advising technology

31
Running head: ADVISING TECHNOLOGY 1 Advisor Perceptions of Advising Technology and the Impact on Advising Meghan Arias George Mason University

Upload: marias5

Post on 02-Nov-2015

30 views

Category:

Documents


0 download

DESCRIPTION

Final paper for qualitative research methods course.

TRANSCRIPT

Running head: ADVISING TECHNOLOGY 1ADVISING TECHNOLOGY 21

Advisor Perceptions of Advising Technology and the Impact on AdvisingMeghan AriasGeorge Mason University

Advisor Perceptions of Advising Technology and the Impact on AdvisingThere are two schools of thought for providing academic advising to students. Prescriptive advising involves simply prescribing to the student what courses to take or what path to follow without delving any deeper. In contrast, developmental advising sees the advisor/advisee relationship as being similar to a teacher/student relationship. Advisors using a developmental technique focus on teaching students and helping them grow, particularly in ways that may not be addressed in the classroom. Prescriptive tasks must still be dealt with in developmental advising, but they are not seen as the primary purpose of the interaction (Hurt, 2007). Regardless of the philosophical view one has of this role, advisors in both campus use various technology designed to improve the process. I feel that technology is affecting advisors in their role, though it is unclear whether it a positive or negative influence. In a meeting I attended a few years ago about technology being introduced at Large Public University (LPU), a timeline was presented for when new technology would be implemented and which groups on campus would have access. Administrators had a few new things to learn, students had access to a handful of the incoming tools, career services had one new program and a few other areas on campus were also impacted. Advisors would have access to, and need to learn to use, almost all of the technology being proposed at that time. Those who saw advising as prescriptive hailed the new technology as a cost-cutting measure that would reduce the universitys need for staff to do advising. Advocates of developmental advising saw the technology as a way to reduce the time spent on prescriptive advising tasks, opening up more opportunity for developmental advising. No one seemed to ask how the advisors felt about it though. The assumption seemed to be that the technology would all contribute to advising in some way, but none of the people introducing the new tools were actually advisors. I chose to explore how advisors feel about their roles in student development and how the technology they use in advising influences this role and their relationships with students. Specifically, this research sought to answer three separate but related questions. First, how do developmental advisors see their role? Second, how do advisors feel about the technology they use in advising? Finally, does the technology have an impact on the role of advisors and their relationships with students? BackgroundI am interested in this topic personally, professionally and educationally. I wrote a paper about advising for an assessment course, so I do have some knowledge on the subject. It is an important topic because much of the research relates strong advising to student retention (e.g. Fowler & Boylan, 2010; Young-Jones, 2013), which is relevant to me personally and professionally. My professional role does not include advising, but I do work regularly with advisors throughout the university. While you cannot generalize from such a small group, talking in depth to those individuals could help improve my understanding of the university from an advisors perspective. This could be useful in my job and potentially my future research. My first job out of college was with a university, so I have been involved in higher education for more than the past decade of my life. I have worked in the Registrars office at LPU for the past eight years, most recently as the head of the graduation and degree compliance section. During this time I have been intensely involved in the development and implementation of a new degree evaluation software, Degree Works. This system shows how the classes students have already taken apply to their program requirements and what remains for them to take. It is designed to help students and advisors understand the degree requirements and identify what courses still need to be completed more easily. I have also been involved in several advising related committees because of my role with this software, which intended for use by both students and advisors. These include committees that discuss the day-to-day duties and provide information for advisors at LPU, as well as a committee tasked with proposing advising reform at the university. As a PhD student, I have also done research related to advising in my program. While I believe I have some experiences with students that are similar to that of an advisor because of my role in graduation and degree compliance, I have never been an academic advisor. Even without direct advising experience, my involvement in higher education has shaped my beliefs and assumptions about the topic significantly. I believe that good academic advisors can improve student success and retention, and my prior research supports this belief. My experience working with the degree evaluation system and answering advisor questions has lead me to believe that professional advisors are significantly better at academic advising than faculty advisors. Faculty advisors have a greater depth of understanding of their field, the research and often the job prospects, but many seem to fall short on their knowledge of academic policy and procedures. They have enough other responsibilities that I feel it does not make sense to expect these individuals to do the work of an academic advisor, an entirely separate profession, in addition to their duties as faculty. While I see the faculty role as one of a mentor and guide for students in the field, I see an advisor role as one of a guide and mentor for general academic and life skills. LPU now provides several tools to advisors and students to increase their access to information and I believe advisors should utilize the available technology to assist them in serving their students to the best of their ability. The academic catalog, with all of the program requirements and academic policies, and the schedule of classes for each semester are available online for students and advisors. Degree Works allows a student to see if they are on track for graduation, and can help an advisor identify if the student has room for things like a study abroad opportunity or adding a minor while still graduating on time. We are currently working on an addition to Degree Works that would allow students to set out their entire educational plan in advance, with advisor approval, and register for classes directly from that plan. We hope this simple registration process and clear plan will help students progress through their degree program more quickly, taking fewer superfluous courses and allowing more students to graduate on time. This would help the students that we serve, and the university as a whole, as graduation rates are an important metric by which universities are measured. In addition several other systems provide information about individual students or groups. The Student Success Collaborative pulls information about students from the past ten years which can be used to identify patterns and perhaps pinpoint problem areas. For example, advisors may see that 60% of students who took two particular classes at the same time did poorly and encourage future students to take those courses in different semesters to balance the workload. Map Works was the universities first attempt at implementing an early alert system, which as replaced by the current iteration, Beacon. Beacon allows individuals from many areas across campus to input when they have concerns about a student who seems to be struggling. This can trigger a message to someone in the students support network, allowing someone to reach out and help the student before it is too late.While I am specifically responsible for the Degree Works system, my office has primary responsibility for, or was at least involved in, the implementation of many products aimed at advisors. This means I have behind the scenes knowledge about the programs, which helped me understand some jargon that came up in my interviews. However, it also means that am extremely close to this topic and needed to be careful not to allow my emotions to get the best of me. I have been involved in Degree Works since the implementation process and I am responsible for many of the decisions made during that time, so it often feels as if the system is mine. My involvement with Degree Works in particular was a large concern for me when beginning the project. Because of my responsibility and commitment to the system, it often feels like a personal attack when others speak ill of it and I was concerned this would be difficult for me to cope with during interviews. My role with the system is well known among the advisors my office works with regularly, adding an additional concern that the interviews would not be difficult for me if the advisors were not open about the system due to their knowledge of my involvement with it. I knew I needed to be aware of my emotions while interviewing, and refrain from responding to my interviewees with information about the system. I had to remember that my purpose was to listen to their thoughts, opinions and experiences, not impose my own. The plethora of tools available are designed to give advisors access to more information and to streamline basic advising tasks in the hopes of better serving students. The hope is that if advisors have access to the most important information about a student and easy access to what classes the student needs to take, they can spend more time on other areas. The advisor would have time to find out more about the students personal interests and guide each student in a more meaningful way. I am interested in obtaining the perspective of some advisors using these tools to determine their thoughts on the effectiveness of the systems. I believe strongly in the power of technology, but understand that it is often not a perfect fit. While I hope this is not the case, I expect to find that advisors feel technology actually takes away from the advising relationship in some situations. I deal with advisors on a daily basis, asking basic questions about Degree Works or other systems. While basic questions would be fine for a new advisor, it can be difficult to answer the same questions repeatedly from the same people! Personally, if I were a student with an advisor who appeared uncomfortable with the tools provided by the university, I would lose faith in their ability to advise me appropriately. I also know the degree evaluation system is not perfect, so I went into the study expecting advisors to feel frustrated by the limitations. The goal is to have them trust and rely on the system to free up time for deeper conversations with their advisees, but I believed some advisors would report not trusting the system enough to do this. I anticipated this to mean that they still go through some menial advising tasks manually and therefore not reaping the full benefit of the tool. These expectations were largely supported. I also hoped to hear positive impacts about technology on advising as well. While I have already admitted the new degree evaluation system is not perfect, it is able to identify several common student issues the old system did not recognize, such a requirement for certain number of unique credits (credits not applied to other degree requirements) within a minor. Many past students were forced to take additional coursework or drop minors they thought were complete because the system did not clearly indicate when the unique credit requirement was unmet, but the new system does a much better job of this. All of the advisors did express a reliance on technology, expressing that it would be difficult or impossible to conduct most advising sessions without power or internet. MethodsStudy SiteI chose to study advisors at the university where I work. There were several reasons for this choice. First, the condensed time in which this research needed to be completed necessitated using a population I could access easily. Since I work and attend classes at this university, I was able to accommodate my schedule to fit the needs of my interviewees with minimal difficulty. Additionally, my experience with the technology used by the advisors at this university would allow me to hit the ground running, so to speak. I would not need to do background research on the types of tools used by advisors and would be familiar with much of the jargon and acronyms that I found to be common throughout the interviews. I also knew that advisors at this university have a great deal of autonomy in their work, so I would not need to negotiate with a central authority to gain access to the individuals I was interesting in interviewing. This allowed me to move more quickly from identifying potential participants to reaching out and setting up times for interviews. The summer timing of this research has also had a largely positive impact on the availability of advisors. While orientations for new students are in full swing during the summer months taking up much of the advisors time, there are overall fewer students on campus so the advisors had time to meet with me on when orientation was not being held. Concerns. I initially had some concerns with using my home institution, however, those initial concerns turned out to largely unfounded. One concern was that my interviewees would not open up to me about the system I oversee. However, several of them did talk about the Degree Works system, providing some positive impressions, and several negative, so I do not believe they held back. While I did experience some reactions to these comments, I was able to control myself well enough not to respond, instead focusing on listening and asking some follow up questions of my participants. However, I do believe much of my focus on not displaying my reaction to negative comments about the system I am responsible for, prohibited me from following up as deeply as I perhaps could have in other circumstances. My personality also shies away from prying into others thoughts too deeply, which may have reduced my ability to recognize where deeper follow up questions may have been appropriate or to ask them even if I did. For example, I could have followed up on how errors in the various systems impact the advisors in sessions with students and how they feel that may impact the relationship, but largely avoided digging deeper in these areas to avoid imposing my own views in asking un-scripted questions. While many of my initial fears were assuaged, I developed new concerns to take their place. In describing his research on medical students, Becker (1998) discusses the use of the word crock by a medical student. When Becker inquired as to what the student meant when he called the patient a crock, the student had difficulty putting the definition into words. It took several comparisons of patients and much input from the student and his peers to final make sense of the word. The students used the word crock as a type of jargon, the meaning was so obvious to them they could not even describe it. Before coding, I became concerned that because I am so close to the subject, working with advisors in my own university, that I would be blind to any comparable terminology in our own field. While my closeness to the topic and institution certainly have advantages, it could also cause me to be blind to potential deeper meaning behind our own jargon words as the medical students were to crock. While my awareness of this encouraged me attend to the words my participants used more closely, I still requested feedback from my classmates to see if I had any blind spots they may have been able to see more clearly. The consultations with the classmates turned out to be incredibly helpful, pointing out some areas I could explore further as well as providing confirmation of some ideas I had already identified. An additional, unexpected, complication was a recruitment email sent for research being conducted by a full faculty member at the university. This faculty members research topic was extremely similar to mine. She would be conducting interviews and focus groups around the same time I was hoping to interview and her focus was the same population as in my research. While I was concerned our two research projects would serve as competition for each other, and my less rigorous class research would lose access to participants for this reason, this turned out not to be an issue. At least two of my participants also participated in the other research project. I believe the commitment these individuals feel to their role as advisors and a desire to have their opinions heard played a role in their willingness to devote so much of their precious time to being involved in two separate, but similar research projects. ParticipantsEach college at our university has their own advising format, some with professional advisors, some with faculty advisors, others with a combination of the two. Therefore, I wanted to interview individuals from different colleges to obtain opinions from advisors in different circumstances. My final sample consisted of four individuals, representing three colleges. I decided to focus on professional advisors. Professional advisors are hired with a primary focus on providing advising. Some teach, particularly the introduction to college course (UNIV 100), or have additional administrative duties, but advising is a primary responsibility for professional advisors. For faculty advisors, advising is a secondary task. They often receive a course release, which allows them to spend time they would traditionally spend in a classroom or in outside activities related to a class, to advise. Teaching and research are the primary responsibilities of faculty at a university, and faculty advisors vary greatly in their given responsibilities and effectiveness in this secondary role. The specific advisors I chose were ones with whom I already had an established professional and/or personal relationship. I knew this would be easier for me on a practical level than reaching out to unknown individuals, but feared that it would cause my participants to temper their opinions because they know about my role in some of the technology I use. As mentioned earlier, my concern of participants not being open turned out not to be an issue, with many of my participants directly critiquing the system with which I work most closely. While this was somewhat difficult for me to hear, I believe I was able to maintain an appropriate demeanor throughout the interview because I appreciated my participants candor and to react otherwise may have discouraged them for providing further insight on their use of the technology, particularly with the Degree Works system. A more positive impact of my insiders perspective was when the advisors used terms like percentage bar, something several complained about, I knew what they were referring to without needing to ask for clarification. The percentage bar shows a students progress towards their degree. The formula is proprietary but it is largely based on the number of requirements marked as completed on the evaluation. It only shows 98% complete when students are completing their final courses, not showing 100% until they have successfully passed those requirements. This has caused stress for many students and, as I have come to realize, their advisors. Data CollectionI began by reaching out via email to one advisor, Jane* (*all names are pseudonyms). I only emailed one person at first to ascertain this individuals reaction to the email I had drafted and determine if any edits would be necessary before reaching out to others. The email explained that I was conducting research on the role of technology in academic advising as part of a research methods course. The university has a generous benefit which allows faculty and staff to take coursework through the school, so many of those I reached out to have taken or will take a similar research methods course that requires them to do such research. I believe this also played a role in their willingness to help me, as they understood the purpose and difficulty of such an assignment. The email went on to express the desire to meet for about an hour long one on one interview and I provided broad categories that I would ask about in this interview. The communication concluded by noting I hoped to audio record the encounter and would keep their information confidential. The first response came back almost immediately of course! After setting an interview appointment with Jane, I reached out to two more advisors, Jay and Jill, both of whom quickly agreed to participate in my research. I conducted my first interview with Jay, who had already participated in the faculty members similar study. We talked briefly about this other research and the trials of working full time while going to school before officially beginning the interview. Our prior professional relationship made the interview go quite smoothly, with Jay sharing details of his day-to-day experience and strong opinions freely. The interview with Jill was a few days later and was similar in execution to Jays interview. Jill had not participated in the other research yet, but said she was planning to when she returned from vacation. I asked about the vacation and we chatted about how nice it is to be able to get away in the somewhat calmer summer months before officially beginning the interview. Again, I believe the prior relationship allowed Jill to open up rather quickly as she seemed to have no hesitation sharing her concerns with the technology our institution implements.Jane ended up requesting to reschedule her interview, so I reached out to a fourth advisor, Joy, just in case I was unable to connect with Jane. I was able to connect with both of them, for a total of four interviews, which worked out well as the final three interviews each ran just under an hour. Table 1 provides a breakdown of the length of each interview and the locations.

Table 1. Interview Times and LocationsIntervieweeDate/TimeLengthPlace

Jay6/26/15; 12:30 pm1 hour, 3 minutesMy office

Jill6/29/15; 12:00 pm50 minutesMy office

Jane7/9/15; 2:30 pm41 minutesJanes temporary office

Joy7/10/15; 9:30 am42 minutesMy office

After transcribing the first two interviews, I heard where some of the questions went against the flow of the participants answers, so I re-structured my interview guide in an attempt to maintain an appropriate flow to the questions (Appendix C). I did ask some follow up questions not included in the guide in most of the interviews, which may have contributed to the flow issues when I tried to jump back to the interview guide. However, the changes I made did seem to help maintain a more appropriate flow in the remaining interviews, even with follow up questions. One thing I would certainly like to add to this study if I did it again, with additional time and resources, would be observations of actual advising sessions. Since there are various types of advising sessions and each student and session is different, it was difficult to obtain specifics about actual advising sessions. It would be useful to observe the advisors I spoke with in an advising session to see how they interact with the students and with the technology. While the summer timing of this project is beneficial for scheduling interviews, the advisors are focused on orientations rather than general advising sessions and those are quite different as orientation advising is often in a group setting. Being able to see how advisors are actually using the technology, in addition to hearing how they report using it and how they talk about the tools would add depth to my study. Instead, I am receiving only the view these advisors choose to present to me. While they appeared open and forthcoming in our conversations, it is possible they have not analyzed their own behavior and examining their words with the context of their actions in an actual advising session could reveal important information they might not be aware of enough to articulate. Data AnalysisAs I transcribed my interviews, I kept another document open to make note of anything interesting that came up, particularly things that might be relevant to my research question. I completed the transcriptions in a somewhat piecemeal fashion, transcribing a few minutes here and there when I had time, so I read through all the interviews again once I finished transcribing. This allowed me to review everything together and maintain the larger picture of the interviews in my mind, as well as begin to identify some larger themes before fragmenting the data. I recognized positive and negative views of technology present with each advisor, but all indicated experiencing frustration with various systems at least some of the time. Communication, both with the student and within the university, came up in all the interviews as well. Finally, all the interviewees expressed a desire for the technology to be more streamlined and less cumbersome to use. Next, I loaded all the transcripts into NVivo and created a few codes I thought would be useful to begin based on the connections I identified previously. These original codes were student relationships, communication, frustration, positive and negative views of technology. Several of our readings warn against too much reliance on computer-assisted data analysis programs. While as a novice researcher I certainly appreciate the wisdom of those who have gone through this process countless times already, I feel some of reservation comes from personal preferences developed before the technology was fully developed. Emerson, Fretz and Shaw (2011) caution that it is difficultto modify codes once applied to specific pieces of data (p. 176), but I found the process to be mostly painless. It was certainly easier for me than even the thought of printing, highlighting, and cutting dozens of sheets of paper! Most of our readings emphasized that the physical process is largely personal preference developed through trial and error. While I have never done qualitative research before, I have done countless literature reviews. In the early days of my education, I would print or photocopy articles and spread them in piles by topic with different colored highlights for different key points. However, as the available technology has improved, and my partners patience for vast piles of paperwork has diminished, I have become comfortable doing equivalent tasks electronically. The analysis involved in coding the interviews is a different from what is required for a literature review, but the process described in many of our readings struck me as similar enough that I was confident in my ability to effectively use NVivo. As I read each transcript, I used a fairly open coding method, adding codes when important ideas did not fit neatly into one of my early codes. I ended up with more than 30 categories (Appendix A), mostly substantive, some of which are subsets of larger areas. I went back through each interview with this larger arsenal of codes to ensure I captured all the important concepts and identified a sufficient number of examples for each code. In the first round of coding, I tried not over think things, letting my data tell me what codes to use. This resulted in some codes that overlap, so I went back through the snippets I coded in similar categories to determine if they were sufficiently similar to combine, or if they are subcategories of some larger concept. I was able to combine several categories and break others out. I also began eliminating codes that did not speak to the larger relationships and processes being identified and reorganized the codes to make the relationships I identified more prominent (Appendix B). The re-coding process was the most frustrating in using the NVivo software. I went back through each of the nodes and read through the section of interview text I identified as relating to that particular node. However, in going back through these snippets out of context I often felt the need to read the surrounding information to make more sense of the comment. I also wanted to ensure any further coding or interpretation I did of these smaller snippets would be truthful to the interviewees context and not overly laden with my own desired perspective. I felt the software should have made it easier to view the surrounding content of a code. FindingsIn pouring over what the advisors said, one point of interest that emerged actually had to do with what the advisors did not say. I explained my research topic and general interests to each participant before beginning the interview, so they were all aware of my particular interest in advising technology. However, for the most part, their answers to my more general advising questions did not mention technology at all. This lack of focus on technology is more telling than anything else; the word NVivo identified as occurring most often in the interviews was student. Several of my early questions dealt with general advising topics. I asked how they had gotten into advising, how they see their role as advisors and what they generally do each day or in a typical advising session. Two advisors mentioned email as an important part of their daily routine and all mentioned, at least in passing, helping students decide their schedule, which inherently relies on the technology since the course schedule and registration is online. However, the focus for answers to these general questions was far more on the student than the technology. Student relationships came up repeatedly in these interviews. Jay said that while he enjoys working with the data he can obtain from the SSC, he does not use it directly in advising or in preparing for advising appointments because its not gonna tell me information that I cant ask the student for. The relationship with the student is at the forefront and the advisors I spoke with seemed to resist technology that attempted to usurp their interactions with students. Jay talked about his preference for SSC over another system because SSC simply provided information and the advisor could take what he or she needed and leave the rest. The other system, Map Works, was trying to take away that, because it was compensating for a lack of human resource. He also felt the technology actually detract[s] me from what I'm trying to accomplish with those students. If I've got like 17 windows open, I'm more focused on the window than I am on the student sitting right in front of me. Joy agreed, noting that the data from SSC can be useful to the administration, but not useful in direct interactions with students. The SSC displays graphs of how a student is doing in relation to other students in the same program and Joy feels it's not helpful for them to see [the graphs]. We can have a conversation with them. Three of the advisors also talked about the uniqueness of students at the university or their program in particular. Jill said the nationally normed data used by the Beacon system is essentially worthless because LPU is a unicorn. I don't care who you talk to, it's just its own special butterfly and you cannot use normed data for it. All schools are unique to some extent, but the level of pride I heard in the advisors voices when talking about their special school or students suggests a deep level of commitment. This is especially meaningful in connection with the lack of power these advisors feel in their roles.Issues of power or control came up in several interviews. Control, as I mean it here, deals mostly with the lack of power advisors feel they have. First, the advisors often seemed to feel powerless in the choice and implementation of technology they are expected to use. Jay says she feels none of the advisors have ever been asked about implementation or even about the products themselves before they are purchased and implemented. Jill expressed the same and lamented that administrators assume that its gonna be a system that will be helpful for [advisors], but usually people making those decisions have not advised a day in their lives. Closely tied to the implementation, is the lack of control the advisors have over what the technology provides to them. The Beacon early alert system identifies students who are in the murky middle. These are students at risk of dropping out of college, but are not easily identified through traditional means. While many details of the formula are not released, part-time students are automatically identified as at-risk. Jay complains about this automatic classification, noting that many of his part-time students are the best about checking in with him because they are part-time and need to make every course count. He feels bogged down by these many part-time students being flagged as needing contact in the system and feels he should have some control over that. He argues that if I have a well-established relationship with them, I should be able to pull them off my report. Joy says students are marked in red, or high risk, in Beacon and they might have like a 3.8 GPA, but they're marked in red because they said they were homesick. And it's like, ya know, theyre a new student, we should expect them to be homesick. Is that a reason that we should mark them as red? Jill has similar feelings about SSC, stating:Every now and then students will pop up on a list [from SSC] that shouldn't be there, and that's annoying because then it createstrust issues for me and you have to be able to trust a technology system. So, if a student's popping up on a list and they don't qualify to be on that list or you're thinking, I can't filter this student out and there's probably more students like this student on the list therefore, I can't use this list, right? Well then, you're useless to me. SSC right now is just, it's not giving me the level of filter that I get from Requests[@gmu.edu, the Registrar IT staffs email to request student data reports] so basically, I use SSC tokinda play around to see what types of populations I would be interested in getting and then I'd request a list that has more information that I could use. So is SSC helping me that much? I mean, it's helping me brainstorm, it's making the data more useable for me, um, it's making me feel like I understand things I guess so there's a lot of value added there, but just give me what I want, ya know.While not all settings in a purchased software package can be changed, most of these systems provide significant customization options for the various institutions that implement them. Had the advisors been consulted during the implementation, it is possible different markers could have been identified or additional filters provided. Of course, since Jay noted that he has sort of butted heads a little bit with some colleagues on what he considers an at-risk student, even advisor input would not resolve everything.While there was definite frustration expressed about the noticeable lack of advisor input for technology advisors were supposed to use and the problem using them, the larger frustration was often aimed at feelings of being powerless to serve all of their students. Lack of resources and issues of feeling overloaded were brought up again and again. Jane talks about advising students in what was essentially a closet because her department did not have sufficient resources to provide her with an office and the closet had an outlet and a chair, all she needed to make it work. Both Jay and Jill talked how the technology cannot be fully utilized because people are still needed. As Jill puts it The technology will give you the information, it can show you the problems, but it's not going to solve them. You need people to solve the problems. Jill explains it this way: I feel personally responsible for them. Now, am I responsible for all of them? Absolutely not, that's ridiculous, but the type of people that work in this field, we feel responsible for them. I feel like it's my job to make sure every single one of them make it to graduation. And if they're not making it there, I need to figure out why and help them do that.This statement precedes one of her only positive comments about advising technology:The technology helps you with that, you feel like you're getting closer to catching all of them. Cause you can't go up to every single person and say, How are you doing, how are you feeling, what's going on?You can't do that with 6 thousand people, so I think that's what technology helps us with. Taking all that big data, boiling it down to something that makes sense and is a good predictor of those things, it's never gonna be 100% accurate, but even asking people isnt a hundred percent accurate, so I think it helps me in that way.

Joy feels the sheer quantity is an issue stating that the school has way too many technologies for advisorsyou have to pull up at least four different systems to get the information you need before you pull a student back. Like thats just ridiculous, thats not a good use of time. One unexpected theme related to the overwhelming number of technologies in use at LPU, was the desire for integration. One of the final questions I asked in each interview was what would advising technology do in a perfect world? Many participants wanted a system that made communicating with students easier, but all four advisors wanted a more integrated system. Earlier quotes show the advisors frustration with having to open multiple windows and check multiple systems. Jay says the perfect technology is one in which it all talks to each other and its in real time. Jill asks for it to all be in one place[and] manage communication in a way that theres no confusion about whos telling the student what. Jane wants the technology to be more streamlined, just be simple and feels there should not be 10 steps for me to get into something that should take maybe 5 steps. Until she can have a hologram that could pop up to answer questions when advisors are not available, Joys dream technology would have one interface that I can use to take notes, review the student's case file, scan and upload paperwork to, see where the student is in terms of GPA, see what their interests are. The desire for integration was far the most consistent answer, but unfortunately, is also a dream that is difficult to fulfill without the proper funding and support. Each individual system LPU uses is proprietary, so the systems often have trouble communicating, hence the need for advisors to open so many windows. Jill talked about a homegrown system at her last institution, a system developed within the university for their own use. When I expressed how impressive that was, she explained that they have the resources to do that, but I also think that LPU is really far behind as far that goes. Again, because they don't have enough people.There was also evidence of a tension between the needs of the student and the needs of the university. Jay feels that the tools are always there to serve the institution, and its never really created to serve the student. LPU is looking to grow as an institution, and according to Joy there is a huge effort[to] reach out to students who have checked [on a Beacon survey] that they intend to leave. However, she feels that some students, they should leave. If another is school is a much better fit for them, it's where they wanted to go from the beginning, they have the grades to do it andthe student's gonna have better experiences somewhere else, well, why would we keep them here? Jay expresses the same sentiment, If they're gonna leave the institution, yes, that's at-risk as far the university is concerned, but if it's in the best interest of the student, then that's what I'm gonna help them with. By stating that he will not help [students] stay here because I want to keep their enrollment, Jay is declaring the value he places on his students over that of the university. While Jill also values the needs of her students, she argues the needs of students and the needs of the university do not have to be at odds because if you provide support [to students], we get our retention and persistence numbers up. Part of this support to Jill is strong advising, and as discussed earlier, these advisors often do not feel they receive the support they need from their university. In summary, I believe I mostly answered my original research questions, and found additional themes of interest. Advisors see their role as focused on the needs of their students, not what the university needs. They all see advisors as playing a role in student development to some degree, which is not surprising considering my focus on developmental advising. Advisors are generally not fans of the technology they use, but acknowledge it can be useful, and sometimes necessary. It seems to impact their roles as advisors mostly as being one more thing check and the technology appears to have little direct impact on their relationship with students. However, this could be a symptom of the advisors lack of control over the technology and the lack of integration. Perhaps if they had more control and the student information was more easily accessible in one central system, it would play a larger role. I think Jill sums this up best when she says make sure that technology is working for you and not the other way around. Unfortunately, the technology seems to be working for the administration at the university, with little concern for the advisors. LimitationsThe personal relationship with many of the participants I selected came from my experiences with them in class. This had an unexpected consequence that I did not fully appreciate until coding and analysis; all four participants had roles that included duties beyond advising. Jill and Joy have worked up to the manager or director level and spend more of their time overseeing advising than doing it, Jane is an adjunct faculty member as well as an advisor and Jay has several coordinator roles outside of advising. While there were common themes throughout, Jill and Joy had more in common with each other than they did with the other two participants. If I could do the study over again I would choose either individuals at the advising director level or full-time advisors, not a combination. The perspectives of both are valuable, but I think there are enough differences to warrant looking at them separately, or at least with more people from each for a more robust comparison. This difference impacted a theme that began to take shape from Jane and Jays interviews, but did not appear in Jill or Joys. Competence was definitely an issue for Jane, who stated that when a new technology is introduced she does a mock advising session so she doesnt look like an idiot basically in front of my students. I don't want them to think Im not credible because of a technology thing. Jay used less straightforward words that seemed to address a similar issue. He describes opening several key systems before students arrive to avoid floundering around trying to get into the site. He also talks about checking the degree evaluation just to see if somethings wrong when students are not present so he can address any issues ahead of time so the student is not getting the same experience that Im looking at. One possible explanation for the lack of this theme in Jill and Joys interviews, is that they spend less time in one-on-one advising and more time on administrative issues. However, it is not possible to determine the reason for this difference with the data I have. Another limitation was, of course, the shortened summer time frame. Glesne (2014) suggests several steps qualitative researchers can take to improve the researchers trustworthiness, similar to the idea validity in quantitative research. Unfortunately, the short time frame did not allow time for many of the recommendations, such as extended observations and member checking. I was, however, able to utilize other suggestions. Glesne (2014) recommends reflecting upon your subjectivities and upon how they are both used and monitored and obtaining external reflection and input on your work (p. 53). The class has these steps built in through the memo assignments and consultations, both of which were incredibly helpful throughout the process. Reactivity is present in all research, particularly qualitative research, and my study was no exception. While I did my best to develop non-leading questions for my interview guide, there was a potential influence I had not considered. While I work at the same university as all of my participants, I do not see myself as having any power over them. However, in my final consultation, my partner pointed out that even if I do not have direct power over them, my role in the Registrars Office does impact their work. I work with the degree evaluation they use with students and I am the gatekeeper for graduation, which they hope for their students to obtain. I had not considered this angle prior to the final consultation, and so did nothing to address it. As stated previously, participants did not seem to have any hesitation in discussing the system I work with, so it does not appear my influence was any more than what is to be expected in this type of research. However, had this occurred to me earlier, I may have decided to conduct the study at a different site. ConclusionThe most important thing I learned about doing qualitative research from this study is that I am not a fan! Breaking the quantitative thought processes and habits engrained over years has been extremely difficult. I have taken several introduction to research type classes, and even these supposedly broad view introductions focus heavily on the quantitative perspective. I am not someone who documents things regularly, and while I attempted to jot down notes throughout the process, as several of the readings recommended, it was not something that came naturally to me. I felt most comfortable with the whole process after entering some codes and playing with various report capabilities in NVivo. I could potentially do a mixed-methods study in the future, because I do see value in getting a more nuanced understanding that is easily missed with quantitative research. However, I do not see an entirely qualitative study in my future! This research has taught me a lot about my topic. I have already started using information from the interviews to try to make a difference for advisors. When speaking to the project manager for (yet another) new technology that will be arriving soon about a different subject, I mentioned (without going into detail about how I knew this!) that advisors often feel left out of implementations and that she may receive more support for her project if she tries to involve them in this system. Advisors will not be required to use her system, but advisor buy-in will greatly improve its chance of success, so she seemed enthusiastic at the suggestion. However, I do not know if she will follow through. Additionally, I will work to make changes to my own system based on some of the feedback I received in the interviews. There is only so much I can do, but moving forward, I will always consider ways I can help the advisors feel like they have more control over the systems they are expected to use.

ReferencesBecker, H. S. (1998). Understanding strange talk. In Tricks off the trade: How to think about your research while youre doing it (pp. 150157). Chicago, Ill: University of Chicago Press.Emerson, R. M., Fretz, R. I., & Shaw, L. L. (2011). Writing ethnographic field notes (2nd ed.). Chicago: University of Chicago Press.Fowler, P. R., & Boylan, H. R. (2010). Increasing student success and retention: A multidimensional approach. Journal of Developmental Education, 34(2), 210.Glesne, C. (2014). Becoming qualitative researchers: an introduction. Boston: Pearson.Hurt, R. L. (2007). Advising as teaching: Establishing outcomes, developing tools, and assessing student learning. NACADA Journal, 27(2), 3640. http://doi.org/10.12930/0271-9517-27.2.36Young-Jones, A. D. (2013). Academic advising: does it really impact student success? Quality Assurance in Education, 21(1), 719. http://doi.org/10.1108/09684881311293034

Appendix AOriginal Codes Academic integrity Barriers Bureaucracy Communication Easiness (to use technology) Handed the role Helping Knowledge/Competence Lack of resources Needs of college Old school One-size-fits all Overload Ownership/control People using it Responsibilities Resistance Students Relationships Student Engagement Student Needs Support Technology Customization Frustration Integration Lack of Understanding Negative views Positive views Tether (role of advisor) Time Trust in data Uniqueness of students Wants

Appendix BFinal Codes

Students Communication Engagement Relationships Student Needs University Communication Bureaucracy Lack of Resources/Support Needs of college Technology Customization Frustration Integration Trust Ownership/control Knowledge/Competence

Appendix CUpdated Interview GuideGeneral Why advising? How did you get here (advising in general and/or current job)? What steps did you take to get here? Did you know, or realize the extent, that advising would be part of your job when you started here? How do you see your role as an advisor?Advisee Relationships How do you see your role with students/student development? Describe your perfect advisee. What is atypical relationships with advisees? Best?&Worst? Provide some examples of when advising has beenrewarding Provide examples of times you have been unhappy or stressed with work Has technology played a role in any of these experiences?Day in the life.. Walk me through a typical day (advising primary responsibility?) The schedule for next semester has just come out and a student has come in for advising. Walk me through what that session looks like. What other advising session do you often have? Role of technology in those? What technologydo you use ona regularly in advising sessions? For what reasons? Are you happy with them? How would you do things differently in an advising session if the power or internet was out?Technology What were your feelings before new technology impacting advising implemented? Example of the worst experience using the technology Example of the best experience using the technology In a perfect world, what would advising be like? What would the technology do? How close or far do you think it is from this?