ncsu libraries summon user research: findings and

36
NCSU LIBRARIES SUMMON USER RESEARCH: FINDINGS AND RECOMMENDATIONS Prepared for NCSU Libraries by MoreBetterLabs, Inc. Lead analyst: Abe Crystal, Ph.D. Last updated: 4/13/10. Contact: 919.593.6129 | [email protected] CONTENTS NCSU Libraries SUMMON User Research: findings and recommendations...........................................1 Research overview........................................2 Purpose of this research...............................2 Research methods....................................... 2 Key findings.............................................4 Overall impressions....................................4 User Experience........................................ 5 Integration of Summon with library website redesign. . .16 Next steps..............................................16 Appendix A – recruiting screener.........................18 Appendix B – User Research Guide.........................20 Checklist of issues to observe........................26 Appendix C – information retrieval system relevance evaluation............................................... 26 Suggested method for relevance evaluation...............26

Upload: lamngoc

Post on 02-Jan-2017

215 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: NCSU Libraries SUMMON User Research: findings and

NCSU LIBRARIES SUMMON USER RESEARCH: FINDINGS AND RECOMMENDATIONS

Prepared for NCSU Libraries by MoreBetterLabs, Inc.

Lead analyst: Abe Crystal, Ph.D. Last updated: 4/13/10. Contact: 919.593.6129 | [email protected]

CONTENTS

NCSU Libraries SUMMON User Research: findings and recommendations.......................1

Research overview.........................................................................................................2

Purpose of this research............................................................................................2

Research methods..................................................................................................... 2

Key findings....................................................................................................................4

Overall impressions....................................................................................................4

User Experience.........................................................................................................5

Integration of Summon with library website redesign.............................................16

Next steps.................................................................................................................... 16

Appendix A – recruiting screener.....................................................................................18

Appendix B – User Research Guide..................................................................................20

Checklist of issues to observe..................................................................................26

Appendix C – information retrieval system relevance evaluation....................................26

Suggested method for relevance evaluation...............................................................26

Page 2: NCSU Libraries SUMMON User Research: findings and

RESEARCH OVERVIEW

PURPOSE OF THIS RESEARCH

The Libraries have been investigating ways to improve discovery and retrieval of library resources, particularly for undergraduate students who can find the complex array of databases and e-journals difficult to navigate. The “Summon” application, developed by Serials Solutions, attempts to address these issues by providing a unified search interface.

ABOUT SUMMON

Summon simplifies the process of finding information in the Libraries. It searches through a massive collection of books, scholarly journals, newspaper articles, e-books, dissertations, conference proceedings, and numerous academic databases. Rather than search for all of these things in separate databases, Summon lets you find them all at once, quickly and easily. Even better, it's tied directly to what is available through the NCSU Libraries, online or in print.

The purpose of this engagement was to evaluate how students use the beta implementation of Summon, assess how well Summon supports library users, and develop recommendations for improving its design, functionality, and integration with other library websites and applications.

RESEARCH METHODS

Research scope

We focused on assessing the strengths and weaknesses of Summon’s information retrieval capabilities and user interface.

In particular, we sought to observe students interacting with Summon to complete a variety of tasks, including:

Known itemo Find a textbook for a course you’re takingo Find a favorite book or author

Citation lookupo Get to the full text of an article from a properly formatted citation

Simple researcho Find articles from the most recent issue of a given journal

Complex research

Page 3: NCSU Libraries SUMMON User Research: findings and

o Find information for a real research assignment, ideally from a current class

For the complete research guide and scenarios, see Appendix B. We developed the guide based on input from the NCSU team, reviewing examples of syllabi and assignments from NCSU classes, and tasks from previous user studies at NCSU.

Participants

We recruited ten student participants, with a range of backgrounds, academic interests, and information literacy skills. See Appendix A, “Recruiting screener.”

Participant # Academic standing Major/field GenderPilot A Sophomore Undeclared FemalePilot B Senior Political science Male

1 Junior Business Male2 Masters Food Science Female3 Masters Plant Pathology Female4 Junior Psychology Female5 Masters Computer Science Male6 Senior Philosophy Male7 Junior/nontraditional Anthropology Female8 Senior Textiles Male9 Junior History Male

10 Doctoral Science Education Female

Procedure

Eliciting background and context

Sessions began with a brief semi-structured interview, in which participants were asked to provide background on their academic interests and goals and their use of the library. These prompts led into discussion of the student’s research process and use of resources in the library and on the Web. Based on the discussion, we identified a particular research interest to focus on, such as an assignment in a current class for an undergraduate, or a thesis topic for a graduate student.

Example research topics:

Effect of reniform nematodes on cotton plants Role of memory in OCD The TCP incast problem in datacenters

Page 4: NCSU Libraries SUMMON User Research: findings and

The effects of large dam-building projects in India The knot strength of filaments for ropes

Usability testing

This qualitative research was structured to gather key insights on students’ experience with Summon. Students were given considerable leeway to choose their own tasks, explore Summon, experiment, and make mistakes. Because of this approach we did not collect data on task efficiency or error rate, as these data would not be comparable across individuals.

We asked students to use Summon to perform both directed tasks (e.g., find an article based on this citation) and natural tasks (look for information on your research topic, as you would on your own). We alternated between directed and natural tasks as time allowed, seeking to have each participant use all of Summon’s key features. In some cases, this required prompting students directly to use features they had not discovered, such as “Save Items.”

Sessions were recorded in Morae and are available for review by NCSU staff.

KEY FINDINGS

OVERALL IMPRESSIONS

Each participant was asked to answer seven assessment questions after having used Summon.

Here are selected answers to these post-test questions, chosen to represent typical student impressions of Summon:

A. How would you describe Summon to your friends?A new search engine for the library to bring resources together.Like Google but for the library.A fairly easy way to find full articles and books online.

B. Would you recommend your friends use Summon?Definitely—just for the sheer convenience—that’s always a good thing.Yeah—a lot better than trying to search the library website before.

Page 5: NCSU Libraries SUMMON User Research: findings and

C. What is the primary benefit you received from using Summon?Having access to all library resources in one spot.Saved items—that’s amazing.

D. What other sites would you compare Summon to? Why?Google.JSTOR.

E. Do you plan to use Summon again for your own work? What would you most likely use it for?I will use it again, probably as a way to back up a philosophical paper with empirical research.

F. If you could change or improve anything about Summon, what would you change or improve?The ranking of results.Make it easier to get the full text.The subject terms weren’t always well-matched or relevant to me.The preview article window was too small and hard to read.

G. Would you be disappointed if you could no longer use Summon? Maybe a little.Oh yeah.Probably, because it’s roundabout to use the rest of the [library] website.

Overall, undergraduate students appeared to value Summon much more than graduate students. Undergraduates’ nonverbal (tone, expressions) communication about Summon was markedly more enthusiastic than graduates’, and they appeared more likely to use Summon in their future work.

USER EXPERIENCE

Information retrieval effectiveness and relevance ranking

Known-item queries

Students often need to retrieve the full text of an article based on a citation provided by a professor, or a reference in an article they’ve read. Summon has all the technical capabilities and interface features needed to support these known-item queries. We

Page 6: NCSU Libraries SUMMON User Research: findings and

observed students attempting to retrieve articles based on real citations (either from their own reading, or provided by the researcher). Students’ success with these tasks varied widely. Several students attempted to use extremely detailed citation information—for example, entering the entire title of an article, even if was very long, or copying-and-pasting an entire reference. These students were generally unsuccessful, while more successful students adapted the reference information to Summon’s metadata model, sometimes using the Advanced Search tool. From an overall user experience perspective, going from either a printed or electronic citation to the full text of the article isn’t easy or fluid for most students. Our impression is that motivated students would persist past the initial difficulties and track down a needed article using a combination of tools, including Summon, Google, and perhaps other databases that they have been trained to use in class (e.g. Academic Search Premier).

Success in searching for books in Summon also varied widely. While some students were able to locate specific textbooks and popular books readily, others struggled to match their understanding of a book’s title to Summon’s model. For example, some students tried to search for a specific edition of a book by entering “5th. ed.” or similar qualifiers after a book’s title, which proved ineffective. In general, Summon appeared considerably less forgiving of typos and partial matches than Google, which may have confused students accustomed to Google’s extremely loose matching.

Finding the catalog record for a book often required using Summon’s filters to eliminate book reviews and journal articles related to the book’s title. Only a minority of students were sophisticated enough to navigate this process. Specific user experience issues with the filters are discussed below.

Topical research queries

A key purpose of Summon is to provide relevant responses to queries—in particular, to have high precision within the first page or two of results1. It is expected that many students will compare Summon against their mental model of Web-scale search engines such as Google and Bing, which often have very high precision on their first results page.

Conducting a detailed, quantitative evaluation of Summon’s relevance ranking algorithms is beyond the scope of this study. (See Appendix C for recommendations for assessing Summon’s relevance in a more rigorous way.) However, we developed general impressions of Summon’s ranking capabilities based on observations of participants searching on a variety of topics and research questions.

1 Precision of the entire result set is considered less important because students are unlikely to evaluate more than a few dozen result listings before changing their query or moving to a different search strategy. See http://www.oracle.com/technology/products/text/htdocs/imt_quality.htm for an explanation of “Precision at n.”

Page 7: NCSU Libraries SUMMON User Research: findings and

In general, we observed students experiencing considerable difficulty finding information that they considered relevant and useful to their interests. These difficulties resulted from a complex interplay of students’ subject knowledge, specificity of research topic, and information literacy/search skills, with Summon’s ranking and user interface. In addition, most participants were inexperienced with Summon, either using it for the first time or having limited experience with it.

With those caveats in mind, however, our overall qualitative assessment is that the current implementation of Summon is only a first step toward the vision of a “Google-like” experience for discovery of library resources.

As a case study, consider a sophomore psychology major who had a research assignment and had chosen to focus on the relationship between memory and obsessive-compulsive disorder (OCD). She had already been introduced to Summon in one of her classes, and had started using it for her research before participating in this research.

Using Summon, she searched for ocd memory. The results included some promising, but highly specialized articles – mostly from one journal, Behaviour Research and Therapy. Example citation:Hout, Marcel van den (03/01/2003). "Phenomenological validity of an OCD-memory model and the remember/know distinction". Behaviour research and therapy (0005-7967), 41 (3), p. 369.

These articles would likely be quite difficult for a sophomore to read and synthesize, and it would be challenging for her to get a general understanding of the role of memory in OCD by reading such specialized articles.

In contrast, searching Google for ocd memory returns as the second result an article from About.com, “Is OCD Associated With Memory Problems?” http://ocd.about.com/od/whatisocd/f/OCD_Memory.htm . This article provides a clear, succinct overview of the issue, and a reference to an authoritative scholarly review article which is almost certainly more useful to a student than an array of specialized articles:

Olley, A., Malhi, G., & Sachdev, P “Memory and executive functioning in obsessive-compulsive disorder: A selective review.” Journal of Affective Disorders 2007 104: 15-23.

Note that Summon did not return this review article in the first three pages of results.

This analysis is not meant to imply that Google is always preferable or that Google is optimal. There are plenty of flaws in the results Google returns, including the first result, which is almost useless. The point is simply that Summon’s use of standard information retrieval algorithms to search on a vast corpus of extremely diverse content, without the benefit of citation metrics or link-weighting algorithms (like

Page 8: NCSU Libraries SUMMON User Research: findings and

Google’s PageRank), often leads to results that aren’t well-suited to the research needs of students. Summon does provide a “Content Type” filter to help control the diverse result sets it returns. However, most students saw only modest improvements in their results when they filtered by Content Type. Theoretically, students would be better-served by metadata such as “Audience” (e.g. undergraduate, researcher, etc.) or “Scope” (e.g., introduction, overview, synthesis of recent research, etc.), but automatically generating such metadata is still difficult.

We are particularly concerned about students who need to find a general introduction to or overview of a topic, rather than drilling down into a specialized line of research. To be sure, this problem is not unique to Summon—most students would face similar challenges using other databases such as Web of Science or Academic Search Premier.

Accessing full text

In general, students were able to use 360 Link function and retrieve the full text of articles successfully. However, few participants were able to use 360 Link easily, without confusion, frustation or hesitation (see additional points in “User interface, below”).

Metadata and surrogates

Summon presents surrogates of results in its results listings (see screenshot). The surrogates combine metadata with a snippet of full text (when available).

Displayed metadata fields include:1. Title2. Author3. Reference information (journal/publication, ISBN / ISSN, date, page)4. Subject terms5. Text snippet6. Availability

The primary information retrieval issue we observed with the surrogates is the relative “opacity” of the results. In other words, the connection between the user’s query and each surrogate often seemed unclear. For example, the example surrogate shown

Page 9: NCSU Libraries SUMMON User Research: findings and

above was retrieved by the query european union infringement on civil liberties. The term “liberty” is shown in bold in the surrogate, which provides a clue as to the system’s matching, but it’s still unclear as to why Summon considered this article relevant to the entire query.

Common user behaviors indicating the effects of opacity included students who simply scrolled down through the entire first page of results, without indicating much interest in any of the surrogates—or saying explicitly “I don’t understand why this is here.”

User interface

Summon’s user interface (UI) has several major components. Observation of students’ interaction with these components revealed strengths and weaknesses in several areas. The following sections summarize our observations.

Page 10: NCSU Libraries SUMMON User Research: findings and

Advanced search

Strengths Weaknesses Participants expressed desire to

control search parameters beyond the content type and subject filters, suggesting Advanced Search functionality has value

Some participants found Advanced Search efficient and effective for known-item retrieval

o Hard to discover this function—unclear what “advanced” means

o Form layout is difficult to scan and parse

o Users expected “author” instead of “written by”

o Some users struggled to map the citation format they were working with to the metadata fields

o Some users weren’t sure how much of the title or journal field to fill out

o Summon modifies the query string after the Advanced Search form is filled out, which can be confusing for novice users

Page 11: NCSU Libraries SUMMON User Research: findings and

o Most participants looked for the ability to specify article/title/journal in the filters area, not in Advanced Search

Filters

Overall, students clearly understood and were able to use the left-rail filters. The problem was that the metadata and filter UI, as currently implemented, weren’t especially useful to students.

Refine your search

Strengths Weaknesses Being able to limit to peer-reviewed

literature maps to many students’ assignments, particularly for introductory classes2

o Long checkbox labels are effortful to scan and read, and the first word in each label is vague (“items” / “limit” / “exclude” / “add”)

o Most participants ignored this area of the UI

Content type

2 We did not control the tasks students perform to ensure they searched specifically for peer-reviewed material. Most students seemed aware of this general issue, though they often used different terminology. For example, one senior said he avoided newspaper articles because they “can be subjective.”

Page 12: NCSU Libraries SUMMON User Research: findings and

Strengths Weaknesses Some participants found the filters

appealing and engaging (e.g., “Hey, it’s like eBay!”)

Near real-time filtering provides a sense of immediacy and control (dynamic manipulation)

o No apparent order or structure to the list of items, making it hard to understand (participants didn’t pick up on sorting the list by # of items retrieved)

o Similar-sounding content types can be confusing (“journal” vs. “journal article”)

o Can be confusing to see a different set of options for each query, especially when expecting a particular option such as “book” to be present

o Participants expected to be able to filter by author/title/journal name

Subject terms

Strengths Weaknesses Participants expressed desire to o Somewhat overwhelming number of

Page 13: NCSU Libraries SUMMON User Research: findings and

control search parameters beyond the content type, suggesting control of subject or topic could have value

Summon can return results from many different fields/disciplines, which subject terms can help to disambiguate

terms presented in a very long listo Large variety of terms from different

fields and disciplines, without structure or grouping to facilitate understanding

o Default sorting (by # of items) is difficult to scan, and participants did not use the alphabetical sort function

Date

Strengths Weaknesses Slider widget is engaging and easy

to understando We observed little interest in or need

for this filter, across all participants o Dropdown menus for choosing a

specific date are opaque and hard to interpret at first

Page 14: NCSU Libraries SUMMON User Research: findings and

Article preview

Strengths Weaknesses Some participants liked the preview

and said they would use it regularly Easier (cognitively and physically)

than clicking the link and loading a new page

o Often provide little or no additional metadata beyond the surrogate, which can be frustrating

o Hard to discover the feature—almost all participants had to be prompted to try it

o The small size of the preview pane, and the scrolling it requires, frustrated users

Save items

Page 15: NCSU Libraries SUMMON User Research: findings and

Strengths Weaknesses Students love the capability to save

items (e.g., “awesome,” “cool”) Ability to email results fits with

students’ mental model and workflow

Ability to format results in various citation formats is valuable

o Confusion about persistent access to saved items—most participants expected to log in with their NCSU ID, which isn’t supported

o Hard to discover; no clear affordances (such as a dedicated, always-visible pane, or the ability to drag-and-drop items)

o RefWorks integration isn’t very elegant

o RefWorks tab doesn’t provide cues/help to students who aren’t familiar with RefWorks

Page 16: NCSU Libraries SUMMON User Research: findings and

Full text (360 Link) 3

Strengths Weaknesses Participants love getting access to

full text online Summon UI makes it easier to

ensure you get full-text results (versus Google)

o Risk of tab overload during exploratory research (many students opened 10+ tabs within a few minutes of starting research)

o 360 Link screen is confusing, hard to scan, lacks visual hierarchy4 and overall visual coherence + polish

3 The 360 Link screen has been slightly revised since the Summon study was conducted—the size of the PDF icon and article link have been increased (based on feedback gathered during the study).4 See http://52weeksofux.com/post/443828775/visual-hierarchy for a brief discussion of visual hierarchy. In particular, this article states that “the best visual hierarchies lead users to take the action confidently,” whereas we observed users of the 360 Link screen acting tentatively and lacking confidence.

Page 17: NCSU Libraries SUMMON User Research: findings and

INTEGRATION OF SUMMON WITH LIBRARY WEBSITE REDESIGN

A further goal of this research was identify directions for integrating Summon with the new library website design.

In general, undergraduate students preferred the experience of using Summon to the experience of using the current library website, and specific databases. This finding suggests that Summon should be given prominence and made easily accessible to undergraduates. However, this finding is strongly qualified by the numerous user experience issues (including both information retrieval and user interface issues) observed during the usability testing.

Given the frequent interaction problems we observed, we recommend that Summon be placed in context, and design elements created to help students understand Summon before they attempt to use it in a “Google-like” way and encounter problems. This approach entails sensitivity to how Summon is described on the website, as well as exploring the use of tours or introductions (described in Next Steps, below).

Many graduate students will likely prefer more powerful subject-specific databases and tools (e.g. ACM Digital Library, Web of Science, PubMed, ERIC) to Summon. They should not be encouraged too overtly to use Summon as it may not be the tool of choice for many disciplines (in contrast, graduates with highly interdisciplinary research questions may find Summon expands the breadth of material they can access).

Considering these issues, one approach would be offer different entry points to Summon—for early undergraduates, advanced undergraduates, and graduate students. This audience-specific approach might or might not influence the new home page design, but could be integrated on subsidiary pages as needed. Regardless, wherever Summon is presented, it should not be implied that it has the same flexibility of use, and high first-page precision, as Google.

QuickSearch

Some participants briefly used the current implementation of QuickSearch. These participants preferred Summon to QuickSearch overall, with most comments referring to QuickSearch’s “layout.” Participants found the current layout somewhat confusing and unappealing. This suggest that QuickSearch could be a more effective component of the library website if it was redesigned with a less complex layout, and a more modern visual treatment.

Page 18: NCSU Libraries SUMMON User Research: findings and

NEXT STEPS

In a typical user-centered design process, the goal of collecting usability feedback (summarized above) is to inform future iterations of the user interface. Since the Libraries don’t directly control the Summon UI, options for improving the user experience include:

Designo Collaborate with Serials Solutions to improve current UI based on the

ideas developed during this research.o Explore the process and resources that would be necessary to build a

custom UI on top of the Summon API.o Explore systematic relevance evaluation (see Appendix C) to develop a

deeper understanding of how Summon’s relevance ranking works, and what search strategies are needed to use it effectively.

o Consider adding an integrated feedback mechanism such as UserVoice [http://uservoice.com/], to gather additional user comments over time.

Helpo Develop a lightweight “tour” that introduces new users to key aspects of

the Summon UI—this could help address some of its discoverability issues. See AmberJack (http://amberjack2.org/) for an example of a tool to support this type of tour. Facebook and Twitter also provide examples of custom-designed introductory tours for new users.

o Develop an introductory tutorial (screenshot walkthrough; video; slidecast) that provides an overview of how Summon works, gives examples of effective search techniques, and demonstrates the key UI features. See http://manybills.researchlabs.ibm.com/help/tour.html for an example of a quick, slidecast-style tour.

o Provide stronger calls to action for help and reference assistance within the UI ('Help' link in top right is fairly minimal).

Instructiono Incorporate training on Summon searching and its UI into library

instruction.o Reach out to instructors of key classes (English 101, Psychology 101, etc.)

to ensure they are familiar with Summon’s capabilities and can help their students use it.

Page 19: NCSU Libraries SUMMON User Research: findings and

APPENDIX A – RECRUITING SCREENER

1. What name do you prefer to be called? 2. What is your gender? ___ Female ___Male 3. Age group?

A. Younger than 18B. 18 – 25C. 26 – 39D. 40-59E. 60-74F. 75 or older

4. Can you confirm your status on campus?

A. Undergraduate | Year? 1st 2nd 3rd 4th 5th B. Graduate Student | Are you also an instructor or research assistant? Y / N C. Staff / Research Assistant / Grad D. Faculty

5. What is your department or program of study? 6. About how much time per week do spend on the NCSU Libraries Web site?

A. Less than 1 hour a weekB. 1-3 hours a weekC. 4-10 hours a weekD. More than 10 hours a week

7. About how much per week do you spend doing anything online?

A. Less than 1 hour a week B. 1-3 hours a week C. 4-10 hours a week D. More than 10 hours a week

Page 20: NCSU Libraries SUMMON User Research: findings and

8. About how much time per week do you spend in any of the NCSU campus libraries?

A. Less than 1 hour a week B. 1-3 hours a week C. 4-10 hours a week D. More than 10 hours a week

9. What would you say is your main purpose for using the library? 10. Are you available next week to participate? 11. What are the best times to call you back to schedule an interview? 12. Can you give me a general idea what times would work best for you to meet with us?

Page 21: NCSU Libraries SUMMON User Research: findings and

APPENDIX B – USER RESEARCH GUIDE

1. Introduction and framing [0:02 minutes]

A. I’m Abe Crystal, a user experience researcher working with the library. Thanks for coming in today! We’ll spend the few minutes talking about your experiences using the library, and then we’ll spend approximately 45 minutes going through several activities that will allow me to see your use of the library’s online tools.

B. I’d like to emphasize that I’m not the Web designer for the library. I am a neutral researcher, so nothing you say today will hurt my feelings. You can be completely open and candid with me. This is about me learning from you so I can help the library better serve students.

C. We are working to understand how the library can build its next generation Web site and need to understand how you rely on the Web to achieve your goals. This plays a key part in helping us think about to how better structure the website to meet your needs. To that end, we’ll be recording today’s session.

D. Please stop me any time if you have questions.

2. Warm-up and background [0:03]

A. Could you tell me a bit about yourself… i. Where are you from, what attracted you to NCSU?ii. Class (freshman…)iii. If upperclassman, what’s your major?

a. How did you choose this? b. What attracted you to this subject?

iv. Do you have any particular aspirations for career or graduate school?

Page 22: NCSU Libraries SUMMON User Research: findings and

3. Guided tour of your library experience [0:10]

A. Now, I’d appreciate if you could help me familiarize myself with how you use the library’s website.

i. Can you go to the library’s website?ii. How familiar are you with this site? Would you say you use it almost every day,

once a week, a couple of times a semester, or…?iii. Could you show me some of the things you normally do on the site?iv. Academics

a. Can you walk me through the classes you’re taking currently? Review syllabi online

b. For each class… how have you used the library for this class? Prompt for use of - Catalog- Journal articles- Databases

What other tools have you used for research in this class? (It’s OK to say Google, Wikipedia, etc. – I’m not evaluating you!)

v. Anything else you’d like to discuss?

4. Usability test [0:40]

A. Standard framing

i. You’re not being tested.

ii. Please think-aloud.

iii. Be open and candid.

B. Exploration and learnability

i. Go to http://www.lib.ncsu.edu/summon/

ii. First impressions?

iii. What do you think you’d use this for?

C. Known item – catalog

i. Find a textbook for a course you’re taking

a. First impressions of main Summon UI

ii. Who’s an author you like? Find a book by him/her.

a. Or: find The Tipping Point

Page 23: NCSU Libraries SUMMON User Research: findings and

D. Known item – journal

i. What’s a journal you’ve cited in your writing? Find the latest issue.

ii. Find the latest issue of the ‘Journal Of The American Chemical Society ' (either electronic or print copy).

E. Class research

i. Prompt to try to find info for one of the classes we discussed

a. Where are you in your research process for this class?

b. What would you do next on Summon as you try to find info for your class/research?

[can prompt for use of different features]

c. Citation lookup for an article cited in her class or in her previous research

ii. Fallback or complement -- Prompt to try to find info based on one of a set of standard research questions/tasks

a. Citation lookup

Properly formatted citation – can draw from a list from different disciplines:

- Political science

- Shefter, Martin. "Institutional Conflict over Presidential Appointments: The Case of Clarence Thomas." PS: Political Science & Politics 25.4 (1992): 676-79. Print.

- English

- Hodder, Alan D. "In the Glass of God's Word: Hooker's Pulpit Rhetoric and the Theater of Conversion." New England Quarterly 66, no. 1 (1993): 67-109. (In JSTOR, but not Summon)?

- Dale T Adams. "Twice Convicted, Once Executed: A Literary Naturalist's Interpretation of Richard Brooks's Film in Cold Blood." Literature/Film Quarterly 37.4 (2009): 246. Print.

Page 24: NCSU Libraries SUMMON User Research: findings and

- Psychology

- Devine, P. G., & Sherman, S. J. (1992). Intuitive versus rational judgment and the role of stereotyping in the human condition: Kirk or Spock? Psychological Inquiry, 3(2), 153-159.

- Physics

- K.A. Nelson, R.J. Dwayne Miller, D.R. Lutz, and M.D. Fayer,  "Optical generation of turntable ultrasonic waves," Journal of Applied Physics, vol. 53, no. 2, Feb., pp. 1144-1149. 

- Environment/agriculture

- Humberto Blanco-Canqui, & R Lal. (2009). Crop residue removal impacts on soil productivity and environmental quality. Critical Reviews in Plant Sciences, 28(3), 139.

- CivE

- Chiu, H.-S., Chern, J.-C., and Chang, K.-C. (1996a) "Long-term deflection control in cantilever prestressed concrete bridges. I: Control method and algorithm." J. Engrg. Mech., ASCE, 122(6), 489-494.

- CS

- Zhu, Y., Ye, S., and Li, X. 2005. Distributed PageRank computation based on iterative aggregation-disaggregation methods. In Proceedings of the 14th ACM international Conference on information and Knowledge Management (Bremen, Germany, October 31 - November 05, 2005). CIKM '05. ACM, New York, NY, 578-585. DOI= http://doi.acm.org/10.1145/1099554.1099705

b. Simple research – draw from a pool of approaches, such as:

find a an article about global warming in a 'scholarly' journal. Start with something simple like 'global warming' and maybe then proceed to 'relationship between global warming and hurricanes'. Walk through how they go about finding a scholarly journal.

Page 25: NCSU Libraries SUMMON User Research: findings and

something about the planet saturn (as opposed to cars) - do they use the facets vs changing the keyword

find a newspaper article from the new york times (or about a topic) in the year / month you were born (this might test the pubdate facet, newspaper article facet)

find three of the most recent papers written by an author (professor might ask students to do this)

find the most recent articles on a given topic (sometimes hard to find the most recent in Summon due to relevance ranking)

- Is genetically modified "golden rice" safe?

- What is the effect of advertising on adolescents?

- What do America's bridges need re: repair and improvement?

Or, find an author who’s written a lot on a topic

F. Comparison

i. Starting on library home vs. QS home vs. Summon home

ii. Search on Summon directly vs. QuickSearch

5. Post-test interview [0:05]

A. How would you describe Summon to your friends?

B. Would you recommend your friends use Summon?

C. What is the primary benefit you received from using Summon?

D. What other sites would you compare Summon to? Why?

E. Do you plan to use Summon again for your own work? What would you most likely use it for?

F. If you could change or improve anything about Summon, what would you change or improve?

G. Would you be disappointed if you could no longer use Summon?

6. Explain next steps [0:01]

Thank you for your time and your valuable insights!

We’ll be using your feedback and ideas to improve the next-generation website.

Page 26: NCSU Libraries SUMMON User Research: findings and

Please don’t hesitate to get in touch with me if you have any questions.

[Present participant with cash].

Page 27: NCSU Libraries SUMMON User Research: findings and

CHECKLIST OF ISSUES TO OBSERVE

Feature usage [can prompt to use if neglected]o Facets

Refine your search Content type Subject terms Publication date

o Advanced searcho Preview on hovero Preview on icono Save this itemo Reference list and different citation formats

Does the student…o End up with a peer-reviewed article?o Look at the abstract?o What criteria do they use to decide if an article/resource is likely to be

relevant/useful?o Get to full text?o Use the facets?

Are they confused when facets disappear with a new search?

APPENDIX C – INFORMATION RETRIEVAL SYSTEM RELEVANCE EVALUATION

SUGGESTED METHOD FOR RELEVANCE EVALUATION

1. Develop a set of test documents and topic statements

Topical retrievalWe could develop topic statements by talking with reference librarians, students and faculty, by reviewing class syallbi, and by reviewing logs from the current catalog.

Very rough draft example topic statements for NCSU Libraries...

What are the economic benefits of recycling tires? What are the environmental consequences of the use of phosphates in laundry

detergent? How was Picasso's early work received in his own era?

Page 28: NCSU Libraries SUMMON User Research: findings and

Who were some of the key inventors in the development of semiconductors and microcomputers?

Known-item retrievalAn extension of this approach is to evaluate retrieval of known items, by using specific resource names (e.g. book title, article title, call number, etc.) rather than topics.

2. Develop a sample of documents to be judged

Use the systems to be evaluated (e.g., NCSU Catalog and/or QuickSearch; EBSCO; Summon) to search for each topic statement and download the results.

Pool the results from all systems to create a single set for relevance judging (typically, we take the top 50 or 100 results from each system).

3. Judge the relevance of the documents retrieved for each topic

Multiple judges (could be librarians, faculty, and any other relevant stakeholders) independently grade the relevance of each result in the pool for each topic. [This process would require the largest time commitment from NCSU staff].

4. Evaluate the retrieval runs using standard measures

With all the relevance judgments for all of the topics in hand, we can calculate measures of topical retrieval effectiveness, such as:

mean average precision precision at ten retrieved documents

Or for known-item retrieval effectiveness, we can calculate...

mean reciprocal rank success rate (% of queries for which the correct item appears in the first 10

results)

5. Compare the test systems

Page 29: NCSU Libraries SUMMON User Research: findings and

With measures for each system completed, we can assess their relative effectiveness using simple comparisons, as well as measures of statistical reliability and effect size.