benchmarking usability performance
TRANSCRIPT
![Page 1: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/1.jpg)
BENCHMARKING USABILITY PERFORMANCE
Jennifer Romano Bergstrom, Ph.D. UX Research Leader Fors Marsh Group
George Mason University Dec 9 , 2014
![Page 2: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/2.jpg)
WHAT IS USER EXPERIENCE?
+ emotions and perceptions = UX
Usability = “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use.” ISO 9241-11
![Page 3: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/3.jpg)
USABILITY & USER EXPERIENCE
useful
valuable
desirable
accessible
trustworthy
engaging
usable
The 5 Es to Understanding Users (W. Quesenbery): http://www.wqusability.com/articles/getting-started.html
![Page 4: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/4.jpg)
WHEN TO TEST
![Page 5: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/5.jpg)
WHEN TO TEST
Benchmark
![Page 6: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/6.jpg)
WHY TEST
WHY BENCHMARK?‣ Provide a framework of current website performance ‣ Compare metrics in future testing
![Page 7: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/7.jpg)
WHY DO IT?‣ Ensure you’re solving a problem that exists ‣ Ensure you’re building a product that is tailored to its audience ‣ Ensure that your product solution aligns to behaviors
WHY TEST
![Page 8: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/8.jpg)
WHERE TO TEST
• Controlled environment
• All participants have the same experience
• Record and communicate from control room
• Observers watch from control room and provide additional probes (via moderator) in real time
• Incorporate physiological measures (e.g., eye tracking, EDA)
• No travel costs
LABORATORY REMOTE IN THE FIELD • Participants tend to be
more comfortable in their natural environments
• Recruit hard-to-reach populations (e.g., children, doctors)
• Moderator travels to various locations
• Bring equipment (e.g., eye tracker)
• Natural observations
• Participants in their natural environments (e.g., home, work)
• Use video chat (moderated sessions) or online programs (unmoderated)
• Conduct many sessions quickly
• Recruit participants in many locations (e.g., states, countries)
![Page 9: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/9.jpg)
HOW TO TEST
• In-depth feedback from each participant
• No group think
• Can allow participants to take their own route and explore freely
• No interference
• Remote in participant’s environment
• Flexible scheduling
• Qualitative and Quantitative
ONE-ON-ONE SESSIONS FOCUS GROUPS SURVEYS • Representative
• Large sample sizes
• Collect a lot of data quickly
• No interviewer bias
• No scheduling sessions
• Quantitative analysis
• Participants may be more comfortable with others
• Interview many people quickly
• Opinions collide
• Peer review
• Qualitative
![Page 10: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/10.jpg)
WHAT TO MEASURE
![Page 11: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/11.jpg)
WHAT TO MEASURE
Benchmark
![Page 12: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/12.jpg)
EXAMPLE IN-LAB ONE-ON-ONE METHODS Co
pyrig
ht*©
2013**The
*Nielse
n*Co
mpany.*Con
fiden
;al*and
*proprietary.*
34*
Example Methodology Participants: • N = 74 | Average Age = 37 • Mix of gender, ethnicity, income • Random assignment to diary condition
• New, Old, Prototype, Bilingual
Usability Testing session: • Participants read a description of the
study. • The moderator gave instructions and
calibrated the eye tracker. • Participants completed Steps 1-5 in the
diary at their own pace. • End-of-session satisfaction questionnaire • Debriefing interview
Eye Tracker
Moderators worked from another room.
Control Room
Slide from: Walton, L., Romano Bergstrom, J., Hawkins, D. & Pierce, C. (2014). User Experience and Eye-Tracking Study: Paper Diary Design Decisions. Paper presentation at the American Association for Public Opinion Research (AAPOR) Conference, Anaheim, CA, May 2014.
![Page 13: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/13.jpg)
EXAMPLE IN-LAB ONE-ON-ONE METHODS Co
pyrig
ht*©
2013**The
*Nielse
n*Co
mpany.*Con
fiden
;al*and
*proprietary.*
34*
Example Methodology Participants: • N = 74 | Average Age = 37 • Mix of gender, ethnicity, income • Random assignment to diary condition
• New, Old, Prototype, Bilingual
Usability Testing session: • Participants read a description of the
study. • The moderator gave instructions and
calibrated the eye tracker. • Participants completed Steps 1-5 in the
diary at their own pace. • End-of-session satisfaction questionnaire • Debriefing interview
Eye Tracker
Moderators worked from another room.
Control Room
Slide from: Walton, L., Romano Bergstrom, J., Hawkins, D. & Pierce, C. (2014). User Experience and Eye-Tracking Study: Paper Diary Design Decisions. Paper presentation at the American Association for Public Opinion Research (AAPOR) Conference, Anaheim, CA, May 2014.
No Think Aloud in
Benchmark studies: We want a pure measure of
performance
![Page 14: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/14.jpg)
PREPARATION
‣ What are the most important things users should be able to do on this site? ‣ Most frequent ‣ Most important (e.g., registration)
‣ Tasks should be clear and unambiguous and in the user’s language (no jargon).
‣ Don’t prompt the solution.
CREATE TASKS
![Page 15: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/15.jpg)
PREPARATION
TASK SCENARIO EXAMPLE‣ “You want to book a romantic holiday for you and your partner for Valentine’s day. How would you do that?” !
‣ “Use this site to…” is even better. It is a task. You can measure behavior. !
‣ NOT: Go to the home page of romanticholidays.com and click “sign up now” then click “Valentine’s day.”
![Page 16: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/16.jpg)
PREPARATION
THINGS TO AVOID‣ Asking participants to predict the future
‣ Asking if a participant would use something like X or might enjoy X feature is not productive
‣ Instead, ask about current behavior (do you currently do X?) or show them something and observe how they interact with it
![Page 17: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/17.jpg)
PREPARATION
THINGS TO AVOID‣ Leading people
‣ Let them make their own mistakes; that is valuable ‣ If you give the answers, you’ll never learn what you need to learn
‣ AVOID: ‣ Telling people what to do or explaining how it works ‣ “Is there anywhere else you would click?” ‣ “Go ahead and click on that…”
![Page 18: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/18.jpg)
PREPARATION
THINGS TO AVOID‣ Bias
‣ Try to remain neutral, even if the person is really funny or mean ‣ Use open-ended questions to understand perceptions
‣ AVOID: ‣ Testing friends ‣ Acting differently with different participants ‣ “Did you like it?” ‣ “Interesting.” ‣ “Now we are going to work with this awesome page.”
![Page 19: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/19.jpg)
PREPARATION
THINGS TO AVOID‣ Interrupting
‣ You don’t want to interfere with what participants would normally do on their own
‣ Wait until the end to ask follow-up questions ‣ AVOID:
‣ Probing mid-task ‣ “Why?”
![Page 20: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/20.jpg)
PREPARATION
THINGS TO AVOID‣ Explaining the purpose
‣ Your job is to pull as much information as possible ‣ Your job is not to explain how it works ‣ “What do you think it is for?” ‣ “What would you do if I was not here?”
‣ AVOID: ‣ Explaining how to find information ‣ Explaining the purpose of the product
![Page 21: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/21.jpg)
ANALYZING RESULTS
USABILITY & UX TESTING
![Page 22: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/22.jpg)
COMPARE TO GOALS‣ It is a good idea to set goals (e.g., 90% of participants should be able to register in less than one minute).
‣ Keep results simple so people will use them and appreciate them. ‣ Compare performance to goals ‣ In future iterations, compare performance to benchmark
ANALYZING RESULTS
![Page 23: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/23.jpg)
OUTPUTS‣ Notes, data, video/audio recordings ‣ Usability labs will create full reports (doc or PPT) ‣ Unmoderated tests may provide data reports and recorded sessions.
‣ When writing research notes, remember to: ‣ Report good and bad findings ‣ Stick to what you observed in the test
‣ Use the data!
ANALYZING RESULTS
![Page 24: Benchmarking Usability Performance](https://reader033.vdocuments.us/reader033/viewer/2022052912/55a20b511a28abc74e8b45db/html5/thumbnails/24.jpg)
BENCHMARKING USABILITY PERFORMANCE
THANK YOU!Jennifer Romano Bergstrom, Ph.D. Fors Marsh Group [email protected] @romanocog
Links to more info: EdUI slides (see other slides on Slideshare too) Eye Tracking in UX Design