beyond the usability lab - bentley university · pdf filekeys to success agenda. design and...

Post on 29-Mar-2018

225 Views

Category:

Documents

2 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Design and Usability Center

175 Forest Street, Waltham, MA 02452 Tel: 1-800-523-2354

Beyond the Usability Lab: Exploring Large-Scale User Experience Research

Bill AlbertDirector, Design and Usability Center

walbert@bentley.edu

Lena DmitrievaSenior Consultant, Design and Usability Center

ldmitrieva@bentley.edu

2

Design and Usability Center

Acknowledgments

• Thanks to Tom Tullis and Donna Tedesco, Fidelity

Investments

• Based on upcoming book: “Beyond the Usability Lab:

Conducting Large-Scale User Experience Studies” (published by Morgan Kauffman/Elsevier in January, 2010)

3

Design and Usability Center

• Основан в 1999 году

• Оказывает консалтинговые услуги для корпоративных клиентов

• Работает с Master’s program - Human Factors in Information Design

• Консультанты

Bill Albert, PhD – Director Bill Gribbons, PhD - Sr. Consultant

Rich Buttiglieri - Usability Consultant Lena Dmitrieva - Usability Consultant

Kris Engdahl – Usability Consultant Chris Hass – Usability Consultant

9 Research Associates

О центре

5

Design and Usability Center

Наши технические возможности

• Две юзабилити лаборатории

• Прямая трансляция юзабилити сессий

• Eye tracking

• Возможность удаленного тестирования

• Оборудование, предназначенное для

тестирования мобильных и медицинских

устройств

6

Design and Usability Center

Мы предлагаем

Изучение потребительских нужд

• Анализ пользователей - Персоны

• Интернет опросы

• Исследования вне лаборатории

• Фокус-группы

Тестирование, оценки, дизайн

• Количественные и качественные исследования

• Исследования по-русски

• Экспертная оценка интерфейсов

• Юзабилити-тестирование: в лаборатории, удаленное, без участия ведущего

• Информационная архитектура

• Доступность информационно-коммуникационных технологий

User Еxperience Обучение

• Корпоративные треннинги

• Интенсивные треннинги

7

Design and Usability Center

1. Context of UX research

2. What is large-scale UX research?

3. Strengths, limitations and myths

4. Complements with other UX research methods

5. How to conduct a study (planning, design, pilot/launch,

and analysis)

6. Keys to success

Agenda

Design and Usability Center

Context of UX Research

Beyond the Usability Lab:

9

Design and Usability Center

The Goal of UX Research

• Making smart and quick decisions

• Informing design and strategy

• Sparking innovation

• Methods don’t matter

10

Design and Usability Center

I am a big fan of traditional usability

11

Design and Usability Center

Missing Piece in our Toolkit

12

Design and Usability Center

Simplified Toolkit

13

Design and Usability Center

Big Picture

Design and Usability Center

What is Large-Scale

UX Research?

Beyond the Usability Lab:

15

Design and Usability Center

What do you call this?

Unmoderated Usability

Online UsabilitySelf-Guided Usability

Remote UsabilityAutomated Usability

Asynchronous Usability

Large-Scale Usability

16

Design and Usability Center

5 basic ingredients

1. Interactive system (usually a website)

2. A lot of participants (n>50 per segment)

3. An online tool to moderate study and collect data

4. Tasks (not just attitudes/opinions)

5. Capturing data about their experience

17

Design and Usability Center

Welcome Page

18

Design and Usability Center

Screener Questions

19

Design and Usability Center

Task Instruction

20

Design and Usability Center

Dialog

21

Design and Usability Center

Post Task Questions

22

Design and Usability Center

Follow-up Questions

23

Design and Usability Center

Open-ended Questions

24

Design and Usability Center

Types of Studies

• Comprehensive evaluation

• UX benchmark

• Competitive evaluation

• Live site vs. prototype comparison

• Feature/function test

• Discovery

25

Design and Usability Center

Types of Data

• Performance

• Success, time, efficiency

• Self-reported

• Task ratings, SUS, expectations, verbatims, etc

• Clickstream

• Pages, times, clicks, data entry, abandonment, etc

26

Design and Usability Center

Outcomes

• What are the usability issues, and how big?

• Which design is better, and by how much?

• How do different customer segments differ?

• What is the user experience like?

Design and Usability Center

Strengths, Limitations & Myths

Beyond the Usability Lab:

28

Design and Usability Center

Strengths

1. Comparing products

2. Measuring user experience

3. Finding the right participants

4. Focusing on design improvements

5. Insight into the user’s real experience

29

Design and Usability Center

Limitations

1. Not well suited to rapid, iterative design

2. Need a deep understanding of issues

3. Studies that require long sessions

4. Lose control over prototypes

5. Internet access

30

Design and Usability Center

Myths

1. Only test with websites

2. It is very expensive

3. Only gather quantitative data

4. A lot of noise in the data

5. Same as any market research

study

Design and Usability Center

Compliments to Other

UX Methods

Beyond the Usability Lab:

32

Design and Usability Center

• Use lab test to identify issues, iterate, and the run

online study to validate

• After an online study, use lab to do a deep dive on

issues

Traditional Lab Testing

33

Design and Usability Center

• Use focus groups to generate design concepts, then

use online study to validate

Focus Groups

34

Design and Usability Center

• Conduct an expert review to identify the most

significant issues, then test online

Expert Reviews

35

Design and Usability Center

Web Traffic Analysis

• Web traffic analysis can be used to identify areas of

concern, followed by an online test as a deep dive into

the problems

• Web traffic can

also follow an

online test to

validate design

changes

Design and Usability Center

Planning a Large-Scale

UX Study

Beyond the Usability Lab:

37

Design and Usability Center

Setting Goals

• Compare prototype designs

• Benchmark

• Identify issues and prioritize

• Validate new design against

current design

• Compare specific design

features/functions

38

Design and Usability Center

Budget

• Recruiting costs• $2 - $20 per complete (panel)

• No cost if using internal list

• Incentives• $5 - $10 per complete

• Drawing to win

• No incentive

• Technology• Annual license or per study for commercial service

• Discount approach

39

Design and Usability Center

Timelines

ID Task Name Start Finish Duration

Jan 3 2010 Jan 10 2010 Jan 17 2010

4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

1 3d01/06/201001/04/2010Plan Goals and Structure of Study

2 5d01/11/201001/05/2010Build prototypes (2 Wizard Flows, with

slight difference)

3 3d01/11/201001/07/2010Build tasks and questions into existing

online study tool

4 3d01/13/201001/11/2010Pilot and Test tool (2 iterations)

2d01/15/201001/14/2010Launch Study5

2d01/19/201001/18/2010Analysis

7

6

1d01/20/201001/20/2010Present Results

21 22

ID Task Name Start Finish Duration

Mon Jan 4

8 9 10 11 12 1 2 3

1 1h01/04/201001/04/2010Point online study tool to prototypes

and create tasks and questions

2 1h01/04/201001/04/2010Pilot with team members and iterate

3 2h01/04/201001/04/2010Launch study internally

4 2h01/04/201001/04/2010Analyze and report back results

40

Design and Usability Center

Commercial Services

41

Design and Usability Center

Discount Approach

Design and Usability Center

Designing a Study

Beyond the Usability Lab:

43

Design and Usability Center

Introducing the Study

• Purpose

• Sponsor/contact

information

• Time estimate

• Incentive

• Technical requirements

• Legal stuff

• Instructions

44

Design and Usability Center

Screeners

• Screener questions focus study on target users

• Most common screens based on product experience

and demographics

• Screening questions need to be grouped together so

they are hard to guess

• Careful about

fraudulent

participants

45

Design and Usability Center

Starter Questions

• Customer

segmentation

• Product experience

• Expectations

• Avoid sensitive

questions

• Conditional logic

46

Design and Usability Center

Tasks

• Open-ended (discovery)

• “Learn about how carbon offsets work”

• Close-ended (measuring success)

• “What is the ISBN number of the book “Measuring the User

Experience”

• Self-generated (discovery)

• “What would you like learn on this website?...”

• Self-selected

• “Choose 5 tasks that are most important to you”

47

Design and Usability Center

Mental Cheaters

• About 10% of participants go through the study as fast

as possible to get the incentive

• Speed traps

• Consistency checks

48

Design and Usability Center

Post-Task Questions

• Ease of use

• Confidence

• Open-ended

• Proxy measures

49

Design and Usability Center

Post Session Questions

• Overall

assessment

(SUS)

• Demographics

• Open-ended

(focus on

specifics)

Design and Usability Center

Piloting and Launching

Beyond the Usability Lab:

51

Design and Usability Center

Piloting and Launching

Technical checks

Usability checks

Full pilot with data checks

Timing the launch

Singular launch

Phased launches

Monitoring results

Design and Usability Center

Data Analysis

Beyond the Usability Lab:

53

Design and Usability Center

Data Cleanup

Removing participants

Incomplete data

Fraudulent participants

Mental cheaters

Removing or modifying data for individual tasks

Outliers

Contradictory responses

Removing a task for all participants

Modifying task success

54

Design and Usability Center

Data Analysis

• Task performance

• Self-reported

• Clickstream

• Combined metrics

55

Design and Usability Center

Task Performance

• Task success

• Task time

• Efficiency (tasks per minute)

• Learnability (efficiency over time)

Number of Correct Tasks per Minute

(Error bars represent the 90% confidence interval)

1.80

1.85

1.90

1.95

2.00

2.05

2.10

2.15

2.20

Static TOC Expand-Collapse TOC Auto-collapse TOC

56

Design and Usability Center

Self-Reported Data

• Rating Scales

• Likert Scale (disagree – agree)

• Semantic Differential (easy –

difficult)

• Open-ended questions

• Based on conditional logic Before and After Task Ease Ratings

0.0

0.5

1.0

1.5

2.0

2.5

3.0

3.5

4.0

0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0

Before Rating

Aft

er

Rati

ng

Harder Easier

Easier

Harder

Harder Easier

Easier

Harder

T1

T2

T3 T4

T5

Fix It FastBig Opportunity

Promote It Don’t Touch It

Task Ease Ratings (1-5, Higher=Easier)

(Error bars represent the 90% confidence interval)

1.0

1.5

2.0

2.5

3.0

3.5

4.0

4.5

5.0

Ferret Food Store Location Gift Card Shipping Pine Cat Litter

Site A

Site B

57

Design and Usability Center

Clickstream Analysis

• Percentage of participants who used specific paths

• Intended paths, acceptable paths, unacceptable paths

• Abandonment (rates and pages)

• Total page visits (compared to minimum)

• Lostness metric# Pages Beyond Minimum

0

2

4

6

8

10

12

Task 1 Task 2 Task 3 Task 4

58

Design and Usability Center

Combined Metrics

• Convert each

metric to a

percentage, then

average all

percentages

• Z-score

transformation

• Jeff Sauro’s SUM

method

Participant Condition

Mean Task

Ease Rating

Percent

Correct

Mean Task

Time (sec) Time %

Task

Rating %

Combined

Task %

P382 Static TOC 4.0 100% 20 50% 75% 75%

P383 Static TOC 4.4 100% 24 42% 85% 76%

P384 Auto-collapse TOC 4.4 100% 19 51% 85% 79%

P385 Expand/Collapse TOC 4.0 100% 47 21% 75% 65%

P386 Static TOC 3.8 90% 26 39% 70% 66%

P387 Auto-collapse TOC 4.2 100% 29 35% 80% 72%

P388 Auto-collapse TOC 4.1 90% 33 30% 78% 66%

P389 Expand/Collapse TOC 3.2 100% 19 53% 55% 69%

P390 Static TOC 4.9 100% 23 43% 98% 80%

P391 Expand/Collapse TOC 4.1 80% 50 20% 78% 59%

P392 Static TOC 4.3 90% 22 46% 83% 73%

P393 Static TOC 4.4 100% 17 60% 85% 82%

P394 Auto-collapse TOC 4.0 100% 41 24% 75% 66%

P395 Static TOC 3.2 100% 25 40% 55% 65%

P396 Auto-collapse TOC 4.6 100% 44 23% 90% 71%

P397 Auto-collapse TOC 4.1 89% 60 17% 78% 61%

P398 Auto-collapse TOC 2.3 60% 41 24% 33% 39%

P399 Expand/Collapse TOC 4.3 90% 24 41% 83% 71%

P400 Expand/Collapse TOC 4.8 90% 36 28% 95% 71%

P401 Expand/Collapse TOC 4.0 80% 35 29% 75% 61%

Overall Task % (Accuracy, Speed, Task Ease)

(Error bars represent the 90% confidence interval)

67%

68%

69%

70%

71%

72%

Static TOC Expand/Collapse TOC Auto-collapse TOC

Design and Usability Center

Keys to Success

Beyond the Usability Lab:

60

Design and Usability Center

Keys to Success

1. Choosing the right tool

2. Thinking beyond the web

3. Test early

4. Compare alternatives

5. Consider the entire user experience

6. Use your entire toolkit

7. Explore your data

8. Sell your results

9. Trust your data (within limits)

10. Just dive in

61

Design and Usability Center

Resources

• www.measuringux.com

• www.remoteusability.com

• www.userzoom.com

• www.measuringusability.com

• http://keynote.com/product

s/customer_experience/

Design and Usability Center

175 Forest Street, Waltham, MA 02452 Tel: 1-800-523-2354

Спасибо Thank you

Bill AlbertDirector, Design and Usability Center

walbert@bentley.edu

Lena DmitrievaSenior Consultant, Design and Usability Center

ldmitrieva@bentley.edu

top related