storytelling: rhetoric of heuristic evaluation

30
The Rhetoric of Heuristi c Carol Barnum Director of Graduate Studies and The Usability Center @ Southern Polytechnic How to tell the story

Upload: ux-firm

Post on 01-Nov-2014

5.388 views

Category:

Design


2 download

DESCRIPTION

Keynote at symposium on Usability and Communicating Complex Information E. Carolina University

TRANSCRIPT

Page 1: Storytelling: Rhetoric of heuristic evaluation

The Rhetoric

of Heuristic Evaluation Carol BarnumDirector of Graduate Studies and The Usability Center @ Southern Polytechnic

How to tell the story

Page 2: Storytelling: Rhetoric of heuristic evaluation

Heuristic Eval is popular pick

Slide 2UX Workshop, ECU

UPA survey results for HE/expert review% of respondents

Survey year

77% 2007

74% 2009

75% 2011

Page 3: Storytelling: Rhetoric of heuristic evaluation

Why so popular? Fact or myth?

Slide 3UX Workshop, ECU

Fast Cheap

Easy Effective

Convenient

Page 4: Storytelling: Rhetoric of heuristic evaluation

Workshop, ECU

“It can often be more expensive and difficult to find 3-5 usability professionals as it is to test 3-5 users.”

Jeff Sauro, “What’s the difference between a Heuristic Evaluation and a Cognitive Walkthrough?” Measuring Usability, Aug. 2, 2011

Care to comment?

Slide 4

Page 5: Storytelling: Rhetoric of heuristic evaluation

HE output

Slide 5UX Workshop, ECU

• A list of usability problems• Tied to a heuristic or rule of practice• A ranking of findings by severity• Recommendations for fixing problems• Oh, and the positive findings, too

Page 6: Storytelling: Rhetoric of heuristic evaluation

Workshop, ECU

1. Visibility of system status2. Match between system and real world3. User control and freedom4. Consistency and standards5. Error prevention6. Recognition rather than recall7. Flexibility and efficiency of use8. Aesthetic and minimalist design9. Help users recognize, diagnose, and recover from errors10. Help and documentation

J. Nielsen and R. Mack, eds. Usability Inspection Methods, 1994

Nielsen’s 10 heuristics

Slide 6

Page 7: Storytelling: Rhetoric of heuristic evaluation

Workshop, ECU

• Audience• Purpose• Context of use

What would Aristotle do?

Slide 7

Page 8: Storytelling: Rhetoric of heuristic evaluation

What do you do?

Slide 8UX Workshop, ECU

• Do you do it (or teach it)?• How do you do it?• Why do you do it?• Do you do it alone or with others?• How do you report it?• How do you charge for it?

Page 9: Storytelling: Rhetoric of heuristic evaluation

Slide 9usability.spsu.edu UPA 2011

• Phase 1: Nielsen is my bible

What do I do? A brief history

Page 10: Storytelling: Rhetoric of heuristic evaluation

Workshop, ECU

• Comparative evaluation of reservation process• 17 teams—

– 8 did expert review/HE– Only one team used heuristic evaluation

• Rolf’s conclusions– Findings “overly sensitive“—too many to manage– Need to improve classification schemes. – Need more precise and usable recommendations

CHI 2003Results available at Rolf Molich’s DialogDesign website, http://www.dialogdesign.dk/CUE-4.htm

CUE 4 Hotel Pennsylvania

Slide 10

Page 11: Storytelling: Rhetoric of heuristic evaluation

usability.spsu.edu UPA 2011

11

HE samplefindings page

Page 12: Storytelling: Rhetoric of heuristic evaluation

Slide 12usability.spsu.edu UPA 2011

• Phase 1: Nielsen is my bible• Phase 2: loosely based findings from Nielsen;

tables, screen captures, recommendations

What do I do? A brief history

Page 13: Storytelling: Rhetoric of heuristic evaluation

usability.spsu.edu UPA 2011

13

Objectives/goals for the modules

Reason content is being presentedConciseness of presentationDefinitions required to work with the module/contentEvaluation criteria and methodsDirect tie between content and assessment measureSequence of presentation follows logically from introductionQuizzes challenge users

Develop a consistent structure that defines what’s noted in the bulleted points, above.Avoid generic statements that don’t focus users on what they will be accomplishing.Advise that there is an assessment used for evaluation and indicate if it’s at the end or interspersed in the moduleConnect ideas in the goals and objectives with outcomes in the assessmentFollow the order of presentation defined at the beginningDevelop interesting and challenging questionsRe-frame goals/objectives at the end of the module

3

Finding Description Recommendation H C S Severity Rating

Objectives/goals for the modules

Reason content is being presentedConciseness of presentationDefinitions required to work with the module/contentEvaluation criteria and methodsDirect tie between content and assessment measureSequence of presentation follows logically from introductionQuizzes challenge users

Develop a consistent structure that defines what’s noted in the bulleted points, above.Avoid generic statements that don’t focus users on what they will be accomplishing.Advise that there is an assessment used for evaluation and indicate if it’s at the end or interspersed in the moduleConnect ideas in the goals and objectives with outcomes in the assessmentFollow the order of presentation defined at the beginningDevelop interesting and challenging questionsRe-frame goals/objectives at the end of the module

3

Hyperspace, Shock, and Cardiac Arrest all require more clearly defined goals and objectives.H = Hyperspace; C = Cardiac Arrest; S = Shock

Page 14: Storytelling: Rhetoric of heuristic evaluation

usability.spsu.edu UPA 2011

14

Page 15: Storytelling: Rhetoric of heuristic evaluation

usability.spsu.edu UPA 2011

15

35 page report

Page 16: Storytelling: Rhetoric of heuristic evaluation

Slide 16usability.spsu.edu UPA 2011

• Phase 1: Nielsen is my bible• Phase 2: loosely based findings from Nielsen;

tables, screen captures, recommendations• Phase 3: screen captures, UX terminology

What do I do? A brief history

Page 17: Storytelling: Rhetoric of heuristic evaluation

UX Workshop, ECU 17

Page 18: Storytelling: Rhetoric of heuristic evaluation

usability.spsu.edu UPA 2011

18

A unique password between 6 and 16 characters was required. What “unique” means is not defined. This is a problem with terminology.

Usually, passwords must be a combination of letters and numbers for higher security. An all-letter password—Heuristics—was accepted. A dictionary term is not a secure password and contradicts accepted conventions. The ability to input a dictionary word may be a component of trust for users.

The username and security question answer were rejected on submit.

This result is confusing as the name was confirmed on the previous screen. This relates to establishing conventions for the form of names/passwords on the input screen. Input formats need to be defined on the relevant page.Differences in spelling “username” vs. “user name” are subtle but are consistency issues.

The red banner is confusing as the user chose the gold (Free Edition). This is a consistency issue.

Page 19: Storytelling: Rhetoric of heuristic evaluation

Slide 19

• Phase 1: Nielsen is my bible• Phase 2: loosely based findings from Nielsen;

tables, screen captures, recommendations• Phase 3: screen captures, UX terminology• Phase 3.1: user experience emerges

What do I do? A brief history

UX Workshop, ECU

Page 20: Storytelling: Rhetoric of heuristic evaluation

UX Workshop, ECU Slide 20

State Tax

Reviewer comments: I wanna click on the map, not the pulldown. WAH!Also, I’ve got no idea what the text on this page means.

Page 21: Storytelling: Rhetoric of heuristic evaluation

Slide 21

• Phase 1: Nielsen is my bible• Phase 2: loosely based findings from Nielsen;

tables, screen captures, recommendations• Phase 3: screen captures, UX terminology• Phase 3.1: user experience emerges• Phase 4: tell the story of the user experience

What do I do? A brief history

UX Workshop, ECU

Page 22: Storytelling: Rhetoric of heuristic evaluation

Workshop, ECU

• Ginny Redish and Dana Chisnell• AARP report—58 pages, 50 websites

– Two personas—Edith and Matthew– Evaluators “channel“ the user via persona and

tasks/goals– Their story emerges

Available from Redish &Associates http://www.redish.net/images/stories/PDF/AARP-50Sites.pdf

Persona-based scenario review

Slide 22

Page 23: Storytelling: Rhetoric of heuristic evaluation

While the clickable area is very large in the navigation blocks, Edith expected to click on the labels, so she was surprised when the menu appeared

When trying to click an item in the menu above, Edith had trouble selecting because her mouse hovered close enough to the choices below to open that menu, obscuring the item she wanted to click

Chisnell and Redish, Designing Web Sites for Older Adults: Expert Review of Usability for Older Adults at 50 Web Sites, (for AARP)

Page 24: Storytelling: Rhetoric of heuristic evaluation

Workshop, ECU

• All sites have usability problems• All organizations have limited resources• You’ll always find more problems than you have

resources to fix• It’s easy to get distracted by less serious problems

that are easier to solve . . .• Which means that the worst ones often persist• Therefore, you have to be intensely focused on

fixing the most serious problems firstRocket Surgery Made Easy, New Riders, 2010

Steve Krug’s approach

Slide 24

Page 25: Storytelling: Rhetoric of heuristic evaluation

Workshop, ECU

• Focus ruthlessly on a small number of the most important problems.

• When fixing problems, always do the least you can do.

Krug’s maxims

Slide 25

Page 26: Storytelling: Rhetoric of heuristic evaluation

Workshop, ECU

• Ginny Redish– Letting Go of the Words, Morgan Kaufmann, 2007

– Engage in conversation with your reader• Whitney Quesenbery and Kevin Brooks

– Storytelling for User Experience Design, Rosenfeld, 2010

– Stories can be a part of all stages of work from user research to evaluation

Conversation, Story Telling

Slide 26

Page 27: Storytelling: Rhetoric of heuristic evaluation

Workshop, ECU

No deliverableQuick findingsPresentationDetailed report

Report deliverable options

Slide 27

Jim Ross, “Communicating User Research Findings,“ UX Matters, Feb. 6, 2012

Page 28: Storytelling: Rhetoric of heuristic evaluation

Workshop, ECU

Why report?

Slide 28

Big honkin’ report—courtesy Steve Krug

Page 29: Storytelling: Rhetoric of heuristic evaluation

Workshop, ECU

What’s a writer to do? Rhetoric to the rescue!

Slide 29

Rhetorical Question Rhetorical principle

Who are my readers? Audience

What is my purpose in writing? What is their purpose in reading?

Purpose

How will they use the report? Context of use

Page 30: Storytelling: Rhetoric of heuristic evaluation

Workshop, ECU

What have we learned today?

Slide 30

• Things to keep

• Things to change

• Things to think about