heuristic evaluation. sources for today’s lecture: professor james landay: ...

31
Heuristic Evaluation

Post on 20-Dec-2015

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Heuristic Evaluation

Page 2: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Sources for today’s lecture:Professor James Landay:

http://bmrc.berkeley.edu/courseware/cs160/fall98/lectures/heuristic-evaluation/heuristic-evaluation.ppt

Jakob Nielsen’s web site:

http://www.useit.com/papers/heuristic/heuristic_evaluation.html

Nielsen articles linked to course web site

Page 3: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Heuristic evaluation (what is it?)

Method for finding usability problems Popularized by Jakob Nielsen “Discount” usability engineering

Use with working interface or scenarioConvenientFastEasy to use

Page 4: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Heuristic evaluation (how ?)

Small set of evaluators (3-5) Each one works independently Find problems with an interface using a

small number of heuristics (principles) Aggregate findings afterward

Page 5: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Use multiple evaluators

These people can be novices or experts “novice evaluators” “regular specialists” “double specialists” (- Nielsen)

Each evaluator finds different problems The best evaluators find both hard and

easy problems

Page 6: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Use multiple evaluators

Page 7: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Proportion of usability problems found by different numbers of evaluators (Nielsen)

Page 8: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Heuristic Evaluation - Advantages

Evaluators can be experts.

There need not be a working system.

Evaluators evaluate the same system or scenario.

Often, about 5 evaluators can discover around 75% of the problems.

Page 9: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Principles (Nielsen’s original set)

Simple & natural dialog

Speak the users’ language

Minimize users’ memory load

Be consistent

Provide feedback

Provide clearly marked exits

Provide shortcuts

Good error messages

Prevent errors

Page 10: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Sample Heuristics (we’ll be using these)

1. Visibility of system status

2. Match between system & real world

3. User control and freedom

4. Consistency & standards

5. Error prevention

6. Recognition rather than recall

7. Flexibility & efficiency of use

8. Minimalist design

9. Help error recovery

10. Help & documentation

(PRS pp. 408-409)

Page 11: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Revised principles (PRS, 408-9)

1. Visibility of system status

searching database for matches

Page 12: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

What is “reasonable time”?

0.1 sec: Feels immediate to the user. No additional feedback needed.

1.0 sec: Tolerable, but doesn’t feel immediate. Some feedback needed.

10 sec: Maximum duration for keeping user’s focus on the action.

For longer delays, use % done progress bars.

Page 13: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

2. Match between the system and the real world

Page 14: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Natural dialog?

Socrates: Please select command mode

Student: Please find an author named Octavia Butler.

Socrates: Invalid Folio command: please

Page 15: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Another example:

Dragging a diskette

into the trash

(Stay tuned for lecture

on metaphors!)

Page 16: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

3. User control and freedomProvide exits for mistaken choicesEnable undo, redoDon’t force users to take a particular path

Page 17: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

4. Consistency and standards

See also: SPSS menus (“OK” is inconsistently located.

Page 18: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

5. Error prevention

People make errors.

Yet we can try to prevent them.

How might you go about trying

preventing errors?

Page 19: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

5. Error prevention

People make errors.

Yet we can try to prevent them.

How might you go about trying

preventing errors?

(try adding forcing functions)

Page 20: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

6. Recognition rather than recall

Ex: Can’t copy info from one window to another

Violates: Minimize the users’ memory load

(see also Norman’s book)

Page 21: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

7. Flexibility and efficiency of useProvide short cutsEnable macros Provide multiple ways of accomplishing the

same thing

Page 22: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

8. Aesthetic and minimalist design

NOT!

Page 23: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

9. Help users recognize, diagnose, and recover from errors

Page 24: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Error messages (Unix)

SEGMENTATION VIOLATION! Error #13

ATTEMPT TO WRITE INTO READ-ONLY MEMORY!

Error #4: NOT A TYPEWRITER

Page 25: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

10. Help and documentation

Page 26: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Heuristics adapted to web site evaluation: (PRS p. 415)

Adapt the general heuristics provided by Nielsen to the particular domain!

Page 27: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Phases of heuristic evaluation

1. Pre-evaluation training - give evaluators needed domain knowledge and information on the scenario (readings, this lecture!)

2. Have them evaluate interface independently

3. Classify each problem & rate for severity

4. Aggregate results (Matt will do this)

5. Debrief: Report the results to the interface designers

Page 28: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Severity ratings

Each evaluator rates individually:0 - don’t agree that this is a usability problem

1 - cosmetic problem

2 - minor usability problem

3 - major usability problem; important to fix

4 - usability catastrophe; imperative to fix

In giving a rating, consider both the flaw’s impact and its frequency.

Page 29: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt
Page 30: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt

Conclusion

Heuristic evaluation is a great “discount” method. (You will try out this method with Assignment #2.)

But it’s not perfect - some “problems” may not matter, and some problems will be missed.

For best results, use heuristic evaluation in combination with user testing!

Page 31: Heuristic Evaluation. Sources for today’s lecture: Professor James Landay:  stic-evaluation/heuristic-evaluation.ppt