1human-computer interaction human computer interaction evaluation techniques
TRANSCRIPT
2Human-Computer Interaction
What is evaluation?
The role of evaluation: We still need to assess our designs and test our
systems to ensure that they actually behave as we expect and meet user requirements.
3Human-Computer Interaction
What is evaluation? (Cont.)
Evaluation should not be thought of as a single phase in the design process.
Evaluation should occur throughout the design life cycle, with the results of the evaluation feeding back into modifications to the design.
It is not usually possible to perform extensive experimental testing continuously throughout the design, but analytical(تحليلية ) and informal techniques can and should be used.
4Human-Computer Interaction
Broad headings on evaluation techniques
We will consider evaluation techniques under two broad headings( عريضة :( عناوين
Evaluation Through Expert Analysis. Evaluation Through User Participation.
5Human-Computer Interaction
Goals of Evaluation
Evaluation has three main goals: To assess extent(مدى ) and accessibility of the
system’s functionality.
To assess user’s experience of the interaction.
To identify specific problems with the system.
6Human-Computer Interaction
Goals of Evaluation (Cont.)
1. To assess extent and accessibility of the system’s functionality:
The system’s functionality is important in that it must accord with the users requirements.
So the design of the system should enable users to performed their intended tasks more easily.
The use of the system must be matching the user’s expectations of the task.
7Human-Computer Interaction
2. To assess user’s experience of the interaction:
This includes considering aspects such as: How easy the system is to learn. Its usability. The users satisfaction with it. The user’s enjoyment and emotional response.
Goals of Evaluation (Cont.)
8Human-Computer Interaction
3. To identify specific problems with the system:
Unexpected results -> confusion amongst users.
Related to both the functionality and usability of the design.
Goals of Evaluation (Cont.)
9Human-Computer Interaction
Objectives of User Interface Evaluation
Key objective of both UI design and evaluation:
“Minimize malfunctions”
Key reason for focusing on evaluation:
Without it, the designer would be working “blindfold”. Designers wouldn’t really know whether they are solving
customer’s problems in the most productive way.
10Human-Computer Interaction
Evaluation Techniques Evaluation:
Tests usability and functionality of system.
Evaluates both design and implementation.
Should be considered at all stages in the design life cycle.
But, in order for evaluation to give feedback to designers... ...we must understand why a malfunction occurs.
Malfunction analysis: Determine why a malfunction occurs. Determine how to eliminate malfunctions.
11Human-Computer Interaction
Overview of Interface Evaluation Methods
Three types of methods:Passive evaluation.Active evaluation.Predictive evaluation (usability inspections).
All types of methods useful for optimal
results.Used in parallel.
All attempt to prevent malfunctions.
12Human-Computer Interaction
1. Passive evaluation
Performed while prototyping in a test.
Does not actively seek malfunctions. Only finds them when they happen to occur. Infrequent malfunctions may not be found.
Generally requires realistic use of a system. Users become frustrated(محبط ) with
malfunctions.
13Human-Computer Interaction
1. Passive evaluation. (Cont.)
Gathering Information:
a) Problem report monitoring: Users should have an easy way to register their
frustration / suggestions.
Best if integrated with software.
14Human-Computer Interaction
1. Passive evaluation (Cont.)
b) Automatic software logs. Can gather much data about usage:
Command frequency. Error frequency. Undone operations (a sign of malfunctions).
Logs can be taken of: Just keystrokes, mouse clicks. Full details of interaction.
15Human-Computer Interaction
1. Passive evaluation (Cont.)
c) Questionnaires: Useful to obtain statistical data from large
numbers of users. Proper statistical means( المناسبة اإلحصائية ( الوسائل
are needed to analyze results. Gathers subjective data about importance of
malfunction. Less frequent malfunctions may be more important. Users can prioritize needed improvements.
16Human-Computer Interaction
Questionnaires
Set of fixed questions given to users.
Limit on number of questions: Very hard to phrase questions well. Questions can be closed- or open-ended.
Advantages: Quick and reaches large user group.
17
Open-ended questions
Open-ended questions are those questions that will solicit (التماس ) additional information from the inquirer(المستعلم ).
Examples:
How may/can I help you? Where have you looked already? What aspect are you looking for? What kind of information are you looking for? What would you like to know about [topic]? When you say [topic], what do you mean?
18
Closed ended questions Closed ended questions are those questions, which can be
answered finitely by either “yes” or “no.” Examples: a. Can I help you? b. May I help you? c. Can you give me more information? d. Have you searched elsewhere? e. Can you describe the kind of information you want? f. Can you give me an example? g. Could you be more specific? h. Are you looking for [topic]?
19Human-Computer Interaction
2. Active evaluation Actively study specific actions performed by users. Performed when prototyping done.
Gathering Information:d) Experiments & usability engineering:
Prove hypothesis about measurable attributes of one or more UI’s. e.g. speed/learning/accuracy/frustration…
In usability engineering test against goal of system. Hard to control for all variables.
e) Observation sessions (Videotaped Evaluation). Also called ‘interpretive evaluation’. Study active use on realistic tasks.
20Human-Computer Interaction
3. Predictive evaluation Studies of system by experts rather than users. Performed when UI is specified (useful even before prototype
developed). Can eliminate many malfunctions before users ever see software.
Also called “usability inspections”.( التفتيش ( عمليات Gathering Information:
f) Heuristic(ارشادي ) evaluation. Based on a UI design principle document.
Analyze whether each guideline is commit to(تلتزم ) in the context of the task and users.
Can also look at commit to standards.
g) Cognitive(المعرفية ) walkthroughs. Step-by-step analysis of:
Steps in task being performed. Goals users form to perform these tasks. How system leads user through tasks.
21Human-Computer Interaction
Summary of evaluation techniquesTechniqueWhen to use
a) Problem reportingAlways
b) Automatic logsIn any moderately(متوسط ) complex system and whenever there are large numbers and commands
c) QuestionnairesWhenever there are large number of users
d) Experiments & Usability Engineering
In special cases where it is hard to choose between alternatives,
e) Observation sessionsAlmost always, especially when user has to interact with a client while using the system
f) Heuristic evaluation Always
g) Cognitive WalkthroughWhen usability must be optimized
22Human-Computer Interaction
Evaluating Designs The evaluation should occur throughout the
design process.
These methods can be used at any stage in the development process from a design specification, through storyboards and prototypes, to full implementations, making
them: Flexible evaluation approaches.
23Human-Computer Interaction
1. Videotaped Evaluation
A software engineer studies users who are actively using the user interface: To observe what problems they have. The sessions are videotaped. Can be done in user’s environment.
Activities of the user: Preferably talks to him/her-self as if alone in a room.
This process is called ‘co-operative’ evaluation when the software engineering and user talk to each other.
24Human-Computer Interaction
The importance of video: With using it, ‘you can see what you want to see from the system’. You can repeatedly analyze, looking for different problems.
Tips for using video: Several cameras are useful. Software is available to help analyse video by dividing into
segments and labelling the segments.
1. Videotaped Evaluation (Cont.)
25Human-Computer Interaction
2. Experiments1. Pick a set of subjects (users):
1. A good mix to avoid biases(التحيزات ).2. A sufficient number to get statistical significance (avoid
random happenings effect results).
2. Pick variables to test: Variables Manipulated to produce different conditions:
Should not have too many. They should not affect each other too much. Make sure there are no hidden variables.
3. Develop a hypothesis: A prediction of the outcome. Aim of experiment is to show this is correct.
26Human-Computer Interaction
Variables
Independent variable (IV): Characteristics changed to produce different conditions. e.g. interface style, number of menu items…
Dependent variable (DV): Characteristics measured in the experiment. e.g. time taken, number of errors.
27Human-Computer Interaction
3 .Heuristic Evaluation
Developed by Jakob Nielsen & Rolf Molich in the early 1990s. Helps find usability problems in a UI design. a heuristic is based on UI guideline. usability criteria (heuristics) are identified. design examined by experts to see if these
are violated
28Human-Computer Interaction
Heuristic Evaluation (cont.) Evaluators goes through UI several
times. Inspects(يفحص) various dialogue elements. Consider and compares with list of usability
principles. Usability principles:
Nielsen’s “heuristics”. Competitive analysis & user testing of existing
products.
Use violations to redesign/fix problems.
29Human-Computer Interaction
1 .Heuristic Evaluation A type of predictive(تنبؤي ) evaluation:
Use HCI experts as reviewers instead of users. Benefits of predictive evaluation:
The experts know what problems to look for. Can be done before system is built. Experts give prescriptive(توجيهي ) feedback.
Important points about predictive evaluation: Reviewers should be independent of designers. Reviewers should have experience in both the application
domain and HCI. Include several experts to avoid bias. Experts must know classes of users. Beware: Novices can do some very bizarre(غريب ) things
that experts may not anticipate.
30Human-Computer Interaction
1 .Heuristic Evaluation
Example heuristics: System behaviour is predictable. System behaviour is consistent. Feedback is predictable.
Heuristics( البحثية االستدالل, األساليب ) being developed for mobile devices, virtual worlds, etc…
31Human-Computer Interaction
Nielsen’s ten heuristics are:1. Visibility of system status.2. Match between system and the real world.3. User control and freedom.4. Consistency and standards.5. Error prevention.6. Recognition(التعرف ) rather than recall(تذكر ).7. Flexibility and efficiency of use.8. Aesthetic(جمالي ) and minimalist( األدنى ( الحد
design.9. Help users recognize, diagnose(تشخيص ) and
recover from errors.10. Help and documentation.
32Human-Computer Interaction
H-1: Visibility of system status
Provide feedback. Keep users informed about what is going on. example: pay attention to response time.
0.1 sec: no special indicators needed. 1.0 sec: user tends to lose track of data. 10 sec: max. duration if user to stay focused on 1
action. For longer delays, use percent-done progress bars.
33
H-1: Visibility of system status
Continuously inform the user about: What it is doing. How it is interpreting the user’s input. User should always be aware of what is
going on.
> Do it
What’s it doing?
> Do itThis will take5 minutes...
Time for coffee.
34Human-Computer Interaction
What did I select?
What mode am I in now?
How is the system interpreting my actions?
Microsoft Paint
H-1: Visibility of system status
35Human-Computer Interaction
Be as specific as possible, based on user’s input.
Best within the context of the action.
H-1: Visibility of system status
36Human-Computer Interaction
Drawing Board LT
Multiple files being copied, but feedback is file by file.
H-1: Visibility of system status
37
Dealing with long delays. Cursors.
For short transactions.
Percent done dialogs. Time left. Estimated time.
RandomFor unknown times.
cancel
Contacting host (10-60 seconds)
H-1: Visibility of system status
38Human-Computer Interaction
H-2: Match between system and real world
Dragging disk to trash. Should delete it, not eject it.
Speak the users’ language. Follow real world conventions.
39Human-Computer Interaction
H-2: Match between system and real world
My program gave me the message Rstrd
Info.What does it mean?
That’s restricted
informationBut surely you can tell me!!!
No, no… Rstrd Info stands for “Restricted
Information”
Hmm… but what does it mean???
It means the program is too busy
to let you log on
Ok, I’ll take a coffee
40
H-2: Match between system and real world
Terminology based on users’ language for task. e.g. withdrawing money from a bank machine.
Use meaningful mnemonics, icons & abbreviations. eg File / Save
Ctrl + S (abbreviation)Alt FS (mnemonic for menu action) (tooltip icon)
41Human-Computer Interaction
H-3: User control and freedom
Wizards Must respond to 1 Q
before going to next
Good for beginners Have N versions
SketchUp 6
“exits” for mistaken choices, undo, redo. Don’t force down fixed paths.
43
H-3: User control and freedom
Users don’t like to feel trapped by the computer! Should offer an easy way out of as many situations as
possible.
Strategies: Cancel button (for dialogs waiting for user input). Universal Undo (can get back to previous state). Quit (for leaving the program at any time). Defaults (for restoring a property sheet). Core
Dump
45
H-4: Consistency & standardsConsistent syntax of input.
Consist language and graphics. Same visual appearance across the system. Same information/controls in same location on all windows.
Consist effects Commands, actions have same effect in equivalent situations
Predictability.
Ok Cancel OkCancel Accept Dismiss
Cancel
Ok
46Human-Computer Interaction
H-4: Consistency & standards
These are labels with a raised appearance.
Is it any surprise that people try and click on them?
50Human-Computer Interaction
H-5: Error prevention
Make it difficult to make errors.
Even better than good error message is a careful design that prevents a problem from occurring in the first place.
51Human-Computer Interaction
H-6: Recognition rather than recall
Make objects, actions, options, and directions visible. The user should not have to remember information
from one part of the dialog to another.
52
H-6: Recognition rather than recall
Computers good at remembering, people are not!Promote recognition over recall.
Menus, icons, choice dialog boxes vs commands, field formats. Relies on visibility of objects to the user.
53Human-Computer Interaction
H-6: Recognition rather than recall
Gives input format, example and default.
54Human-Computer Interaction
H-7: Flexibility and efficiency of use
Accelerators for experts e.g., keyboard shortcuts
Allow users to tailor frequent actions e.g., macros
Customized user profiles on the web
55Human-Computer Interaction
H-8: Aesthetic(جمالي ) and minimalist( األدنى design ( الحد
No irrelevant information in dialogues.
56Human-Computer Interaction
H-9: Help users recognize, diagnose, and recover from errors
Error messages in plain language. Precisely indicate the problem. Constructively suggest a solution.
57
People will make errors!
Errors we makeMistakes
Conscious(الواعي ) actions lead to an error instead of correct solution.
SlipsUnconscious behaviour gets misdirected in route to
satisfy a goal.
H-9: Help users recognize, diagnose, and recover from errors
59
H2-9: Help users recognize, diagnose, and recover from errors
Provide meaningful error messages: Error messages should be in the user’s task language. Don’t make people feel stupid:-
Try again!. Error 25. Cannot open this document. Cannot open “chapter 5” because the application “Microsoft
Word” is not on your system. Cannot open “chapter 5” because the application “Microsoft
Word” is not on your system. Open it with “Teachtext” instead?.
60
H2-9: Help users recognize, diagnose, and recover from errorsPrevent errors:
Try to make errors impossible. Modern widgets: can only enter legal data.
61Human-Computer Interaction
H2-10: Help and documentation
Easy to search. Focused on the user’s task. List concrete steps to carry out. Not too large.
62
H2-10: Help and documentation
Help is not a replacement for bad design!
Simple systems: Use minimal instructions.
Most other systems: Simple things should be simple. Learning path for advanced features. Volume 37:
A user's guide to...
63
Documentation and how it is used
Many users do not read manuals.Usually used when users are in some kind of panic.
paper manuals unavailable in many businesses! e.g. single copy locked away in system administrator’s
office. online documentation better. online help specific to current context.
Sometimes used for quick reference. list of shortcuts ...
64Human-Computer Interaction
2. Cognitive Walkthrough
Proposed by Polson and colleagues. Evaluates design on how well it supports user in learning
task. Focus on ease of learning.
Usually performed by expert in cognitive psychology. Expert is told the assumptions about user population,
context of use, task details.
expert ‘walks though’ design to identify potential problems using psychological principles.
65Human-Computer Interaction
2 .Cognitive Walkthrough
Walkthroughs require a detailed review of a sequence of actions.
In the cognitive walkthrough, the sequence of actions refers to the steps that an interface will require a user to perform in order to accomplish some task.
66Human-Computer Interaction
Walkthrough needs four things:
1. A specification or prototype of the system. It doesn't have to be complete, but it should be fairly
detailed.2. A description of the task the user is to perform on the
system.
3. Written list of the actions needed to complete the task with the proposed system.
4. An indication of who the users are and what kind of experience and knowledge the evaluators can assume about them.
67Human-Computer Interaction
Four QuestionsThe evaluator will answer these questions:
1. Is the effect of the action the same as the user’s goal at that point?
2. Will users see that the action is available?
3. Once users have found the correct action, will they know it is the one they need?
4. After the action is taken, will users understand the feedback they get?
68Human-Computer Interaction
Cognitive Walkthrough
For each task walkthrough considers: What impact will interaction have on user? What processes are required? What learning problems may occur?
Analysis focuses on goals and knowledge: Does the design lead the user to generate the
correct goals?
69Human-Computer Interaction
Steps of a Cognitive Walkthrough
Define inputs. Convene analysts. Step through action sequences for each task. Record important information. Modify UI.
70Human-Computer Interaction
Define Inputs - Example
Task: Move an application to a new folder or drive
Who: Win 2003 user Interface: Win 2003 desktop
Folder containing desired app. is open. Destination folder/drive is visible.
Action sequence...
71Human-Computer Interaction
Action Sequence
Move mouse to app. icon. Right mouse down on app. icon:
Result: App. icon highlights. Failure: The user may not know that the right
mouse button is the proper one to use. Success?: Highlighting shows something
happened, but was it the right thing?
72Human-Computer Interaction
Action Sequence (cont’d)
Release mouse button: Result: Menu appears: Cut, Copy, Create
Shortcut, Cancel. Success: User is prompted for next action.
Move mouse to “Cut”: Result: Selection highlights. Success: Standard GUI menu interaction.
73Human-Computer Interaction
Action Sequence (cont’d)
Move mouse to destination icon: Result:
App. icon follows mouse. Destination icon highlights when mouse reaches it.
Success: Dragging is intuitive (and common in GUIs) for moving. The feedback is appropriate.
74Human-Computer Interaction
Action Sequence (cont’d)
Click mouse button: Result:
App. icon disappears from under the mouse. App. icon disappears from original folder. App. icon appears in destination folder.
Success: Standard GUI menu selection. Feedback shows desired goal was accomplished.
75
Cognitive Walkthrough Example :
Human-Computer Interaction
step1: identify task step 2: identify action sequence for task
user action: Press the ‘timed record’ button system display: Display moves to timer mode. Flashing
cursor appears after ‘start’. step 3: perform walkthrough
for each action – answer the following questions Is the effect of the action the same as the user’s goal at that point? Will users see that the action is available? Once users have found the correct action, will they know it is the
one they need? After the action is taken, will users understand the feedback they
get? Might find a potential usability problem relating to icon on
‘timed record’ button.
76Human-Computer Interaction
Example:Programming a VCR by remote control
1 32
4 65
7 098
Time 21:45
Channel 3
1 32
4 65
7 098
1
Start:
End:
Channel:
Date:
77Human-Computer Interaction
Example: VCR (Cont.)
Task: Program the video time-record a program starting at 12.00 and finishing at 13.30 on channel 2 on 23 February 2008.
Who: Assume user is familiar with VCRs but not with this particular design.
Action Sequence: User’s action ( UA ). System’s Display ( SD ).
78Human-Computer Interaction
UA 1: Press the ‘timed record’ button SD 1: Display moves to timer mode. Flashing cursor appears after ‘start:’ UA 2: Press digits 1 2 0 0 SD 2: Each digit is displayed as typed and flashing cursor moves to next
position UA 3: Press the ‘timed record’ button SD 3: Flashing cursor moves to ‘end:’ UA 4: Press digits 1 3 3 0 SD 4: Each digit is displayed as typed and flashing cursor moves to next
position UA 5: Press the ‘timed record’ button SD 5: Flashing cursor moves to ‘channel’ UA 6: Press digit 2 SD 6: Digit is displayed as typed and flashing cursor moves to next position UA 7: Press ‘timed record’ button SD 7: Flashing cursor moves to ‘date’ UA 8: Press digits 2 3 0 2 0 8 SD 8: Each digit is displayed as typed and flashing cursor moves to next
position. UA 9: Press the ‘timed record’ SD 9: Stream number in top right-hand corner of display flashes UA 10: Press the transmit button SD 10: Details are transmitted to video player and display returns to normal
mode
79Human-Computer Interaction
We must answer the four questions and tell a story about the usability of the system: UA 1: Press the ‘timed record’ button
Q1) is the effect of the action the same as the user’s goal at that point? The timed record button initiates timer programming. It is
reasonable to assume that a user familiar with VCRs would be trying to do this as his first goal.
Q2) Will users see that the action is available? The ‘timed record’ button is visible on the remote controller.
Example: VCR (Cont.)
80Human-Computer Interaction
Q3) once users have found the correct action, will they know it is the one they need? It is not clear which button is the ‘timed record’ button. The icon of
a clock is a possible candidate but this could be interpreted as a button to change the time. Other possible candidates might be the fourth button down on the left or the filled circle ( associated with record ). In fact, the icon of the clock is the correct choice but it is quite possible that the user would fail at this point. This identify a potential usability problem.
Q4) After the action is taken, will users understand the feedback they get? Once the action is taken the display changes to the timed record
mode and shows familiar headings ( start, end, channel, date ). It is reasonable to assume that the user would recognize these as indicating successful completion of the first action.
Example: VCR (Cont.)
81Human-Computer Interaction
Evaluation through user participation
Some of the techniques we have considered so far concentrate on evaluating a design or system through analysis by the designer, or an expert evaluator, rather than testing with actual users.
User participation in evaluation tends to occur in the later stages of development when there is at least a working prototype of the system in place.
82Human-Computer Interaction
Styles of evaluation
There are two distinct evaluation styles: Those performed under laboratory conditions. Those conducted in the work environment or ‘in
the field’.
83Human-Computer Interaction
Laboratory Studies
In the first type of evaluation studies, users are taken out of their normal work environment to take part in controlled tests, often in a specialist usability laboratory.
This approach has a number of benefits and disadvantages.
A well equipped usability laboratory may contain sophisticated audio/visual recording and analysis facilities.
84Human-Computer Interaction
Laboratory Studies (Cont.)
There are some situations where laboratory observation is the only option. e.g. if the system is to be located in a dangerous or remote
location, such as a space station. Some very constrained single user tasks may be
adequately performed in a laboratory. Want to manipulate the context in order to uncover
problems or observe less used procedures. Want to compare alternative designs within a controlled
context.
85Human-Computer Interaction
Field Studies
The second type of evaluation takes the designer or evaluator out into the user’s work environment in order to observe the system in action.
High levels of ambient noise, greater levels of movement and constant interruptions, such as phone calls, all make field observation difficult.
86Human-Computer Interaction
Field studies (Cont.)
The very ‘open’ nature of this situation means that you will observe interactions between systems and between individuals that would have been missed in a laboratory study.
The context is retained and you are seeing the user in his ‘ natural environment ‘.