[hci] week 14. evaluation reporting
TRANSCRIPT
Lecture 14
Evaluation Reporting
Human Computer Interaction / COG3103, 2015 Fall Class hours : Tue 1-3 pm/Thurs 12-1 pm 1 & 3 December
INTRODUCTION
Lecture #14 COG_Human Computer Interaction 3
Figure 17-1 You are here; at reporting, within the evaluation activity in the context of the overall Wheel lifecycle template.
INTRODUCTION
• Importance of Quality Communication and Reporting
– Hornbæk and Frokjær (2005) show the need for usability evaluation
reports that summarize and convey usability information, not just lists of
problem descriptions by themselves.
– What should be contained in the report
• inform the team and project management about the UX problems in the
current design
• persuade them of the need to invest even more in fixing those problems.
Lecture #14 COG_Human Computer Interaction 4
REPORTING INFORMAL SUMMATIVE RESULTS
• What if You Are Required to Produce a Formative Evaluation Report for
Consumption Beyond the Team?
– The informal summative UX results are intended to be used only as a
project management tool and should not be used in public claims.
• What if You Need a Report to Convince the Team to Fix the Problems?
– The need for the rest of your team, including management, to be
convinced of the need to fix UX problems that you have identified in your
UX engineering process could be considered a kind of litmus test for a
lack of teamwork and trust within your organization.
Lecture #14 COG_Human Computer Interaction 5
REPORTING QUALITATIVE FORMATIVE RESULTS
• Common Industry Format (CIF) for Reporting Formal Summative UX Evaluation
Results
– A description of the product
– Goals of the testing
– A description of the number and types of participants
– Tasks used in evaluation
– The experimental design of the test (very important for formal summative studies
because of the need for eliminating any biases and to ensure the results do not suffer
from external, internal, and other validity concerns)
– Evaluation methods used
– Usability measures and data collection methods employed
– Numerical results, including graphical methods of presentation
Lecture #14 COG_Human Computer Interaction 6
REPORTING QUALITATIVE FORMATIVE RESULTS
• Common Industry Format (CIF) for Reporting Qualitative Formative
Results
– Mary Theofanos, Whitney Quesenbery, and others, organized two
workshops in 2005 (Theofanos et al., 2005), these aimed at a CIF for
formative reports (Quesenbery, 2005; Theofanos & Quesenbery, 2005).
– They concluded that requirements for content, format, presentation style,
and level of detail depended heavily on the audience, the business
context, and the evaluation techniques used.
Lecture #14 COG_Human Computer Interaction 7
FORMATIVE REPORTING CONTENT
• Individual Problem Reporting Content
– the problem description
– a best judgment of the causes of the problem in the design
– an estimate of its severity or impact
– suggested solutions
Lecture #14 COG_Human Computer Interaction 8
FORMATIVE REPORTING CONTENT
• Include Video Clips Where Appropriate
– Show video clips of users, made anonymous, encountering critical
incidents if you are giving an oral report or include links in a written report.
• Pay Special Attention to Reporting on Emotional Impact Problems
– Provide a holistic summary of the overall emotional impact on participants.
Report specific positive and negative highlights with examples from
particular episodes or incidents. If possible, try to inspire by comparing
with products and systems having high emotional impact ratings.
• Including Cost-Importance Data
Lecture #14 COG_Human Computer Interaction 9
FORMATIVE REPORTING CONTENT
Lecture #14 COG_Human Computer Interaction 10
Figure 16-4 The relationship of importance and cost in prioritizing which problems to fix first.
FORMATIVE REPORTING CONTENT
Lecture #14 COG_Human Computer Interaction 11
Problem Imp. Solution Cost Prio. Ratio
Prio. Rank
Cuml. Cost Resolution
Did not recognize the “counter” as being for the number of tickets. As a result, user failed to even think about how many tickets he needed.
M Move quantity information and label it
2 M 1 2
User confused by the button label “Submit” to conclude ticket purchasing transaction
4 Change the label wording to “Proceed to Payment”
1 4000 2 3
User confused about “Theatre” on the “Choose a domain” screen. Thought it meant choosing a physical theater (as a venue) rather than the category of theatre arts.
3 Improve the wording to “Theatre Arts”
1 3000 3 4
Unsure of current date and what date he was purchasing tickets for
5 Add current date field and label all dates precisely
2 2500 4 6
Did not like having a “Back” button on second screen since first screen was only a “Welcome”
2 Remove it 1 2000 5 7
Users were concerned about their work being left for others to see
5 Add a timeout feature that clears the screens
3 1667 6 10
Transaction flow for purchasing tickets (group problem; see Table 16-8)
3 Establish a comprehensive and more flexible model of transaction flow and add labeling to explain it.
5 600 7 15
Line of affordability (16 person-hours—2 work days) Did not recognize what geographical area theater information was being displayed for
4 Redesign graphical representation to show search radius
12 333 8 27
Ability to find events hampered by lack of a search capability
4 Design and implement a search function
40 100 9 67
Table 16-11 The Ticket Kiosk System cost-importance table, sorted by priority ratio, with cumulative cost values entered, and the “line of affordability” showing the cutoff for this round of problem fixing
FORMATIVE REPORTING AUDIENCE, NEEDS, GOALS, AND CONTEXT OF USE
• The 2005 UPA Workshop Report on formative evaluation reporting (Theofanos et al., 2005)
– Documenting the process: The author is usually part of the team, and the goal is to document
team process and decision making. The scope of the “team” is left undefined and could be just
the evaluation team or the whole project team, or perhaps even the development organization
as a whole.
– Feeding the process: This is the primary context in our perspective, an integral part of
feedback for iterative redesign. The goal is to inform the team about evaluation results,
problems, and suggested solutions. In this report the author is considered to be related to the
team but not necessarily a member of the team. However, it would seem that the person most
suited as the author would usually be a UX practitioner who is part of the evaluator team and, if
UX practitioners are considered part of the project team, a member of that team, too.
– Informing and persuading: The audience depends on working relationships within the
organization. It could be the evaluator team informing and persuading the designers and
developers (i.e., software implementers) or it could be the whole project team (or part thereof)
informing and persuading management, marketing, and/or the customer or client.
Lecture #14 COG_Human Computer Interaction 12
FORMATIVE REPORTING AUDIENCE, NEEDS, GOALS, AND CONTEXT OF USE
• Introducing UX Engineering to Your Audience
– Engender awareness and appreciation
– Teach concepts
– Sell buy-in
– Present results
• Reporting to Inform Your Project Team
• Reporting to Inform and/or Influence Your Management
• Reporting to Your Customer or Client
Lecture #14 COG_Human Computer Interaction 13
FORMATIVE REPORTING AUDIENCE, NEEDS, GOALS, AND CONTEXT OF USE
• Formative Evaluation Reporting Format and Vocabulary
– Jargon
– Precision and specificity
• Formative Reporting Tone
– Respect feelings: Bridle your acrimony
– Accentuate the positive and avoid blaming
• Formative Reporting over Time
Lecture #14 COG_Human Computer Interaction 14
FORMATIVE REPORTING AUDIENCE, NEEDS, GOALS, AND CONTEXT OF USE
• Problem Report Effectiveness: The Need to Convince and Get Action
with Formative Results
– problem severity: more severe problems are more salient and carry more
weight with designers
– problem frequency: more frequently occurring problems are more likely to
be perceived as “real”
– perceived relevance of problems: designers disagreeing with usability
practitioners on the relevance (similar to “realness”) of problems did not
fix problems that practitioners recommended be fixed
Lecture #14 COG_Human Computer Interaction 15
FORMATIVE REPORTING AUDIENCE, NEEDS, GOALS, AND CONTEXT OF USE
• Reporting on Large Amounts of Qualitative Data
– One possible approach is to use a highly abridged version of the affinity
diagram technique (Chapter 4).
• Your Personal Presence in Formative Reporting
– Do not just write up a report and send it out, hoping that will do the job.
Lecture #14 COG_Human Computer Interaction 16
FORMATIVE REPORTING AUDIENCE, NEEDS, GOALS, AND CONTEXT OF USE
Lecture #14 COG_Human Computer Interaction 17
http://www.adityagujaran.me/portfolio-item/rhomania/
FORMATIVE REPORTING AUDIENCE, NEEDS, GOALS, AND CONTEXT OF USE
Lecture #14 COG_Human Computer Interaction 18
http://www.smashingmagazine.com/2013/12/16/using-brainwriting-for-rapid-idea-generation/
FORMATIVE REPORTING AUDIENCE, NEEDS, GOALS, AND CONTEXT OF USE
Lecture #14 COG_Human Computer Interaction 19
http://people.ischool.berkeley.edu/~sophia.lay/01-portfolio/
Exercise17-1: Formative Evaluation Reporting for Your System
• Goal
– Write a report of the formative UX evaluation you did on the system of your choice.
• Activities
– Report on your informal summative evaluation results using a table showing UX targets, benchmark tasks,
questionnaires, and so on used to gather data, along with target values and observed values.
– Add brief statements about whether or not each UX target was met.
– Write a full report on a selected subset (about half a dozen) of UX problems found in the qualitative part of your
formative UX evaluation. Follow the guidelines in this chapter regarding content, tone, and format, being sure to
include redesign proposals for each problem.
– Report on the results of your cost-importance analysis, including problem resolutions, for all the problems you
reported previously and, if appropriate, some others for context.
• Deliverables : Your formative evaluation report.
• Schedule : We expect this exercise to take about an hour.
Lecture #14 COG_Human Computer Interaction 20
Final Submission Content Table
• Part I. Design
– System Concept Statement
– Social Model
– Persona
– Sketches & Wireframes
– Kickstarter Page + Concept Test Results and Analysis
• Part II. UX Evaluation (Due on 8th December)
– A description of the product
– Goals of the testing
– A description of the number and types of participants
– Tasks used in evaluation [Prototype Links or Captures]
– The experimental design of the test (very important for formal summative studies because of the need for eliminating any
biases and to ensure the results do not suffer from external, internal, and other validity concerns)
– Evaluation methods used
– Usability measures and data collection methods employed
– Numerical results, including graphical methods of presentation
Lecture #14 COG_Human Computer Interaction 21
Guideline
Lecture #14 COG_Human Computer Interaction 22
Report the Evaluation Result
Final Presentation
R1 R2
Due : 6th December
Part I Design + Part II Evaluation Due : 8th December The week of independent study
Submission Due : 11: 59 pm Mon. 6th December
Final Submission
R3
By the Compiling Part I and the Part II, complete your Design and Analysis Report Due : 24th December The end of Semester (yay)