usability issues in a census web survey · usability issues in a census web survey kathleen t....
TRANSCRIPT
Usability Issues in a Census Web Survey
Kathleen T. Ashenfelter, Temika Holland, Victor Quach,
Elizabeth Nichols, & Sabin Lakhe
U.S. Census Bureau
Center for Survey Methodology
Human Factors & Usability Research Group
Background
• The Census Bureau is developing options for an online response option for the 2020 Decennial Census.
• Census Quality Survey (CQS) was conducted about 5 mos. after the 2010 Decennial Census
– Reinterview of Census 2010 Rs- essentially same qs
– conducted in order to estimate measurement error, from a Census Internet q’naire compared to that from a paper q’naire
• We performed usability testing on prototypes of an online form.
2
Usability Testing
• Methods
• Two Rounds of testing were conducted
– First Round- Wire Frame Prototype (n=5)
– Second Round- Fully Functioning Instrument
(n=34)
• When possible, participants were recruited
from complex households.
3
Procedure
• Participants used a concurrent think-aloud
technique while completing the survey
• Also completed background and
satisfaction surveys
• Debriefing interview was conducted
following the session.
4
Metrics
• Efficiency – time to completion
– Avg. time 11 min 6 sec
• Satisfaction – score based on satisfaction
survey
– Avg. Satisfaction 7.97 on a 9-pt scale
• Usability Issues
– High, Medium, and Low Priority
5
Usability Issues
• High Priority: These issues can prevent
respondents from accomplishing their goals.
The user-system interaction is interrupted
and no work can continue.
• Medium Priority: These issues show down
and frustrate the user, but do not necessarily
halt the interaction.
• Low Priority: These issues are minor, but
significant enough to warrant user comments.
6
Issue 1: Logging In (High Priority)
Login screen for the online CQS Instrument
7
Issue 1: Logging In (High Priority)
• 11/34 Participants in Round 2 had difficulty
logging in.
• Many entered 0000s into the Access Code
Fields
– The number was changed to #s after those
symbols worked better for a concurrent
usability test with a nearly identical login
screen.
– Appeared to alleviate the problem
8
Issue 1: Logging In (High Priority)
• The eye-tracking data of the
LOGIN screen showed that most of the participants ignored the instruction text presented on the page.
• Instead, participants looked at the example image for the necessary information on how to log into the survey
9
n=13
Issue 1: Logging in
• After the change, no #s entered
• Perhaps this is the best method (so far) of
presenting an example user ID
• Instructions about mailing materials may
need to be more prominent.
• Ultimately, the Login Screen was changed
before it went into production
10
Logging in
11
Production Login Screen
Issue 2: Ps Failed to Notice Important
Information (High Priority) • During debriefing, 17/34 Ps said they did
not notice the reference date (Jan 1, 2010)
12
Issue 2: Ps Failed to Notice
Important Information (High Priority)
• Gaze opacity screenshot of the first AGE screen with
the name in black (n = 13)
13
Issue 2: Ps Failed to Notice Important
Information (High Priority)
• Ps did not notice the reference date
• Any important information should be
prominently displayed because Ps do not
like to read text.
14
Issue 3:Use of “Help” Links
(Medium Priority)
• Ps did not tend to use the help links
• Only 2 Ps in Round 2 clicked on any help
links
• Possibly not prominent enough and/or Ps
possibly won’t click help no matter how
prominent it is
15
16
CQS Relationship 1
Issue 3:Use of “Help” Links
(Medium Priority)
n=13
Issue 3: Use of Help Links
• If there is information that a P needs to answer a question, put it on the screen and not behind a help link – it won’t be seen much!
– The CQS only asked demographic questions, so most items probably did not warrant the use of help
– Can be an issue when important info like residence rules are only available behind help links
17
Issue 4: Auto-Tab Feature
(Medium/Low Priority Issue)
• Auto-tabbing was removed from the login screen because of problems in testing with other instruments
• However, some Ps had difficulty with entering their access codes correctly
• Perhaps because it was not present for the Login screen, Ps were generally not expecting it for the later Date of Birth Q
– Clicked on the text entry boxes for each field in DOB more than they needed to
18
Auto-Tab Feature
• In some situations, like DOB question,
participants expect auto-tabbing
• It can create issues when calculations are
made based on the entries in the auto-
tabbed boxes (i.e., the wrong age being
calculated from DOB boxes).
19
Issue 5: Aesthetics of the CQS
(Low Priority Issue)
• Several participants gave negative
comments about the aesthetics of the
online CQS.
– During debriefing, when asked, six
participants commented that they did not like
the yellow coloring on the screen.
20
21
Issue 5: Aesthetics of the CQS
(Low Priority Issue) Debriefing:
• Yellow
color
• Outdated
look and
feed
Conclusions/Discussion
22
• Iterative usability testing can catch unanticipated issues with Web surveys before they go live
• Respondents do not necessarily recognize an example user ID, so explicit labels and #s are a good way to go
• If there is information that a P needs to answer a question, behind a help link is not the best place
• Leave time in the design schedule for usability testing! It works!
Thank you!
• Kathleen Ashenfelter, U.S. Census Bureau
• 301-763-4922
23