ehr usability test report of eyecare360 v 2 … · 2 1 executive summary a usability test of...

32
1 EHR Usability Test Report of EyeCare360 v 2.8 Report based on NISTIR 7742 Common Industry Format for Usability Test Reports EyeCare360 Version 2.8 Date of Usability Test: Testing schedule March 15-27, 2019 Date of Report: April 1, 2019 Report Prepared By: Christina Jesser Support Center Manager, EyeCare Partners LLC 636.227.2600 ext 2043 [email protected] 15933 Clayton Rd, Clarkson Valley, MO 63011 Table of Contents Contents 1 EXECUTIVE SUMMARY ..........................................................................................................................................2 2 INTRODUCTION .....................................................................................................................................................4 3 METHOD ............................................................................................................................................................... 4 3.1 PARTICIPANTS………………………………………………………………………………………………………………………4 3.2 STUDY DESIGN………………………………………………………………………………………………………………………5 3.3 TASKS……………………………………………………………………………………………………………………………………6 3.4 PROCEDURE………………………………………………………………………………………………………………………….6 3.5 TEST LOCATION…………………………………………………………………………………………………………………….7 3.6 TEST ENVIRONMENT…………………………………………………………………………………………………………….7 3.7 TEST FORMS AND TOOLS………………………………………………………………………………………………………8 3.8 PARTICIPANT INSTRUCTIONS………………………………………………………………………………………………..8 3.9 USABILITY METRICS……………………………………………………………………………………………………………...8 4 DATA SCORING......................................................................................................................................................9 5 RESULTS .............................................................................................................................................................. 10 5.1 PARTICIPANTS……………………………………………………………………………………………………………………10 5.2 EFFECTIVENESS………………………………………………………………………………………………………………….11 5.3 EFFICIENCY………………………………………………………………………………………………………………………..12 5.4 SATISFACTION……………………………………………………………………………………………………………….…..12 5.5 MAJOR FINDINGS………………………………………………………………………………………………………….…..12 5.6 AREAS FOR IMPROVEMENT………………………………………………………………………………………….……12 5.7 RISK ASSESSMENT………………………………………………………………………………………………………….….13 6 APPENDICES 1: Sample Recruiting Screener 2: Participant Demographics 3: Non-disclosure Agreement (NDA) and Informed Consent Form 4: Example Data Loggers Guide, also referred to as Moderators Guide 5: Safety Enhanced Design Checklist and Safety Enhanced Design Data Log 6: DrFirst, Rcopia Version 4 Usability Report and Safety Enhanced Design Checklist

Upload: others

Post on 20-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

1

EHR Usability Test Report of EyeCare360 v 2.8

Report based on NISTIR 7742 Common Industry Format for Usability Test Reports

EyeCare360 Version 2.8

Date of Usability Test: Testing schedule March 15-27, 2019

Date of Report: April 1, 2019

Report Prepared By: Christina Jesser

Support Center Manager, EyeCare Partners LLC

636.227.2600 ext 2043

[email protected]

15933 Clayton Rd, Clarkson Valley, MO 63011

Table of Contents

Contents 1 EXECUTIVE SUMMARY .......................................................................................................................................... 2

2 INTRODUCTION ..................................................................................................................................................... 4

3 METHOD ............................................................................................................................................................... 4

3.1 PARTICIPANTS………………………………………………………………………………………………………………………4

3.2 STUDY DESIGN………………………………………………………………………………………………………………………5

3.3 TASKS……………………………………………………………………………………………………………………………………6

3.4 PROCEDURE………………………………………………………………………………………………………………………….6

3.5 TEST LOCATION…………………………………………………………………………………………………………………….7

3.6 TEST ENVIRONMENT…………………………………………………………………………………………………………….7

3.7 TEST FORMS AND TOOLS………………………………………………………………………………………………………8

3.8 PARTICIPANT INSTRUCTIONS………………………………………………………………………………………………..8

3.9 USABILITY METRICS……………………………………………………………………………………………………………...8

4 DATA SCORING ...................................................................................................................................................... 9

5 RESULTS .............................................................................................................................................................. 10

5.1 PARTICIPANTS……………………………………………………………………………………………………………………10

5.2 EFFECTIVENESS………………………………………………………………………………………………………………….11

5.3 EFFICIENCY………………………………………………………………………………………………………………………..12

5.4 SATISFACTION……………………………………………………………………………………………………………….…..12

5.5 MAJOR FINDINGS………………………………………………………………………………………………………….…..12

5.6 AREAS FOR IMPROVEMENT………………………………………………………………………………………….……12

5.7 RISK ASSESSMENT………………………………………………………………………………………………………….….13

6 APPENDICES

1: Sample Recruiting Screener

2: Participant Demographics

3: Non-disclosure Agreement (NDA) and Informed Consent Form

4: Example Data Loggers Guide, also referred to as Moderators Guide

5: Safety Enhanced Design Checklist and Safety Enhanced Design Data Log

6: DrFirst, Rcopia Version 4 Usability Report and Safety Enhanced Design Checklist

Page 2: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

2

1 EXECUTIVE SUMMARY

A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various

times by EyeCare Partners LLC Support Center by Christina Jesser, administrator, and Ashley

Jones, data logger. The purpose of this test was to test and validate the usability of the current user

interface, and provide evidence of usability in the EHR Under Test (EHRUT).

During the usability test, users with various positions in the healthcare industry including practice

administrator, front desk scheduler, certified ophthalmic technician, physician scribe, hospital laboratory

technicians, and physician coding specialist, matching the target demographic criteria, served as participants

and used the EHRUT in simulated, but representative tasks.

The study collected performance data on tasks conducted on 43 tasks typically conducted in an EHR. Tasks

included record/change/display lab order via CPOE, record/change/display imaging order via CPOE,

record/change/display patient demographic information including preferred language, date of birth, sex,

race, ethnicity, gender identity and sexual orientation, record/change a problem on the problem list, display

active/historic problem list, record/change a medication on the medication list, display active/historic

medication list, record/change a medication allergy on the medication allergy list, display active/historic

medication allergy list, add/change/view/trigger Clinical Decision Support interventions/resources on a

variety of data elements, record/change/view UDI for implantable device, incorporate CCDA and perform

medication, medication allergy, and problems in CCDA with existing patient records, and to generate a CCDA

with reconciled information. An integrated partner was used for 12 remaining tasks involving E-prescribing;

including formulary/non-formulary drug recognition, drug-drug/drug-allergy interactions,

create/change/cancel/refill prescription, receive refill status notification and request/receive medication

history information. The partner site is accessible via single sign on in the EHR, website login, and smart

phone app. Information reported for these tasks was received from the integrated partners Safety Enhanced

Design report (People, 2017) attached in Appendix 2.

During the 60 to 120-minute one-on-one usability test, each participant was greeted by the

administrator and asked to review and sign an informed consent/release form (included in

Appendix 3.) They were instructed that they could withdraw at any time. Participants did not have

prior experience with the EHR. The participants were given step by step instructions of actions

to perform to complete tasks necessary for the purpose of usability testing. The administrator

introduced the test, and instructed participants to complete a series of tasks using the instruction

guides provided for EyeCare360 version2.8. During the testing, the administrator timed the test

Page 3: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

3

and, along with the data logger, recorded user performance data on paper and electronically. The

administrator did not give the participant assistance in how to complete the task.

Participant screens, head shots, and audio were recorded for subsequent analysis. The following

types of data were collected for each participant:

Number of tasks successfully completed within the allotted time without assistance

Time to complete the tasks

Number and types of errors

Path deviations

Participant verbalizations

Participant’s satisfaction ratings of the system

All participant data was de-identified. No correspondence could be made from the identity of

the participant to the data recorded in this report. Following the conclusion of the testing,

participants were asked to answer post-test questions and were compensated with a gift card

for their time. Various recommended metrics, in accordance with the examples set forth in

the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health

Records, were used to evaluate the usability of the EHRUT. Results of the testing can be found

in the 2015 Ed SED Checklist and 2015 SED Data Log attached in Appendix 5.

The results from the System Usability Scale scored the subjective satisfaction with the system

based on performance with these tasks to be above satisfactory.1 (Tullis, 2008)

In addition to the performance data, the following qualitative observations were made:

Major Findings

o Each participant commented during post-test questioning that the system is

“easy to use”.

o Each participant commented that entering the UDI for the Implantable

Device was time consuming.

o Several participants commented that Clinical Decision Support should be an

administrative function and not part of standard usability.

Areas for Improvement

o While Clinical Decision Support creation was the task that was deemed most

difficult by participants, this is an administrative function within our

organization. Due to limited user access, no modifications will be made to

the process.

Page 4: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

4

o Our organization will examine the need to add a barcode scanning solution

to the system for Implantable Device UDI recording. The frequency that our

organization will use this function will be compared to the investment to

implement the ability to record UDI via barcode.

2 INTRODUCTION The EHRUT tested for this study was EyeCare360, version 2.8. Designed to present medical

information to healthcare providers in optometric settings, the EHRUT consists of various aspects

including but not limited to: scheduling, sales, ordering, inventory management, and a fully

functioning EMR. The usability testing attempted to represent realistic exercises and conditions.

The purpose of this study was to test and validate the usability of the current user interface, and

provide evidence of usability in the EHR Under Test (EHRUT). To this end, measures of effectiveness,

efficiency and user satisfaction, such as time to complete task, deviations from task guides, and errors

made during the task were captured during the usability testing.

Eyecare360 v 2.8 was designed for use in an optometric practice. The features of this software are

focused toward benefiting an optometric practice including its patients, doctors, opticians,

technicians, support groups, and management.

3 METHOD 3.1 PARTICIPANTS

A total of 14 participants were recruited for this study. Due to circumstances, such as scheduling

conflicts or lack of preparedness for testing, only 10 participants were tested in the EHRUT.

Participants in the test were derived from a variety of backgrounds. Participants were recruited

by Christina Jesser and were compensated by gift card for their time. In addition, the participants

had never used EyeCare360 prior to test participation, were not associated with the testing or

supplier organization, and have no direct connection to the development of the software.

Participants were given a modified orientation and level of training versus what our actual end

users would have received.

For the test purposes, end user characteristics were identified and translated into a recruitment

screener used to solicit potential participants. An example of the screener is provided in

Appendix 1.

Recruited participants had a mix of backgrounds and demographic characteristics conforming to

the recruitment screener. The following is a table (Table 1) of participants by characteristics

including demographics, professional experience, computing experience, and user needs for

assistive technology. Participant names were replaced with Participant IDs so that an individual’s

data cannot be tied back to individual identities.

Page 5: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

5

Participant Identifier

Gender Age Education Occupation/role

Professional Experience

Computer Experience

Prior EHR Product Experience

Assistive Technology Needs

02 F 40-49 College Grad Practice

Administrator 15+ Years

EHR access and research

1 No

03 F 40-49 Some College Front Desk Scheduler

5 Years EHR access and

research 2 No

04 M 30-39 High School Grad Certified

Ophthalmic Tech 13 Years

EHR access and research

3 No

05 M 40-49 Some College Scribe 10+ Years EHR access and

research 2 No

06 F 20-29 Some College Certified

Ophthalmic Tech 5 Years

EHR access and research

1 No

07 F 20-29 Some College Scribe 7 Years EHR access and

research 3 No

08 F 20-29 College Grad Certified

Ophthalmic Tech < 1 Year

EHR access and research

1 No

09 F 20-29 Some College Certified

Ophthalmic Tech 7 Years

EHR access and research

2 No

10 F 20-29 College Grad Lab Tech/Nursing

Student 10 Years

EHR access and research

2 No

11 F 30-39 Some College Physician

Coder/Scribe 2 Years

EHR access and research

3 No

Table 1 – Participant Characteristics

Fourteen potential participant interviews were performed, fourteen individuals were recruited

and ten individuals participated in the usability test. The four participants that were screened but

that did not participate were either unavailable during test times or were cancelled due to lack of

preparedness in the test environment.

Participants were scheduled for 120 minute sessions with 15 minutes before the beginning of the

session and 5 minutes between each session for training/debrief by the administrator or data

logger. The system was reset to proper test conditions during these breaks. A spreadsheet was

used to keep track of the participant schedule, and include each participant demographic as

provided by the recruiter.

3.2 STUDY DESIGN

The overall objective of this test was to uncover areas where the application performed well- that

is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet

the needs of the participants. The data from this test may serve as a baseline for future tests

with an updated version of the same EHR and/or comparison not only with other EHRs provided

the same tasks are used. In short, this testing serves as both a means to record or benchmark

current usability, but also to identify where improvements must be made.

During the usability test, participants interacted with one EHR, EyeCare360. Each participant

used the system on the same laptop and was provided the same instructions. The system was

evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and

analyzed for each participant:

Number of tasks completed without assistance

Time to complete the tasks

Number and types of errors

Path deviations

Participant verbalizations (comments)

Page 6: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

6

Participant satisfaction rating of the system

3.3 TASKS

A number of tasks were constructed that would be realistic and representative of the kinds of

activities a user might do with this EHR, including:

Creating/changing/viewing a patient demographic and recording information, such as,

name, address, date of birth, sex, identified gender, sexual orientation, race and ethnicity

Scheduling a patient appointment

Recording vital signs including blood pressure and pulse

Recording/changing/viewing medications and medication allergies

Creating/changing/viewing imaging and diagnostic tests and the results

Sending/receiving of CCDA messages with other providers

Creating/reconciling medication orders, medication allergies and problems

Additional tasks that are not frequently used in an optometry practice environment but were

selected based on test criteria from The Office of the National Coordinator is Health Information

Technology and proctoring guidelines from Drummond Group, Inc, in reference to Test Procedure

170.314.g3 Safety Enhanced Design, including:

Creating/changing/viewing of laboratory and radiology tests and recording the results

Creating/changing/viewing of Clinical Decision Support rules

Recording/viewing Implantable Device UDI

Results of tasks and testing by the integrated e-prescribing vendor, DrFirst, are included in the

report to prove usability in additional areas, including:

Triggering drug-drug, drug-allergy interactions while entering a new medication order

Creating/changing/canceling a prescription

Receiving a fill status notification

Refilling a prescription

Requesting and receiving of medication history information

This test data was obtained from the DrFirst.com Rcopia V3 Usability Report dated June 20, 2017.

This integrated partner can be accessed by our users via single sign on in EyeCare360, online at

web.drfirst.com with use of secure, registered username and password, or by accessing the

iPrescribe app on a smart device using a secure, registered username and password.

3.4 PROCEDURES

Upon arrival, participants were greeted, and their identity was verified and matched with the

name on the participant schedule. Participants were then assigned a participant ID. Each

participant reviewed and signed an informed consent release form and non-disclosure

agreement (Appendix 3). The test administrator witnessed the participants signature.

To ensure that the test ran smoothly, two staff members participated in this test: the usability

administrator and the data logger. The usability testing staff conducting the test were

Page 7: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

7

experienced usability practitioners with 11-15 years of experience with varying educational

background and a firm understanding of the Safety Enhanced Design requirements.

The administrator moderated the session including administering instructions and tasks. The

administrator also monitored task times, obtained post-task rating data, and took notes on

participant comments. A second person served as the data logger and took notes on task

success, path deviations, number and type of errors, and comments.

Participants were instructed to perform tasks:

As quickly as possible while making as few errors and deviations as possible

Without assistance, administrators were allowed to give immaterial guidance and

clarification on tasks, but not instruction on use

Without using a “think aloud” technique

For each task, the participants were given a written copy of the task. Task timing began once the

participant started the recording the task session. The task time stopped once the participant

paused the recording after each task had been successfully completed or in the event that the

participant was unable to complete the task. Scoring is discussed in Section 3.9.

Following the session, the data logger asked the participant a series of post-test questions

(System Usability Scale), compensated them with a gift card for their time, and thanked each

individual for their participation. Participant demographic information, task success rate, time on

task, errors, deviations, verbal responses, and post-test questionnaire responses were recorded

into a spreadsheet. Participants signed a receipt and acknowledgement form indicating they had

received compensation.

3.5 TEST LOCATION

The testing area included a waiting area, a test area with laptop with an attached 20-inch

monitor for ease of viewing by the participant, and recording devices controlled by the

administrator. Only the participant, administrator and data logger were in the test area. To

ensure that the environment was comfortable for users, noise levels were kept to a minimum

with an ambient temperature within a normal range. All of the safety instruction and evacuation

procedures were valid, in place, and visible to the participants.

3.6 TEST ENVIRONMENT

The EHRUT would typically be used in an Optometry office. In this instance the testing was

conducted in a business office. For testing the laptop used was a Dell Latitude 3480 running

Windows 10. The participants used a mouse and keyboard while interacting with the EHRUT.

The EHRUT connected to a 20-inch monitor, set at 1680 x 1050 resolution and system adjusted

color settings. The application was set up by the administrator according to vendor

documentation provided by EyeCare Partners, LLC Software Development, which described the

system setup and preparation. The application itself was running on a network provided by

EyeCare Partners, LLC using a test database on a LAN connection. Technically, the system

performance was representative to what actual users would experience in a field

implementation. Additionally, participants were instructed not to change any default system

settings.

Page 8: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

8

3.7 TEST FORMS AND TOOLS

During the usability test, various documents and instructions were used, including:

Informed Consent

Data Loggers Guide

System Usability Questionnaire

Usability Test Tasks Training Guide

Non-Disclosure Agreement

Examples of these documents can be found in attached Appendices respectively. The Data

Loggers Guide was devised so as to be able to capture required data.

The participant’s interaction with the EHRUT was captured and recorded digitally with screen

capture software running on the test machine. A video/web camera and microphone built into

the laptop recorded participant’s facial expression and reaction synced with the screen capture.

The test results were transmitted electronically to a nearby office so the results could be

confirmed and recorded by the Data Logger.

3.8 PARTICIPANT INSTRUCTIONS

The administrator reads the following instructions aloud to each test participant (also, see the full

Data Loggers Guide in Appendix 4)

Thank you for participating in the study. Your input is very important. Our session today

will last 80-120 minutes. During that time, you will use an instance of an electronic health

record. I will ask you to complete tasks using the system and then answer questions. You

should complete the tasks as quickly as possible making as few errors as possible. Please

try to complete the tasks on your own using the instructions provided before testing

begins. Please note that we are not testing you, we are testing the system, therefore if

you have difficulty, all this means is that something needs to be improved in the system. I

will be here to provide specific instruction, but I am unable to instruct you or provide you

help in how to use the application once testing begins.

Overall, we are interested in how easy or difficult this system is to use, what in it would be

useful to you, and how we could improve it. All of the information that you provide will be

kept confidential and your name will not be associated with your comments at any time.

Should you find it necessary, you may withdraw from the testing at any time.

Following the instructions, participants were shown the EHR and as their first task, were given 5

minutes to explore the system and make comments. Once this task was complete, the

administrator gave the following instructions:

For each task I will show you how to perform the task, answer any questions you may

have, then provide you with a task guide and begin to record the session. At that point,

please perform the task and pause the recording once you have completed the task. I

would like to ask that you not talk aloud or verbalize while you are doing the tasks. I will

ask you your impressions of the system and about the task once you are done.

3.9 USABILITY METRICS

According to the NIST GUIDE to the Processes Approach for Improving the Usability of Electronic

Health Records, EHRs should support a process that provides a high level of usability for all users.

Page 9: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

9

The goal is for users to interact with the system effectively, efficiently, and with an acceptable

level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were

captured during the usability testing. The goals of the test were to assess:

Effectiveness of EyeCare360 by measuring participant success rates and errors

Efficiency of EyeCare360 by measuring the average task time and path deviations

Satisfaction with EyeCare360 by measuring ease of use ratings

4 DATA SCORING The following table (Table 2) details how tasks were scored, errors evaluated, and the time data analyzed.

(Tullis, 2008)

MEASURES RATIONALE AND SCORING

EFFECTIVENESS: Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis.

The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage.

Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency. Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Data Loggers Guide were defined by taking multiple measures of optimal performance and multiplying by a factor of 1.25 to allow some time buffer because the participants are not trained for expert performance. Thus, if expert, optimal performance on a task was 60 seconds then allotted task time performance was 60 * 1.25 seconds. This ratio was aggregated across tasks and reported with mean and variance scores.

EFFECTIVENESS: Task Failure

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as a “Failure.” No task times were taken for errors. The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors. This was also expressed as the mean number of failed tasks per participant.

EFFICIENCY: Task Deviations

The participant’s path through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation.

EFFICIENCY: Task Time

Each task was timed from when the administrator said “Recording” until the participant paused the recording. If he or she failed to pause the recording, the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

SATISFACTION: Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 Very Difficult to 5 Very Easy. (Tullis T. a., 2006) This data was averaged across participants. Common convention is that average ratings for systems judged easy to use should be 3.3 or above. System Usability Scores were based on a scale of 1 Very Unusable to 100 Very Usable.

Table 2 Details of how observed data were scored

Page 10: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

10

5 RESULTS 5.1 DATA ANALYSIS AND REPORTING

The results of the usability test were calculated according to the methods specified in the

Usability Metrics section above. Participants who failed to follow session and task instructions

had their data excluded from the analyses. Participant 3 had several instances of failure to follow

the instructions and deviated path due to not following the provided task guide and what the

participant later stated to be “looking for shortcuts”. The participant results were only excluded

for tasks related to 170.315.a.9 Clinical Decision Support as the administrator continued to

instruct the participant that they must adhere to and follow the provided task instructions. The

tasks that were excluded were not included in the mean or averages of the results.

The usability testing results for DrFirst Rcopia (Table 3) and EyeCare360 (Table 4) are detailed

below. The results should be seen in light of the objectives and goals outlined in Section 3.2

STUDY DESIGN. The data should yield actionable results that, if corrected, yield material, positive

impact on user performance.

Task Mean Task

Time (mm:ss)

SD (mm:ss)

Completion Rate (%)

Mean # Path Deviations

SD Mean Task Satisfaction

SD

1 Med Allergy List – Add Allergy 1:47 0:54 100% 0.70 0.64 4.30 1.00

2.1 – 2.2 Drug-Allergy Interaction Check 0:56 0:37 100% 0.50 0.67 4.10 1.22

3.1 Electronic Prescribing 2:49 1:10 90% 1.60 1.74 3.50 1.20

4.1 – 4.2 Electronic Rx Cancel/Change 3:21 1:35 900% 2.00 1.84 3.30 1.10

5.1 – 5.2 Electronic Rx – Renew/Refill 0:42 0:21 100% 0.20 0.60 4.80 0.60

6.1 – 6.4 Problem List CDS Intervention 0:35 0:12 100% 0.00 0.00 5.00 0.00

7.1 Medication List; Medication Stop 0:22 0:09 100% 0.00 0.00 4.80 0.60

8.1 – 8.4 Medication Order; Drug/Drug Interaction

2:23 0:21 100% 0.10 0.30 4.70 0.46

9 Medication Allergy List – Remove Allergy 0:38 0:19 100% 0.00 0.00 4.50 0.67

Table 3 DrFirst Rcopia V 4 Testing Results

Page 11: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

11

Task Mean

Task Time (Seconds)

SD (Secon

ds)

Completion Rate (%)

Mean # Path Deviations

SD Mean Task Satisfaction

SD

10.1 Record Lab Order via CPOE 0:00:40 0:00:07 100% 0.10 0.32 4.7 0.48

10.2 Change Lab Order via CPOE 0:00:12 0:00:05 100% 0.00 0.00 4.8 0.42

10.3 Display Lab Order via CPOE 0:00:03 0:00:02 100% 0.00 0.00 4.8 0.42

11.1 Record Imaging Order via CPOE 0:00:36 0:00:09 100% 0.00 0.00 4.7 0.48

11.2 Change Imaging Order via CPOE 0:00:12 0:00:07 100% 0.10 0.32 4.8 0.42

11.3 Display Changed Imaging Order via CPOE

0:00:03 0:00:02 100% 0.00 0.00 5.0 0.00

12.1 Record Required Demographic Information

0:00:53 0:00:09 100% 0.40 0.70 4.8 0.42

12.2 Change Required Demographic Information

0:00:22 0:00:07 100% 0.00 0.00 4.8 0.42

12.3 Display Changed Demographic Information

0:00:04 0:00:02 100% 0.00 0.00 5.0 0.00

6.1 Record a Problem to the Problem List 0:00:35 0:00:04 100% 0.20 0.42 4.7 0.48

6.2 Change a Problem in the Problem List 0:00:11 0:00:05 100% 0.00 0.00 4.8 0.42

6.3 Display the Active Problem List 0:00:03 0:00:01 100% 0.10 0.32 5.0 0.00

6.4 Display the Historical Problem List 0:00:02 0:00:00 100% 0.10 0.32 5.0 0.00

13.1 Record a Medication to the Medication List

0:00:35 0:00:11 100% 0.10 0.32 4.8 0.42

13.2 Change a Medication on the Medication List

0:00:13 0:00:10 100% 0.00 0.00 4.8 0.42

13.3 Display the Active Medication List 0:00:03 0:00:02 100% 0.00 0.00 5.0 0.00

13.4 Display the Historical Medication List 0:00:02 0:00:01 100% 0.00 0.00 5.0 0.00

14.1 Record a Medication Allergy 0:00:24 0:00:06 100% 0.10 0.32 4.8 0.42

14.2 Change a Medication Allergy 0:00:10 0:00:04 100% 0.00 0.00 4.8 0.42

14.3 Display the Active Medication Allergy List

0:00:02 0:00:00 100% 0.00 0.00 5.0 0.00

14.4 Display the Historical Medication Allergy List

0:00:02 0:00:01 100% 0.00 0.00 5.0 0.00

15.1 Add CDS Interventions/Resources: Problem List, Medication List, Medication Allergy List, Demographic, Laboratory Test, Vital Sign and a Combination of at least 2 of the above elements

0:09:28 0:00:38 90% 0.10 0.33 2.9 0.78

15.2 Trigger the CDS Intervention/Resources Added

0:01:00 0:00:07 90% 0.10 0.35 4.2 0.67

15.3 View Intervention/Resource via Infobutton

0:00:12 0:00:07 90% 0.00 0.00 4.6 0.73

15.4 Trigger CDS Interventions/Resources by Incorporating Transition of Care/Referral Summary

0:00:12 0:00:02 90% 0.00 0.00 4.3 0.71

15.5 Access Attributes for one of the Triggered CDS Interventions/Resources: Bibliographic, Citation, Developer, Funding Source, Release/Revision Date

0:00:05 0:00:01 90% 0.10 0.33 4.4 0.73

16.1 Record UDI 0:00:43 0:00:15 100% 0.00 0.00 4.2 0.63

16.2 Change UDI Status 0:00:26 0:00:05 100% 0.00 0.00 4.5 0.53

16.3 Access UDI, Device Description, identifiers, Attributes

0:00:05 0:00:01 100% 0.00 0.00 4.9 0.32

17.1 Incorporate CCDA and Reconcile Medication/Medication Allergy/Problems with Information Currently in Patient Record

0:01:14 0:00:14 100% 0.20 0.63 4.0 0.67

17.2 Generate a new CCDA with Reconciled Data

0:00:53 0:00:16 100% 0.10 0.32 4.0 0.67

Table 4 EyeCare360 V 2.8 Testing Results

Page 12: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

12

As Table 4 shows, relative to the optimal performance standards as defined by EyeCare Partners,

LLC, participant performance in the EyeCare360 usability test was very satisfactory. The overall

average task completion rate was ninety-nine percent (99%).

5.2 EFFECTIVENESS

There were various deviations by participants while performing the tasks. However, each error

was unique and is attributed to each participant’s interpretation of training and instruction.

5.3 EFFICIENCY

When analyzing time spent on each task by each participant, time spent on a task was typically

within reason with few exceptions. An experienced user of the software performed the tasks to

develop the control times. These times were used in the formula for target task completion time

(T=S x 1.25) that were used to analyze if the time it took participants to complete the tasks

deemed the task more difficult for a less experienced user. The administrator of the tests did not

assign time constraints to the tasks during EHRUT. A majority of the tasks were completed in a

reasonable amount of time in comparison to the target task completion times.

5.4 SATISFACTION

Participants were asked to score each task on a scale of 1-5 where 1 was very difficult and 5 was

very easy. The results indicate that the mean was consistently above the 3.3 mean necessary to

deem the software “usable. Participants were also asked to offer an overall rating on a scale of 1

to 100 at the end of the test session. 1 showed the participant felt the software was very un-

usable. 100 indicated the participant felt the software was very usable. The mean score was

87.5. According to the SUS (System Usability Scale), scores below 60 represent systems with

poor usability. Systems with SUS scores over 80 would be considered to be above average. (Tullis

T. &., 2008)

5.5 MAJOR FINDINGS

In analyzing the results of the EHRUT, Clinical Decision Support creation was deemed to be the

most difficult task for participants. Keeping in mind that this task will remain an administrative

function performed solely by the EyeCare Partners, LLC Software Development team, the

developers found it would be imprudent to change the way this function operates. Changes to

this part of the system will have no benefit to our targeted users.

5.6 AREAS FOR IMPROVEMENT

Findings suggest that the software as a whole is user friendly and functional, with the exception

of Clinical Decision Support creation. Participants commented on EyeCare360’s functionality and

ease of use.

“I like the color scheme on the scheduler. It gives an overall impression of the day at a

quick glance.”

“The developers did a great job writing this program. It is very user friendly.”

“There were many things I liked in EyeCare360. It is different than what I am used to

using but I think I could learn to use it very easily.”

“I really like the Launchpad. It keeps the desktop clean and the windows less cluttered.”

Page 13: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

13

The only negative feedback returned was regarding Clinical Decision Support creation and that

we did not have a barcode scanner for UDI entry.

5.7 RISK ASSESSMENT

The risk assessment for DrFirst Rcopia (Table 5) and EyeCare360 (Table 6) presents the list of

tasks with associated risk to the patient if the user were to make an error. The risk is measured

using the following scale: NONE – An error by the user will result in no risk to the patient

LOW – An error by the user will result in little risk to the patient

MODERATE – An error by the user will result in some risk to the patient

HIGH – An error by the user will result in high risk to the patient

Task Description Risk Status

1 Med Allergy List – Add Allergy Add a patient reported allergy NONE

2.1 – 2.2 Drug-Allergy Interaction Check Prescribe a drug, react to alerts NONE

3.1 Electronic Prescribing Electronically Prescribe and view interaction LOW

4.1 – 4.2 Electronic Rx Cancel/Change Modify Electronic Prescription LOW

5.1 – 5.2 Electronic Rx – Renew/Refill Renew/Refill Electronic RX NONE

6.1 – 6.4 Problem List CDS Intervention Add and update Problems NONE

7.1 Medication List; Medication Stop View Medication List; Stop a medicine NONE

8.1 – 8.4 Medication Order; Drug/Drug Interaction Order new medications, view and react to alerts NONE

9 Medication Allergy List – Remove Allergy Remove a patient allergy NONE

Table 5 DrFirst Rcopia V 4 Risk Assessment recorded from DrFirst_Rcopia_4_SED_2018Edition_SummativeStudy.pdf

Task Risk Status

10.1 Record Lab Order via CPOE LOW

10.2 Change Lab Order via CPOE LOW

10.3 Display Lab Order via CPOE NONE

11.1 Record Imaging Order via CPOE LOW

11.2 Change Imaging Order via CPOE LOW

11.3 Display Changed Imaging Order via CPOE NONE

12.1 Record Required Demographic Information NONE

12.2 Change Required Demographic Information NONE

12.3 Display Changed Demographic Information NONE

6.1 Record a Problem to the Problem List NONE

6.2 Change a Problem in the Problem List NONE

6.3 Display the Active Problem List NONE

6.4 Display the Historical Problem List NONE

13.1 Record a Medication to the Medication List LOW

13.2 Change a Medication on the Medication List LOW

13.3 Display the Active Medication List NONE

13.4 Display the Historical Medication List NONE

14.1 Record a Medication Allergy LOW

14.2 Change a Medication Allergy LOW

14.3 Display the Active Medication Allergy List NONE

14.4 Display the Historical Medication Allergy List NONE

15.1 Add CDS Interventions/Resources: Problem List, Medication List, Medication Allergy List, Demographic, Laboratory Test, Vital Sign and a Combination of at least 2 of the above elements

LOW

15.2 Trigger the CDS Intervention/Resources Added NONE

15.3 View Intervention/Resource via Infobutton NONE

15.4 Trigger CDS Interventions/Resources by Incorporating Transition of Care/Referral Summary NONE

15.5 Access Attributes for one of the Triggered CDS Interventions/Resources: Bibliographic, Citation, Developer, Funding Source, Release/Revision Date

NONE

16.1 Record UDI LOW

16.2 Change UDI Status LOW

16.3 Access UDI, Device Description, identifiers, Attributes NONE

17.1 Incorporate CCDA and Reconcile Medication/Medication Allergy/Problems with Information Currently in Patient Record

NONE

17.2 Generate a new CCDA with Reconciled Data NONE

Table 6 EyeCare360 V 2.8 Risk Assessment

Page 14: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

14

6 APPENDICES

The following appendices include supplemental data for this usability test report.

1: Sample Recruiting Screener

2: Participant Demographics

3: Non-disclosure Agreement (NDA) and Informed Consent Form

4: Example Data Loggers Guide, also referred to as Moderators Guide

5: Safety Enhanced Design Checklist and Safety Enhanced Design Data Log

6: DrFirst, Rcopia Version 4 Usability Report and Safety Enhanced Design Checklist

Page 15: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

15

APPENDIX 1: SAMPLE RECRUITING SCREENER

Script:

Hello, my name is ___________. I am with EyeCare Partners, LLC. I am recruiting individuals to

participate in a usability study for an electronic health record software. I would like to ask you a few

question to determine if you qualify and would like to participate. This should only take a few minutes

of your time. This is strictly for research purposes. If you are interested and qualify for the study you

will be paid to participate.

Can I ask you a few questions?

1. Are you male of female?

2. Have you participated in a focus group or usability test in the past 6 months?

3. Do you, or does anyone in your home, work in marketing research, usability research, and web

design? [if so terminate]

4. Do you, or does anyone in your home, have commercial or research interest in electronic health

record software of consulting company? [if so terminate]

5. Which of the following best describes your age [20-29; 30-39;40-49; 20-59; 60-69; 70 and older]

6. Which of the following best describes your race or ethnic group [Caucasian, Asian, Black/African-

American, Latino/a or Hispanic, etc]

7. Do you require any assistive technologies to use a computer? [if so describe]

Professional Demographics

8. What is your current/most recent position and title

a. Optometrist : _____________

b. Physician: Specialty ___________

c. Certified Technician : Specialty ____________

d. Administrative Staff _____________

e. Other [specify]________________

9. How long have you held this position?

10. Describe your work location and environment? [private practice, health system, government

clinic]

11. Which of the following describes your highest level of education? [high school graduate/GED,

some college, college graduate (RN, BSN), postgraduate (MD/PhD), other

Computer Expertise

12. Besides reading email, what professional activities do you do on the computer? [e.g., access

EHR, research; reading news; shopping/banking; digital pictures; programming/work processing,

etc.] [if no computer use at all terminate]

13. About how many hours per week do you spend on the computer?

14. What computer platform do you usually use? [Mac, Windows]

Page 16: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

16

15. What Internet browser do you usually use? [ie, Firefox, IE, Chrome, etc]

16. In the last month, how often have you used an electronic health record?

17. How many years have you used electronic health records?

18. How many EHR’s do you use or are you familiar with?

19. How does your work environment patient records?

a. On paper

b. Some paper, some electronic

c. All electronic

Contact Information If each person matches your qualifications, continue

Those are all the questions I have for you. Your background matches the people we’re looking for and

for your participation you will be paid $50 via gift card.

Would you be able to participate on [date time]? (If so collect contact information)

May I get your contact information?

Name of participant:

Address:

City, State, Zip:

Daytime phone number:

Evening phone number:

Alternate [cell] phone number:

Email address:

Before your session starts, we will ask you to sign a release form allowing us to record your session. The

recording will only be used internally for further study needs. Will you consent to being recorded? This

study will take place at 217 Clarkson Rd, Ballwin, Mo 63011. I will confirm your appointment a couple of

days before your session and provide you with directions to our office. What time is the best time to

reach you?

Page 17: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

17

Appendix 2: PARTICIPANTS DEMOGRAPHICS [de-identified]

The following is a high level overview of the participants in this study.

Gender

Men 2 Women 8 Total 10 Occupation

RN/BSN 0 Optometrist 0 Admin Staff 2 Specialist tech 5 Scribe 3 Total 10 Facility Use of EHR______________________________________________________________________ All Paper 0 Some paper, some Electronic 3 All Electronic 7 Total 10

Page 18: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

18

Appendix 3: NON-DISCLOSURE AGREEMENT AND INFORMED CONSENT FORM

Non-Disclosure Agreement THIS AGREEMENT is entered into as of ___________, 2019, between _______________________(‘the participant”) and the testing organization EyeCare Partners, LLC located at 15933 Clayton Rd, Suite 210, Clarkson Valley, Mo, 63011. The Participant acknowledges his or her voluntary participation in today’s usability study may bring the Participant into possession of Confidential Information. The term “Confidential Information” means all technical and commercial information of a proprietary or confidential nature which is disclosed by Eyecare Partners, LLC, or otherwise acquired by the Participant, in the course of today’s study. By way of illustration, by not limitation, Confidential Information includes trade secrets, processes, formulae, data, know-how, products, designs, drawings, computer aided design files and other computer files, computer software, ideas, improvements, inventions, training methods and materials, marketing techniques, plans, strategies, budgets, financial information, or forecasts. Any information the Participant acquires relating to this products during this study is confidential and proprietary to EyeCare Partners, LLC and is being disclosed solely for the purposes of the Participant’s participation in today’s usability study. By signing this form the Participant acknowledges that s/he will receive monetary compensation for feedback and will not disclose this confidential information obtained today to anyone else or any other organizations. Participant’s printed name: ___________________________________________________________ Signature: _______________________________________________ Date: ____________________

Page 19: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

19

Informed Consent EyeCare Partners, LLC would like to thank you for participating in this study. The purpose of this study is to evaluate an electronic health records system. If you decide to participate, you will be asked to perform several tasks using the prototype and five your feedback. The study will last about 45 minutes. At the conclusion of the test, you will be compensated for your time. Agreement I understand and agree that as a voluntary participant in the present study conducted by EyeCare Partners, LLC I am free to withdraw consent or discontinue participation at any time. I understand the recording may be copied and used by EyeCare Partners, LLC without further permission. I understand and agree that the purpose of this study is to make software applications more useful and usable in the future. I understand and agree that the data collected from his study may be shared with outside of EyeCare Partners, LLC and EyeCare Partners, LLC client. I understand and agree that data confidentiality is assured, because only de-identified data – i.e., identification numbers not names – will be used in analysis and reporting of the results. I agree to immediately raise any concerns or areas of discomfort with the study administrator. I understand that I can leave at any time. Please check one of the following:

YES, I have read the above statement and agree to be a participant. NO, I choose not to participate in this study.

Signature: _________________________________________ Date: _____________________________

Page 20: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

20

Appendix 4: Data Logger’s Guide

EHRUT Usability Test Administrator: _______________________ Data Logger: ________________________ Date: _________________________ Time: _____________ Participant #____________ Location: _ EyeCare Partners, LLC, IT Department, 217 Clarkson Road, Ellisville, Mo, 63011_ Prior to testing

Confirm schedule with Participants

Ensure EHRUT lab environment is running properly

Ensure lab and data recording equipment is running properly Prior to each participant

Reset application

Start session recordings with tool Prior to each task

Reset application to starting point for next task After each participant

End session recordings with tool After all testing

Back up all video and data files

GENDER AGE EDUCATION OCCUPATION/ROLE PROFESSIONAL EXP COMPUTER EXP PRIOR EHR PRODUCT EXP ASSISTIVE TECHNOLOGY NEEDS

Page 21: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

21

Orientation

Thank you for participating in this study. Our session today will last 45 minutes. During that time, you will take a look at an electronic health record system. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or difficult) this system is to use, what in it would be useful to you, and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Do not you do anything more than asked. If you get lost or have difficulty, I cannot answer or help you with anything to do with the system itself. Please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. I did not have any involvement of its creation, so please be honest with your opinions. The product you will be using today is a test version of EyeCare 360 PM and EMR. Some of the data may not make sense as it is test data. We are recording this session today. All of the information that you provide will be kept confidential and your name will be kept confidential and your name will not be associated with your comments at any time. Do you have any questions or concerns? Preliminary Questions (5 minutes)

What is your job title /appointment? How long have you been working in this role? What are some of your main responsibilities? Tell me about your experience with electronic health records.

Page 22: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

22

First Impressions (timed)

This is the application you will be working with. Have you ever heard of it? ___Yes ___No If so, tell me what you know about it.

Show test participant the EHRUT

Please don’t click on anything just yet. What do you notice? What are you able to do here?

Please be specific. Notes/Comments:

Page 23: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

23

Criteria 170.315: Medication List/Medication Allergy List

Take the participant to the starting point for the task. Medication list and medication allergy list involves entry, editing, and accessing a patients medications and medication allergies. Success:

Easily completed Completed with difficulty or help :: describe below Not Completed

o Comments: Task Time: _______ seconds Was Optimal Path Followed? YES NO

Correct Minor Deviations:: Describe below Major Errors :: Describe below

o Comments

Observed Errors: Rating: Overall, this task was: ______( rate 1 – 5) Show participant written scale: “Very Difficult” (1) to “Very Easy” (5) Administrator/Note taker Comments:

Page 24: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

24

Criteria 170.315: Medication List/Medication Allergy List

Take the participant to the starting point for the task. Medication list and medication allergy list involves entry, editing, and accessing a patients medications and medication allergies. Success:

Easily completed Completed with difficulty or help :: describe below Not Completed

o Comments: Task Time: _______ seconds Was Optimal Path Followed? YES NO

Correct Minor Deviations:: Describe below Major Errors :: Describe below

o Comments

Observed Errors: Rating: Overall, this task was: ______( rate 1 – 5) Show participant written scale: “Very Difficult” (1) to “Very Easy” (5) Administrator/Note taker Comments:

Page 25: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

25

Criteria 170.315: Medication List/Medication Allergy List

Take the participant to the starting point for the task. Medication list and medication allergy list involves entry, editing, and accessing a patients medications and medication allergies. Success:

Easily completed Completed with difficulty or help :: describe below Not Completed

o Comments: Task Time: _______ seconds Was Optimal Path Followed? YES NO

Correct Minor Deviations:: Describe below Major Errors :: Describe below

o Comments

Observed Errors: Rating: Overall, this task was: ______( rate 1 – 5) Show participant written scale: “Very Difficult” (1) to “Very Easy” (5) Administrator/Note taker Comments:

Page 26: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

26

Criteria 170.315: Medication List/Medication Allergy List

Take the participant to the starting point for the task. Medication list and medication allergy list involves entry, editing, and accessing a patients medications and medication allergies. Success:

Easily completed Completed with difficulty or help :: describe below Not Completed

o Comments: Task Time: _______ seconds Was Optimal Path Followed? YES NO

Correct Minor Deviations:: Describe below Major Errors :: Describe below

o Comments

Observed Errors: Rating: Overall, this task was: ______( rate 1 – 5) Show participant written scale: “Very Difficult” (1) to “Very Easy” (5) Administrator/Note taker Comments:

Page 27: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

27

Criteria 170.315: Medication List/Medication Allergy List

Take the participant to the starting point for the task. Medication list and medication allergy list involves entry, editing, and accessing a patients medications and medication allergies. Success:

Easily completed Completed with difficulty or help :: describe below Not Completed

o Comments: Task Time: _______ seconds Was Optimal Path Followed? YES NO

Correct Minor Deviations:: Describe below Major Errors :: Describe below

o Comments

Observed Errors: Rating: Overall, this task was: ______( rate 1 – 5) Show participant written scale: “Very Difficult” (1) to “Very Easy” (5) Administrator/Note taker Comments:

Page 28: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

28

Criteria 170.315: Medication List/Medication Allergy List

Take the participant to the starting point for the task. Medication list and medication allergy list involves entry, editing, and accessing a patients medications and medication allergies. Success:

Easily completed Completed with difficulty or help :: describe below Not Completed

o Comments: Task Time: _______ seconds Was Optimal Path Followed? YES NO

Correct Minor Deviations:: Describe below Major Errors :: Describe below

o Comments

Observed Errors: Rating: Overall, this task was: ______( rate 1 – 5) Show participant written scale: “Very Difficult” (1) to “Very Easy” (5) Administrator/Note taker Comments:

Page 29: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

29

Criteria 170.315: Medication List/Medication Allergy List

Take the participant to the starting point for the task. Medication list and medication allergy list involves entry, editing, and accessing a patients medications and medication allergies. Success:

Easily completed Completed with difficulty or help :: describe below Not Completed

o Comments: Task Time: _______ seconds Was Optimal Path Followed? YES NO

Correct Minor Deviations:: Describe below Major Errors :: Describe below

o Comments

Observed Errors: Rating: Overall, this task was: ______( rate 1 – 5) Show participant written scale: “Very Difficult” (1) to “Very Easy” (5) Administrator/Note taker Comments:

Page 30: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

30

Criteria 170.315: Medication List/Medication Allergy List

Take the participant to the starting point for the task. Medication list and medication allergy list involves entry, editing, and accessing a patients medications and medication allergies. Success:

Easily completed Completed with difficulty or help :: describe below Not Completed

o Comments: Task Time: _______ seconds Was Optimal Path Followed? YES NO

Correct Minor Deviations:: Describe below Major Errors :: Describe below

o Comments

Observed Errors: Rating: Overall, this task was: ______( rate 1 – 5) Show participant written scale: “Very Difficult” (1) to “Very Easy” (5) Administrator/Note taker Comments:

Page 31: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

31

Criteria 170.315: Medication List/Medication Allergy List

Take the participant to the starting point for the task. Medication list and medication allergy list involves entry, editing, and accessing a patients medications and medication allergies. Success:

Easily completed Completed with difficulty or help :: describe below Not Completed

o Comments: Task Time: _______ seconds Was Optimal Path Followed? YES NO

Correct Minor Deviations:: Describe below Major Errors :: Describe below

o Comments

Observed Errors: Rating: Overall, this task was: ______( rate 1 – 5) Show participant written scale: “Very Difficult” (1) to “Very Easy” (5) Administrator/Note taker Comments:

Page 32: EHR Usability Test Report of EyeCare360 v 2 … · 2 1 EXECUTIVE SUMMARY A usability test of EyeCare360 version 2.8 was conducted from March 15 – 27, 2019 at various times by EyeCare

32

Final Questions

What was your overall impression of this system? What aspects of the system did you like the most? What aspects of the system did you like the least? Were there any features that you were surprised to see? What features did you expect to encounter but did not see? That is, is there anything missing in this application? How does this system compare to other systems you have used? Would you recommend this system to your colleagues? YES NO Why or why not? On a scale of 1-100, 1 being very unusable and 100 being very usable, how

would you rate this system?

Do you have any additional comments before we end the session?