health it software and systems – part 4: application of ... · sherm eagles, software cpr 37 ....

66
HITWG03N006 Proposed AAMI Provisional American National Standard AAMI HIT1000-4 (PS):20XX (Committee Draft) 12 December 2019 Health IT Software and Systems – Part 4: Application of human factors engineering Association for the Advancement of Medical Instrumentation This is a working draft of AAMI HIT1000-4. Abstract: Describes how to apply human factors engineering to HIT system and software user interface throughout the HIT product lifecycle to ensure such systems are reasonably safe and effective. Keywords: human factors engineering, human factors, usability, health software, health IT, safety, effectiveness, security, health IT system, sociotechnical system, use error

Upload: others

Post on 04-Aug-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

HITWG03N006

Proposed AAMI Provisional American National Standard AAMI HIT1000-4 (PS):20XX (Committee Draft)

12 December 2019

Health IT Software and Systems – Part 4: Application of human factors engineering

Association for the Advancement of Medical Instrumentation

This is a working draft of AAMI HIT1000-4.

Abstract: Describes how to apply human factors engineering to HIT system and software user interface throughout the HIT product lifecycle to ensure such systems are reasonably safe and effective.

Keywords: human factors engineering, human factors, usability, health software, health IT, safety, effectiveness, security, health IT system, sociotechnical system, use error

Page 2: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

ii © 2019 Association for the Advancement of Medical Instrumentation ■ AAMI xx:xxxx

Association for the Advancement of Medical Instrumentation

901 N. Glebe Rd., Suite 300

Arlington, VA 22203

www.aami.org

© 2019 by the Association for the Advancement of Medical Instrumentation

All Rights Reserved

Publication, reproduction, photocopying, storage, or transmission, electronically or otherwise, of all or any part of this document without the prior written permission of the Association for the Advancement of Medical Instrumentation is strictly prohibited by law. It is illegal under federal law (17 U.S.C. § 101, et seq.) to make copies of all or any part of this document (whether internally or externally) without the prior written permission of the Association for the Advancement of Medical Instrumentation. Violators risk legal action, including civil and criminal penalties, and damages of $100,000 per offense. For permission regarding the use of all or any part of this document, complete the reprint request form at www.aami.org or contact AAMI at 901 N. Glebe Road, Suite 300, Arlington, VA 22203. Phone: +1-703-525-4890; Fax: +1-703-525-1424.

Printed in the United States of America

Page 3: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

Health IT Software and Systems – Part 4: Application of human factors engineering

WD stage

Warning

This document is not a standard. It is distributed for review and comment. It is subject to change without notice and may not be referred to as a standard. It may not be distributed beyond the membership of the AAMI Health IT initiative without explicit permission from AAMI ([email protected]).

Recipients of this draft are invited to submit, with their comments, notification of any relevant patent rights of which they are aware and to provide supporting documentation.

Please see the AAMI (ANSI) Patent Policy < http://s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/Standards/AAMI_ANSI_Patent_Policy.pdf > and the AAMI Antitrust Statement < http://s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/Standards/AAMIAntitrust.pdf > for more information.

Page 4: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

4

Contents

Committee Representation ........................................................................................................................................................ 1

Foreword ..................................................................................................................................................................................... 3

Introduction ................................................................................................................................................................................ 4

1 Scope .................................................................................................................................................................................. 5

2 Normative References ....................................................................................................................................................... 5

3 Terms and definitions ........................................................................................................................................................ 5

4 Context and concepts ........................................................................................................................................................ 7

5 Development stage human factors engineering ............................................................................................................ 13

5.1 General Development stage human factors engineering process ............................................................................ 13 5.2 Development stage optimizing the user experience through user-centered design and evaluation methods ........... 14 5.3 Development stage conducting user research .......................................................................................................... 17 5.4 Development stage managing use-related risk ......................................................................................................... 24 5.5 Development stage designing the user interface ...................................................................................................... 30 5.6 Development stage evaluating the user interface ..................................................................................................... 31

6 Acquisition stage human factors engineering ............................................................................................................... 37

6.1 Procurement ............................................................................................................................................................. 37 6.2 Guidance and good practice for procurement ........................................................................................................... 37

7 Integration stage human factors engineering ................................................................................................................ 38

7.1 General Integration stage human factors engineering .............................................................................................. 38 7.2 Integration stage of human factors engineering process .......................................................................................... 38 7.3 Guidance and good practice for integration stage of human factors engineering process ........................................ 39

8 Implementation stage human factors engineering ........................................................................................................ 39

8.1 General Implementation stage - human factors engineering .................................................................................... 39 8.2 Implementation stage - human factors engineering process ..................................................................................... 40 8.3 Guidance and good practice for implementation stage - human factors engineering process .................................. 40

9 Operational use in the clinical setting stage - human factors engineering ................................................................. 41

9.1 General operational use in the clinical setting stage - human factors engineering ................................................... 41 9.2 Operational use in the clinical setting stage - human factors engineering process ................................................... 42 9.3 Guidance and good practice for Operational stage - human factors engineering process ........................................ 43

10 Decommissioning stage - human factors engineering ............................................................................................ 43

10.1 General decommissioning stage - human factors engineering ................................................................................. 43 10.2 Decommissioning stage - human factors engineering process ................................................................................. 44 10.3 Guidance and good practice for Decommissioning stage - human factors engineering process .............................. 44

Annex A Summary of Requirements ...................................................................................................................................... 45

Page 5: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

5

Annex B Considerations for conducting validation usability tests ............................................................................... 50

B.1 Overview of validation usability tests ........................................................................................................................ 50 B.2 Recommendations for conducting successful validation usability tests (See IEC 62366-1 and IEC 62366-2) .......... 51 B.3 Additional considerations for usability testing ........................................................................................................... 54

Annex C Sample expert review findings .......................................................................................................................... 57

C.1 Sample Finding (Opportunity for Improvement): Placement of referral function ....................................................... 57 C.2 Sample Finding (Opportunity for improvement): Use of ellipses ............................................................................... 57 C.3 Sample Finding (Strength): Allergy Alerts ................................................................................................................. 58

Bibliography and cited references .......................................................................................................................................... 59

Page 6: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser
Page 7: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

1

Committee Representation 1

At the time the document was published the AAMI HIT/WG03 – HIT Usability Committee had the following members: 2

Chair: Zach Hettinger, MedStar Institute National Center for Human Factors Engineering in Healthcare 3 Michael Wiklund, Emergo by UL – Human Factors Research & Design 4

5

Members: Pat Baird, Philips 6 Janey Barnes, User-View Inc 7

Rick Botney, Oregon Health & Science University 8 David Brick, Independent expert 9 Neil Gardner, Independent expert 10 Daryle Jean Gardner-Bonneau, Bonneau and Associates 11 Richard Gibson, Association of Medical Directors of Information Systems 12 Zach Hettinger, MedStar Institute National Center for Human Factors Engineering in Healthcare 13 Tina Mirchi, DexCom Inc 14 Elaine Plonski, Independent expert 15 Jody Polk, ICU Medical Inc 16 Harsha Sripuram, Boston Scientific Corporation 17 Sharon Stanford, American Dental Association 18 Sandra Stuart, Kaiser Foundation Health Plan/Hospitals 19 Matt Weinger, Vanderbilt University Medical Center 20 Nanck Wilck, Department of Veterans Affairs National Center for Patient Safety 21 Michael Wiklund, Emergo by UL – Human Factors Research & Design 22 Nicole Zuk, Siemens Healthineers 23

24

Alternates: Shilo Anders, Vanderbilt University Medical Center 25 Elisabeth George, Philips 26

Jeremy Jensen, Boston Scientific Corporation 27 Susumu Nozawa, Siemens Healthineers 28 Simon Psavko, DexCom Inc 29 Mark Segal, Independent Expert 30 Walter Suarez, Kaiser Foundation Health Plan/Hospitals 31 Karen Zimmer, Independent expert 32

33

Liaisons: Darren Dahlin, Cantel 34 Holly Chico Drake, DexCom 35 Sherm Eagles, Software CPR 36 David Osborn, Philips 37 Robert Phillips, Siemens Healthineers 38 Beth Pumo, Kaiser Permanente 39

40

At the time the document was published, the AAMI – Health IT Committee had the following members: 41

Chair: David Classen, University of Utah Hospital and Clinics 42 Mark Segal, Independent expert 43

44

Members: Pat Baird, Philips 45 Rick Botney, Oregon Health & Science University 46 Jane Carrington, University of Arizona - College of Nursing 47 Gerard Castro, The Joint Commission 48 Richard De La Cruz, Silver Lake Group Inc 49 Sherman Eagles, SoftwareCPR 50 Neil Gardner, Independent expert 51 Richard Gibson, Association of Medical Directors of Information Systems 52

Page 8: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

2

Peter Goldschmidt, World Development Group Inc 53 William Greenrose, Deloitte Risk and Financial Advisory 54 Zach Hettinger, MedStar Institute National Center for Human Factors Engineering in Healthcare 55 Erich Murrell, US Army Medical Material Agency 56 Vidya Murthy, MedCrypt 57 Susumu Nozawa, Siemens Healthineers 58 Mike Powers, Christiana Cara Health Service 59 Susan Regli, University of Pennsylvania Health System 60 Rebecca Schnall, Columbia University 61 Jeanie Scott, Stratton VA Medical Center 62 Elliot Sloane, Center for Healthcare Information Research and Policy 63 Jeffrey Smith, American Medical Informatics Association 64 John Snyder, US Dept of Health & Human Services 65 Harsha Sripuram, Boston Scientific Corporation 66 Sharon Stanford, American Dental Association 67 Sandra Stuart, Kaiser Foundation Health Plan/Hospitals 68 Matt Weinger, Vanderbilt University Medical Center 69 Michael Wiklund, Emergo by UL – Human Factors Research & Design 70 Michael Willingham, 98point6 Inc 71 Marisa Wilson, Alliance for Nursing Informatics (ANI) 72 Ben Nhi Xavier, ICU Medical Inc 73 Karen Zimmer, Independent expert 74

75

Alternates: Elisabeth George, Philips 76 Andrew Gettinger, US Dept of Health & Human Services 77 Jeremy Jensen, Boston Scientific Corporation 78 Brian Pate, Software CPR 79 Walter Suarez, Kaiser Foundation Health Plan/Hospitals 80 S. Sree Vivek, ICU Medical Inc 81 Nicole Zuk, Siemens Healthineers 82

83

Liaisons: Patty Krantz-Zuppan, Medtronic Inc Campus 84 David Osborn, Philips 85 Robert Phillips, Siemens Healthineers 86 Frank Pokrop, Sotera Wireless Inc 87 Beth Pumo, Kaiser Permanente 88 Elizabeth Quill, American Society of Anesthesiologists 89 Diane Warner, American Health Information Management Association 90 Diane Wurzburger, GE Healthcare 91

92

NOTE—Participation by federal agency representatives in the development of the document does not constitute endorsement by the 93 federal government or any of its agencies. 94

95

96

97

Page 9: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

3

Foreword 98

This standard, HIT1000-4, Safety and effectiveness of health IT software and systems—Part 4: Application of human 99 factors engineering, is published as a provisional National Standard—a standard for trial use—and must be processed 100 as a full American National Standard within 2 years of its publication date (see front cover). 101

This document has been processed in accordance with ANSI’s requirements for a Provisional American National 102 Standard. The Provisional Standards will undergo the standards development process set forth in AAMI’s accredited 103 procedures. This Provisional ANS or pertinent Provisional Amendment(s) shall be withdrawn on or before DD MONTH 104 YYYY. 105

Comments on this standard or suggestions for improving it are invited. Comments and suggested revisions should be 106 sent to Technical Programs, AAMI, 901 N. Glebe Road, Suite 300, Arlington, VA 22203 or by email to 107 [email protected]. 108

109

Page 10: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

4

Introduction 110

Note: This introduction does not contain provisions of AAMI HIT1000-4 (PS), Health IT Software and 111 Systems – Part 4: Application of human factors, but it does provide important information about 112 the development and intended use of the document. 113

114 The vital role that standards for quality systems, risk management, and usability can play in enhancing the safety and effectiveness of 115 health IT has been recognized both in the United States1 and globally.2 Safety and effectiveness are properties of heath IT software or 116 systems that directly impact patient outcomes; quality systems, human factors (usability) engineering, and risk management are tools, in 117 turn, that support safety and effectiveness. 118

This triad (quality systems, risk management, and usability) is used successfully in many high-risk industries, including medical devices, 119 nuclear engineering, and aeronautics. Existing general standards addressing this triad (e.g., ISO 9001:2015 or ISO 31000:2018), 120 however, are organization-focused and do not sufficiently address the complexities of the health IT world, where responsibility for safety 121 and efficacy is shared among many different organizations and stakeholders across the product lifecycle. Standards for regulated 122 healthcare technology (e.g., medical device standards, such as ANSI/AAMI HE75:2009 or ANSI/AAMI/IEC 62366-1:2015) provide very 123 useful concepts and direction but are developed to support regulatory compliance; applying them in the health IT sector is difficult as the 124 regulatory status of components and systems (especially health software) and the regulatory responsibilities of stakeholders vary by 125 product and jurisdiction. 126

There is a need for standards specific to health IT that address the full range of stakeholders across the health IT lifecycle. The AAMI 127 HIT1000 series is intended to address this need. The standards supplement existing quality management systems, risk management 128 frameworks, and human factors engineering processes. They also facilitate shared responsibility among stakeholders by identifying 129 specific roles and defining the responsibilities needed to ensure health IT safety and quality. The HIT1000 series will provide a common 130 framework for cooperation and collaboration among the many organizations and individuals that develop, implement, and use health IT. 131

The AAMI HIT1000 series (Safety and effectiveness of health IT software and systems) is envisioned to initially comprise the following 132 parts: 133

‒ Part 1: Fundamental concepts, principles, and requirements 134 ‒ Part 2: Application of quality systems principles and practices 135 ‒ Part 3: Application of risk management 136 ‒ Part 4: Application of human factors engineering. 137

138 139 This part of the series, HIT1000-4, describes how to apply human factors engineering to HIT systems and software user interfaces 140 throughout the HIT product lifecycle to ensure such systems are reasonably safe and effective. Throughout this document, the term “HIT 141 system” refers to both HIT systems and software. 142

143

1 See especially, the April 2014 FDASIA Health IT Report: Proposed Strategy and Recommendations for a Risk-Based Framework. 2 See Report of the ISO/TC 215-IEC/SC 62 Joint Task Force on Health Software (available from International Organization for Standardization ISO/TC 215 or IEC/SC 62A, Geneva). International Standards for health IT are under development in a Joint ISO/IEC Joint Working Group (ISO/TC 215-IEC/SC 62A Joint Working Group 7). AAMI manages this Joint Working Group and is ensuring coordination between the international work and the development of the HIT1000 series. The International Standards will take several years to complete and may be considered for adoption at that time, if they may reflect the specific needs of the U.S. health IT sector. (See note 4 below.)

Page 11: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

5

AAMI HIT1000 - Health IT Software and HIT Systems – Part 4: 144

Application of human factors engineering 145

1 Scope 146

1.1 This part of the HIT1000 standard describes an approach to developing and validating a health IT system’s user interface so that 147 such systems are safe and effective. The intent is to promote good development practices without being overly prescriptive. 148

1.2 In line with the approach of the HIT1000 series, this document describes how to apply a user-centered design process throughout 149 the entire health IT lifecycle. As such, this guidance covers the development, acquisition, integration, implementation, and operational 150 use lifecycle stages. Additionally, the guidance includes a section describing usability considerations for health IT system replacement 151 and decommissioning. 152

1.3 Given the importance of human factors engineering during the development process and the iterative nature of health IT system 153 design and development, the development section generally serves to outline the user-centered design process in detail, while the 154 following sections (e.g., integration, implementation) include references to certain sub-steps within the user-centered design process 155 (e.g., usability evaluations, risk management). 156

2 Normative References 157

The following documents, in whole or in part, are normatively referenced in this document and are indispensable for its application. For 158 references with dates, only the edition cited applies. For undated references, the latest edition of the referenced document (including any 159 amendments) applies. 160

AAMI HIT1000-1 (PS):2018, Safety and effectiveness of health IT software and systems – Part 1: Fundamental concepts, principles 161 and requirements 162

3 Terms and definitions 163

3.1 164 decision support 165 Provides health care providers and patients with knowledge and person-specific information, intelligently filtered or presented at 166 appropriate times, to enhance health and health care 167

[Source: FDASIA Health IT Report, 2014] 168 169 3.2 170 effectiveness 171 Accuracy and completeness with which users achieve specified goals 172

Note: In the context of this standard, the “specified goal” is enabling the intended healthcare diagnosis, treatment, and care 173 in order to improve patient outcomes. 174

[Adapted from ISO 9241-11:2018] 175 176 3.3 177 efficiency 178 Resources expended in relation to effectiveness 179

[Adapted from ISO 9241-11:2018] 180 181

Note: Additional terms and definitions may be found in AAMI HIT1000-1 (PS), Safety and effectiveness of health IT 182 software and systems – Part 1: Fundamental concepts, principles, and requirements. 183

3.4 184

Page 12: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

6

ethnographic studies 185 Observational research of people behaving in and interacting with their natural use environment. 186

3.5 187 human factors 188 Human factors engineering (HFE) is the application of knowledge about human capabilities (physical, sensory, emotional, and intellectual) 189 and limitations to the design and development of tools, devices, systems, environments, and organizations. 190 191 [Adapted from AAMI/ANSI HE75:2009] 192

3.6 193 usability 194 Extent to which a product or system can be used by intended users to achieve their goals with effectiveness, efficiency, and satisfaction 195 in the intended contexts of use. 196

Note: All aspects of usability, including effectiveness, efficiency, and user satisfaction, can potentially affect safety. 197

198 [Note adapted from ISO 9241-11:2018 and ANSI/AAMI/IEC 62366-1:2015] 199 200 3.7 201 usability testing 202 (1) user testing conducted with representative end users to obtain direct information and observations about how people will use the 203 system, whether they encounter problems, and how they deal with those problems. (2) Procedure to assess usability and to determine 204 whether usability goal have been achieved. 205

Note: Usability tests can be performed in a laboratory setting, in a simulated environment, or in the environment of intended 206 use. 207

208 [Source: ANSI/AAMI HE75:2009] 209 210 3.8 211 use error 212 User action or lack of user action while using the health IT software or system that leads to a different result than that intended by the 213 developer or expected by the user 214

Note 1 Use errors include the inability of the user to complete a task. 215

Note 2 Use errors can result from a mismatch between the characteristics of the user, user interface, task, or use environment. 216

Note 3 A user might be aware or unaware that a use error has occurred. 217

Note 4 An unexpected physiological response of the patient is not by itself considered use error. 218

219 [Adapted from ANSI/AAMI/IEC 62366-1:2015] 220

3.9 221 user interface 222 Means by which the user and the health IT software and system interact 223

[Adapted from ANSI/AAMI/IEC 62366-1:2015] 224 225

Page 13: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

7

4 Context and concepts 226

For the purposes of this document, the context and concepts in HIT1000-1 apply. 227

The establishment and maintenance of a human factors engineering process is essential in ensuring that a health IT product does not 228 compromise patient safety. It is an activity that requires a collaborative, frequent and open dialogue between the various roles across the 229 lifecycle, from its initial design to post-deployment monitoring. 230

Note: The principles below (adapted from HIT1000-1) summarize best practices for applying human factors 231 engineering to HIT software and health IT system development and implementation. These principles are 232 adapted from NIST GCR 15-996 – Technical Basis for User Interface Design of Health IT (see Wiklund et al. 233 2015). 234

The human factors engineering process should be applied in parallel to a use-related risk management process3, which details and 235 analyzes related risks, ensuring that the product does not enable users to commit a safety-related use error or that the risk associated 236 with such use errors is mitigated to the greatest possible extent. Developers should convene a dedicated, multidisciplinary team to 237 consider, eliminate, or mitigate sources of use-related risk. Key components of the human factors engineering process include: 238

• determining user needs; 239 • identifying opportunities for improved efficiency and effectiveness; 240 • identifying potential use errors; 241 • determining use-related risks; 242 • establishing user interface requirements; 243 • designing the user interface; 244 • conducting formative usability tests; 245 • conducting summative (i.e., validation) usability tests. 246

Developers should establish and maintain a diverse user base, enabling them to obtain rapid feedback throughout the design and 247 development process, as well as to identify and address usability issues that occur after implementation. Organizations should engage 248 the user base to participate in regular user research activities and identify a core group of the intended users representing diverse 249 demographic characteristics (e.g., small community hospital, large research hospital, outpatient clinic, etc.) and clinical specialties. 250

Considering typical and optimal user workflows and clinicians’ mental models of frequent, urgent, and critical tasks will improve product 251 efficiency, usability, and user satisfaction. As such, developers should collaborate with representative users to draft user profiles and use 252 cases to better understand users, tasks, and workflows. Additionally, further investigating users’ typical workflows will help develop an 253 understanding of the cognitive requirements of particular tasks, users’ mental models of particular tasks, and the nature of user 254 collaboration and interaction. 255

Conducting a variety of human factors engineering (i.e., user-centered design) activities helps developers form a multi-faceted 256 understanding of user interactions with the product. Organizations should plan human factors engineering activities that provide relevant 257 and appropriate input during particular phases of the design and development process, as well as at key junctures throughout the health 258 IT system lifecycle (e.g., shortly after implementation and after major system updates). 259

Conducting multiple formative tests in the early stages of the design process, and using test findings to inform product development, 260 facilitates safe and usable health IT system design. Organizations should allocate resources to formative usability testing throughout the 261 product development lifecycle, focusing on phases in which study findings can still inform health IT system design and design changes 262 are less costly (i.e., relative to later-stage testing). Finally, summative evaluations (also known as HFE validations) should be conducted 263 before implementation to ensure that the product is safe and effective for use by the intended users in the intended use environments. 264

Table 1, repeated from HIT1000-1 for convenience, outlines roles and responsibilities as relevant for the health IT system lifecycle. 265

3 AAMI HIT1000-3 (PS):2019, Safety and effectiveness of health IT software and systems – Part 3: Application of risk management

Page 14: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

8

Table 1 – Lifecycle roles and responsibilities 266

Top Management Group of people who direct and control an organization and thus have overall responsibility for a health IT System and are accountable for its performance

Business Owner The healthcare organization procuring, using, and decommissioning HIT software or health IT systems and accountable for its overall safety and effectiveness within the context of the healthcare sociotechnical ecosystem

Developer Role responsible for the design, development, and maintenance of the HIT software

Note: This would include any actor that modifies, customizes, or creates features, functionality, or user experience attributes of HIT software or systems.

Integrator Role responsible for the technical installation, configuration, and integration of HIT software or health IT systems with the other technology being used by the healthcare organization

Implementer Role responsible for the clinical installation (including workflow alignment, training, etc.) of the HIT software or health IT systems in the clinical setting (an Implementer may be the Business Owner, Developer, or Integrator)

Operator Role responsible for keeping the HIT software or health IT system operational (and/or may be the implementers for a managed service)

User Persons interacting with (i.e., operating or handling) the HIT software or system, which may include, for example, consumers in the case of personal health records

267

The roles and responsibilities related to human factors engineering are shown below in Table 2. At each stage of the HIT software and 268 system lifecycle, the role that assumes the ownership for human factors engineering is identified. 269

Table 2 – Health IT software lifecycle stages, roles, and activities involved in HFE 270

Lifecycle stage Lifecycle Activities

Step definition and HFE activities needed during the step

Role(s) involved in HFE

Design & Development

Overarching responsibility

Design and development are processes using resources to transform requirements (inputs) into characteristics or specifications (outputs) for HIT products. Organizations shall follow an internal process (e.g., plan, standard operating procedure) to ensure Human Factors is applied and communicated effectively during the software lifecycle.

Developer

Concept/ scope

Conceiving, imagining, and specifying the initial design of the aesthetics and primary functions of the software. Defining the clinical scope, intended use environment and how the HIT product is to be used while identifying potential failure scenarios.

Developer, User

Page 15: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

9

Requirement analysis

A requirement is a need, expectation, or obligation. It can be stated or implied by an organization, its customers, or other interested parties. Organizations shall identify requirements for HIT systems based on user research activities.

Developer

Task analysis Process in which all potential User interactions with the software are analyzed as a means to identify potential use errors, particularly those with the potential to cause significant harm. The task analysis shall be incorporated into the risk analysis.

Developer

Design A design is concerned with how user requirements will be met and how a problem/s is to be resolved, inclusive of aesthetics and navigation. Organizations shall conduct comprehensive risk analysis to identify the user interface design requirements to be implemented in the HIT system’s design.

Developer

Development The design is transformed into an HIT product. Throughout the process, the organization shall ensure risks are mitigated through the HIT system’s design and/or implementation.

Developer

Formative Evaluation

Conduct formative usability evaluations throughout the development processes and use the results to improve the product’s design/efficacy.

Developer

Verification The output of the development step is reviewed, inspected, or tested to establish and document that it correctly implements the requirements. Safety requirements are tested for effectiveness.

Developer

Summative Evaluation

Conducting summative evaluations (also known as HFE Validation) on the production-equivalent system

Developer

Delivery A release is a specific version of the product, that is made available by distribution to owners or Implementers for a particular purpose.

Developer

Transition point from Developer to Business Owner

At this transition, the Developer provides information to the Business Owner that is sufficient for the Business Owner to determine that the HIT product meets the organization’s needs. The Safety Assurance Case report (i.e., summative usability test report) provided by the Developer demonstrates the product can be used safely and effectively by its intended Users in the intended use environment. It is the Business Owner’s responsibility to then make the correct determination about the level of risk and effort required to ensure safe use of the HIT product when implemented within their local socio-technical environment. The Business Owner then factors that determination into their procurement decision.

Acquisition Procurement Defining requirements and acquiring a solution to meet the organization’s needs through an available product or engaging an organization for the production of bespoke or in-house developed products.

Business Owner, Developer

Page 16: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

10

Transition point from Business Owner to Integrator

At this transition, the Business Owner provides the Integrator with (1) the planned context of use of the HIT software product in the specific socio-technical ecosystem and (2) any known requirements from the acquisition stage for configuration or customization of the HIT software, including (but not limited to) training, and monitoring of the integrated HIT system. The Safety Assurance Case report (i.e., summative usability test report) provided by the Developer serves as input to the Integrator’s risk management process.

Integration Installation

Software conformance testing and certification may also be included in the integration step, either as a first or pre-installation step. Usability parameters for successful integration of the HIT software in the health IT system are defined.

The stage risk owner during integration is the Integrator. The Developer, Implementer and Operator are also involved in usability.

Configuration Configuring the HIT software and other supporting components of the health IT system to address the organization’s specific requirements. Human factors engineering is performed to ensure safety of the configuration.

Integrator, Developer, Business Owner, Operator

Customization Modifications or additions to the HIT software or other components of the health IT system that require customized coding (as they cannot be addressed through configuration). Human factors engineering is performed to identify and control any risk introduced during customization.

Integrator, Developer

Integration Connection of the HIT software with the other health IT system components (e.g., to allow for data exchange or validation). Human factors engineering is performed on the resulting health IT system after the integration of the HIT software.

Integrator

Data extraction and transformation

Transforming and loading source data into the appropriate tables in the health IT system. Human factors engineering is performed to consider the potential risk in transforming and loading the source data.

Integrator

Integration testing Testing the configuration, integration, or interfaces between components of the health IT system (e.g., between different components, such as the operating system, file system, and hardware, as well as interfaces with other health IT systems with which the HIT software needs to communicate). Usability controls are tested for effectiveness and to ensure any new risks that are identified are mitigated.

Integrator

Transition point from Integrator to Implementer

At this transition the Integrator provides the Implementer additional information in the Safety Assurance Case report about any hazards that were identified during integration, including those that may have emerged during configuration and customization.

Page 17: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

11

4 The Joint Commission, National Patient Safety Goals Effective January 2019, NPSG.02.03.01.

Assumptions, mitigation strategies, and evidence or logic for adequacy of mitigations are also provided. Any hazards that are expected to be mitigated during implementation (e.g., through user training) are identified.

This stage balances the usability needs of the Operators as configuration and customization decisions are made and occurs when the Business Owner provides the planned context of use of the HIT software in the HIT system and healthcare sociotechnical eco-system, and any known configuration or customization of the HIT software, training of Operators or Users, or special testing and monitoring of the integrated HIT system to the Integrator.

Implementation

Workflow assessment and optimization

Assessing the current clinical and business workflow and identifying how the new HIT software should be optimally used to meet each affected organizational unit’s objectives.

The stage risk owner during implementation is the Implementer.

Decision support Confirming that decision support rules in the system, where present, align with clinical best practices and expectations for the targeted organizational environment(s).

Implementer, Operator

Data quality Ensuring that the system facilitates accurate data capture, storage, interpretation and communication of patient information in a timely manner.4 Usability scenarios are identified that could result in safety risk and risk is managed accordingly.

Implementer, Operator

Change management and training

Preparing the end user environment for accepting the work process changes and supporting Users in utilizing the new system safely and effectively. The human factors engineering process identifies scenarios that could result in safety risk and manages the risk to ensure it is minimized and acceptable.

Implementer, Operator

Pre-roll-out testing. Implementing the system in a pre‐production test environment so that end users can do a final test of all functions of the system using ‘real world’ scenarios. The human factors engineering process identifies scenarios that could result in safety risk and ensures risks are appropriately managed before the first pilot or production roll-outs begin.

Implementer, Operator

Pilot or limited production roll-out

Implementing the system in a small number of user production environments to assess and ensure the system’s readiness. The human factors engineering process ensures that the aggregate level of risk is acceptable prior to the first roll-out and that risks and benefits are monitored, optimized and re-evaluated throughout this stage. The risk management outcomes from this stage are critical input to updating the Safety Assurance Case which then informs the ‘Go-Live’ decision and broader roll-out strategy.

Implementer, Operator

Page 18: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

12

Go-Live Making the system fully active so that its intended Users can access and utilize it in carrying out the full range of targeted functions.

Depending on the scale of the implementation, the Go‐Live stage may involve a staged roll‐out in order to ensure the diversity of end user environments can be adequately supported and any new risks (or risk levels) can be identified and managed appropriately.

Processes for the ongoing monitoring and surveillance of risks need to be established.

Implementer, Operator

Transition point from Implementer to Operator

At this transition the Implementer provides the Operator with additional information in the Safety Assurance Case report about any specific actions needed by the Operator to maintain safety during use of the health IT system and any hazards that may need special attention during the subsequent decommissioning and disposal stage This stage occurs when the Integrator provides the Implementer additional information in the safety assurance case about any hazards that were identified during integration, including those that may have emerged during configuration and customization. Operational Use in a clinical setting

Post-deployment monitoring

Monitoring and optimizing network, database, infrastructure support to the health IT system.

The role responsible for risk during the operational use stage is the Operator.

Surveillance and monitoring

Includes active monitoring of the system’s use in the clinical setting through measures such as User satisfaction, data quality, and the effectiveness of critical functions such as decision support, as well as ensuring any hazards and safety events are identified and analyzed with appropriate remediation (including reporting to appropriate parties) to reduce the likelihood of future re-occurrence. Human factors engineering is used to identify and control any new risks resulting from remediation.

Operator, Top Management, Developer, Business Owner, User

Modification and maintenance

Modification or maintenance of HIT software or the health IT system after delivery to correct faults; improve or assure technical performance or other attributes; improve or restore functionality; address data quality or integrity issues, or improve integration with clinical workflow processes; and address other barriers to effective system use and adoption. Human factors engineering is used to identify and control any new risk resulting from modification and maintenance.

Operator, Top Management, Developer, Business Owner, User

Transition point from Operator to Business Owner

At this transition the Operator documents any specific actions needed by the Business Owner to maintain safety during decommissioning of the health IT system and any hazards that may need special attention on decommissioning of the HIT software in the Safety Assurance Case report.

Page 19: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

13

271

272

5 Development stage human factors engineering 273

5.1 General Development stage human factors engineering process 274

This section outlines best practices for conducting user-centered design during health IT system development. 275

Note that a particular health IT system user interface development effort is likely to involve parallel activities and iterative cycles that are 276 difficult to describe in a linear narrative (as illustrated in Figure 1 below). Therefore, developers should consider opportunities for 277 sequencing the steps described below in a manner that complements other health IT system development activities and proceeds 278 productively and efficiently. 279

This stage occurs when the Implementer documents any specific actions needed by the Operator to maintain safety of the health IT system and any hazards that may need special attention on decommissioning and disposal of the HIT software in the safety assurance case. The User shall report usability-related issues to the Operator and Business Owner.

Decommission Retiring and ending the existence of a system's existing software products or services while preserving the integrity of organizational operations.

The system is removed from the operational environments, and system work products and data are archived in the appropriate manner.

The human factors engineering process is used to identify and control any risk caused by retiring the HIT software or health IT system.

The Operator documents any specific actions needed by the Business Owner to maintain safety during decommissioning of the HIT software in the HIT system and any hazards that may need special attention on decommissioning of the HIT software in the safety assurance case.

The usability owner during the decommission stage is the Business Owner. The Operator and Developer are also involved in usability.

Page 20: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

14

280

Figure 1 – An illustration of a sample implementation of a user-centered design process5 281

282

Clause 5.2, below, provides an introduction regarding how developers can optimize the user experience through user-centered design 283 and evaluation method. The following sections cover the following elements: 284

• Conducting user research; 285 • Managing use-related risk; 286 • Designing the user interface; 287 • Evaluating the user interface. 288

Note that the guidance in this section was adapted primarily from a NIST grant/contract report (NIST GCR 15-996).6 289

5.2 Development stage optimizing the user experience through user-centered design and evaluation methods 290

5.2.1 Ensuring use-safety 291

5.2.1.1 The ability to deliver safe, high quality health care is closely tied to the quality of a health IT system's user interface design (Payne 292 et al., 2015). Simply stated, a poorly designed health IT system can induce use errors that could lead to patient injury and even death. 293 For example, just as an intravenous infusion pump can be mis-programmed to deliver a morphine overdose, a user can enter the wrong 294 medication order into a health IT system. Moreover, the interplay among multiple health IT systems may also lead to hazardous situations. 295

5 NIST GCR_15-996, Technical Basis for Use Interface Design of Health IT, National Institute of Standards and Technology, Gaithersburg, Maryland.

6 NIST GCR_15-996, Technical Basis for Use Interface Design of Health IT, National Institute of Standards and Technology, Gaithersburg, Maryland

Page 21: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

15

As health IT systems become increasingly integral to healthcare delivery, it is even more important to minimize the chance of potentially 296 harmful use errors and develop systems based on human factors engineering best practices and principles. 297

5.2.1.2 Mirroring the well-established approach to reducing risk that prevails in other industries (e.g., power generation, aviation, 298 transportation, medical devices) developers shall follow a user interface design process that seeks to identify potentially harmful use 299 errors and decrease their likelihood of occurrence. Fortunately, a use-safety focused design process typically also increases task 300 effectiveness, work efficiency, and user satisfaction. 301

5.2.1.3 In the medical device industry, taking steps to reduce the chance of potentially harmful use errors is an important part of a broad-302 based risk management process. Such efforts make sense because medical devices can directly cause harm: use errors can lead to 303 infection, laceration, burn, overdose, asphyxiation, and exsanguination, among others. In such cases, risk reduction techniques might 304 include placing automatically activating guards on needles, insulating heat sources, rounding the edges of mechanical parts, shape- or 305 color-coding tubes and associated ports, forcing functions of design and adding pressure sensors. The contribution to safety of these 306 design features may be augmented by attention-getting warnings, clearly written and illustrated instructions, thorough training approaches 307 and comprehensive testing throughout the entire health IT system lifecycle. 308

5.2.1.4 A similarly structured approach to identifying potential use errors using a health IT system will lead to mitigations. For example, a 309 health IT system might employ one or more of the following: preventing orders of the wrong medication by presenting prescription 310 information in a coherent structure, including bold screen titles, using colour to effectively highlight important information, and including 311 helpful, non-distracting prompts. The user interface principles to be presented in AAMI’s planned Technical Information Report on HIT 312 design principles promote the implementation of these features, along with others proven to reduce use-related risks associated with 313 health IT systems. 314

5.2.2 Facilitating task effectiveness 315

5.2.2.1 While the priority is to ensure that user interactions with a health IT system are safe, it is also important to ensure that interactions 316 are effective. 317

5.2.2.2 An effective health IT system not only enables, but also facilitates task performance. It helps various types of users to complete 318 tasks easily and efficiently. Moreover, an effective health IT system is likely to garner higher usability satisfaction ratings from users. Both 319 formative and summative usability testing are required for health IT systems. Formative assessments can provide needed input to 320 developers during design and coding phases. Summative testing, including in situ testing, can demonstrate adherence to usability 321 objectives. 322

5.2.2.3 Formative and summative usability testing are productive means to evaluate a health IT system’s effectiveness. In a usability test, 323 a representative sample of a health IT system's intended users attempt to perform selected tasks. Such tasks may be derived from 324 certification and regulatory criteria (e.g., ONC certification criteria) but should be broad enough that these criteria are not the only criteria 325 tested. In parallel, tasks judged to be safety-critical, urgent, frequent, or challenging shall be tested (see clause A.3.2 Specifying usability 326 test tasks for additional information on task selection); tasks may also be developed based on earlier user research. 327

5.2.2.4 If a test participant completes a task, but also commits a potentially harmful use error, the task is characterized a failure. Task 328 failure also occurs if there is any need for the test administrator to help the participant complete a task, because such assistance might 329 not be available during actual use and a use error could result. 330

5.2.2.5 Developers should aspire to produce health IT systems with which all intended users can interact effectively, albeit while 331 recognizing the near impossibility of achieving this goal. Inevitably, certain people at certain times in certain situations will fail to complete 332 a given task, perhaps due to use errors that cannot be attributed to a design-related root cause. However, if a usability test reveals that 333 several individuals (e.g., 3 out of 30) cannot complete selected tasks, further investigation should be undertaken to determine if the user 334 interface needs modification or if other factors, such as training or experience levels, are the issue. 335

5.2.3 Optimizing usability 336

5.2.3.1 Usability is a key component of user experience especially as it relates to user satisfaction. Satisfaction with a technology or 337 software application influences users’ willingness to employ the product and to tolerate aspects of the user interface that do not fully meet 338 their needs. User satisfaction is influenced by multiple, interrelated user interface design features, rather than just one or a few features. 339 For example, a user might be satisfied by a health IT system that has the following characteristics: 340

Page 22: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

16

− Intuitiveness enabling the user to perform basic tasks without first reading instructional materials 341

− User support in the form of access to additional information for as-needed assistance 342

− Workflow that closely matches the conventional clinical workflow that existed before health IT system implementation 343

− Visual appeal resulting from the balanced-looking arrangement of onscreen contents, use of harmonious colors, easy to interpret 344 icons augmented by text labels, large and three-dimensional looking controls (i.e., buttons), and white space to separate blocks 345 of information 346

− Information usability (e.g., accessibility, comprehensibility, legibility, readability). 347

5.2.3.2 Table 3 provides additional examples of user interface characteristics that may promote or degrade usability and, accordingly, 348 impact efficiency, effectiveness, and/or user satisfaction. 349

Table 3 – Examples of User interface characteristics that can promote or degrade usability 350

Characteristics promoting usability Characteristics degrading usability

Harmonious color palette Garish screen appearance

Large, meaningful screen titles Lack of screen titles

Prompt placed in a consistent location Minimal prompts presented in varying formats

Key information summarized on a single screen Key information spread across multiple screens

Confirmation pop-ups associated with irreversible actions

No means to undo an erroneous action

Inclusion of patient-identifying information on each screen or window

Lack of patient-identifying information on some screens

Information, prompts, and alerts provided in a context-appropriate manner and time

Alerts and prompts that interrupt user concentration and workflow

Meaningful error and information messages Vague or conflicting error messages

User-controlled customization within appropriate parameters

No options for customization

On-demand data visualizations Data presented only in tabular form

Fonts of sufficient size Small or otherwise illegible fonts

Text hierarchy, clear indications of workflow and navigation tools

Lack of navigation or indication of workflow/next steps

Tasks in-line with other tasks users are performing in-context

Data presented out of order

5.2.3.3 These characteristics are expected to have a positive influence on health IT system use-safety and effectiveness. Notably, a well-351 designed health IT system should possess these and/or comparable interface characteristics to foster task efficiency and user satisfaction. 352

5.2.3.4 The automobile is a fitting comparison for recognizing the value of usability. Comfortable seats, air conditioning, and cup holder 353 locations might not be essential for safety and effectiveness, but all are important to users and likely to have a disproportionately strong 354 influence on consumer preference for one car versus another. Similarly, usability is perhaps likely to have a strong influence on user 355 preference for one health IT platform versus another, and, as such, to work for one health provider over another. 356

Page 23: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

17

5.2.3.5 Accordingly, developers shall take the following applicable steps to increase a health IT system's usability that may include using 357 a human factors or usability engineering expert: 358

− Conduct ethnographic studies to understand users and their environments as an input to user interface requirement 359 development; 360

− Develop usability-focused user interface requirements (e.g., in the form of written statements, conceptual designs, user stories, 361 etc.); 362

− As practicable, implement accepted user interface design practices, such as those documented in industry guidance 363 documents;7 364

− Iteratively design, conduct formative evaluations (e.g., usability testing, expert reviews), and revise the health IT system’s design 365 based on usability testing results; 366

− Conduct summative usability testing and residual risk analysis. 367

5.2.3.6 Notably, developers are likely to find that usability is often a natural by-product of efforts to ensure a health IT system’s use-safety 368 and effectiveness because an investment in user-centered design can have a positive influence on all three factors. 369

5.3 Development stage conducting user research 370

5.3.1 General development stage conducting user research 371

5.3.1.1 Developers shall conduct user research as a foundational step in the user-centered design process. The primary goal of user-372 centered design is to develop user interface design solutions that meet users’ needs, a key one of which is to operate the user interface 373 safely and effectively. This goal can be accomplished in part by taking steps to identify user needs. Even though a health IT system 374 developer might already have (or believe he or she has) extensive knowledge of user needs based on the developer’s prior market 375 research activities and/or general knowledge, some specific user research is usually still warranted. 376

5.3.1.2 It is important to emphasize that clinicians who are employed by a developer, or are hired as consultants because of their expertise, 377 can be very valuable to usability efforts but can never be fully representative of all of the health IT system's intended users. Thus, research 378 involving a cross-section of typical users is beneficial. 379

5.3.1.3 This subsection describes how developers can identify user needs, convert the needs into user interface requirements, and then 380 use the requirements to drive the user interface design effort. 381

5.3.2 Determine user needs 382

5.3.2.1 Introduction 383

5.3.2.1.1 As suggested above, a broad and deep understanding of a health IT system's users’ needs is fundamental to designing a health 384 IT system user interface that is safe, effective, and easy to use. Common and effective methods for developing this understanding involve 385 (1) observing the intended users at work, and (2) speaking with them about their health IT related needs and desires, as well as their 386 typical workflow, task-oriented goals, and priorities. 387

5.3.2.1.2 Observation and interview methods vary in name and emphasis, depending on what type of professionals perform the research. 388 The types of professionals who might perform the research include people trained in the overlapping fields of human factors, ethnography, 389 interaction design, information design, marketing, communications, and related disciplines. For example, human factors specialists use 390 the term contextual inquiry when referring to interviewing people while they work. Ethnographers speak of performing ethnographic 391 research as a specialized means of observing people unobtrusively to gain a more accurate understanding of how they work, as well as 392

7 ANSI/AAMI HE75:2009, Human factors engineering – Design of medical devices. Association for the Advancement of Medical Instrumentation; 2009. Arlington, VA.

Page 24: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

18

performance-shaping factors related to the environment and system in which they work. Market researchers speak about conducting 393 interviews and focus groups to capture the voice of the customer. 394

5.3.2.1.3 During such research activities, users might articulate their needs, such as those presented below, in hypothetical, first-person 395 forms: 396

• “I want the patient’s chart to clearly show their name, age, hospital ID number, principal medical condition, and maybe even a 397 photograph for ease of recognition.” 398

• “I don’t like having to go to many screens to get the information I need to make a clinical decision. The key information should be 399 on one screen. Stuff it in if you need to.” 400

• “I need the system to check for drug interactions and grab my attention if there is a problem.” 401 402

5.3.2.1.4 The overarching point is that observing and speaking with users can generate useful insights regarding the qualities that will 403 make a health IT system safe, effective, efficient, usable, and satisfying. As such, developers shall conduct some form of user research 404 with a heavy focus on health IT system safety and effectiveness. 405

5.3.2.1.5 A critical element of any user research method is more than just users’ current practices (i.e., the tasks they currently perform 406 with the tools they currently use) to identify their underlying work goals and priorities. The key is to understand what the user is trying to 407 accomplish. For example, the best software products do not simply convert a paper-based form into an electronic one. By first gaining a 408 thorough understanding of the underlying associated work, developers can produce electronic forms that harness the value added by a 409 computer to improve task safety and effectiveness, as well as usability (encapsulating efficiency, intuitiveness, and satisfaction). 410

5.3.2.1.6 Further, neither interviews nor observations alone will suffice. Observations, no matter how extensive and detailed, in the 411 absence of user interviews, will not reveal users’ underlying reasons or motivations for their actions and thus miss essential aspects of 412 users’ real needs. On the other hand, relying solely on users’ descriptions of their current practices and their preferences for future health 413 IT systems will inevitably yield suboptimal products. Without observations, researchers will be unable to identify latent user needs (i.e., 414 needs users are not even aware of themselves). Therefore, it is important to both observe and speak to users. 415

5.3.2.1.7 Ultimately, the health IT system developer should adopt a user research approach of its own choosing, while considering the 416 following advice: 417

• Extend the research to cover the widest possible range of intended users, including primary and secondary users. 418 • Extend the research to cover not only common use scenarios, but also uncommon ones (i.e., edge cases), especially if those use 419

scenarios are safety-related (i.e., associated with a high-severity risk). 420 • Extend the research to cover both experienced and inexperienced users to understand (1) experience influences use of a system, 421

and (2) inexperience influences users’ ability to intuit use of a system. 422 423

5.3.2.1.8 Common deliverables for documenting user research include a prioritized list of user needs, a list of use scenarios (i.e., common 424 and high-risk tasks), and a description of user characteristics, sometimes expressed as user profiles and/or user personas, which are 425 described below. Research findings may be described in stand-alone documents for each user research activity or converted directly into 426 user interface requirements (see clause 5.3.7 Develop user interface requirements). 427

5.3.2.2 User profiles 428

A user profile is a description of users who are members of a distinct user group (i.e., a sub-group of the overall intended user population). 429 User profiles may be brief (e.g., ½ page) or expanded (e.g., 2-3 pages) documents. They typically cover characteristics that could have 430 bearing on a health IT system’s user interface design, such as the following: 431

• Demographic traits such as age, gender, ethnicity and other cultural attributes; 432 • Education, experience and expertise (particularly pertaining to health IT system use); 433 • Learning style; 434 • Work environment; 435 • Common tasks (as they affect health IT system use); 436

Page 25: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

19

• Performance shaping factors (pertaining to a health IT system)8. 437 438

Performance shaping factors might include the following details: 439

• General health and mental state (stressed, relaxed, rested, tired, affected by medication or disease) when using the health IT 440 system; 441

• Sensory capabilities (vision, hearing, touch); 442 • Coordination (manual dexterity) as it might influence onscreen navigation and data entry; 443 • Cognitive ability and memory; 444 • Knowledge about health IT system operation and associated functions (e.g., physician order entry); 445 • Previous experience with health IT systems (particularly those that serve similar purposes); 446 • Previous experience using software applications (including desktop, web-based, and mobile applications); 447 • Expectations about how a health IT system will operate; 448 • Motivation to use the system; 449 • Ability to adapt to adverse circumstances associated with health IT system use (e.g., using it in an emergency use scenario); 450 • Amount of cooperation with other health IT system users (e.g., individual health IT system use versus team-based use); 451 • Work pace (e.g., user driven or system driven). 452

453 5.3.2.3 Personas 454

A persona is an alternative or a complement to a user profile. Typically, a 1-2 page narrative, it describes a particular but usually 455 hypothetical (or at least an actual but de-identified) individual rather than summarizing what is known about the members of an overall 456 user population. Developers may create multiple personas for each user group, with each persona reflecting characteristics and 457 behavioral traits amalgamated from user research findings. 458

Personas can help guide user-centric design and build empathy for the user. Personas are often used by market researchers to describe 459 specific types of customers. They often include potentially relevant personal life details (e.g., hobbies) and behavioral descriptions that 460 are not normally included in a user profile. Some user interface developers prefer personas over user profiles because they believe that 461 it can be more productive to design an application to meet the needs of selected individuals rather than the general needs associated 462 with distinct types of users. They may find it easier to imagine and empathize with hypothetical individuals rather than more general user 463 profiles. While the comparative values of both end products are debatable, either approach (or both approaches) can support the 464 development of user interface requirements. 465

5.3.3 Identify work environment characteristics 466

A strong understanding of the environments in which people will interact with a health IT system (the “use environments”) also provides 467 a foundation for making good user interface design decisions. 468

It is practical to study health IT system users’ work environments at the same time one learns about user’s needs and preferences (i.e., 469 while conducting user interviews or observations – see clause 5.3.2. Determine user needs). Insights are typically documented in the 470 form of use environment descriptions – one for each significant type of use environment. 471

Similar to a user profile, a use environment description may be a brief (e.g., ½ page) or expanded (e.g., 1-3 pages) description of a 472 particular setting. Use environment descriptions discuss characteristics that might affect a health IT system’s user interface design8, such 473 as: 474

• Health IT system placement within the use environment and expected viewing distances; 475 • Availability of work surfaces; 476

8 AAMI/IEC TR 62366-2:2016, Medical devices – Part 2: Guidance on the application of usability engineering to medical devices. International Electrotechnical Commission, 2016. Geneva, Switzerland.

Page 26: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

20

• Network connectivity; 477 • Connected and adjacent medical devices; 478 • Seating (affecting viewing angle); 479 • Noise levels, lighting, and other environmental factors; 480 • Sources of distraction (e.g., phone calls, patient requests); 481 • Concurrent tasks and activities; 482 • Ease of access to materials necessary to complete tasks (e.g., patient intake or referral forms, scanners); 483 • Nearby personnel; 484 • Potential for unauthorized use (e.g., by hospital visitors). 485

486

Note that a health IT system might be used on a fixed workstation, such as those installed in central nursing stations, patient rooms, and 487 offices; on a rolling or hallway workstation; or on mobile devices (e.g., tablets, smartphones) that travel throughout a clinical environment. 488

Similar to user profiles, use environment descriptions support the development of user interface requirements pertaining to such factors 489 as visual and physical access to the health IT system display and pointing device (if not a touchscreen), and conditions and events that 490 could interrupt interactions with the health IT system. 491

5.3.4 Define preferred workflows 492

Whether or not people are accustomed to healthcare delivery workflows involving existing health IT systems, they will be accustomed to 493 particular workflows. Those workflows can vary widely in terms of safety, effectiveness, efficiency, and the degree to which workers 494 consider them to be satisfactory. Developers should study existing workflows to determine which characteristics must or should be 495 preserved in a new health IT system, and most importantly, how the health IT system could enhance or facilitate task safety, effectiveness, 496 and efficiency. 497

There are ample examples of health IT systems’ technical limitations or user interface design flaws that have led to undesirable changes 498 in clinicians’ workflows. Developers should be acutely aware that changes to existing workflows might be disruptive and even unsafe, 499 sometimes in unanticipated ways. As such, they should analyze the potential effects of all changes in workflow and proceed with solutions 500 that change workflow only when there is a significant benefit to the health IT system users. 501

5.3.5 Determine and evaluate common workflows 502

Developers should study existing workflows before designing a health IT system, noting that a health IT system will invariably affect future 503 workflows to some extent. Specifically, a developer should determine how clinicians complete health IT system-related tasks in the 504 absence of a health IT system (i.e., using paper-based records) and/or with a legacy health IT system. 505

It is common to document existing and anticipated workflows in the form of flow charts that depict user actions and decisions in response 506 to computer actions. 507

5.3.6 Define use scenarios 508

In the context of health IT system development, a use scenario is a description of possible interactions between the user and the system. 509 Use scenario details may include: 510

• Types of users involved in the use scenario; 511 • Use environment in which the use scenario occurs; 512 • Conditions influencing user interactions (e.g., workload, task urgency, experience using other health IT systems); 513 • Tasks performed during the use scenario. 514

515

Common use scenarios and their associated details can be identified by speaking with health IT system users about how they use their 516 current health IT systems or how they currently perform tasks that will eventually be performed using a health IT system under 517 development. Direct observations are useful in both identifying potential use scenarios and in confirming scenarios derived from user 518 interviews. 519

Page 27: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

21

Developers should identify a wide range of health IT system use scenarios as an input to the user interface requirements development 520 and risk management processes. 521

The following are summaries of a few sample use scenarios: 522

• Neonatologist using a health IT system in a neonatal intensive care unit to order an emergency medication for a newborn baby; 523

• Nurse using a health IT system in an outpatient clinic to prepare a for a patient office visit, including recording vital signs, verifying 524

the patient’s current allergies, and reconciling the patient’s medications; 525

• Medical assistant using a health IT system in an outpatient clinic to schedule a patient’s follow-up visit; 526

• A physician on call accessing a health IT system on a portable device to view a patient’s chart and document a clinical encounter 527

that occurred via phone consultation; 528

• During shift-turnover, a registered nurse starting her shift by using a health IT system in a hospital’s medical/surgical unit to review 529

current medication orders for the eight patients for whom she will be caring. 530

531

5.3.7 Develop user interface requirements 532

User interface requirements or specifications refer to documented characteristics and functions that are expected from health IT software, 533 applications, and systems. The goal of user interface requirements is to optimally support user needs, to enhance care effectiveness, 534 safety, and efficiency as well as the user and patient experience. 535

Just as technical requirements can drive the design of a health IT system’s database structure or software architecture, user interface 536 requirements for health IT software and systems will drive user interface design. 537

The user-centered design cycle typically begins with a research phase, followed by risk management, design, and evaluation (see Figure 538 1). User interface requirements are initially an output of the research phase, and an input to the design phase. The initial creation of the 539 user interface requirements should be based on a thorough understanding of the intended users, context of use, and users’ needs given 540 the functions and processes intended to be supported by the health IT software or system. 541

The initial user requirements document may be written at a relatively high level and, subsequently, iteratively refined and made more 542 specific and detailed during the design and evaluation phases. 543

User interface requirements cannot be created without considering pragmatic engineering, implementation, and adoption issues. For 544 example, an application that must be used in real-time while users move about a clinical environment will need to be implemented on 545 mobile hardware; this platform will in turn drive interface control and display requirements. Regarding adoption, a thorough understanding 546 of intended users and their needs will guide many user interface design requirement decisions such as interaction preferences including 547 how data will be displayed, screen organization and interface element prioritization and presentation. 548

The user interface requirements should address all important user needs, ensure adequate health IT software or system functionality, 549 delineate error-resistant and error-tolerant features, and describe all relevant usability-related characteristics of the final product. As such, 550 it is common for HIT software or health IT systems to have many user interface requirements, typically numbering in the hundreds. It is 551 common to have a hierarchical structure with a smaller number of higher-level requirements (e.g., “Log In”) under which there are several 552 more granular requirements (e.g., “Password Requirements”). 553

The user interface requirements document should specify and clarify the characteristics of all aspects of the health IT software or system 554 with which the user interacts. Thus, it should specify how each function will be enabled (often delineated in a separate functional 555 requirements document). The level of specificity and detail can evolve throughout development. 556

For many health IT software and systems, user interface requirements will often address the specific design requirements for user 557 interface elements, including: 558

• Input controls, including buttons, text fields, checkboxes, radio buttons, dropdown lists or menus, list boxes, toggles, date fields; 559 • Navigational controls, including breadcrumbs, slider, search field, pagination, tags, icons; 560 • Informational controls, including tooltips, icons, progress bars, notifications, alerts, alarms, message boxes, modal windows; 561

Page 28: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

22

• Containers, including accordions. 562 563

User interface requirements can be derived from a variety of sources, including: 564

• User interface design principles, such as those found in this document and many other articles, websites, textbooks, and 565

standards; 566

• Needs expressed by prospective health IT system users, documented in user research documents, user profiles, or personas; 567

• Use environment characteristics that influence users’ interactions with the health IT systems, documented in use environment 568

descriptions; 569

• User interface designer judgment, informed by education, experience, and design talent; 570

• Usability problems and use errors identified in legacy or competitor health IT systems, in literature, or in publicly available 571

databases; 572

• User interface design principles applicable to the particular operating system on which the health IT system runs; 573

• Data from user research or marketing. 574

575

Best practices that guide the development of user interface requirements are discussed throughout this document but may include, for 576 example: 577

• Keep the interface simple; 578 • Create consistency and use common UI elements; 579 • Be purposeful in UI layout; 580 • Strategically use colors, textures, other graphical options; 581 • Use typography to create hierarchical structures and for enhancing clarity; 582 • Ensure that the system communicates the status of operation; 583 • Consider establishing defaults; 584 • Think about different screen sizes and media (desktop, mobile, etc.); 585 • Develop a user interface flow chart. 586

587 5.3.8 Develop acceptance criteria and usability goals 588

5.3.8.1 Throughout the user interface development process, a developer might choose to establish metrics to assess when and whether 589 the user interface is “acceptable.” For the product as a whole, there will be numerous acceptance criteria, depending on the stakeholder, 590 from corporate leaders (e.g., return on investment criteria), regulatory bodies, and customers (e.g., cost and functionality). 591

5.3.8.2 This section focuses on methods to ascertain whether a health IT software or system has acceptable usability, from the perspective 592 of end-users and those who may evaluate the product for possible acquisition. Such “acceptance criteria” for health IT software or system 593 usability must address all aspects of usability – effectiveness, safety, efficiency, and satisfaction. Acceptance criteria, in general, must be 594 “SMART” – specific, measurable, achievable, relevant, and timely. More specifically, for each attribute of usability that is to be assessed, 595 the metric must actually measure the desired attribute, be detectable in the test environment (e.g., a usability test lab), and do so in a 596 cost-effective and timely manner. 597

5.3.8.3 Assessing the safety of health IT software and systems prior to real-world use, a critical activity during all phases of product 598 development, is addressed elsewhere in this standard (see clause 5.4 Development stage managing use-related risk). At a high level, 599 use safety must be anchored by a rigorous use-related risk analysis that is continually updated based on all sources of information. The 600 use related risk analysis shall be created during the research phase, be updated during usability evaluations, and then continue to be 601 updated during the lifecycle stages following development (i.e., post market release including, integration, implementation, and 602 operational use). Notably, the risk analysis should be based on real-world safety experience with the software or system, if possible. 603

Page 29: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

23

5.3.8.4 Usability evaluation methods for safety include expert reviews and formal usability testing. However, as described below, when 604 evaluating safety-related task performance during formative or summative functional usability testing, acceptance criteria need to be 605 carefully crafted because even a single instance of failure is cause for serious investigation and possible design revision. 606

5.3.8.5 In the rest of this section, usability goals (also called usability objectives or performance goals) will be discussed. Usability goals 607 quantitatively specify the desired quality of user interaction with and the desired user impressions of the health IT system. Usability goals 608 can be established based on usability evaluations of legacy and competing systems, competitive market goals, or the developer’s 609 aspirations for the overall quality of the system’s user experience. Setting measurable usability goals will help assess the performance of 610 the HIT software or health IT system throughout development. Benchmarking and improving on task performance can drive improvements 611 in the usability of the system. 612

5.3.8.6 These goals are primarily intended to ascertain the acceptability of the results of a usability test. Data collected during usability 613 testing efforts can be compared to the established usability goals, enabling the developer to determine whether the evolving or final health 614 IT software or system design meets the target metrics. A failure to attain a well-conceived usability goal during a usability test allows 615 developers to engage in a meaningful (evidence-based) discussion about design trade-offs that assures usability is on equal footing with 616 other product objectives (i.e., functionality, cost, memory requirements, hardware constraints, etc.). 617

5.3.8.7 Usability goals that are poorly conceived may mislead developers to accept a sub-optimal performance. For example, if a 618 developer sets a usability goal that 90% of new users will be able to complete a task, meeting the goal would still allow that 10% of new 619 users may be unable to complete that task during actual use. On the other hand, setting a usability goal of 99% is simply not feasible as 620 at least 100 users would need to be tested to assure that the goal has been attained. Thus, percentages should generally not be used in 621 usability goals except when the percentage is part of a goal that includes both an objective metric such as time on task and/or a second, 622 looser performance measure that must be attained by all. 623

5.3.8.8 At the end of a usability test, it can be useful to ask the test participant (who will often be unaware of their “failure”) their perceptions 624 of their task performance, particularly but not exclusively focused on task failures. While there may be many good reasons to discount 625 their explanation (e.g., rationalization, hindsight bias, failed recollection), patterns of participant responses can yield useful information to 626 guide design decisions. 627

5.3.8.9 It is important to emphasize that failure to achieve a well-conceived usability goal during a usability test does not a priori mean 628 that this aspect of the user interface is a “failure.” Rather, it should be viewed as an opportunity for the team to re-evaluate its assumptions 629 and processes for both the design and the evaluation. Especially during early phase usability testing, usability goal failure should also 630 prompt an assessment of the appropriateness of the usability goal and of the specific task scenario. 631

5.3.8.10 Typical usability goals address use attributes of speed, accuracy, overall success (or absence of use error), and satisfaction 632 measures. Table 4 presents measurable usability goals for a health IT system. 633

Table 4 – Measurable usability goals9 634

Goal What it measures Sample benchmark

Speed How long it takes to complete the task 95% of experienced users will be able to use the alphabetical list to find terms in the glossary in less than 2 minutes and all users will be able to do so within 4 minutes.

9 Adapted from: www.usability.gov.

Page 30: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

24

Accuracy The number of attempts to complete the task 95% of new users will be able to locate the patient’s last encounter with less than 2 false clicks.

Overall Success

The percentage of users who completed the task 90% of new users will be able to create a new patient encounter.

Satisfaction How satisfied the user was with the process of completing the task

Users will rate the overall usability of the application as an average of four on a five-point scale where five is the best.

5.3.8.11 Typically, a complete set of usability goals emphasize those that are observable and thus more objective. The set will also 635 include some goals that require users to express their opinions and thus are more subjective. Objective usability goals are usually based 636 on the time taken to perform a specific task or the rate at which new users are able to complete a task successfully. Subjective usability 637 goals call upon individuals who have used an application to express their opinion about it, typically using a rating scale. Table 5 presents 638 a sample of objective and subjective goals pertaining only to a health IT system's usability. 639

Table 5 – Sample objective and subjective usability goals 640

Objective usability goal Subjective usability goals

85% of new users shall successfully log into the system on their first attempt.

On average, new users shall rate the system’s ease-of-use as 5 or better (scale: 1 = poor, 7 = excellent)

90% of new users shall be able to load a patient’s chart in less than 30 seconds.

On average, new users shall rate the patient chart’s readability as 5 or better (scale: 1 = poor, 7 = excellent)

95% of experienced users shall be able to enter a patient’s vital signs in less than 5 minutes.

Consider presenting a subjective usability goal that is unrelated to ratings because ratings are not the only way to collect subjective data. For example, “X percent of participants considered the system safe and effective as is.”

641

5.3.8.12 Developers have the prerogative to make the success criteria for each goal (e.g., target task time for an objective goal, target 642 average rating for a subjective goal) based on their ambitions for the system’s interaction quality. Ideally, the set criteria should be based 643 on users’ needs derived from user research, as well as data from legacy or competitive products (e.g., average task times collected from 644 a prior usability test). 645

5.4 Development stage managing use-related risk 646

5.4.1 General development stage managing use-related risk 647

5.4.1.1 Another goal of user-centered design is to develop user interface solutions that are less vulnerable to potentially harmful use 648 errors. This goal can be accomplished by taking the necessary steps to understand how health IT system users might err and either 649 preventing such occurrences altogether or mitigating their negative effect. This process is called use-related risk management. 650

Page 31: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

25

5.4.1.2 Notably, all developers shall implement and maintain a use-related risk management process. In summary, use-related risk 651 management involves identifying hazards and associated use errors, estimating the risk associated with the use errors, and reducing the 652 risk of use errors that could cause significant harm. The requisite risk management steps are described below. 653

5.4.1.3 For more information regarding risk management and maintaining safe and effective health IT software and systems, see AAMI 654 HIT1000-3 (PS) 2019, Safety and effectiveness of health IT software and systems – Part 3: Application of Risk Management. 655

5.4.2 Identify hazards and associated use errors (top-down approach) 656

5.4.2.1 One might normally think about hazards as physical things that can cause harm. In clinical environments, for example, use of 657 medical devices can introduce hazards such as an unguarded needle, an energized electrode, a leaking blood tubing, or an unshielded 658 radiation beam. Each of these hazards could cause harm to a patient or a clinician. In this case, to address these risks of harm, device 659 developers are compelled to identify hazardous situations that could be created in association with the use of their devices and to 660 implement protections against use errors that could expose users to these hazards and lead to harm. The implementation of risk control 661 (i.e., reduction) measures is often termed risk mitigation. The “top down” approach described here (starting with identification of possible 662 hazards and the harms they can cause, and then analyzing how use of the product could lead to such harms) is typically called a use-663 related hazard analysis. 664

5.4.2.2 In contrast to therapeutic medical devices such as infusion pumps, a use error committed with a health IT system (that in not 665 linked to a physical medical device) cannot harm a patient directly but can expose a patient to a hazardous situation leading to harm. 666 That is, health IT system users can commit use errors that create hazardous situations whereby the actions or inactions resulting from 667 health IT system use can be injurious or even deadly. Therefore, health IT system developers shall identify potential hazards, and 668 foreseeable sequences of events, that may result in exposure of a patients to hazardous situations that could lead to harm, just as would 669 the developer of nuclear radiation delivery software, dialysis machines, or anesthesia workstations. 670

5.4.2.3 Table 6 presents sample use errors and hazardous situations that may lead to harm. 671

Table 6 – Sample use errors and hazardous situations leading to possible harm 672

Use errors Hazardous situations leading to possible harm

Ordered procedure for the wrong patient Intended patient does not receive treatment, resulting in possible harm ranging from disease progression to death.

Incorrect patient receives unnecessary procedure, resulting harm ranging from temporary and/or mild discomfort to permanent injury or death.

Ordered wrong test Delay in diagnosis due to additional time required to recognize wrong test order and re-order correct test, resulting in disease progression.

False positive diagnosis based on test results of wrong test, and subsequent treatment, resulting in possible harm ranging from temporary and/or mild discomfort to permanent injury or death.

Page 32: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

26

Use errors Hazardous situations leading to possible harm

Stopped a medication to which a patient had an allergic reaction but did not add medication to the patient’s allergy list

Patient is subsequently (re)prescribed that medication, resulting in a deadly allergic reaction.

Scheduled appointment for wrong patient Delay in follow-up care for intended patient, resulting in disease progression.

Ordered an unnecessary medication Patient receives unnecessary medication, resulting in anything from mild discomfort to deadly allergic reaction.

Documented that an influenza vaccine had been administered without actually ordering (or administering) the vaccine

Patient does not receive influenza vaccination and contracts a form of influenza, resulting in a life-threatening pulmonary infection.

Overlooked clinical reminder and failed to order a diagnostic screening test (e.g., a mammogram)

Patient receives delayed diagnostic screening, resulting in progression of disease (e.g., breast cancer) to an untreatable state.

Input chemotherapy medication concentration with decimal point in incorrect location

Patient receives chemotherapy with toxic concentration, leading to permanent injury or death.

673

5.4.2.4 Risk analysts should consider how user interactions with the health IT system could create hazardous situations leading to patient 674 harms. Note that user interactions with the health IT system cannot cause harm as directly as mishandling of a syringe can cause a 675 needlestick injury. Rather, use errors committed when interacting with a software application can cause indirect harm. In other words, 676 use error can lead to harmful action or inaction, such as those described below. Example 1: Enters wrong patient weight 677

• Use Error: User measures an infant’s weight in pounds and enters the measurement into an electronic medical record field set 678 to kilograms. 679

• Harm: Infant receives an underdose of a weight-based medication. 680

Example 2: Does not deliver antibiotic 681

• Use Error: User does not notice an order to give a patient a preoperative antibiotic injection. 682 • Harm: Patient develops a wound infection. 683

5.4.2.5 Accordingly, use errors involving health IT systems can indirectly but ultimately expose patients to many types of hazards (e.g., 684 biological, chemical, electrical, mechanical, radiological, thermal). 685

Page 33: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

27

5.4.2.6 Once hazards are identified, developers should determine all of the possible use errors that could lead to exposure of the patient 686 (or even provider or bystanders) to one or more hazardous situations. Fault tree analysis is a common way to determine use errors that 687 could expose the user to a hazard and potential harm. The technique calls for analysts to start with a harm and then work backwards to 688 determine what event or series of events, including one or more use errors, can lead to the harm. Such fault trees can become large 689 because the fault may stem not only from the use of health IT system data but also from use error in inputting data presented by the 690 health IT system. 691

5.4.3 Identify potential use errors and associated hazardous scenarios (bottom-up approach) 692

In addition to taking the top-down approach of identifying hazards and linking them to use errors that can lead to exposure, developers 693 can take a bottom-up approach that calls for systematically identifying the consequences of potential use errors. In effect, this method 694 asks the question “What happens if….?” This process is often termed a use-related failure modes and effects analysis (uFMEA). 695

The first step in this approach is to perform a task analysis, defining each step users will take to complete particular tasks with the health 696 IT software. With a preliminary user interface design, or a new version based heavily on a previous version (i.e., share many of the 697 predecessor’s characteristics), the task of tracing the interactive steps involved in the selected tasks is relatively straightforward. In the 698 case where the steps are not well-defined, the task/risk analysis would remain a living document until the health IT system is finalized. 699

Having identified the discrete tasks to be performed, a user-system interaction can be deconstructed into the following elements: 700

• Perception: acquisition of information (e.g., reading a growth chart) 701

• Cognition: mental processing of the information (e.g., deciding if a child is growing at a healthy pace and what to do if this is not 702 the case) 703

• Action: taking action based on the information (e.g., ordering a diagnostic test or a medication). 704

Deconstructing tasks into these categories constitutes a Perception-Cognition-Action (PCA) analysis, which can assist in understanding 705 how and why potential task breakdowns may occur. During consideration of the health IT system users, the use environment, and the 706 user interface, in conjunction with an understanding of the task, PCA needs and expectations serve to identify those areas of system 707 design that may be incongruous with the need. This information can then be used to support estimates of likelihood of harm, as well as 708 to identify opportunities for improvement and risk mitigations. Consideration of PCA should be part of task analysis. 709

5.4.4 Estimate risks associated with use errors 710

Traditional risk estimation in domains other than human factors involves estimating risk as a combination of the likelihood of occurrence 711 and the severity of harm. However, for the purposes of human factors studies, the potential severity of harm alone may be of primary 712 concern due to the inherent difficulty in establishing estimates of use-error likelihood with a high level of accuracy. This approach is similar 713 to that employed in other areas, such as software, where the accuracy of likelihood estimates is difficult to establish. In such cases, the 714 severity of the potential harm alone is considered; with the likelihood estimate set to a value of “1” (i.e., it will occur). Within the human 715 factors domain, it is an established practice to identify “critical” tasks (i.e., those tasks that, if performed incorrectly or not performed at 716 all, would or could cause serious harm). Identification of critical tasks directs the focus and rigor of the human factors effort on those 717 areas of system design having the greatest potential for serious harm. 718

That said, there are several other points to consider. Consider that clause 4.4 of ISO 14971,10 the FDA-recognized international standard 719 on applying risk management to medical devices, discusses estimating risks from hazardous situations, and that the estimates should be 720 based on available information or data; further, that the estimates can be quantitative or qualitative. By their very nature qualitative 721 estimates are imprecise, recognizing that the basis of such estimates may have limited direct observational evidence. As a consequence, 722 limited direct observational evidence is not, necessarily, a deterrent to making an initial use-error likelihood estimate. 723

10 ANSI/AAMI/ISO 14971:2007, Medical devices — Application of risk management to medical devices. International Organization for Standardization; 2007. Geneva, Switzerland.

Page 34: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

28

Use-error likelihood estimates serve other purposes as well. As illustrated in Figure 2 below from the FDA Design Guidance for Medical 724 Device Manufacturers,11 manufacturers identify a user need (such as performance of a critical task safely and effectively) and establish 725 this as a design input requirement. 726

727

Figure 2 – FDA Design Control Guidance for Medical Device Manufacturers 728

Human factors techniques are then applied to reduce (control) the risk associated with the task. For such critical tasks, two levels of risk 729 exist: one level prior to the introduction of a risk control, and a second (hopefully reduced) level of risk after introduction of a risk 730 control. Even though the pre- and post-levels of risk are qualitative estimates, over time and through an iterative process with formative 731 studies as well as HF validation testing will generate data and information that increase confidence in the estimates. Ultimately the level 732 of confidence should be such that a manufacturer concludes that they have reduced risk to an acceptable level and are prepared for a 733 summative usability test (Validation in the figure). 734

Documenting the pre- and post-levels of risk throughout the design process, as well as the test results, provides demonstrable likelihood 735 data that increases confidence in the qualitative risk estimates. This also serves to demonstrate compliance with regulatory verification 736 and validation requirements. Consequently, use-error likelihood estimates should be viewed as an integral element of a functioning and 737 effective design process. 738

5.4.5 Reduce use-related risks 739

When the severity of a particular use error is high, the risk must be reduced to a level that the developer’s risk management plan stipulates 740 is an acceptable level. Health IT system developers have little, if any, control over the severity of the harm that can result from a health 741 IT system use error, such as overdosing a patient, because the harm is a matter of biology rather than technology. 742

Therefore, health IT system risk control measures focus on reducing or eliminating the chance of a use error occurring or providing means 743 to detect and recover from the error before it causes harm. 744

Applying the user interface design principles (e.g., as presented in the AAMI Health IT Committee’s forthcoming Technical Information 745 Report on health IT design principles) is one way to reduce the risk of use error occurrence. 746

Table 7 lists several sample principles and their intended use-safety benefit. 747

11 FDA CDRH, Design Control Guidance for Medical Device Manufacturers, 1997.

Page 35: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

29

Table 7 – Examples of use-safety benefit 748

Guideline Intended use-safety benefit

Persistently display and highlight important patient information, such as disease mechanisms, current medications, and allergies, on patient information displays. Information may be presented in a persistent header, “dashboard,” “whiteboard” or similar user interface element that contains the key information. Do not permit the patient name to become hidden due to paging or scrolling, for example.

Reduce likelihood that a user will place a medication order for the wrong patient.

Present users with options that are operational mode-appropriate. For example, when prescribing medication to a pediatric patient, present medication dose options that are appropriate to children rather than adults.

Reduce likelihood that a clinician will prescribe an inappropriate dose.

Provide users with a direct means to perform emergency tasks, rather than requiring them to first perform non-essential tasks. Relatively low importance and non-urgent steps could cause an undesirable delay that could lead to patient harm. For example, enable appropriate users to order an emergency medication (e.g., tissue plasminogen activator (t-PA) for ischemic stroke) without having to complete a patient admission form ahead of time.

Reduce likelihood that there will be a delay in delivering therapy to a new patient with an acute stroke who has not yet been “admitted” in the system.

The principles provided in AAMI’s forthcoming Technical Information Report on health IT user interface design principles can serve as a 749 good general starting point for identifying mitigations. However, certain risks will likely require developers to identify specific, custom 750 mitigations based on the developed health IT system’s distinct design. 751

To produce a safety enhanced health IT system, developers must be prepared to modify a system’s user interface in response to evidence 752 and expert judgments that certain user interface aspects are leading to potentially harmful use errors. Human factors experts describe a 753 hierarchy of risk mitigations where the most effective (and often the most effortful) is to redesign the user interface to eliminate the 754 potential of use error. Often, effective but somewhat less reliable risk mitigation strategies include making the correct actions more obvious 755 by providing cues (i.e., affordances), making the error less likely through preventative measures (i.e., constraints), or making the use 756 error or its consequences more detectable. Generally, warnings, alerts, and instructions that essentially inform the user to “be careful” or 757 “avoid making this error” are much fewer effective mitigations. 758

It is important that developers recognize the regulatory context as they seek to mitigate likelihood of possible use errors in a health IT 759 system’s design. That is, ISO 14971 (an FDA-recognized consensus standard) requires manufacturers to identify and give priority to risk 760 control measures that provide inherent safety, followed by protective measures, and lastly information for safety because information for 761 safety is demonstrably less effective in reducing risk than other risk control measures. The philosophy being to employ risk control 762

Page 36: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

30

measures that will provide the highest level of effectiveness first. The EU has the same requirements as stated in the Medical Device 763 Directive12 and the new Medical Device Regulation13. 764

As discussed in clause 5.7 Development stage evaluating the user interface, developers can conduct usability tests to evaluate the 765 effectiveness of chosen risk reduction measures. If the risk mitigations are effective, users will be able to complete tasks without 766 committing safety-related use errors (i.e., those associated with high-severity risks). If even one user makes a safety-relevant use error 767 despite the mitigation(s), then different or additional risk control measures maybe warranted. 768

5.5 Development stage designing the user interface 769

5.5.1 General development stage designing the user interface 770

A comprehensive description of a health IT system user interface design process14 is outside the scope of this document and provisional 771 standard, but below two importance considerations for the user interface design process are outlined: 772

a) Ensure that user interface requirements drive the design, rather than letting technical software requirements and 773 programmers’ personal judgments override the user needs expressed in the form of user interface requirements. Additionally, 774 user interface requirements should be driven by the goals of the user. 775

b) Repeatedly evaluate the design-in-progress to determine if there are any new opportunities for use error that should be new 776 input to the risk management process. For example, it may be quite productive to evaluate a design near the completion of the 777 following stages of development: 778

• Conceptual design – which might be represented in the form of several sample screens (e.g., wireframes) and a user 779 interface structure diagram. 780

• Preliminary design – which might be represented by a medium-fidelity, interactive prototype of selected portions of 781 the application. 782

• Detailed design – which might be represented by a high-fidelity, interactive prototype of key portions of the application. 783

• Final (production-equivalent) design – which should be represented by a fully functional (i.e., production-equivalent, 784 ready for release) version of the health IT system, including the supporting learning aids. 785

5.5.2 Fulfill user interface design requirements 786

As suggested above, the user interface design effort shall be driven by the user interface requirements that, in turn, are based on user 787 needs and preferences as well as risk management efforts. In practice, this means that user interface designers shall be held accountable 788 for meeting the user interface requirements as opposed to designing the user interface according to team members’ personal judgments 789 and/or programming expedience. 790

5.5.3 Re-analyze tasks and perform additional risk management 791

As a health IT system's user interface evolves and becomes more complete, it enables a more detailed task analysis. Instead of 792 speculating about how users will perform tasks as a basis for identifying potential use errors, risk analysts can perform increasingly 793 specific and accurate task analyses resulting in a more accurate and comprehensive set of potential use errors, although the results 794 should substantially overlap with results of the initial analysis. Updated task analysis results – specifically the updated list of potential use 795 errors – shall be an input to a continuing risk management effort. 796

12 European Union. Council Directive 93/42/EEC of 14 June 1993 concerning medical devices OJ L 169 of 12 July 1993 13 European Union. Regulation 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices 14 ANSI/AAMI HE75:2009, Human factors engineering – Design of medical devices. Association for the Advancement of Medical Instrumentation; 2009. Arlington, VA.

Page 37: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

31

5.6 Development stage evaluating the user interface 797

Developers shall evaluate the health IT system’s user interface throughout the design and development process, including an evaluation 798 of initial concepts, evaluation of preliminary and interim versions of the system, and validation of the “final” version of the system (i.e., 799 “default” version that exists at the close of the development stage, prior to integration and implementation for a particular client). More 800 commonly, if the “default” version supports multiple configurations, the developer should validate (a) the most likely configuration and (b) 801 the configuration(s) associated with the highest levels of use-related risk. 802

The section below describes recommended evaluation and validation methods in detail. For an overview of additional usability evaluation 803 methods, see Usability.gov’s section on Usability Evaluation Methods.15 804

5.6.1 Evaluate initial concept(s) 805

Notably, developers should evaluate the user interface design in the early stage of its development. This section describes two common 806 early-stage evaluation approaches: the expert reviews and the cognitive walk-through. 807

5.6.1.1 Conduct expert review 808

A review by experts in the clinical domain and human factors experts is a productive way to identify a health IT system’s user interface 809 design strengths and opportunities for improvement at various stages of its development. For example, experts can evaluate a health IT 810 system when it is at the concept development stage and expressed in the form of sample screens comprising one or just a few important 811 workflows. They can evaluate an early interactive prototype that includes only certain functions. And, they can evaluate more complete 812 prototypes as well as finished products, the latter for the purpose of identifying opportunities to produce an improved, next-generation 813 version of a health IT system already in use. Additionally, developers can engage expert reviewers in an extended dialogue regarding a 814 particular finding or comment, whereas it might be difficult or impractical to obtain additional feedback from usability test participants who 815 have limited time to spend in a health IT system evaluation. Therefore, it might be advantageous for a health IT system developer to 816 initiate two, three, or even more expert reviews over the course of a development effort, noting that such reviews can be completed 817 quickly and at relatively low cost. 818

An expert review can be a valuable complement to formative usability testing because it can generate additional findings over a wider 819 range of time that might arise from an analysis of test data. However, developers should not consider expert reviews to be a complete 820 substitute for formative usability testing because certain design strengths and opportunities for improvement become evident only when 821 actual users perform representative tasks with a health IT system. 822

An expert review calls for one or more experts to inspect a health IT system and provide feedback regarding its interactive qualities, 823 including the strengths and opportunities for improvement. Both domain experts (i.e., clinicians with experience in usability and human 824 factors), as well as human factors professionals who do not possesses any domain expertise should participate in expert reviews. 825

Still, certification as a human factors professional holding a degree in a user interface design-related field and similar credentials are 826 certainly hallmarks of a genuine expert. A degree of objectivity is advantageous as well, which is why it makes little sense for members 827 of the development team to conduct their own expert reviews. For this reason, developers often turn to outside consultants, or at least 828 co-workers who are not members of the particular health IT system development team, to conduct such reviews. 829

Ultimately, the productivity of a particular review depends on the selected expert(s) to possess a suitable base of knowledge and 830 experience in areas such as, but not limited to, the following: 831

• The applicable clinical domain (i.e., clinical terms, tasks, and overall workflows); 832 • User interface design and its variants (e.g., user experience design, interaction design, user interface architecture); 833 • Human factors engineering (or usability engineering); 834 • Human-computer interaction; 835 • Visual design. 836

15 See: https://www.usability.gov/how-to-and-tools/methods/usability-evaluation/index.html

Page 38: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

32

837

An expert review is a flexible process; one that is usually shaped by a reviewer’s or review team members’ preferred practices, which 838 may include the following activities: 839

• Develop an initial understanding by exploring the health IT system and attempting to perform tasks before getting a complete 840 introduction; 841

• Learn about the health IT system's intended uses, users, and the environments in which the system will be used; 842 • Review user interface design goals with the development team; 843 • Participate in training, most likely of a condensed form but possibly in its fullest form, to use the health IT system. Alternatively, 844

observe all or a portion of an actual training session; 845 • Scan or read in-full the health IT system learning aids; 846 • Establish a set of general (i.e., global) and specific (i.e., local) user interface design topics upon which the review will focus; 847 • Document user interface design strengths and opportunities for improvement (i.e., weaknesses, shortcomings). The term 848

“opportunities for improvement” is preferable because it is a more constructive expression that is aligned with the end-goal of 849 producing an optimal user interface. 850

851

Optionally, provide recommendations pertaining to the opportunities for improvement. Such recommendations may be narrative and/or 852 graphical depending on the topic at hand. They may vary in specificity based on the developer’s needs, the reviewers’ user interface 853 design expertise, and the development stage at which the review is conducted. 854

Expert review findings are typically presented in slideshow or narrative document formats (see Annex B Sample expert review findings). 855 Notably, expert reviews might also utilize a priority or severity scale that identifies those findings that a developer might prioritize as they 856 improve the health IT system’s design. 857

The most productive expert reviews typically employ a formal process that ensures comprehensive coverage of the topics of interest and 858 associated documentation. The review should be presented to the development team such that individual findings evoke a constructive 859 response rather than neglect or outright dismissal. In addition, the review should be presented at appropriate points within the 860 development timeline (e.g., taking into account development sprints if within the framework of an agile methodology). 861

Measures to ensure that the expert review findings have a positive influence on a health IT system in development or subject to updating 862 include the following: 863

• Draw upon the expert review results to update the use-related risk assessment. For example, review the expert review results 864 to determine if additional use errors have been identified that warrant risk analysis, or suggest modifications to prior estimates 865 of a use error’s likelihood (i.e., probability) and the severity of the harm(s) that could arise; 866

• Add the expert review findings – particularly opportunities for improvement linked to user interface design shortcomings – into a 867 tracking system that assures they will be addressed in some manner, which could include implementing a design change or 868 deciding to keep the design as is. This approach to dealing with expert review findings is akin to how software developers 869 normally track backlogged design tasks and/or programming “bugs,” ensuring that high-priority bugs or usability issues are fixed 870 before the application’s release. 871

872

A heuristic analysis is a type of expert review that serves the same basic purpose, but is worth discussion given that it is widely used 873 among software developers. The technique calls for multiple reviewers – typically two to three – to conduct independent reviews according 874 to a common set of evaluation criteria (i.e., design principles such as those presented in AAMI’s forthcoming Technical Information Report 875 regarding health IT user interface design principles). After independently developing a list of problems, the reviewers discuss and reach 876 a consensus regarding the problems and their severity according to applicable criteria such as use-safety, task completion, and usability. 877 In principle, adding more experts is likely to identify a greater proportion of the existing design problems. As such, heuristic analysis is a 878 more rule-based approach to conducting an expert review, which might or might not pay dividends in terms of the end-product’s quality. 879 Notably, a heuristic analysis is normally focused on problems as opposed to design strengths. This approach makes it best suited to the 880 evaluation of well-developed user interfaces when the goal is to perfect them, whereas an expert review that also focuses on design 881 strengths may be more helpful at an earlier stage of development. 882

An example of the "rules" that might be evaluated in a heuristic review might include: 883

Page 39: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

33

• Data organization. Is the data organized in useful ways that put related things together and separate unrelated things? 884

• Visibility of system status. Are users kept informed about what is going on, through timely and meaningful feedback? 885

• Does the design strive for error prevention? Clear and concise error messages are important. However, designing to prevent a 886 problem from occurring at all is the ideal? 887

A variety of human factors heuristics are available in the human factors and/or health IT domains, including by Nielsen-Shneiderman16, 888 HIMSS, and AHRQ17. 889

5.6.1.2 Conduct cognitive walkthrough 890

It might be impractical to conduct a usability test of a user interface design at the early stage of its development when the health IT system 891 might not be functional enough to support such testing. In place of a usability test, developers may engage prospective health IT system 892 users in a cognitive walkthrough. 893

Whereas a formative usability test calls for representative users to interact with a prototype of a health IT system, a cognitive walkthrough 894 requires only static screens or wireframes of screens, complemented by supporting explanations by the walkthrough administrator (i.e., 895 the administrator “fills in the blanks” left by a lack of design detail and dynamic behavior). 896

The term walkthrough delineates the process of stepping through a series of activities – a workflow – without actually performing the 897 activities. Participants in a cognitive walkthrough of a health IT system would view certain screens and share their impressions of them. 898 They would also describe how they might act if interacting with a working health IT system and discuss the underlying thought processes 899 pertaining to each step of a task (e.g., considering options, performing mental calculations, identifying missing information, arriving at 900 decisions). 901

The conceptual design stage is a good time to conduct a cognitive walkthrough, presuming that the application’s user interface is 902 instantiated in the form of static screens only, or perhaps only a marginally interactive prototype. Otherwise, it would be more productive 903 to conduct a formative usability test. 904

A good sample size for a cognitive walkthrough is the same as a formative usability test of an early prototype – perhaps 5-10 individuals. 905 However, the variety of health IT system users might warrant a larger sample, albeit one that includes only a few representatives of each 906 type of user. 907

The technical steps are similar to a formative test (see clause 5.6.2.1 Conduct formative usability test). In brief, they may include: 908

• Determining cognitive walkthrough goals; 909 • Developing cognitive walkthrough plan; 910 • Recruiting the participants; 911 • Conducting the cognitive walkthrough sessions; 912 • Consolidating and analyzing the data; 913 • Documenting the results. 914

As compared to formal usability tests, walkthroughs can be conducted by a single researcher more readily; in this case, consider recording 915 the sessions for use in data analysis. However, a two-person research team might still be warranted if the walkthrough covers a significant 916 amount of content. In place of a comprehensive report, the research team might choose to write a concise memo to document the results. 917 Alternatively, the team might choose to convey the study findings in a briefing, capturing the major points in meeting notes. 918

16 See: Nielsen-Shneiderman Heuristics: A tool for evaluating product usability https://www.patientsafety.va.gov/docs/TIPS/usability_tool.pdf

17 See: https://healthit.ahrq.gov/health-it-tools-and-resources/evaluation-resources/workflow-assessment-health-it-toolkit/all-workflow-tools/heuristic-evaluation

Page 40: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

34

A cognitive walkthrough should be a helpful, convenient exercise rather than a burden to the development process. Still, similar to the 919 way developers should track expert review findings, developers should incorporate the findings from a cognitive walkthrough into the risk 920 management and design optimization efforts. 921

5.6.2 Evaluate the preliminary user interface 922

When a health IT system's user interface has evolved beyond the concept stage and a functional prototype is available, an expert review 923 might still be in order. An expert review takes relatively little time and can be very productive. However, a functional prototype enables a 924 usability test, which is generally considered the optimal way to evaluate a software user interface because it can reveal problems that 925 might go undetected in an expert review. When testing involves a design in formation, it is commonly called a formative usability test. 926

Developers should conduct formative usability tests of their health IT system design. This technique is described below. 927

5.6.2.1 Conduct formative usability test 928

Formative usability testing calls for prospective users to perform hands-on tasks with a prototype of the health IT system. As suggested 929 above, the prototype may be fully or partially functional. Indeed, while some portions of the user interface might be refined, others might 930 be static or missing altogether. The point is to invite prospective users to interact with the health IT system at its early stage of 931 development, when design and software changes are relatively easy to implement, and then at subsequent stages of its development to 932 ensure the design is on the right path toward a successful validation (see clause 5.3.4 Validate the user interface). 933

Human factors research suggests that even small-scale formative usability tests, involving as few as 5-10 prospective users, generate 934 valuable results. As part of an agile approach, it is advisable to conduct multiple small-scale formative usability tests at various points in 935 the development process, rather than one large one at a single point in the development process. 936

Formative usability tests usually have the following characteristics: 937

• Test participants represent the intended users. Ideally, but with practical exceptions, test participants should not be development 938 organization employees, even if they might have previously served in a role matching that of an intended user (e.g., as a 939 clinician). Rather, participants should be recruited from the organizations that might use the health IT system in the future, such 940 as nurses working in large medical practices; 941

• Testing is guided by a plan (i.e., protocol) that describes the test goal, test method, test participants, state of health IT system 942 development (i.e., the prototype’s functional and visual fidelity), hands-on tasks, and data to be collected. NIST provides a 943 sample usability test plan on its website18; 944

• Test participants should think aloud while interacting with the prototype health IT system so that testing personnel can 945 understand the participants’ thought processes and identify the root causes of interaction problems that arise; 946

• Testing reveals the following types of usability-related events, which will indicate where the health IT system user interface might 947 require modification to ensure its safe, effective, and satisfying use: 948 o Use errors – in which a participant uses the health IT system in a manner that does not match its intended use and/or could 949

lead to patient injury; 950 o Close calls – in which a participant almost commits a use error or avoids a use error by relying on her/his vigilance (as 951

opposed to guidance provided by the user interface); 952 o Difficulties – in which participants struggle to complete a particular task. 953

• Formative testing presents multiple opportunities to collect the test participants’ opinions about health IT system features they 954 like and dislike, thereby identifying opportunities for design improvement; 955

• Test findings should be described in a document that may include design recommendations. NIST advises that documents 956 include the following contents:19 957 o Executive summary 958 o Introduction 959

18 Schumacher, R. M., & Lowry, S. Z. (2010). NIST guide to the processes approach for improving the usability of electronic health records. National Institute of Standards and Technology.

19 Schumacher, R. M., & Lowry, S. Z. NISTIR 7742: Customized Common Industry Format Template for Electronic Health Record Usability Testing. 15-November 2010.

Page 41: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

35

o Method 960 o Participants 961 o Study design 962 o Procedure 963 o Test location 964 o Test environment 965 o Test forms and tools 966 o Participant instructions 967 o Usability metrics 968 o Results 969 o Data analysis and documentation 970 o Discussion of the findings 971 o Appendices 972 o Sample recruiting screener 973 o Participant demographics 974 o Non-disclosure agreement, if used, and informed consent form 975 o Example moderator’s guide 976 o System usability scale questionnaire (if used) 977 o Incentive receipt and acknowledgment form (if used). 978

5.6.3 Verify the user interface 979

5.6.3.1 Developers can ensure that health IT systems fulfill user interface requirements by verifying a final user interface design. 980 Verification is a matching exercise, often performed and documented using a spreadsheet or other requirements-tracking application, 981 that seeks to ensure there is a design feature or behavior to match each user interface requirement. The exercise serves to ensure that 982 no user interface requirement has been overlooked and/or unmet. 983

5.6.3.2 Developers often carry out user acceptance testing (UAT) to verify the user interface requirements. For example, in FDA’s parlance 984 regarding medical devices, verification calls for developers to confirm that there is a design output (i.e., user interface design feature or 985 behavior) to match each design input (i.e., user interface design requirement). For example, if there is a requirement to place a meaningful 986 title at the top of every window and major grouping, verification calls for the inspection of every health IT system window and major 987 grouping to ensure that such titles are present. 988

5.6.3.3 A health IT system's user interface may be considered verified when all user interface requirements have been met by the final 989 (i.e., production-equivalent) user interface design. In addition to user interface verification, the developer shall validate the user interface. 990

5.6.4 Validate the user interface 991

5.6.4.1 Finally, developers shall validate the final user interface. 992

5.6.4.2 While a developer might produce a health IT system that meets established user interface requirements (i.e., is verified), it does 993 not ensure that the intended users will be able to interact with it safely, effectively, or with satisfaction. Therefore, design verification is 994 insufficient evidence of a user interface’s design quality. Further evidence of interactive quality is needed and can be provided by 995 performing a step called validation. 996

5.6.4.3 To validate a user interface is to conduct a summative (i.e., final) usability test on a production-equivalent health IT system. Such 997 testing differs from formative usability testing because the focus is no longer on identifying design strengths and opportunities for 998 improvement, but rather on determining if users who interact with the health IT system in a representative manner are able to complete 999 tasks successfully and without committing use errors that could cause harm. 1000

5.6.4.4 Notably, the developer should define the configuration of the production-equivalent health IT system used to conduct the usability 1001 test. This system might be the “default” version that exists at the close of the development stage, prior to integration and implementation 1002 for a particular client. More commonly, if the “default” version supports multiple configurations, the developer should validate (a) the most 1003 likely configuration and (b) the configuration(s) associated with the highest levels of use-related risk. 1004

Page 42: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

36

5.6.4.5 Summative usability testing is the standard approach to validate a medical device’s use-safety and effectiveness, and both US20 1005 and international standards2122 on the application of human factors engineering to medical devices call for such testing. 1006

5.6.4.6 Summative usability tests should have the following characteristics: 1007

• Test participants include representatives of each distinct group of intended users. Most health IT system user populations will 1008 include multiple distinct user groups. 1009

• The number of test participants is usually greater than the number included in a formative usability test. For example, in the 1010 medical device industry, the standard of care for summative usability testing is to include 15 representatives of each distinct user 1011 population, suggesting a minimum sample size of 15, 30, and 45 participants for health IT systems that will be used by people 1012 who individually fit into 1, 2, and 3 distinct user populations (e.g., physicians, registered nurses, and medical assistants), 1013 respectively. 1014

• Testing is guided by a plan (i.e., protocol) that describes among other details the test goal, test method, test participants, state 1015 of the health IT system prototype, hands-on tasks, and data to be collected. See NISTIR 7742 for a sample usability test plan. 1016

• Test participants are not directed to think aloud while interacting with the health IT system, as they might be instructed to do in 1017 a formative usability test, because the information sharing technique can affect how people interact with the health IT system in 1018 ways that do not serve validation purposes. 1019

• Testing focuses on the riskiest tasks. The objective is to put the high-risk use error mitigations (i.e., within the user interface, 1020 instructional material, and training) to the test by calling upon test participants to perform the associated tasks and see if the 1021 error occurs or if there are patterns of close calls and difficulties that suggests a greater chance of use errors occurring in real-1022 world settings. 1023

• Test personnel seek to identify the root causes of any use errors and patterns of close calls and difficulties based on their 1024 observations of test participants performing tasks and associated interviews that come at the end of the test when the participant 1025 has performed all tasks. A key interview question to pose after a test participant has attempted all hands-on tasks is whether he 1026 or she believes the user interface is safe and effective as designed or needs modification to ensure that it is. 1027

• Test findings are described in a document that may include design recommendations. See NISTIR 7742 for a sample usability 1028 test plan. 1029

• While the participant completes use scenarios, the moderator will not ask follow-up questions to solicit subjective feedback. This 1030 approach limits opportunities for bias to be introduced to the usability test’s findings. Rather, the moderator will hold most 1031 questions (apart from those intentionally asked during post-task debriefs) until the final interview (i.e., the post-test debrief). 1032

5.6.4.7 With test results and associated analyses in hand, developers shall perform a residual risk analysis23 on any use errors that 1033 occurred during the test as well as patterns of close calls and difficulties suggesting a heightened chance of use error. If the residual risk 1034 analysis suggests that the health IT system is reasonably safe and effective, a developer may conclude that the application has been 1035 validated. If the residual risk analysis suggests that the health IT system is not reasonably safe and effective, the health IT system will 1036 require modification and follow-up validation. 1037

5.6.4.8 See Annex A Considerations for conducting validation usability tests for additional guidance on conducting validation usability 1038 tests. 1039

20 ANSI/AAMI HE75:2009, Human factors engineering – Design of medical devices. Association for the Advancement of Medical Instrumentation; 2009. Arlington, VA.

21 ANSI/AAMI/IEC 62366-1:2015, Medical devices – Part 1: Application of usability engineering to medical devices. Association for the Advancement of Medical Instrumentation; 2015. Arlington, VA.

22 AAMI/IEC TR 62366-2:2016, Medical devices – Part 2: Guidance on the application of usability engineering to medical devices. International Electrotechnical Commission, 2016. Geneva, Switzerland.

23 ANSI/AAMI/ISO 14971:2019, Medical devices—Application of risk management to medical devices. International Organization for Standardization; 2019. Geneva, Switzerland.

Page 43: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

37

6 Acquisition stage human factors engineering 1040

6.1 Procurement 1041

6.1.1 The Business Owner shall work with internal User subject matter experts (i.e., User SMEs, individuals responsible for 1042 representing clinical user needs in unique situations) to develop scenarios of use for key capabilities that represent the organization’s 1043 practice goals (including current pain points with care delivery and desired practice improvements). The User SMEs should support 1044 scenario development by documenting scenarios that represent key aspects of healthcare delivery (clinical work, administrative tasks, IT 1045 tasks, etc.). 1046

6.1.2 The Business Owner shall ensure that a human factors engineer with healthcare experience can be part of the selection team 1047 and participate in contract negotiations. 1048

6.1.3 The Business Owner shall determine how usability assessments of vendor products will be assessed and compared, and how 1049 usability will be factored into the overall decision matrix. 1050

6.1.4 The Business owner shall engage human factors engineers to conduct a usability evaluation of the top two or three vendor 1051 products to identify usability and safety risks. Users shall walk through scenarios with each vendor products and generate a risk score 1052 based on the number and severity to issues identified. 1053

6.1.5 The Business Owner, including user SMEs and human factors engineers, shall establish quality gates at key points during 1054 integration and implementation to assess the usability and safety of the implemented system. Such gates include acceptance testing and 1055 post-training user testing. 1056

6.1.6 The Business Owner, working with human factors engineers, shall specify usability and safety issues, describe how issues are 1057 captured and tracked, and define how issue severity will be determined (including requirements for addressing issues).24 1058

6.1.7 The Business Owner, working with human factors engineers, shall ensure that pertinent information about the workflow and 1059 context of use for the software or system that is ultimately selected, including their assessment of the usability risks and mitigations that 1060 will be required, is communicated to the Integrator. 1061

6.2 Guidance and good practice for procurement 1062

In preparing to address human factors engineering aspects during the selection process, the Business Owner should: 1063

• commission user research within the organization to identify opportunities to improve care delivery and usability issues with the 1064 current health IT system. Aspects of the current system that work well should be documented so they can be retained/replicated 1065 in the replacement system; 1066

• involve internal or external human factors engineering resources in investigating known usability and safety issues associated 1067 with the types of health software and systems solutions likely to be proposed by vendor, as part of market research on competing 1068 electronic health record (EHR) vendors, products, and services; 1069

• provide an overview of how usability will be assessed as part of product selection in the RFP. In doing so, it is important to first 1070 communicate the scenarios of use for key capabilities that represent the organization’s practice goals (including current pain 1071 points with care delivery and desired practice improvements). This overview should then include a sample of representative 1072 scenarios (and supporting materials) that will be used for demos and usability assessments. The overview should also specify 1073 the expectations of the vendor to support the usability assessment of their product; 1074

• request that vendors provide a list of current customers who can be interviewed and/or visited by members of the selection 1075 team, including human factors engineers. 1076

At the acquisition stage, the healthcare organization has obtained information from prospective vendors on their corporate profile, product 1077 capabilities and features, hardware and network requirements, and product cost estimates. The organization is now ready to determine 1078 which product or system will be the best option for their environment. In assessing the human factors aspects of the proposed solutions, 1079 the Business Owner should: 1080

• ensure vendors describe their EHR software engineering process in the context of usability and patient safety; 1081

24 See National Quality Forum’s recommendations for additional guidance: NQF: Identification and Prioritization of HIT Patient Safety Measures.

Page 44: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

38

• ask vendors to provide evidence of a user-centered design process that includes formative, summative, and post-1082 implementation testing; 1083

• ensure vendors describe how quality, usability and safety will be a focus during implementation; this description should address 1084 use-related risk management of product configuration and customization; 1085

• ask vendors to provide evidence (standard operating procedures, process descriptions, documentation) of monitoring quality 1086 and safety risks; 1087

• contact current vendor customers of similar setting, specialty, or practice size to inquire about satisfaction with product quality, 1088 usability and safety, as well as the implementation process; 1089

• investigate known usability and safety issues associated with the vendor EHRs as part of market research on competing EHR 1090 vendors, products, and services. 1091

Additionally, the Business Owner and representatives of the intended users (e.g., user SMEs) can arrange to visit customers of similar 1092 setting, specialty, or practice size, of the top two or three vendors. Users can observe two or more practices using the product(s) being 1093 considered in order to assess user experiences with the implementation process and to inquire about user satisfaction with quality, 1094 usability and safety. 1095

At this stage, the healthcare organization has evaluated vendor products, references and corporate stability, and assessed the vendors’ 1096 ability to conduct a safe and effective implementation. The organization determines which product(s) or system(s) they wish to acquire, 1097 who is responsible for delivery of the various elements and stages and how the product(s) and system(s) are to be integrated and 1098 implemented into their specific socio-technical environment in order to meet the health delivery organization’s procurement goals. 1099

In transitioning to the Integration stage it is important that the Business Owner communicate pertinent information and insights obtained 1100 through the acquisition process about the selected software to the Integrator, including details of the intended use environment, the 1101 required adaptations to the system through configuration and customization that will be required to align with the organization’s clinical 1102 workflow and their assessment of the requisite risks involved. 1103

7 Integration stage human factors engineering 1104

7.1 General Integration stage human factors engineering 1105

In the Integration stage, the systems’ expected functionality and interoperability with other systems in the HDO’s socio-technical 1106 ecosystem is translated into detailed specifications and project plans by a cross-functional team that includes the Business Owner, 1107 Software Developers, Integrator, Implementer, Human Factors Engineers and a representative cross-section of Users of the system. 1108

A theme of this section is balancing the usability needs of the Users as configuration and customization decisions are made. 1109 Configurations and customizations are intended to improve efficacy of the system. Human factors expertise is important to ensuring that 1110 configurations and customizations do not introduce unintended usability degradations. 1111

Per HIT1000-1, the transition from the Acquisition to the Integration stage occurs when the Business Owner provides the planned context 1112 of use of the HIT software in the health IT system and healthcare sociotechnical eco-system, and any known requirements for 1113 configuration or customization of the health IT software, training of Operators or Users, or special testing and monitoring of the integrated 1114 health IT system to the Integrator. 1115

At the Integration stage’s conclusion – and transition to the Implementation stage – the Integrator provides the Implementer additional 1116 information in the safety assurance case about any hazards (including those involving human factors and usability) that were identified 1117 during integration, including those that may have emerged during configuration and customization. Assumptions, mitigation strategy, and 1118 evidence or rationale for adequacy of mitigations are also provided. Any hazards that are expected to be mitigated during implementation 1119 are identified. 1120

7.2 Integration stage of human factors engineering process 1121

7.2.1 The Integrator shall engage individuals with human factors experience in health care in the integration process wherever use-1122 related risk is involved; 1123

7.2.2 The Integrator shall review the safety assurance case documentation provided by the Developer and assess the impact during 1124 the Integration stage of the Developer-identified use-related risks, as well as identifying additional use-related risks resulting 1125

Page 45: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

39

from with variations between the intended uses defined by the developer and the use environment specified by the business 1126 owner; 1127

7.2.3 The Integrator shall identify constraints that influence system requirements, architecture, and/or design; 1128 7.2.4 The Integrator shall consult with developers of the products involved as well as an appropriate cross-section HDO leaders and 1129

user representatives to ensure the use-related risks and impacts (including interdependences of the key configurations) are 1130 understood and managed appropriately when making configuration and customization decisions; 1131

7.2.5 The Integrator shall identify the potential risks of the planned customizations and discuss with the Business Owner; 1132 7.2.6 As a component of the integration process, the Integrator shall conduct usability evaluations after configuration and 1133

customization with representative users and then act on findings, prioritizing based on use-related risk; 1134 7.2.7 The Integrator shall define a testing plan that verifies the correct operation of the system and then documents this for the 1135

Implementer and Operator; 1136 7.2.8 The Integrator shall track configurations made as part of the change management process. 1137 7.2.9 The Integrator, working with human factors engineers, shall provide the Implementer with additional information in the safety 1138

assurance case about any hazards that were identified during integration, including those that may have emerged during 1139 configuration and customization. Assumptions, mitigation strategy, and evidence or logic for adequacy of mitigations are also 1140 provided. Any hazards that are expected to be mitigated during implementation are identified. 1141

7.3 Guidance and good practice for integration stage of human factors engineering process 1142

Because the integration stage deals with connecting one type of health IT system with another type of health IT system or device, many 1143 steps will actually involve multiple developers in the expected task. For example, if one type of health IT is being interfaced to another 1144 type, the developers of each system might be involved. 1145

The Integrator should: 1146

• review use-related risk analysis documentation from the developers and use it as a starting point for conducting integration-1147 specific risk analysis. Clinical input and human factors expertise are essential in assessing the risks of a particular integration 1148 from a usability and clinical process and safety perspective; 1149

• obtain best practice recommendations regarding configuration or customization of the product from the Developer. 1150 Recommendations should be sensitive to the varied needs of expected users and reflect experience and feedback the 1151 Developer has obtained (including incident reports) from customers with similar business requirements). For example, 1152 recommendations for a community-based rural hospital might be different than for a large academic hospital; 1153

• engage and consult with the following types of experts when making configuration and customization decisions: 1154

1. the developers of the products; 1155 2. key stakeholders such as user representatives and clinical leaders; 1156 3. human factors engineers 1157

• evaluate the relationship of usability and workflow alignment with the risks of configuration or customization while deciding on 1158 an appropriate scope of integration; 1159

• consider information about current workflows (often working with implementers in doing this step) and use the insights to inform 1160 the configuration and customization steps. Known usability issues with the current state should in particular be considered; 1161

• thoroughly test the integrated system from both a technical and clinical perspective (including workflow, usability and data 1162 quality perspectives); 1163

• share the results of integration testing, usability evaluations and use-related documentation with the developers to inform future 1164 product development and best practice recommendations; 1165

• provide the complete set of usability evaluations and use-related risk documentation to the Business Owner and Implementer, 1166 along with integration stage updates to the Safety Assurance Case at the transition to the next stage. 1167

8 Implementation stage human factors engineering 1168

8.1 General Implementation stage - human factors engineering 1169

Highly usable health IT software and systems contribute to well-functioning health IT software and systems by decreasing the cognitive 1170 load on practitioners, by not impairing the speed of clinical workflow, and by leading to fewer practitioner errors in use. Involving end 1171 users at every stage of system development, integration, implementation, and support will lead to higher adoption of more usable systems. 1172

Page 46: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

40

Well-functioning, usable health IT software and systems are ultimately intended to fulfil specific functions and goals in an effective and 1173 safe manner, where effectiveness focuses on the degree to which health IT software and systems facilitate the user accomplishing the 1174 intended task. Essentially, effective and safe health IT software and systems are frequently associated with the degree to which errors 1175 are avoided when performing a function, and tasks are successfully achieved. 1176

Distinct from the concept of usability, health IT software and systems possess features and functions that enhance user performance 1177 such as by offering options for data entry, suggest best practice for a given workflow, provide feedback to the user based on their entries, 1178 and display evidence and knowledge for the specific user function at hand. Such feedback loops apply not only to administrative and 1179 clinical practices, but also to the implementation and refinement of health IT software and systems. 1180

The roles of Integrator and Implementer are complementary. Integrators deal with the technical installation, configuration, integration with 1181 other systems and any necessary customization of the health IT software and systems for the organization’s specific health IT 1182 environment, while implementers then deal with the clinical, business, and human workflow context of health IT software and systems 1183

This section of the document describes the various components and actors involved in the Implementation stage of well-functioning health 1184 IT software and systems that lead to highly usable systems that help users perform easily, effectively, consistently and safely. 1185

At the Implementation stage’s conclusion – and transition to the Operational Use stage – the Implementer documents any specific actions 1186 needed by the Operator to maintain safety during use of the HIT software in the health IT system and any hazards that may need special 1187 attention on decommissioning and disposal of the health IT software in the safety assurance case. 1188

8.2 Implementation stage - human factors engineering process 1189

8.2.1 The Implementer shall engage individuals with human factors experience in similar health care system implementations; 1190 8.2.2 The Implementer shall review the safety assurance case documentation provided by the Integrator and assess their impact, as 1191

well as identifying additional use-related risks resulting from the specific implementation environment; 1192 8.2.3 The Implementer shall involve clinical leaders and a representative cross-section of the targeted user community throughout 1193

the implementation process, as well as consulting with the Developer(s) and other similar implementation sites as appropriate: 1194 8.2.4 Leveraging experiences with any pre-existing systems, the Implementer shall conduct a thorough workflow assessment to 1195

optimize the effectiveness of the new health IT software and system in improving the organization’s health care delivery 1196 environment, as well as minimizing use-related risks and negative impacts on clinical workflow and safety; 1197

8.2.5 The Implementer shall ensure the system decision support rules being implemented in the system align with the clinical best 1198 practices adopted by the organization for the targeted clinical environment(s) and are transparent to the users with appropriate 1199 thresholds to avoid cognitive burden, as well as processes for documenting clinical exceptions and updating the rules as clinical 1200 practices and new treatments evolve; 1201

8.2.6 The Implementer shall ensure that there are consistent ways for users to accurately and readily identify patients within, and 1202 across, the multiple systems that clinicians may use at various points in their workflow, as well as appropriate patient 1203 identification and data quality edits in order to catch potential errors and facilitate their correction at source; 1204

8.2.7 The implementer shall utilize an appropriate process for managing changes in clinical work processes as a result of the system, 1205 addressing human factors that may impact its safe use and supporting the transition and adoption of the new health IT system; 1206

8.2.8 The Implementer shall ensure that staff are appropriately skilled in supporting the new system and system users are well trained 1207 in its safe and effective use as a means for improving patient care quality and safety (See HIT1000-3): 1208

8.2.9 The Implementer shall develop a deployment plan for the system that includes pre‐production environments to test of all 1209 functions of the system using ‘real world’ scenarios as well as to support evaluation of the new system’s usability prior to full 1210 roll-out so that final adjustments to the system’s implementation can be made; 1211

8.2.10 The Implementer, working with human factors engineers, shall provide the Operator with additional information in the safety 1212 assurance case about any hazards that were identified during the implementation stage. Assumptions, mitigation strategy, and 1213 evidence or logic for adequacy of mitigations are also provided. Any hazards that are expected to arise and should be carefully 1214 monitored are to be identified for the Operator. 1215

8.3 Guidance and good practice for implementation stage - human factors engineering process 1216

Workflow assessment and optimization best practices enhance the effectiveness and safety of the health IT software and system as 1217 implemented in the organization’s health care delivery environment by minimizing negative effects on user workflow and taking full 1218 advantage of benefits of new health IT software and systems. 1219

Page 47: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

41

Implementers should involve a cross-section of expected users from each area of pertinent expertise in: 1220

• assessing significant usability issues and problems associated with the outgoing (“sun-setted”) system, including incident 1221 reports, safety logs and work arounds; 1222

• aligning system decision support rules with consensus clinical best practices for the targeted organizational environment(s). 1223 • observing the current workflow and asking users about their desired workflow for the new system using interviews, checklists, 1224

and surveys, while taking into account the new system’s capabilities and functions; 1225 • designing and then validating the chosen workflow steps to meet the needs of all participants and enhance effectiveness and 1226

safety of the new system. 1227

Some consider the best health IT software and systems to be those that do not change any users’ workflows. Others feel that the best 1228 health IT software and systems are those with features and functions leading to improved practice and patient outcomes, despite 1229 modifying many user workflows. Experienced Implementers strive for a balance of the two ends of a spectrum when a new system is 1230 installed and are aware of the organization’s readiness and senior leadership support for change, including issues that are both barriers 1231 to, and facilitators of, the change for the various types of users involved. 1232

To avoid user fatigue, decision support and data capture features should similarly be implemented to engage the user and facilitate 1233 decision making where critical patient safety risks are involved, while avoiding too many interruptive decision support alerts and data edit 1234 rules that disrupt workflows and can lead to workarounds. 1235

Implementers strive to meet day-to-day needs of users to make the new system user-friendly, but should be thoughtful and cautious when 1236 altering the system to the organization’s or individuals’ preferences, since local customizations and optionality often introduce new risks 1237 and can be difficult to maintain over the life of the system. 1238

Accurate patient identification and data are vital elements in ensuring that health IT systems provide complete and accurate information 1239 to clinicians in enabling safer care. Data quality and usability can be enhanced with the effective deployment and training on features 1240 such as range-checking, pick-lists, standardized terminologies and options to enter free text where appropriate (e.g., for supplementary 1241 notes and explanations). 1242

Before the health IT system can be put into productive use, it needs to be configured with real-world data so that users in each clinical or 1243 administrative specialty can run through complete simulation scenarios (“day-in-the-life”) with scripts to confirm system completeness 1244 and readiness for production. Depending on the scale of the implementation and its impact on existing clinical workflows, a short pilot 1245 or limited production roll-out may then be appropriate. This ensures the new system can be further evaluated and fine-tuned in a 1246 supportive clinical environment to secure further user feedback and enhance the system’s readiness for a wider system rollout by 1247 addressing any previously unforeseen usability or safety issues – e.g. through workflow adjustments, adjustments to alerts, training, etc. 1248

9 Operational use in the clinical setting stage - human factors engineering 1249

9.1 General operational use in the clinical setting stage - human factors engineering 1250

While the Operator is directly accountable to the Business Owner for the safety and effective use of the HIT system, changes occur 1251 throughout the ecosystem that the system operates within, e.g., changes in clinical processes, workflow and data; changes in other 1252 supporting systems, integrations and technology within the HDO’s health IT infrastructure; as well as updates by the Developer to the 1253 software itself. The Developer, therefore, shares important responsibilities at the operational stage with its customers and close Operator-1254 Developer collaboration is essential, especially in identifying and managing usability issues that can impact safety and effectiveness. 1255

HFE activities to support both the Operator and the Developer during Operational Use in the clinical setting include: 1256

• post-deployment monitoring, surveillance and safety event management, 1257 • analysis of workflow, information flow and decision making, 1258 • user-centered simulation and modelling of the proposed processes in support of system modifications, and 1259 • usability testing for changes in the software product and its implementation. 1260

Close collaboration and good communication between Operators and Developers is essential in identifying and mitigating risks to safety 1261 and effectiveness for systems at the operational stage. 1262

Page 48: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

42

9.2 Operational use in the clinical setting stage - human factors engineering process 1263

Throughout the operational use stage, the Operator is involved in activities associated with post-deployment monitoring, event 1264 management, reporting and resolution in their specific clinical setting, providing feedback to the Developer on usability issues. For 1265 upgrades and updates, the Operator supports Developer efforts to correct faults and to improve technical performance. 1266

The Operator shall: 1267

9.2.1 Engage individuals with human factors experience in managing usability issues during the operational stage and ensure clinical 1268 leaders and a cross-section of Users are involved in these processes as appropriate; 1269

9.2.2 Review the safety assurance case documentation provided by the Implementer and assess the impact on operations, paying 1270 special attention to actively monitoring key residual risks to usability that are identified; 1271

9.2.3 Perform post-deployment monitoring, identifying usability-related issues as a distinct component in its safety event or incident 1272 reporting system and collaborating with the Developer; 1273

9.2.4 Investigate and prioritize reported/discovered safety and usability issues in a timely fashion, identifying affected patients and 1274 root causes; as well as reporting safety and usability issues, potential workarounds and results to the Developer as appropriate; 1275

9.2.5 Address issues that are within their ability to fix by implementing workflow changes, workarounds, changes in the software’s 1276 configuration and/or installing and implementing updates to address identified patient safety risks, understanding that 1277 unintended breaks to the system may follow an update which will need to be tested before rollout; 1278

9.2.6 Review new features and updates from a usability standpoint before implementing them, in order to anticipate potential 1279 problems from the system and periodically evaluate and update workflow analysis documents, especially where system 1280 updates or clinical/business process updates or changes are likely to affect how the system is optimally used; 1281

9.2.7 Periodically review safety issues such as adverse events, near misses, and/or unsafe conditions reported in systems other 1282 than the safety event reporting system, inclusive of their IT help desk and available vendor reports (e.g., latency reports, near 1283 misses, how much time users are spending on the system); 1284

9.2.8 Monitor and optimize the supporting health IT infrastructure for the system and its impact on the usability, minimizing burdens 1285 (e.g., system downtime) associated with changes so they have minimal impact on users’ work activity; 1286

9.2.9 Document any hazards related to usability that may need special attention on decommissioning and disposal of the health IT 1287 software in the safety assurance case. For example, hazards relevant to the decommissioning stage could be “loss of patient 1288 data,” or “inadvertent disclosure of patient data.” 1289

For the Developer, the post-market surveillance for health IT usability errors is critical to providing safe and effective user interfaces for 1290 its customers, for which customer collaboration in the reporting usability issues is critical. Throughout operational use in the clinical setting, 1291 the Developer maintains shared responsibility to provide safe health IT software working in collaboration with the Operator. As part of 1292 upgrades and updates, the Developer will provide the Operator best practices aimed at minimizing training burden and costs, while at the 1293 same time optimizing usability and reducing risk of patient and clinician harm. 1294

The Developer shall: 1295

9.2.10 Collaborate with the Operator in conducting post-deployment monitoring and lessen Operator burden by providing best 1296 practices, tools, and guidance, including a post-implementation plan, measures, and dashboards; 1297

9.2.11 Establish internal processes to address usability and patient safety issues, as well as processes to actively collaborate with 1298 other stakeholders, primarily the operator, to address issues, needs, and risk reduction; 1299

9.2.12 Investigate, triage, and prioritize reported safety and usability issues in a timely manner, reporting the results (including the 1300 estimated timeframe for necessary software fixes) so that the Operator can implement corrective and/or preventive actions; 1301

9.2.13 Update internal use-related risk analysis documents based on post-deployment monitoring insights, and communicate updates 1302 and communicate changes to their customer base through advisories and safety assurance case updates; 1303

9.2.14 Incorporate insights from post-deployment monitoring, including observed data regarding usage patterns and user behaviour, 1304 into future software updates and releases, as well as documentation supporting safe and effective use; 1305

9.2.15 Minimize the training burden, impact on workflow, usability risks and associated costs when product updates and new 1306 enhancements are introduced; 1307

9.2.16 Employ HFE best practices to improve user effectiveness, efficiency and satisfaction, and reduce risk of patient and clinician 1308 harm during system design, development and modifications. 1309

Page 49: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

43

9.3 Guidance and good practice for Operational stage - human factors engineering process 1310

Strong post-market surveillance neither obviates the need for good pre-market usability testing (see 6.4. Validate the user interface) by 1311 health IT system vendors, nor does it replace a thorough Root Cause Analysis process (see A.3.3 Conducting root cause analysis) to 1312 understand the contribution of the health IT systems to safety events. The primary goal of post-market surveillance is to identify often 1313 subtle health IT usability issues that may not be recognized during pre-market usability testing or caught during the implementation stage 1314 of a health IT or other significant change to the system. Over time latent usability issues may become apparent and if left unmonitored 1315 may potentially lead to safety issues, including adverse events. End users of the health IT system may develop workaround strategies to 1316 address usability issues with the system, which in turn may create safety concerns or could impact other system designs such as cyber 1317 security or data integrity. For example, while a computerized physician order entry (CPOE) component of the health IT system may work 1318 flawlessly during usability testing, the interface may not assist the user to recover from an interruption and result in a wrong patient error 1319 when a provider orders the correct test on the incorrect patient. 1320

There are two different approaches to post-market surveillance data collection that an organization or health IT system vendor may take: 1321 active data collection and the use of “passive” data sources that contain potential patterns of use errors. Active surveillance involves the 1322 active collection of data regarding health IT usability issues. It is the most direct route and can provide primary evidence of use errors 1323 and usability issues in an implemented health IT system. This may include directed queries at the clinical health IT system database, 1324 basic usability testing, or observation in the live clinical environment. Passive surveillance uses alternative data sources from the health 1325 IT system that are collected during clinical operations to help identify potential health IT system usability problems. These alternative 1326 sources of information may be a result of informatics, quality and safety, process improvement or other operations where health IT system 1327 usability may be an unrecognized contributing factor to the process under investigation. Common sources include informatics help desk 1328 tickets and patient safety event report databases. 1329

A combination of these post-market surveillance methods is necessary to find these subtle and potentially rare events that can be 1330 catastrophic to patients when the normal safety systems break down. Using the empirical measures of real-world use errors can provide 1331 a means to refine the risk assessments and safety assurance cases to more accurately reflect the likelihood of hazardous situations 1332 occurring. Calculating these risks at an organizational level can help mitigate risks making the collaboration between developers and 1333 operators critical for understanding the true risk of health IT usability safety issues. Ultimately, leveraging post-market surveillance across 1334 healthcare organizations and across products will be critical to delivering safer and more efficient healthcare to our patients. 1335

Top Management support for the mitigation of health IT system usability issues is important at this stage in promoting safety event 1336 reporting systems and providing feedback to all users on how previous reports have improved the system. Strong User engagement 1337 continues to be very important at the operational stage in reporting usability-related issues to the Operator and Business Owner, as well 1338 as actively sharing insights gained through ongoing use of the system – e.g., identifying further opportunities to optimize workflow, 1339 improvements to staff training regarding system use and suggestions for system enhancements. 1340

10 Decommissioning stage - human factors engineering 1341

10.1 General decommissioning stage - human factors engineering 1342

During this stage, the Business Owner assumes overall responsibility for the system and engages the Operator and Developer in 1343 decommissioning the HIT system. In most situations the system is being replaced by a new system, in which case the Developer and 1344 Integrator for the new system will also be involved. Therefore, good communication between these stakeholders is essential. This 1345 communication includes use-related knowledge about hazards and their mitigations with the existing system, as well as hazards that 1346 need to be taken into account in transitioning to the new system (e.g., changes in use patterns and workflows with the new system, data 1347 migration, etc.) The Business Owner is also responsible for ensuring that Users can safely, effectively, and efficiently complete their tasks 1348 and achieve their goals while minimizing negative impact and burden on the User due to the Decommissioning activities. 1349

Users (e.g., physician, clinical staff, patient, layperson caregiver) must be at the center of all decommissioning and replacement Health 1350 IT activities, because they are the ones using the health IT software products and services to achieve patient care goals. Users’ goals 1351 and needs will not change during Decommissioning or Replacement. However, how Users will achieve their goals will invariably be 1352 impacted by Decommissioning or Replacement. The application of HFE to Decommissioning and Replacement will increase the likelihood 1353 that the Users will be able to complete their work activities to achieve their goals (i.e., to deliver quality care in a safe, effective, and 1354 efficient manner) and will minimize the burden associated with changes that negatively impact Users’ work activities. 1355

Page 50: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

44

10.2 Decommissioning stage - human factors engineering process 1356

During Decommissioning, the Business Owner must plan for increased complexity associated with parallel activities; those activities 1357 associated with Decommissioning and those activities associated with replacement of the system. 1358

During the Decommissioning stage the Business Owner shall: 1359

10.2.1 Document a project plan for the decommissioning and replacement of the HIT system that includes the elements necessary to 1360 address risks to usability and safety including Impacts on clinical workflows, archiving and migration of the clinical data, etc.; 1361

10.2.2 Coordinate the activities of the key stakeholders (e.g., Operator, Developer(s), Users) in managing hazards related to usability 1362 throughout the parallel activities of decommissioning and replacement of the system; 1363

10.2.3 Engage individuals with human factors experience in managing usability issues during this stage and ensure that Users are 1364 actively involved in key decisions. 1365

10.3 Guidance and good practice for Decommissioning stage - human factors engineering process 1366

Throughout the Decommissioning stage, the Business Owner is responsible for ensuring that Users can safely, effectively, and efficiently 1367 complete their tasks and achieve their goals while minimizing negative impact and burden on the User due to the HIT system 1368 Decommissioning and associated replacement activities. The Business Owner must plan for the increased complexity associated with 1369 parallel activities and take into account the cognitive and workload impacts on Users who are simultaneously using the current system 1370 while being trained and/or using a replacement system in parallel that has a different interface and functional characteristics. 1371

The goals and needs of the Users (e.g., physician, clinical staff, patient, layperson caregiver) will not change during Replacement and 1372 Decommissioning. A Decommissioning project plan that fails to allow Users to meet their needs throughout the transition process will 1373 increase the risk of patient harm as well as significantly affect User morale and motivation. The Business Owner also needs to ensure 1374 that the Developer and Operator continue to fully support the soon-to-be legacy system through the end stage of its life. 1375

Page 51: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

45

Annex A Summary of Requirements 1376

Table 8 – Summary of Requirements 1377

Development Stage Requirements Clause

Developers shall follow a user interface design process that seeks to identify potentially harmful use errors and decrease their likelihood of occurrence.

5.2.1.1

Developers shall follow a user interface design process that seeks to identify potentially harmful use errors and decrease their likelihood of occurrence.

5.2.1.2

Tasks judged to be safety-critical, urgent, frequent, or challenging shall be tested during a summative usability test.

5.2.2.3

Accordingly, developers shall take the following applicable steps to increase a health IT system's usability that may include using a human factors or usability engineering expert:

− Conduct ethnographic studies to understand users and their environments as an input to user interface requirement development;

− Develop usability-focused user interface requirements (e.g., in the form of written statements, conceptual designs, user stories, etc.);

− As practicable, implement accepted user interface design practices, such as those documented in industry guidance documents;25

− Iteratively design, conduct formative evaluations (e.g., usability testing, expert reviews), and revise the health IT system’s design based on usability testing results;

− Conduct summative usability testing and residual risk analysis.

5.2.3.5

Developers shall conduct user research as a foundational step in the user-centered design process.

5.3.1.1

Developers shall conduct some form of user research with a heavy focus on health IT systems safety and effectiveness.

5.3.2.1.4

The use related risk analysis shall be created during the research phase, be updated during usability evaluations, and then continue to be updated during the lifecycle stages following development (i.e., post market release including, integration, implementation, and operational use).

5.3.8.3

Developers shall implement and maintain a use-related risk management process. 5.4.1.2

Developers shall identify potential hazards, and foreseeable sequences of events, that may result in exposure of a patients to hazardous situations that could lead to harm.

5.4.2.2

User interface designers shall be held accountable for meeting the user interface requirements as opposed to designing the user interface according to team members’ personal judgments and/or programming expedience.

5.5.2

25 ANSI/AAMI HE75:2009, Human factors engineering – Design of medical devices. Association for the Advancement of Medical Instrumentation; 2009. Arlington, VA.

Page 52: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

46

Updated task analysis results [i.e., throughout the development of the Health IT system] – specifically the updated list of potential use errors – shall be an input to a continuing risk management effort.

5.5.3

Developers shall evaluate the health IT system’s user interface throughout the design and development process, including an evaluation of initial concepts, evaluation of preliminary and interim versions of the system, and validation of the “final” version of the system [i.e., for a particular client].

5.6

Developers shall ensure that health IT systems fulfill user interface requirements by verifying a final user interface design.

5.6.3.1

Developers shall validate the user interface. 5.6.3.3

Developers shall validate the final user interface. 5.6.4.1

With test results and associated analyses in hand, developers shall perform a residual risk analysis.

5.6.4.7

Acquisition Stage Requirements Clause

The Business Owner shall work with internal User subject matter experts (i.e., User SMEs, individuals responsible for representing clinical user needs in unique situations) to develop scenarios of use for key capabilities that represent the organization’s practice goals (including current pain points with care delivery and desired practice improvements). The User SMEs should support scenario development by documenting scenarios that represent key aspects of healthcare delivery (clinical work, administrative tasks, IT tasks, etc.).

6.1.1

The Business Owner shall ensure that a human factors engineer with healthcare experience can be part of the selection team and participate in contract negotiations.

6.1.2

The Business Owner shall determine how usability assessments of vendor products will be assessed and compared, and how usability will be factored into the overall decision matrix.

6.1.3

The Business owner shall engage human factors engineers to conduct a usability evaluation of the top two or three vendor products to identify usability and safety risks. Users shall walk through scenarios with each vendor products and generate a risk score based on the number and severity to issues identified.

6.1.4

The Business Owner, including user SMEs and human factors engineers, shall establish quality gates at key points during integration and implementation to assess the usability and safety of the implemented system. Such gates include acceptance testing and post-training user testing.

6.1.5

The Business Owner, working with human factors engineers, shall specify usability and safety issues, describe how issues are captured and tracked, and define how issue severity will be determined (including requirements for addressing issues).26

6.1.6

The Business Owner, working with human factors engineers, shall ensure that pertinent information about the workflow and context of use for the software or system that is ultimately

6.1.7

26 See National Quality Forum’s recommendations for additional guidance: NQF: Identification and Prioritization of HIT Patient Safety Measures.

Page 53: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

47

selected, including their assessment of the usability risks and mitigations that will be required, is communicated to the Integrator.

Integration Stage Requirements Clause

The Integrator shall engage individuals with human factors experience in health care in the integration process wherever use-related risk is involved.

7.2.1

The Integrator shall review the safety assurance case documentation provided by the Developer and assess the impact during the Integration stage of the Developer-identified use-related risks, as well as identifying additional use-related risks resulting from with variations between the intended uses defined by the developer and the use environment specified by the business owner.

7.2.2

The Integrator shall identify constraints that influence system requirements, architecture, and/or design.

7.2.3

The Integrator shall consult with developers of the products involved as well as an appropriate cross-section HDO leaders and user representatives to ensure the use-related risks and impacts (including interdependences of the key configurations) are understood and managed appropriately when making configuration and customization decisions.

7.2.4

The Integrator shall identify the potential risks of the planned customizations and discuss with the Business Owner.

7.2.5

As a component of the integration process, the Integrator shall conduct usability evaluations after configuration and customization with representative users and then act on findings, prioritizing based on use-related risk.

7.2.6

The Integrator shall define a testing plan that verifies the correct operation of the system and then documents this for the Implementer and Operator.

7.2.7

The Integrator shall track configurations made as part of the change management process. 7.2.8

The Integrator, working with human factors engineers, shall provide the Implementer with additional information in the safety assurance case about any hazards that were identified during integration, including those that may have emerged during configuration and customization. Assumptions, mitigation strategy, and evidence or logic for adequacy of mitigations are also provided. Any hazards that are expected to be mitigated during implementation are identified.

7.2.9

Implementation Stage Requirements Clause

The Implementer shall engage individuals with human factors experience in similar health care system implementations.

8.2.1

The Implementer shall review the safety assurance case documentation provided by the Integrator and assess their impact, as well as identifying additional use-related risks resulting from the specific implementation environment.

8.2.2

The Implementer shall involve clinical leaders and a representative cross-section of the targeted user community throughout the implementation process, as well as consulting with the Developer(s) and other similar implementation sites as appropriate.

8.2.3

Page 54: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

48

Leveraging experiences with any pre-existing systems, the Implementer shall conduct a thorough workflow assessment to optimize the effectiveness of the new health IT software and system in improving the organization’s health care delivery environment, as well as minimizing use-related risks and negative impacts on clinical workflow and safety.

8.2.4

The Implementer shall ensure the system decision support rules being implemented in the system align with the clinical best practices adopted by the organization for the targeted clinical environment(s) and are transparent to the users with appropriate thresholds to avoid cognitive burden, as well as processes for documenting clinical exceptions and updating the rules as clinical practices and new treatments evolve.

8.2.5

The Implementer shall ensure that there are consistent ways for users to accurately and readily identify patients within, and across, the multiple systems that clinicians may use at various points in their workflow, as well as appropriate patient identification and data quality edits in order to catch potential errors and facilitate their correction at source.

8.2.6

The implementer shall utilize an appropriate process for managing changes in clinical work processes as a result of the system, addressing human factors that may impact its safe use and supporting the transition and adoption of the new health IT system.

8.2.7

The Implementer shall ensure that staff are appropriately skilled in supporting the new system and system users are well trained in its safe and effective use as a means for improving patient care quality and safety (See HIT1000-3).

8.2.8

The Implementer shall develop a deployment plan for the system that includes pre‐production environments to test of all functions of the system using ‘real world’ scenarios as well as to support evaluation of the new system’s usability prior to full roll-out so that final adjustments to the system’s implementation can be made.

8.2.9

The Implementer, working with human factors engineers, shall provide the Operator with additional information in the safety assurance case about any hazards that were identified during the implementation stage. Assumptions, mitigation strategy, and evidence or logic for adequacy of mitigations are also provided. Any hazards that are expected to arise and should be carefully monitored are to be identified for the Operator.

8.2.10

Operational Stage Requirements Clause

Engage individuals with human factors experience in managing usability issues during the operational stage and ensure clinical leaders and a cross-section of Users are involved in these processes as appropriate.

9.2.1

Review the safety assurance case documentation provided by the Implementer and assess the impact on operations, paying special attention to actively monitoring key residual risks to usability that are identified.

9.2.2

Perform post-deployment monitoring, identifying usability-related issues as a distinct component in its safety event or incident reporting system and collaborating with the Developer.

9.2.3

Investigate and prioritize reported/discovered safety and usability issues in a timely fashion, identifying affected patients and root causes; as well as reporting safety and usability issues, potential workarounds and results to the Developer as appropriate.

9.2.4

Address issues that are within their ability to fix by implementing workflow changes, workarounds, changes in the software’s configuration and/or installing and implementing

9.2.5

Page 55: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

49

updates to address identified patient safety risks, understanding that unintended breaks to the system may follow an update which will need to be tested before rollout.

Review new features and updates from a usability standpoint before implementing them, in order to anticipate potential problems from the system and periodically evaluate and update workflow analysis documents, especially where system updates or clinical/business process updates or changes are likely to affect how the system is optimally used.

9.2.6

Periodically review safety issues such as adverse events, near misses, and/or unsafe conditions reported in systems other than the safety event reporting system, inclusive of their IT help desk and available vendor reports (e.g., latency reports, near misses, how much time users are spending on the system).

9.2.7

Monitor and optimize the supporting health IT infrastructure for the system and its impact on the usability, minimizing burdens (e.g., system downtime) associated with changes so they have minimal impact on users’ work activity.

9.2.8

Document any hazards related to usability that may need special attention on decommissioning and disposal of the health IT software in the safety assurance case. For example, hazards relevant to the decommissioning stage could be “loss of patient data,” or “inadvertent disclosure of patient data.”

9.2.9

Decommissioning Stage Requirements Clause

During the Decommissioning stage the Business Owner shall document a project plan for the decommissioning and replacement of the HIT system that includes the elements necessary to address risks to usability and safety including Impacts on clinical workflows, archiving and migration of the clinical data, etc.

10.2.1

During the Decommissioning stage the Business Owner shall coordinate the activities of the key stakeholders (e.g., Operator, Developer(s), Users) in managing hazards related to usability throughout the parallel activities of decommissioning and replacement of the system.

10.2.2

During the Decommissioning stage the Business Owner shall engage individuals with human factors experience in managing usability issues during this stage and ensure that Users are actively involved in key decisions.

10.2.3

1378

Page 56: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

50

Annex B Considerations for conducting validation usability tests 1379

B.1 Overview of validation usability tests 1380

This section provides expanded guidance on conducting an effective HF validation usability test of a health IT system. 1381

Developers may consider a health IT system to be validated when they can make the claim, supported by validation usability test data, 1382 that: 1383

• The application is not vulnerable to use errors that could potentially lead to significant harm (i.e., that the implemented design 1384 mitigations effectively protect against use error). 1385

• The intended users could successfully complete all tasks that are essential to ensuring patient safety. 1386

The veracity of these claims depends heavily on the quality of the summative usability test (i.e., tasks associated with the potential for 1387 serious harm) and how such testing interrelates with risk management efforts, which in turn interrelate with user research efforts. This 1388 explains why it is imperative to apply human factors engineering (or usability engineering) techniques early and throughout the health IT 1389 system user interface development process. Starting late or skipping a step not only stands to erode the quality of the user interface, but 1390 also jeopardizes a developer’s ability to make the above claims. 1391

One approach to summative usability testing, and associated risk management efforts, includes the following steps: 1392

a) Review the use-related risk analysis results to identify high-risk tasks. See HIT1000-3. 1393

b) Write a summative usability test plan that calls for a sample of representative users to attempt to complete high-risk tasks. 1394

c) Conduct a pilot test (including one or more sessions) to ensure that the test will proceed smoothly without artifacts that could 1395 artificially degrade or enhance user performance. 1396

d) Conduct the test sessions. 1397

e) Consolidate and analyze the test data. 1398

f) Perform root cause analysis of user interaction problems (see next section for more detail). 1399

g) Write a summative usability test report. 1400

h) Conduct a residual risk analysis of identified user interaction problems. 1401

i) (If necessary) Modify the health IT system to reduce the chance of user interaction problems and conduct follow-on validation 1402 testing (potentially focusing only on the modified portions of the health IT system). In other words, conduct another iteration of 1403 the design-model-test cycle. 1404

j) Update the summative usability test report to include the results of the residual risk analysis: rationales supporting the claim that 1405 the health IT system is reasonable safe and effective. 1406

As shown in Figure 3 and discussed above (Step i), the process allows for multiple iterations to ensure that all significant, use-related 1407 risks have been reduced to an acceptable level in the spirit of preventing harm, and that users are able to complete essential tasks. 1408 Simply stated, if a summative usability test reveals that users are committing significant (i.e., safety-related) use errors, the health IT 1409 system developer should implement changes to eliminate or reduce the chance of such use errors and then conduct a supplemental 1410 summative evaluation to validate the effectiveness of the design change. 1411

Page 57: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

51

1412

Figure 3 – An illustration of the risk analysis and mitigation process (Johnathan Kendler, Curiolis) 1413

B.2 Recommendations for conducting successful validation usability tests (See IEC 62366-1 and IEC 62366-2) 1414

Specific validation usability testing recommendations include the following: 1415

• Test participants: Test participants should represent the intended users, including representatives from each distinct user group. 1416 Developer employees should not be test participants, even if they have appropriate clinical backgrounds. If the health IT system 1417 is being validated for use within the USA, the test participants should be USA residents and should include at least 15 participants 1418 from each user population; this is similar to the recommendations set forth by FDA for validation usability testing of medical 1419 devices.27 1420

• System configuration: Recognizing that health IT systems are often customized for particular institutions and departments within 1421 them, developers should choose the most common and complete configuration for validation testing, as well as any configurations 1422 required to assess high-risk interactions. 1423

• Production equivalence: The health IT system should be in a production-equivalent form. It should not be a “work-in-progress” 1424 that is expected to change before its release. 1425

• Sample data: The health IT system should be populated with clinically realistic and complete data to enable naturalistic 1426 interactions with the application. 1427

• Test session: Common steps in a usability test session are the following: 1428

a) Greet the test participants and have him or her review and complete an informed consent form, which may include 1429 statements pertaining to the non-disclosure of confidential information and the use of video recordings and photographs. 1430 Inform the test participant that the test is intended to evaluate the health IT system, not the participant’s skills, work 1431 procedures, or clinical expertise. 1432

b) Orient the test participant to the test environment, test personnel, test purpose, and data collection instruments (rating 1433 scales, rating criteria). The orientation may include a general introduction to the health IT system in a manner that does not 1434 constitute training or task assistance. 1435

27 See CDRH, Applying Human Factors and Usability Engineering to Medical Devices.

Page 58: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

52

c) Direct the test participant to independently perform pre-selected hands-on tasks, identified during the risk analysis process. 1436 Depending on the health IT system, the number and extent of the tasks can vary widely. It is common for a validation 1437 usability test to call upon test participants to perform 10-20 separate tasks. 1438

d) After the test participant completes all hands-on tasks, conduct an interview to collect test participant impressions of user 1439 interaction problems (e.g., use errors, close calls, and difficulties) and seek feedback about the health IT system's overall 1440 use-safety and ways to address any particular safety concerns. In certain cases, it is also beneficial to ask follow-up 1441 questions after the test participant completes a series of interrelated tasks (see Timing of questions below). 1442

e) Compensate (if applicable), thank, and dismiss the test participant. 1443

• Test session duration: A summative usability test usually consumes at least 60 minutes, averages 2 hours, and may extend to 1444 3 hours or longer. Longer test sessions usually include one or more rest periods. 1445

• Test personnel: A summative usability test calls for adherence to a formal plan and requires extensive data collection. As such, 1446 it is advantageous to have two people work as a team to administer each test session. Typically, one team member serves as the 1447 test administrator, directing the test activities and leading discussions with the test participant. The other team member serves as 1448 the data analyst, observing user interactions with the health IT system and recording the associated data (e.g., use errors, close 1449 calls, difficulties, anecdotal comments, observations). However, it is possible for one person to conduct test sessions depending 1450 on the test’s focus, amount of test data, and data collection methods. 1451

• Training participant training: Test participants should receive training if the health IT system developer plans to implement 1452 administrative controls that will ensure users receive training before using the health IT system. Training should represent the 1453 training that will be delivered after implementation. Otherwise, the test should include both trained and untrained participants. 1454 Practicality suggests providing consolidated training in a single session as opposed to spreading multiple training sessions over 1455 many days or weeks. 1456

• Decay period: There should be a decay period between the end of a test participant training session and the ensuing test session. 1457 Decay as short as 1-2 hours might be sufficient, particularly if the training and test sessions are short and the real-world gap 1458 between training and first use is also short. Longer training and test sessions, and real-world cases in which substantial time often 1459 passes between training and first use, suggest a longer training decay period (e.g., training on Day 1 and testing on Day 2) is 1460 appropriate. Logistically, it is unusual for decay periods to extend more than a few days. 1461

• Familiarization period: In cases that a user is likely to take time to become familiar or reacquaint himself or herself with a health 1462 IT system before using it, it is acceptable to designate time at the beginning of a test session for this purpose. However, test 1463 personnel should not dictate a minimum amount of time that should be spent, or how the participant should act. For example, it 1464 would be wrong to instruct the participant to study the health IT system’s use in detail and review the user manual’s content. 1465 Rather, it is appropriate to state something like this: 1466

“Imagine that it is your first day working in a new hospital unit. The charge nurse expects you to use the unit’s health IT system 1467 independently, and to seek help from a colleague only if you face difficulties you cannot resolve independently. You now have the 1468 option to take up to 15 minutes to become familiar with the health IT system in the same manner that you might in a real use 1469 situation. My colleague and I will be nearby doing other work. Let us know when you feel comfortable proceeding to the hands-on 1470 tasks.” 1471

• Thinking aloud: As mentioned earlier, a technique called thinking aloud calls for the test participant to verbalize his or her thoughts 1472 while performing a task. It is appropriate during formative usability testing but not during summative usability testing. However, it 1473 is acceptable for a summative usability test participant to think aloud spontaneously, so long as the test administrator does not 1474 encourage it. Most commonly, the test administrator may ask the participant for feedback once a task is complete. 1475

• Task directions: Task directions are a prompt or instruction provided to a participant at a task’s start that describes the high-level 1476 task the test participant should perform. It is common to present task directions on a card for the test participant to read aloud to 1477 ensure participants understand all of the words in the prompt and the prompt’s meaning. Test personnel may then ask the test 1478 participant if he or she understands the task. If there is a lack of understanding, the test administrator may choose to provide minor 1479 clarifications regarding the task goal, taking care not to provide any information that could be considered task assistance that 1480 could lead the participant to act a certain way during the usability test. 1481

Task directions should be worded in a manner that clarifies the task goal but does not assist test participants by instructing them 1482 how to perform the task. For example, avoid using terms that cue users to the proper menu selections or buttons to click. Task 1483 directions should use words that all test participants will understand, regardless of age and education level. 1484

Good: Determine if Olivia Green, a 10-month old girl, is growing at a normal rate. 1485

Page 59: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

53

Poor: Olivia Green is a 10-month old girl. Review the growth chart on the “Patient Chart” screen to determine if she 1486 is growing at a normal rate. 1487

• Presentation of learning aids: Test personnel should not direct users to access available learning aids (e.g., user manual, online 1488 help), noting that no such direction would occur in a real-world use scenario. Rather, test participants should access learning aids 1489 only on their own initiative and in cases when such aids are normally available. 1490

• Assistance: The test administrator may choose to provide assistance to a test participant when the test participant has (1) reached 1491 an impasse such that he or she cannot continue independently, and (2) has passed a practical task time limit. Provide progressive 1492 levels of assistance, starting with a minimal assist that will help the participant to “get back on track” (i.e., provide prompts such 1493 as “What are you working on?” or “What are you looking for?”). Note that providing assistance constitutes a task failure. The 1494 reasons to provide any assistance at all, as opposed to stopping the task, are to (1) determine how well the test participant will be 1495 able to perform subsequent portions of the task, and (2) provide the user with the necessary, contextual knowledge to proceed 1496 with follow-on tasks. 1497

• Assistance threshold: Only assist a test participant with a task when he or she has endeavored to complete the task without 1498 assistance for a predetermined period of time (e.g., 5 minutes) that is considered sufficient for even slow performers to complete 1499 it. If a participant asks for assistance before the assistance threshold, encourage the participant to persevere with the task. 1500

• Hotline access: It is acceptable to provide test participants access to an actual or simulated hotline (i.e., help line) if such a 1501 resource will be available to actual health IT system users and the hotline service can be delivered in a “production-equivalent” 1502 form. Participants may call a hotline using their cellphone or a portable phone provided in the test room. Ensure hotline calls are 1503 completed using a speakerphone to enable test personnel to monitor and record the call. 1504

• Natural task flow: Do not interrupt the natural flow of a user task by dividing an integrated task into multiple steps or stopping to 1505 pose questions or collect ratings, for example. That said, it might be justifiable to divide a large span of activity into chunks that 1506 are normally separated by significant amounts of time. 1507

• Timing of questions: Pose essential questions about use errors, close calls, and difficulties – as well as their potential root 1508 causes – only at the natural stopping points during the hands-on portion of the test, or after the test participant completes all 1509 hands-on tasks. If debriefing about an event during a post-task interview could bias the participant’s subsequent task performance, 1510 collect these subjective assessments during the post-test28 interview. 1511

• Documenting user interaction problems: Document all user interaction problems in as much detail as possible, including onset 1512 conditions and performance influencing factors. 1513

• Identifying root causes: In a post-task or post-session interview regarding use errors, close calls, difficulties, and instances of 1514 test administrator assistance be sure to ask questions and follow-up questions that generate responses helpful to determining the 1515 root causes of the interaction problems. Dissuade users from blaming themselves and encourage them to suggest design-related 1516 root causes. Note that the test participants’ shared insights might be judged at a later point to be accurate or inaccurate. Indeed, 1517 participants might speculate and suggest some erroneous root causes, but analysts can subsequently separate truth from creative, 1518 off-base suggestions. 1519

• Design improvements: Do not solicit design suggestions from participants during a summative usability test. Instead, summative 1520 usability testing should focus on collecting data that will determine if the health IT system is safe and effective. The only appropriate 1521 time to solicit design suggestions is at the end of the test, if the participant says he or she thinks the design needs to change to 1522 be safer to use. 1523

• Final use-safety and usability assessment: Near the end of a test after the participant has attempted all hands-on tasks, test 1524 personnel should give the test participant the opportunity to comment on the health IT system’s overall use-safety and indicate if 1525 he or she believes there is a need for design modifications (including changes to instructions and training) to ensure its use-safety. 1526 Additionally, test personnel may seek test participants’ overall impressions of the health IT system's usability (unrelated to use-1527 safety) if their responses are of commercial interest. 1528

• Ratings: Test personnel may collect usability ratings focusing on matters such as ease of use and task speed if they are of 1529 commercial interest. However, developers should not treat the ratings as evidence of use-safety. Also, to avoid biasing 1530 discussions, test personnel should delay the collection of commercial-oriented feedback until all other technical portions of the 1531 summative usability test are completed. 1532

• Pleasant, neutral demeanor: Test personnel, and particularly the test administrator, should maintain a pleasant, neutral 1533 demeanor during all discussions about the health IT system. They should not communicate any pleasure or displeasure regarding 1534 the system's performance or the test participants’ expressed views. 1535

28 See the Usability Test Plan on usability.gov for more information on post-task interviews and post-test interviews.

Page 60: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

54

B.3 Additional considerations for usability testing 1536

B.3.1 Selecting usability test participants 1537

The quality of a usability test depends in part on the appropriateness of the test participants involved, which is largely under the 1538 developer’s control. The usability test plan should include a recruiting screener that establishes the characteristics of qualified test 1539 participants, and characteristics that would disqualify individuals as test participants, which should already be available in the form of user 1540 profiles or personas (see clause 5.3.2.2 User profiles and clause 5.3.2.3 Personas). 1541

As discussed in clause A.3.1 Selecting usability test participants, usability test participants should bring objectivity to a test session. 1542 Therefore, it is generally considered poor practice to engage developer organization employees or contractors, as well as employees or 1543 contractors from other companies developing comparable products as test participants. Even if such participants can be objective from a 1544 technical standpoint, their views of a given health IT system might be influenced by their organizational affiliation and personal 1545 relationships (direct or indirect) with test personnel and other stakeholders. Consequently, their feedback about the health IT system 1546 could be distorted (overly positive or negative) as compared to the feedback that could be collected from more objective individuals. 1547

Also, as discussed in clause A.3.1 Selecting usability test participants, test participants should represent members of a health IT system's 1548 distinct user populations. Individuals in different user populations have characteristics that differentiate them in a way that could influence 1549 use of a health IT system. For example, a Nurse and a Physician will interact differently with health IT systems due to their experience, 1550 roles and responsibilities, and education. It is common practice to engage three to five representatives of each distinct user population in 1551 a formative usability test, although larger sample sizes might be warranted, especially when test results will drive major design decisions. 1552

When selecting participants for a summative usability test, test planners should take care to sample each distinct user group equally 1553 rather than assemble a group that proportionally mirrors the user population. In other words, if two-thirds of the users are nurses, but 1554 there are three additional types of distinct users, nurses should comprise only one quarter of the test participants (e.g., 15 out of 60). In 1555 parallel, it is advantageous to select individuals with specific, secondary characteristics to obtain a sample that represents a distribution 1556 within a particular group of 15 individuals, such as the following: 1557

• Type of care environment (e.g., hospital/unit, clinic, physician’s office, long-term care facility) 1558

• Occupational experience (e.g., years working as a nurse) 1559

• Experience using one or several specific health IT systems 1560

• Other relevant characteristics to consider (e.g., demographics). 1561

B.3.2 Specifying usability test tasks 1562

Formative usability tests afford the opportunity to have test participants perform any task of interest to the health IT system development 1563 team. As such, tasks might or might not be considered critical (i.e., might or might not be related to high-severity risks), and task selection 1564 might be driven by which portions of the health IT system have been prototyped. Summative usability testing does not afford this degree 1565 of freedom. As discussed in clause 5.6.4 Validate the user interface, a summative usability test’s purpose is to validate that the intended 1566 users can complete safety-related tasks without committing potentially harmful use errors. This requirement automatically constrains the 1567 task list, at least the list that pertains to validation, noting that a test may include other tasks about which there is an interest in ensuring 1568 a high degree of usability for commercialization sake. 1569

Each task description within a summative usability test plan may include the following contents: 1570

• Short task description. 1571

• Test participant prompt. (Note that the prompt should exclude content that could instruct the participant how to perform the task.) 1572

• List of associated risks (found in use-related risk management documents, such as uFMEA). The list can simply reference risk 1573 identification numbers included in the risk analysis. 1574

• Health IT system configuration at the task’s starting point. 1575

• Additional information required to complete the task (e.g., a set of patient vital signs requiring manual entry). 1576

• Distractions (if any), such as realistically interrupting a task with a phone call that the test participant would be required to answer. 1577

Page 61: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

55

− Summary of expected actions. 1578

− List of anticipated potential use errors. Note that all use errors, anticipated or not, as well as close calls and difficulties should 1579 be captured in a data log. 1580

Note: Given that a health IT system is subject to configuration, usability test tasks should be based on the following configurations: 1581

• The most common configuration(s) or set of configuration(s) 1582

• The configurations that are most vulnerable to use-related risks. 1583

B.3.3 Conducting root cause analysis 1584

This document section provides expanded guidance on how to determine the root causes of use errors and patterns of close calls and 1585 difficulties. 1586

It is most productive and accurate to assume that most user interaction problems with a health IT system stem from shortcomings in its 1587 user interface and support system. Conversely, it is unproductive to blame the users for interaction problems. As such, summative usability 1588 testing specialists abide by the mantra “don’t blame the users.” 1589

Nonetheless, “to err is human.” People play the leading role in the performance of a use error. There is no question that people can be 1590 forgetful, careless, clumsy, lazy, confused, and demonstrate many more shortcomings. But just as it is obvious that people can exhibit 1591 these shortcomings, it is also obvious that a health IT system must be designed defensively to account for them. 1592

Developers should be mindful that health IT systems are used by different types of people. These people will perform a wide range of 1593 tasks in various environment and use scenarios, and a health IT system may be in use for many years. Logic dictates that some 1594 interactions will involve individuals who might not be the strongest performers when it comes to using a health IT system, and that the 1595 health IT system must protect against potentially harmful use errors. Moreover, the health IT system’s user interface must be free of flaws 1596 that would induce even the most experienced users to commit use errors despite their best efforts to perform tasks properly to ensure 1597 safety and success. 1598

Moving forward with the viewpoint that user interface design shortcomings, and not users, are the principal cause of use errors, the 1599 challenge is to identify the most likely, user interface-related cause of use errors. One approach to root cause analysis (RCA) includes 1600 the following steps, some of which may be performed in parallel. 1601

• Identify provisional root causes: Test personnel, and perhaps other team members familiar with the test results, should have 1602 a sense for what might have caused a given user interaction problem. To start the RCA process, document such cause(s) and 1603 indicate them as provisional and subject to change. It might be productive for team members to identify provisional root causes 1604 independently and then discuss them enroute to consensus, provisional findings. 1605

• Analyze anecdotal evidence: Test data should include test participant explanations regarding the cause of use interaction 1606 problems. Recognizing that this information might be accurate or inaccurate, analysts should review it to see if it reinforces the 1607 provisional root causes(s) or suggests modifying them. 1608

• Inspect the health IT system for user interface design flaws: Arguably, the most productive step in the RCA process is to 1609 inspect the health IT system 's user interface for design flaws. The inspection may be based on the principles provided in AAMI’s 1610 forthcoming Technical Information Report on health IT user interface design principles, as well as other sources. 1611

• Consider other contributing factors: Analysts should also consider whether other factors contributed to the root cause, such 1612 as characteristics of other health IT systems, which might cause negative transfer of experience that leads to a use error; use 1613 environment factors, such as poor interoperability among systems; and test artifacts that induce unnatural user-system 1614 interactions. 1615

• Develop a final hypothesis: The last step is to develop a final hypothesis about the user interaction problems root cause(s), 1616 informed by the aforementioned analyses. Clearly, the final hypothesis will be the end product of considering evidence and 1617 applying professional judgment. 1618

B.3.4 Documenting user interaction problems 1619

This section provides expanded guidance about reporting user interaction problems in a summative usability test report. 1620

Page 62: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

56

A report about a summative usability test of a health IT system should thoroughly document the quality of user interactions with the 1621 system. Presuming that a developer has followed a user-centered approach to user interface design that has included early user research 1622 and formative usability testing, the quality of user interactions with a production-equivalent health IT system should be quite good. 1623 However, there are almost always residual, and potentially minor, findings to report. Therefore, it is important to fully document user 1624 interaction problems, possibly in a narrative or tabular form, including the following details: 1625

− Finding title: A short description of the finding for convenient reference. 1626

− Associated task: The single or multiple tasks during which the finding occurred. 1627

− Associated risks: Cross references to the use-related risk analysis document and line item (or equivalent) that addresses the 1628 finding, indicating its likelihood, the severity of the potential harm, and any risk reduction measures. 1629

− Potential harm(s): The harm that may arise due to the finding, as already stated in the use-related risk analysis document. 1630

− Finding description: Detailed description of the finding, including the number of test participants who experienced the finding 1631 (e.g., committed the use error) and how many times the finding occurred in total. 1632

− Participant-reported root causes: Descriptions of the root causes identified by the participants. 1633

− Root cause analysis: Descriptions of the root causes determined by the usability analysts based on their analysis of test 1634 participants’ performance and commentary. 1635

In addition to the components described above, a residual risk analysis should be conducted as follows: 1636

− Residual risk analysis: Document the results of a residual risk analysis of the finding, which may indicate if the health IT 1637 system's user interface is reasonably safe and effective “as is,” or if it needs modification. A description of a potentially effective 1638 modification is not necessary, but it may be included in a summative usability test report. 1639

Importantly, even if a single test participant commits an error that could lead to patient harm (e.g., ordering a 10-fold overdose of the 1640 intended medication), the user interface should be assessed as a potential cause of the error. Often, the assessment suggests that the 1641 user interface could be improved to prevent the use error, and that the improvement should be implemented to bring use-related risk into 1642 an acceptable range. After all, once the health IT system is released there will be many thousands of opportunities for other clinicians to 1643 make the same, harmful error. 1644

Page 63: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

57

Annex C Sample expert review findings 1645

Expert review findings are typically categorized as strengths or opportunities for improvement, augmented by potential design 1646 improvements. This appendix presents several examples of expert review findings. 1647

C.1 Sample Finding (Opportunity for Improvement): Placement of referral function 1648

Finding category: Content Organization 1649

Issue: The referral placement (i.e., Referrals) function is categorized under the Demographics tab. 1650

Potential use error: Users might not find the Referral function because it does not seem related to Demographics. 1651

Hazardous situation and harm: Health Care Provider (HCP) searching for Referral function results in minor delay in referral and 1652 treatment. 1653

Potential design improvement: Add a new tab titled Referral. 1654

1655

Figure 4 – Referral function located under Demographics 1656

C.2 Sample Finding (Opportunity for improvement): Use of ellipses 1657

Finding category: Affordances 1658

Issue: The use of ellipses to indicate that the user can access more details about patient Allergies might not draw the user’s attention 1659 because the symbol is neither informative nor conspicuous. Moreover, some users might not recognize whether the presence of ellipses 1660 is a positive indication that the patient has allergies or not; the symbol is ambiguous. The user should not have to navigate to a list of 1661 Allergies to determine that the patient has none. 1662

Potential use error: A user might overlook the fact that the patient has an allergy that has bearing on a diagnosis and/or treatment. 1663

Hazardous situation and harm: HCP delivers patient medication to which he/she is allergic, and patient experiences allergic reaction. 1664

Potential design improvement: Replace the ellipses with an informative term, such as Yes that links to a list of Allergies. 1665

1666

Figure 5 – Indication of allergies is subtle 1667

Page 64: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

58

1668

Figure 6 – Clicking on “…” leads to list of allergies or none 1669

C.3 Sample Finding (Strength): Allergy Alerts 1670

Finding category: Content Organization 1671

Finding: The application alerts the HCP that the patient is allergic to pollen. Pollen is highlighted in red text to draw the user’s attention. 1672

Potential benefits: The HCP will be aware of ongoing health issues that might have bearing on patient diagnosis and treatment. 1673

1674

Figure 7 – Clinical Reminders 1675

Page 65: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

59

Bibliography and cited references 1676

I. Standards 1677

AAMI (PS) HIT1000-1:2018, Safety and effectiveness of health IT software and systems – Part 1: Fundamental concepts, 1678 principles, and requirements. Association for the Advancement of Medical Instrumentation. Arlington, VA. 1679

AAMI HIT1000-2 (PS) (in development), Safety and effectiveness of health IT software and systems – Part 2: Application of 1680 quality systems principles and practices. Association for the Advancement of Medical Instrumentation. Arlington, VA. 1681

AAMI HIT1000-3 (PS) (in development), Safety and effectiveness of health IT software and systems – Part 3: Application of 1682 risk management. Association for the Advancement of Medical Instrumentation. Arlington, VA. 1683

ANSI/AAMI HE75:2009/(R)2018, Human factors engineering – Design of medical devices. Association for the Advancement of 1684 Medical Instrumentation; 2018. Arlington, VA. 1685

ANSI/AAMI/IEC 62366-1:2015, Medical devices – Part 1: Application of usability engineering to medical devices. Association 1686 for the Advancement of Medical Instrumentation; 2015. Arlington, VA. 1687

ANSI/AAMI/ISO 14971:2007, Medical devices ― Application of risk management to medical devices. International Organization 1688 for Standardization; 2007. Geneva, Switzerland. 1689

ANSI/AAMI/ISO 13485:2016, Medical devices – Quality management systems ― Requirements for regulatory purposes. 1690 International Organization for Standardization; 2016. Geneva, Switzerland. 1691

AAMI/IEC TR 62366-2:2016, Medical devices – Part 2: Guidance on the application of usability engineering to medical devices. 1692 International Electrotechnical Commission, 2016. Geneva, Switzerland. 1693

ISO 9001:2015, Quality management systems – Requirements. International Organization for Standardization; 2015. Geneva, 1694 Switzerland. 1695

ISO 9241-11:2018. Ergonomics of human-system interaction – Part 11: Usability: Definitions and concepts. International 1696 Organization for Standardization; 2018. Geneva, Switzerland. 1697

ISO 31000:2018. Risk management – Principles and guidelines. International Organization for Standardization; 2018. Geneva, 1698 Switzerland. 1699

II Guides 1700

AHRQ-Agency for Healthcare Research and Quality, Heuristic Evaluation. https://healthit.ahrq.gov/health-it-tools-and-1701 resources/evaluation-resources/workflow-assessment-health-it-toolkit/all-workflow-tools/heuristic-evaluation 1702

CDRH-Center for Devices and Radiological Health, Applying Human Factors and Usability Engineering to Medical Devices; 1703 2016. Silver Spring, MD. 1704

CDRH, Design Control Guidance for Medical Device Manufacturers; 1997. Silver Spring, MD. 1705

Neilsen-Shneiderman Heuristics, A tool for evaluating product usability. 1706

SAFER GUIDES https://www.healthit.gov/topic/safety/safer-guides 1707

Usability.gov, Usability Evaluation Methods. 1708

III Reports 1709

FDASIA Health IT Report: Proposed Strategy and Recommendations for a Risk-Based Framework. Jointly released by 1710 the Office of the National Coordinator for Health IT (ONC), the Food and Drug Administration (FDA), and the Federal 1711 Communication Commission (FCC); April 2014. 1712

Page 66: Health IT Software and Systems – Part 4: Application of ... · Sherm Eagles, Software CPR 37 . David Osborn, Philips 38 . Robert Phillips, Siemens Healthineers 39 . Beth Pumo, Kaiser

60

National Quality Forum. Identification and Prioritization of Health IT Patient Safety Measures; 2016. 1713

Report of the ISO/TC 215-IEC/SC 62 Joint Task Force on Health Software (unpublished—available from International 1714 Organization for Standardization ISO/TC 215 or IEC/SC 62A. Geneva, Switzerland). 1715

IV Articles 1716

Payne, T. H., Corley, S., Cullen, T. A., Gandhi, T. K., Harrington, L., Kuperman, G. J., ... & Tierney, W. M. Report of the 1717 AMIA EHR-2020 Task Force on the status and future direction of EHRs. Journal of the American Medical Informatics Association. 1718 2015. 1719

Schumacher, R. M., & Lowry, S. Z. NIST guide to the processes approach for improving the usability of electronic health 1720 records. National Institute of Standards and Technology; 2010. Gaithersburg, Maryland. 1721

Schumacher, R. M., & Lowry, S. Z. NISTIR 7742: Customized Common Industry Format Template for Electronic Health Record 1722 Usability Testing; 2010. Gaithersburg, Maryland. 1723

Wiklund ME, Kendler J, Hochberg L, Weinger MB. NIST GCR 15-996: Technical Basis for User Interface Design of Health 1724 IT. National Institute of Standards and Technology; 2015. Gaithersburg, Maryland. 1725

V. Laws and Regulations 1726

European Union. Council Directive 93/42/EEC of 14 June 1993 concerning medical devices OJ L 169 of 12 July 1993 1727

European Union. Regulation 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices 1728