understanding how we are wired and explaining why …€¦ · diagnosed pyomyositis of the r...

23
Understanding How We Are Wired and Explaining Why We Short-Circuit: A Workshop in Medical Decision Making and Error APPD Annual Meeting 2015 Orlando, Florida March 2015 Presenters: Emily Ruedinger, MD – Seattle Children’s Hospital and University of Washington Andrew Olson, MD – University of Minnesota Maren Olson, MD, MPH – Children’s Hospitals and Clinics of Minnesota and University of Minnesota Emily Borman-Shoap, MD – University of Minnesota Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

Upload: doanque

Post on 30-Jul-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

Understanding How We Are Wired and Explaining Why We Short-Circuit:

A Workshop in Medical Decision Making and Error

APPD Annual Meeting 2015

Orlando, Florida

March 2015

Presenters:

Emily Ruedinger, MD – Seattle Children’s Hospital and University of Washington

Andrew Olson, MD – University of Minnesota

Maren Olson, MD, MPH – Children’s Hospitals and Clinics of Minnesota and University of Minnesota

Emily Borman-Shoap, MD – University of Minnesota

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

CASE ONE

Instructions

Read the following vignette and discuss the questions below. In reading the vignette, please keep in mind that some details have been omitted for the sake of brevity. Please try not to focus too much on any missing medical details.

Vignette

Calvin is a 16-year-old, otherwise healthy male football player who was transferred from a community hospital with arthralgias, extreme fatigue, rash and fever. You obtain a history from Calvin and his parents, as follows:

• 8 days prior to arrival (PTA): developed R shoulder pain and restricted ROM (most bothersome symptom), chills, low grade fever, headache, fatigue and a rash on his bilateral LE, chest and hands. Went to an orthopedist where exam revealed pain with ROM at the R shoulder. Orthopedist recommended acetaminophen and ibuprofen, with follow up if not improving.

• 6-7 days PTA: Rash resolved, fevers improved, felt overall better. Ongoing R shoulder pain. Saw PMD; Lyme and mono tests sent and negative. Returned to school.

• 5 days PTA: Back to the orthopedist. Unable to tolerate ROM at R shoulder. Says that orthopedist gave him a “novocaine” injection into the joint. Had an MRI showing “a lot of fluid in the joint.” Orthopedist put his shoulder in a brace and recommended close follow up.

• 2 days PTA: shoulder not better. Otherwise well. Followed up with the orthopedist, and started on oral steroids “to help with inflammation in the shoulder”.

• 1 day PTA: return of fevers and chills • Day of arrival: abrupt worsening of the shoulder pain, developed full body arthralgias and

extreme fatigue. Had to leave school in a wheelchair. Taken to local ER, where ER physician sent a blood culture, gave a dose of ceftriaxone, and transferred him to the university hospital.

On arrival, he was febrile to 38.5C, had pain with movement of the R shoulder, a macular rash on the trunk and legs, peeling skin on the hands/feet, a “strawberry tongue” and conjunctivitis. Labs showed a leukocytosis and thrombocytosis, elevated ESR and CRP, normocytic and normochromic anemia, mild transaminase elevation and a sterile pyruia. CK was normal. He was mildly hyponatremic.

Due to concern for Kawasaki disease, you ordered an echo showing possible mild coronary dilation.. He was started on IVIg and aspirin. Antibiotics were not continued.

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

Fevers resolved. He was overall feeling well. He continued to experience rash, conjunctival injection, peeling skin on hands and feet, and began to complain of pain in the inguinal area. His shoulder improved slightly.

Three days after initiating treatment for Kawasaki disease, his fever re-emerged. Inflammatory markers had increased, though WBC was trending down. Groin pain and R shoulder pain were worsening, with restriction in ROM at the R shoulder and L hip. Pelvic MRI showed edema and swelling of the iliopsoas muscle group, paraspinal muscles and intermuscular plane of the L thigh as well as a small L hip effusion and diffuse synovial enhancement, with hypointensity of the lower vertebral bone marrow. A repeat R shoulder MRI showed a joint effusion with bursitis, synovitis and inflammatory change. Joint aspiration was bloody and purulent with 76K WBC. He was taken to the OR by orthopedics who ultimately diagnosed pyomyositis of the R shoulder and L hip, with a possible septic arthritis.

Calvin was initiated on cefazolin and vancomycin until cultures returned showing MSSA, at which point the vancomycin was discontinued. He ultimately completed 4 weeks total of antibiotics and aggressive PT of the hip and shoulder. He did well after a long recovery.

Questions

Please do not move on to Question 2 until your group has finished discussing Question 1.

1. Identify any instances of cognitive error you can recognize in this case. You do not need to correctly “name” the error, but try to cite as many examples as you can.

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

2. We would like you to specifically focus on availability bias as a source of error. Availability bias is defined as: jumping to a diagnosis that comes to mind quickly because it is common, serious, recently encountered, or otherwise noteworthy. Where do you think this happened in this case?

3. Can any of you think of any real-life examples—from your personal experiences, or cases you just heard about, where availability bias may have played a role? We would like your group to think of at least two examples.

4. Decision-making is impacted by the environmental and system conditions in which we practice. How do you think the environmental and system conditions in this case influenced decision making?

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

5. What specific cognitive strategies (irrespective of the system in which we practice) could be used to try to avoid similar errors in the future?

6. What system changes could be made to help avoid similar errors in the future?

7. If you were to talk with a resident about this case, how would you do it? What specific phrases might you use?

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

CASE TWO

Instructions

Read the following vignette and discuss the questions below. In reading the vignette, please keep in mind that some details have been omitted for the sake of brevity. Please try not to focus too much on any missing medical details.

Vignette

It’s a busy Saturday night in the pediatric emergency room. You are 11 hours in to a 12 hour shift. You look at the triage notes for the next patient you are about to see, and note that the patient is an 18-year-old female named Samira, with a chief complaint of “1 week of fevers.” Her vital signs are as follows: T 39.3, P132, BP 126/50, R 30, O2 sat 99% RA, BMI 30.4.

You enter the room and Samira greets you by saying, “I’m dying.” She then proceeds to give you a very dramatic and incredibly descriptive depiction of her entire life story. She is simultaneously moaning and complaining of extreme pain “everywhere,” while somehow also appearing cheerful and upbeat. She is one of the most verbose patients you’ve ever encountered, you find yourself cutting her off as her BP drops to the 70s/40s. Samira now reports that she is feeling much better and keeps talking. At your request, the nurse places 2 IV’s, draws labs, starts NS wide open and administers antibiotics.

Her blood pressure improves with fluids. In between writing orders and communicating with nursing and the attending, you are able to ascertain that she has had about 1 week of headaches, initially intermittent but now persistent, 10/10, bi-frontal and radiating to the neck. She’s been having intermittent fevers up to “158” degrees, and lower abdominal/pelvic pain radiating to both sides, the left worse than the right. Samira also complains of vaginal discharge that has been present for two months but has been worse lately, she describes it as copious, white and brown in color and she’s been having pain with sex. Her last period was 11 days ago. Samira has a diffusely positive review of systems- whatever you ask her about, she endorses. She has no allergies and takes no medications. She’s had two surgeries for ectopic pregnancies, one of which ruptured and required a unilateral salpingectomy. She has also had one miscarriage; she denies any history of elective terminations and or live births. She was last pregnant over 2 years ago. She has been with the same male partner for the past 6 months, and has had more than 10 lifetime partners. She does not use condoms or any other method of birth control. Samira immigrated to the US from Nigeria with her mother 2 years ago. She denies any foreign travel since immigrating. Her mom is currently sick and hospitalized elsewhere in the city for unknown reasons.

Her exam is notable for ill appearance, faint crackles at the lung bases and tachycardia. Her lower abdomen is tender, L > R, without rebound or guarding – she has mild RUQ tenderness. Her bed is not equipped with stirrups so you perform a limited gynecological exam. She has a large amount of white vaginal discharge visible externally. Bimanual is negative for cervical motion tenderness. You cannot appreciate any uterine or adnexal tenderness but find this portion of the exam frustratingly difficult because of the room, her positioning and her body habitus.

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

Her initial lab work is shows pancytopenia, elevated inflammatory markers, and negative HCG. Chest X-ray and an EKG are both normal. Pelvic ultrasound showed normal adnexal flow and a homogenous 4cmx2.8cm mass that the overnight radiology resident says is possibly a tubo-ovarian abscess.

OB-GYN is consulted. The consulting physician repeats the bimanual exam and reports left adnexal tenderness and a mass. You put in for a PICU bed, and Samira is admitted with a diagnosis of sepsis caused by PID with tubo-ovarian abscess. You finally leave the ER, exhausted, four hours after going in to see Samira.

The next morning, a repeat CBC shows worsening anemia. The attending radiologist in looks at the US images and requests a pelvic CT scan. This reveals a large fibroid, and no abscess. Samira continues to be treated for presumed PID while further work up ensues. Over the next couple of days, Samira essentially has no change in her subjective symptoms and continues to have periods of stability punctuated by spiking fevers and fluid-responsive hypotension.

Three days after her initial presentation, a malaria smear returns showing Plasmodium ovale. She then remembers that she received treatment for malaria at the time of immigration. She is started on appropriate therapy, improves rapidly and is discharged the day after her malaria diagnosis.

Questions

Please do not move on to Question 2 until your group has finished discussing Question 1.

1. Identify any instances of cognitive error you can recognize in this case. You do not need to correctly “name” the error, but try to cite as many examples as you can.

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

2. We would like you to specifically focus on visceral bias. Visceral bias is defined as: allowing personal feelings towards the patient to influence diagnostic conclusions. Where do you think this happened in this case?

3. Can any of you think of any real-life examples—from your personal experiences, or cases you just heard about, where visceral bias may have played a role? We would like your group to think of at least two examples.

4. Decision-making is impacted by the environmental and system conditions in which we practice. How do you think the environmental and system conditions in this case influenced decision making?

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

5. What specific cognitive strategies (irrespective of the system in which we practice) could be used to try to avoid similar errors in the future?

6. What system changes could be made to help avoid similar errors in the future?

7. If you were to talk with a resident about this case, how would you do it? What specific phrases might you use?

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

CASE THREE

Instructions

Read the following vignette and discuss the questions below. In reading the vignette, please keep in mind that some details have been omitted for the sake of brevity. Please try not to focus too much on any missing medical details.

Vignette

It’s about 9 PM and you are one hour away from finishing your shift in the emergency department. There’s been a stream of patients with upper respiratory symptoms tonight, and you are ready to be done. You pick up the chart for Steven, a 2 year old boy presenting with a barky cough and fever – it seems like parainfluenza is hitting hard this year as this is the fifth case of croup you’ve seen today. You peek at his vitals on the sheet – T 39.4 HR 148 R 36 and SaO2 96%-- and head into the room.

Steven is sitting on his mother’s lap on the gurney and she is helping to hold the dinosaur shaped nebulizer mask over his mouth and nose while he tries to pry it off. She states that his cough started last night and kept them up most of last night and they had to stay home today because the coughing “was so bad” and his “breathing was loud.” He’s never had anything like this before, but his older brother has croup a few times. She says that he has been drinking a little bit today but has had very little to eat. He hasn’t started toilet training yet and last wet diaper she saw was around 2 PM. She says the only funny thing is that his cough didn’t really get much better during the day today which is different than the experience that Seth, his older brother, had when he had croup.

Steven has otherwise been pretty healthy and has had all of his shots. He was born on time and there is no smoking in the home. On examination, he is using accessory muscles to breathe. HIs respiratory rate is about 40. His mucous membranes are moist and he has a lot of secretions everywhere – from his nose, his mouth, and all over his shirt. He has audible inspiratory stridor without putting the stethoscope on his chest and when you do listen with your stethoscope, you only hear transmitted upper airway sounds. His heart is tachycardic but he doesn’t have a murmur. His capillary refill time is slightly delayed. There is no rash. By the end of your examination, you think the racemic epinephrine nebulization has made a little difference and you tell Steven’s mom that a nurse will be in with a dose of steroids and another nebulizer treatment, still optimistic he will get to go home tonight.

After a few more nebulizer treatments and the dexamethasone, Steven still has stridor at rest but is more comfortable – he is working slightly less hard to breath and still has a fever. He wasn’t able to swallow the steroids so an IV was started and an IV dose administered. Given his poor improvement in the ED, you talk to your attending about admitting him to short stay and she agrees. You call the short stay attending and tell her the story and she says back to you – “Yup, sounds like another crouper – send him up.”

A few hours after admission, Steven’s breathing gets worse and he seems less alert – he has both inspiratory and expiratory stridor now and racemic epinephrine nebs are not making a difference. A

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

lateral neck film is obtained that shows “haziness and irregularity” of the anterior border of the trachea concerning for bacterial tracheitis. ENT is called, the patient is transferred to the ICU and he undergoes intubation in the operating room. He is started on broad spectrum antibiotics and is transferred back to the PICU.

You feel like you really missed this one – in fact, this diagnosis hadn’t even crossed your mind. You walk up to talk to the resident in the PICU who happens to be one of your friends.

Questions

Please do not move on to Question 2 until your group has finished discussing Question 1.

1. Identify any instances of cognitive error you can recognize in this case. You do not need to correctly “name” the error, but try to cite as many examples as you can.

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

2. We would like you to specifically focus on premature closure bias as a source of error. Premature closure is defined as: the tendency to apply premature closure to the decision making process, accepting a diagnosis before it has been fully verified. Where do you think this happened in this case?

3. Can any of you think of any real-life examples—from your personal experiences, or cases you just heard about, where premature closure may have played a role? We would like your group to think of at least two examples.

4. Decision-making is impacted by the environmental and system conditions in which we practice. How do you think the environmental and system conditions in this case influenced decision making?

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

5. What specific cognitive strategies (irrespective of the system in which we practice) could be used to try to avoid similar errors in the future?

6. What system changes could be made to help avoid similar errors in the future?

7. If you were to talk with a resident about this case, how would you do it? What specific phrases might you use?

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

CASE FOUR

Instructions

Read the following vignette and discuss the questions below. In reading the vignette, please keep in mind that some details have been omitted for the sake of brevity. Please try not to focus too much on any missing medical details.

Vignette

You are an intern seeing a 16-month-old boy, Anwar, in your continuity clinic for a one-year-old well child check, accompanied by his father. Anwar’s father speaks English relatively well, but aren’t always sure that he understands your questions and comments. You offer to bring an interpreter into the room, but as usual, Anwar’s father declines.

Anwar is quiet and clingy. He watches you warily during the whole encounter, and does not leave his father’s lap. You are unable to get much of a spontaneous developmental exam due to Anwar’s demeanor. His father has no concerns, states that Anwar has a few words, crawls, and pulls to stand but is not yet walking. On exam, you note that Anwar has a prominent genu varum (bow-legged) appearance. You know this can be a normal physiologic appearance for toddlers, but because it seems so prominent and he approaching the later end of “normal” to start walking, you ask them to come back when he is 18 months old to check in.

The next time you see this family, Anwar is almost two years old. He is accompanied by his father, who has no concerns. Anwar is again quiet and does not leave his father’s lap. His father reports that he did start walking and puts words together into short sentences. He thinks everything is going “fine.” The developmental screening form dad filled out is negative. When you ask more questions he gives you short answers and seems to be in a hurry. Dad seems confident and assertive, and has two older children, On exam, you continue to note a prominent genu varum and start to feel like this really might be more than you would expect for his age. You mention this to your precepting attending andshe quickly reassures you-- “it’s probably because he was such a late walker, don’t worry.” You are reassured by her response, and by the fact that the father thinks things are going well.

At Anwar’s next visit, he is nearly 3 years old and you are now in your third year of residency. Anwar is more interactive and engaged than at past visits. However, you note that he has difficulty climbing the steps up to the exam table, trips frequently while walking around the room, and continues to display a prominent genu varum and some metatarsus adductus. Patellar reflexes are diminished. You decide to refer him to neurology. A couple of weeks later, you check in with one of your friends who is a pediatric neurology resident. He says this is a “great case,” tells you that Anwar’s exam was “definitely abnormal” and that his CK was “really elevated.” He thinks that Anwar probably has a congenital muscular dystrophy. Anwar is still undergoing a more work up, and has started aggressive physical therapy.

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

You can’t help but think that this could have been diagnosed earlier if you had done things differently. You know that regardless of when this was diagnosed, congenital muscular dystrophy is progressive and incurable. Yet you wonder if starting PT sooner could have slowed his motor decline. You are kicking yourself because you probably didn’t press dad as much as you could have regarding Anwar’s developmental status during the earlier visits, when you didn’t have a great chance to observe Anwar’s behavior, and also because you trusted your attending’s reassurance that everything was probably fine during his 2-year-old visit even though your gut instinct was telling you that something might be wrong.

Questions

Please do not move on to Question 2 until your group has finished discussing Question 1.

1. Identify any instances of cognitive error you can recognize in this case. You do not need to correctly “name” the error, but try to cite as many examples as you can.

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

2. We would like you to specifically focus on “playing the odds” and “zebra restraint” as sources of error. Playing the odds is a type of bias that is the tendency in equivocal or ambiguous presentations to opt for a benign diagnosis on the basis that it is significantly more likely than a serious one. Zebra restraint is when a rare diagnosis (zebra) figures prominently on the differential diagnosis but the physician retreats from it for various reasons

3. Can any of you think of any real-life examples—from your personal experiences, or cases you just heard about, where “playing the odds” or zebra restraint may have played a role? We would like your group to think of at least two examples.

4. Decision-making is impacted by the environmental and system conditions in which we practice. How do you think the environmental and system conditions in this case influenced decision making?

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

5. What specific cognitive strategies (irrespective of the system in which we practice) could be used to try to avoid similar errors in the future?

6. What system changes could be made to help avoid similar errors in the future?

7. If you were to talk with a resident about this case, how would you do it? What specific phrases might you use?

Andrew Olson and Emily Ruedinger. APPD Workshop 2015.

A R T I C L E

The Importance of Cognitive Errors in Diagnosisand Strategies to Minimize Them

Pat Croskerry, MD, PhD

The recent article by Graber et al.1 providesa comprehensive overview of diagnostic errors inmedicine. There is, indeed, a long overdue andpressing need to focus on this area. They raise

many important points, several of which deserve extraemphasis in the light of recent developments. They alsoprovide an important conceptual framework within whichstrategies may be developed to minimize errors in thiscritical aspect of patient safety. Diagnostic errors areassociated with a proportionately higher morbidity than isthe case with other types of medical errors.2–4

The no-fault and system-related categories of diagnosticerrors described1 certainly have the potential for reduction.In fact, very simple changes to the system could result ina significant reduction in these errors. However, thegreatest challenge, as they note, is the minimization ofcognitive errors, and specifically the biases and failedheuristics that underlie them. Historically, there has pre-vailed an unduly negative mood toward tackling cognitivebias and finding ways to minimize or eliminate it.

The cognitive revolution in psychology that took placeover the last 30 years gave rise to an extensive, empiricalliterature on cognitive bias in decision-making, but thisadvance has been ponderously slow to enter medicine.Decision-making theorists in medicine have clung to nor-mative, often robotic, models of clinical decision makingthat have little practical application in the real world ofdecision making. What is needed, instead, is a systematicanalysis of what Reason5 has called ‘‘flesh and blood’’decision-making. This is the real decision making thatoccurs at the front line, when resources are in short supply,when time constraints apply, and when shortcuts are beingsought. When we look more closely at exactly what

ABSTRACT

In the area of patient safety, recent attention has focusedon diagnostic error. The reduction of diagnostic error isan important goal because of its associated morbidity andpotential preventability. A critical subset of diagnosticerrors arises through cognitive errors, especially thoseassociated with failures in perception, failed heuristics,and biases; collectively, these have been referred to ascognitive dispositions to respond (CDRs). Historically,models of decision-making have given insufficientattention to the contribution of such biases, and therehas been a prevailing pessimism against improvingcognitive performance through debiasing techniques.Recent work has catalogued the major cognitive biasesin medicine; the author lists these and describes

a number of strategies for reducing them (‘‘cognitivedebiasing’’). Principle among them is metacognition,a reflective approach to problem solving that involvesstepping back from the immediate problem to examineand reflect on the thinking process. Further researcheffort should be directed at a full and completedescription and analysis of CDRs in the context ofmedicine and the development of techniques foravoiding their associated adverse outcomes. Consider-able potential exists for reducing cognitive diagnosticerrors with this approach. The author provides anextensive list of CDRs and a list of strategies to reducediagnostic errors.

Acad. Med. 2003;78:775–780.

Dr. Croskerry is associate professor, Departments of Emergency Medicineand Medical Education, Dalhousie University Faculty of Medicine, Halifax,Nova Scotia, Canada. He is also a member of the Center for Safety inEmergency Care, a research consortium of the University of Florida Collegeof Medicine, Dalhousie University Faculty of Medicine, NorthwesternUniversity The Feinberg School of Medicine, and Brown Medical School.

Correspondence and requests for reprints should be sent to Dr. Croskerry,Emergency Department, Dartmouth General Hospital Site, Capital District,325 Pleasant Street, Dartmouth, Nova Scotia, Canada B2Y 4G8;telephone: (902) 465-8491; fax: (902) 460-4148; e-mail: [email protected] responses to this article are printed after it.

A C A D E M I C M E D I C I N E , V O L . 7 8 , N O . 8 / A U G U S T 2 0 0 3 775

cognitive activity is occurring when these clinical decisionsare being made, we may be struck by how far it is removedfrom what normative theory describes. Although it seemscertain we would be less likely to fail patients diagnosticallywhen we follow rational, normative models of decisionmaking, and although such models are deserving of ‘‘aprominent place in Plato’s heaven of ideas,’’6 they areimpractical at the sharp end of patient care. Cognitivediagnostic failure is inevitable when exigencies of the clinicalworkplace do not allow such Olympian cerebral approaches.

Medical decision makers and educators have to do threethings: (1) appreciate the full impact of diagnostic errors inmedicine and the contribution of cognitive errors inparticular; (2) refute the inevitability of cognitive diagnosticerrors; and (3) dismiss the pessimism that surrounds ap-proaches for lessening cognitive bias.

For the first, the specialties in which diagnostic un-certainty is most evident and in which delayed or misseddiagnoses are most likely are internal, family, and emer-gency medicine; this is borne out in findings from thebenchmark studies of medical error.2–4 However, all spe-cialties are vulnerable to this particular adverse event. Theoften impalpable nature of diagnostic error perhaps reflectswhy it does not appear in lists of serious reportable events.7

For the second, there needs to be greater understanding ofthe origins of the widespread inertia that prevails againstreducing or eliminating cognitive errors. This inertia mayexist because such errors appear to be so predictable, sowidespread among all walks of life, so firmly entrenched,and, therefore, probably hardwired. Although the evolu-tionary imperatives that spawned them may have served uswell in earlier times, it now seems we are left withcognitively vestigial approaches to the complex decisionmaking required of us in the modern world. Although‘‘cognitive firewalls’’ may have evolved to quarantine oravoid cognitive errors, they are clearly imperfect8 and willrequire ontogenetic assistance (i.e., cognitive debiasing) toavoid their consequences. Accepting this, we should say lessabout biases and failed heuristics and more about cognitivedispositions to respond (CDRs) to particular situations invarious predictable ways. Removing the stigma of bias clearsthe way toward accepting the capricious nature of decision-making, and perhaps goes some way toward exculpatingclinicians when their diagnoses fail.

An understanding of why clinicians have particularCDRs in particular clinical situations will throw consider-able light on cognitive diagnostic errors. The unmasking ofcognitive errors in the diagnostic process then allows for thedevelopment of debiasing techniques. This should be theultimate goal, and it is not unrealistic.

Certainly, a number of clear strategies exist for reducingthe memory limitations and excessive cognitive loading1

that can lead to diagnostic errors, but the most importantstrategy may well lie in familiarizing clinicians with thevarious types of CDRs that are out there, and how theymight be avoided. I made a recent extensive trawl of medicaland psychological literature, which revealed at least 30CDRs,9 and there are probably more (List 1). This catalogueprovides some idea of the extent of cognitive bias ondecision-making and gives us a working language to describeit. The failures to show improvement in decision support forclinical diagnosis that are noted by Graber et al.1 shouldcome as no surprise. They are likely due to insufficientawareness of the influence of these CDRs, which is oftensubtle and covert.10 There appears to have been an historicfailure to fully appreciate, and therefore capture, where themost significant diagnostic failures are coming from.

Not surprisingly, all CDRs are evident in emergencymedicine, a discipline that has been described as a ‘‘naturallaboratory of error.’’11 In this milieu, decision-making isoften naked and raw, with its flaws highly visible. Nowherein medicine is rationality more bounded by relatively pooraccess to information and with limited time to process it, allwithin a milieu renowned for its error-producing con-ditions.12 It is where heuristics dominate, and without thememergency departments would inexorably grind to a halt.13

Best of all, for those who would like to study real decisionmaking, it is where heuristics can be seen to catastrophicallyfail. Approximately half of all litigation brought againstemergency physicians arises from delayed or missed diag-noses.14

If we accept the pervasiveness and predictability of theCDRs that underlie diagnostic cognitive error, then we areobliged to search for effective debiasing techniques. Despitethe prevailing pessimism, it has been demonstrated that,using a variety of strategies15,16 (Table 1), CDRs can beovercome for a number of specific biases.16–23 It appearsthat there are, indeed, cognitive pills for cognitive ills,22

which makes intuitive sense. This is fortunate, for other-wise, how would we learn to avoid pitfalls, developexpertise, and acquire clinical acumen, particularly if thepredisposition for certain cognitive errors is hardwired?However, medical educators should be aware that if thepills are not sufficiently sugared, they may not be swallowed.

Yates et al.24 have summarized some of the majorimpediments that have stood in the way of developingeffective cognitive debiasing strategies, and they are notinsurmountable. The first step is to overcome the biasagainst overcoming bias. Metacognition will likely be themainstay of this approach. A recent cognitive debiasingtechnique using cognitive forcing strategies is based onmetacognitive principles10 and seems to be teachable tomedical undergraduates and postgraduates.25 Essentially,the strategy requires first that the learner be aware of the

C O G N I T I V E E R R O R S I N D I A G N O S I S , C O N T I N U E D

A C A D E M I C M E D I C I N E , V O L . 7 8 , N O . 8 / A U G U S T 2 0 0 3776

List 1

Cognitive Dispositions to Respond (CDRs) That May Lead to Diagnostic Error*

Aggregate bias: when physicians believe that aggregated data, such as those used to develop clinical practice guidelines, do not apply to individualpatients (especially their own), they are invoking the aggregate fallacy. The belief that their patients are atypical or somehow exceptional may lead toerrors of commission, e.g., ordering x-rays or other tests when guidelines indicate none are required.

Anchoring: the tendency to perceptually lock onto salient features in the patient’s initial presentation too early in the diagnostic process, and failing toadjust this initial impression in the light of later information. This CDR may be severely compounded by the confirmation bias.

Ascertainment bias: occurs when a physician’s thinking is shaped by prior expectation; stereotyping and gender bias are both good examples.

Availability: the disposition to judge things as being more likely, or frequently occurring, if they readily come to mind. Thus, recent experience witha disease may inflate the likelihood of its being diagnosed. Conversely, if a disease has not been seen for a long time (is less available), it may beunderdiagnosed.

Base-rate neglect: the tendency to ignore the true prevalence of a disease, either inflating or reducing its base-rate, and distorting Bayesian reasoning.However, in some cases, clinicians may (consciously or otherwise) deliberately inflate the likelihood of disease, such as in the strategy of ‘‘rule outworst-case scenario’’ to avoid missing a rare but significant diagnosis.

Commission bias: results from the obligation toward beneficence, in that harm to the patient can only be prevented by active intervention. It is thetendency toward action rather than inaction. It is more likely in over-confident physicians. Commission bias is less common than omission bias.

Confirmation bias: the tendency to look for confirming evidence to support a diagnosis rather than look for disconfirming evidence to refute it, despitethe latter often being more persuasive and definitive.

Diagnosis momentum: once diagnostic labels are attached to patients they tend to become stickier and stickier. Through intermediaries (patients,paramedics, nurses, physicians), what might have started as a possibility gathers increasing momentum until it becomes definite, and all otherpossibilities are excluded.

Feedback sanction: a form of ignorance trap and time-delay trap CDR. Making a diagnostic error may carry no immediate consequences, as considerabletime may elapse before the error is discovered, if ever, or poor system feedback processes prevent important information on decisions getting back tothe decision maker. The particular CDR that failed the patient persists because of these temporal and systemic sanctions.

Framing effect: how diagnosticians see things may be strongly influenced by the way in which the problem is framed, e.g., physicians’ perceptions ofrisk to the patient may be strongly influenced by whether the outcome is expressed in terms of the possibility that the patient might die or might live.In terms of diagnosis, physicians should be aware of how patients, nurses, and other physicians frame potential outcomes and contingencies of theclinical problem to them.

Fundamental attribution error: the tendency to be judgmental and blame patients for their illnesses (dispositional causes) rather than examine thecircumstances (situational factors) that might have been responsible. In particular, psychiatric patients, minorities, and other marginalized groupstend to suffer from this CDR. Cultural differences exist in terms of the respective weights attributed to dispositional and situational causes.

Gambler’s fallacy: attributed to gamblers, this fallacy is the belief that if a coin is tossed ten times and is heads each time, the 11th toss has a greaterchance of being tails (even though a fair coin has no memory). An example would be a physician who sees a series of patients with chest pain in clinicor the emergency department, diagnoses all of them with an acute coronary syndrome, and assumes the sequence will not continue. Thus, the pretestprobability that a patient will have a particular diagnosis might be influenced by preceding but independent events.

Gender bias: the tendency to believe that gender is a determining factor in the probability of diagnosis of a particular disease when no suchpathophysiological basis exists. Generally, it results in an overdiagnosis of the favored gender and underdiagnosis of the neglected gender.

Hindsight bias: knowing the outcome may profoundly influence the perception of past events and prevent a realistic appraisal of what actually occurred.In the context of diagnostic error, it may compromise learning through either an underestimation (illusion of failure) or overestimation (illusion ofcontrol) of the decision maker’s abilities.

Multiple alternatives bias: a multiplicity of options on a differential diagnosis may lead to significant conflict and uncertainty. The process may besimplified by reverting to a smaller subset with which the physician is familiar but may result in inadequate consideration of other possibilities. Onesuch strategy is the three-diagnosis differential: ‘‘It is probably A, but it might be B, or I don’t know (C).’’ Although this approach has some heuristicvalue, if the disease falls in the C category and is not pursued adequately, it will minimize the chances that some serious diagnoses can be made.

Omission bias: the tendency toward inaction and rooted in the principle of nonmaleficence. In hindsight, events that have occurred through the naturalprogression of a disease are more acceptable than those that may be attributed directly to the action of the physician. The bias may be sustained by thereinforcement often associated with not doing anything, but it may prove disastrous. Omission biases typically outnumber commission biases.

Order effects: information transfer is a U-function: we tend to remember the beginning part (primacy effect) or the end (recency effect). Primacy effectmay be augmented by anchoring. In transitions of care, in which information transferred from patients, nurses, or other physicians is being evaluated,care should be taken to give due consideration to all information, regardless of the order in which it was presented.

Outcome bias: the tendency to opt for diagnostic decisions that will lead to good outcomes, rather than those associated with bad outcomes, therebyavoiding chagrin associated with the latter. It is a form of value bias in that physicians may express a stronger likelihood in their decision-making forwhat they hope will happen rather than for what they really believe might happen. This may result in serious diagnoses being minimized.

C O G N I T I V E E R R O R S I N D I A G N O S I S , C O N T I N U E D

A C A D E M I C M E D I C I N E , V O L . 7 8 , N O . 8 / A U G U S T 2 0 0 3 777

List 1

Continued

Overconfidence bias: a universal tendency to believe we know more than we do. Overconfidence reflects a tendency to act on incomplete information,intuitions, or hunches. Too much faith is placed in opinion instead of carefully gathered evidence. The bias may be augmented by both anchoring andavailability, and catastrophic outcomes may result when there is a prevailing commission bias.

Playing the odds: (also known as frequency gambling) is the tendency in equivocal or ambiguous presentations to opt for a benign diagnosis on thebasis that it is significantly more likely than a serious one. It may be compounded by the fact that the signs and symptoms of many common andbenign diseases are mimicked by more serious and rare ones. The strategy may be unwitting or deliberate and is diametrically opposed to the rule outworst-case scenario strategy (see base-rate neglect).

Posterior probability error : occurs when a physician’s estimate for the likelihood of disease is unduly influenced by what has gone on before for aparticular patient. It is the opposite of the gambler’s fallacy in that the physician is gambling on the sequence continuing, e.g., if a patient presents tothe office five times with a headache that is correctly diagnosed as migraine on each visit, it is the tendency to diagnose migraine on the sixth visit.Common things for most patients continue to be common, and the potential for a nonbenign headache being diagnosed is lowered through posteriorprobability.

Premature closure: a powerful CDR accounting for a high proportion of missed diagnoses. It is the tendency to apply premature closure to the decision-making process, accepting a diagnosis before it has been fully verified. The consequences of the bias are reflected in the maxim: ‘‘When the diagnosis ismade, the thinking stops.’’

Psych-out error : psychiatric patients appear to be particularly vulnerable to the CDRs described in this list and to other errors in their management,some of which may exacerbate their condition. They appear especially vulnerable to fundamental attribution error. In particular, comorbid medicalconditions may be overlooked or minimized. A variant of psych-out error occurs when serious medical conditions (e.g., hypoxia, delerium, metabolicabnormalities, CNS infections, head injury) are misdiagnosed as psychiatric conditions.

Representativeness restraint: the representativeness heuristic drives the diagnostician toward looking for prototypical manifestations of disease: ‘‘If itlooks like a duck, walks like a duck, quacks like a duck, then it is a duck.’’ Yet restraining decision-making along these pattern-recognition lines leads toatypical variants being missed.

Search satisfying : reflects the universal tendency to call off a search once something is found. Comorbidities, second foreign bodies, other fractures,and coingestants in poisoning may all be missed. Also, if the search yields nothing, diagnosticians should satisfy themselves that they have beenlooking in the right place.

Sutton’s slip: takes its name from the apocryphal story of the Brooklyn bank-robber Willie Sutton who, when asked by the Judge why he robbed banks,is alleged to have replied: ‘‘Because that’s where the money is!’’ The diagnostic strategy of going for the obvious is referred to as Sutton’s law. Theslip occurs when possibilities other than the obvious are not given sufficient consideration.

Sunk costs: the more clinicians invest in a particular diagnosis, the less likely they may be to release it and consider alternatives. This is an entrapmentform of CDR more associated with investment and financial considerations. However, for the diagnostician, the investment is time and mental energyand, for some, ego may be a precious investment. Confirmation bias may be a manifestation of such an unwillingness to let go of a failing diagnosis.

Triage cueing : the triage process occurs throughout the health care system, from the self-triage of patients to the selection of a specialist by thereferring physician. In the emergency department, triage is a formal process that results in patients being sent in particular directions, which cuestheir subsequent management. Many CDRs are initiated at triage, leading to the maxim: ‘‘Geography is destiny.’’

Unpacking principle : failure to elicit all relevant information (unpacking) in establishing a differential diagnosis may result in significant possibilitiesbeing missed. The more specific a description of an illness that is received, the more likely the event is judged to exist. If patients are allowed to limittheir history-giving, or physicians otherwise limit their history-taking, unspecified possibilities may be discounted.

Vertical line failure: routine, repetitive tasks often lead to thinking in silos—predictable, orthodox styles that emphasize economy, efficacy, and utility.Though often rewarded, the approach carries the inherent penalty of inflexibility. In contrast, lateral thinking styles create opportunities for diagnosingthe unexpected, rare, or esoteric. An effective lateral thinking strategy is simply to pose the question: ‘‘What else might this be?’’

Visceral bias : the influence of affective sources of error on decision-making has been widely underestimated. Visceral arousal leads to poor decisions.Countertransference, both negative and positive feelings toward patients, may result in diagnoses being missed. Some attribution phenomena(fundamental attribution error) may have their origin in countertransference.

Yin-Yang out : when patients have been subjected to exhaustive and unavailing diagnostic investigations, they are said to have been worked up the Yin-Yang. The Yin-Yang out is the tendency to believe that nothing further can be done to throw light on the dark place where, and if, any definitivediagnosis resides for the patient, i.e., the physician is let out of further diagnostic effort. This may prove ultimately to be true, but to adopt the strategyat the outset is fraught with the chance of a variety of errors.

*The terms used to describe the various CDRs above are those by which they are commonly known in the psychology and medicine literature, as well as colloquially. Some,such as feedback sanction and hindsight bias, are indirect, reflecting more on processes that interfere with physician calibration. There is considerable overlap among CDRs,

some being known by other synonyms. These, together with further detail and citations for the original work, are described in Croskerry P. Achieving quality in clinicaldecision making: cognitive strategies and detection of bias. Acad Emerg Med. 2002;9:1184–1204. The above list was based on material in that article and in an earlier work.27

C O G N I T I V E E R R O R S I N D I A G N O S I S , C O N T I N U E D

A C A D E M I C M E D I C I N E , V O L . 7 8 , N O . 8 / A U G U S T 2 0 0 3778

various cognitive pitfalls, and second that specific forcingstrategies be developed to counter them.

Much of clinical decision making, as Reason5 notes, iswhere ‘‘the cognitive reality departs from the formalizedideal.’’ This cognitive reality is extremely vulnerable toerror. The problem is that cognitive error is high-hangingfruit and difficult to get at, and there will be a tendency topursue more readily attainable goals. There is a story abouta jogger who came across a man on his knees undera streetlight one evening. He explained that he haddropped his wedding ring. The jogger offered to help himsearch, and he accepted. With no luck after a half hour, thejogger asked the man if he was sure he had dropped the ringat the place where they were searching. The man repliedthat he actually dropped it several yards away in theshadows. ‘‘Then why are we looking here?’’ asked thejogger. ‘‘Because the light is better,’’ came the reply.

Real solutions to cognitive diagnostic errors lie in theshadows, and they will be difficult to find. One very cleargoal in reducing diagnostic errors in medicine is to firstdescribe, analyze, and research CDRs in the context ofmedical decision making, and to then find effective ways ofcognitively debiasing ourselves and those whom we teach.Not only should we be able to reduce many cognitivediagnostic errors, but we may also be pleasantly surprised tofind how many can be eliminated.

The author gratefully acknowledges support through a Senior Clinical

Research Fellowship from the Faculty of Medicine, Dalhousie University,

Halifax, Nova Scotia, Canada, and a grant (#P20HS11592-02) awarded by

the Agency for Healthcare Research and Quality.

REFERENCES

1. Graber M, Gordon R, Franklin N. Reducing diagnostic errors in

medicine: what’s the goal? Acad Med. 2002;77:981–92.

2. Brennan TA, Leape LL, Laird NM, et al. Incidence of adverse events

and negligence in hospitalized patients: results of the Harvard Medical

Practice Study 1. N Eng J Med. 1991;324:370–6.

3. Wilson RM, Runciman WB, Gibberd RW, et al. The Quality in

Australian Health Care Study. Med J Australia 1995;163:458–71.

4. Thomas EJ, Studdert DM, Burstin HR, et al. Incidence and types of

adverse events and negligent care in Utah and Colorado. Med Care.

2000;38:261–2.

5. Reason, J. Human Error. New York: Cambridge University Press, 1990.

6. Simon HA. Alternate visions of rationality. In: Arkes HR, Hammond

KR (eds.). Judgment and Decision Making: An Interdisciplinary

Reader. New York: Cambridge University Press, 1986: 97–113.

7. Serious reportable events in patient safety: A National Quality Forum

consensus report. Washington, D.C.: National Quality Forum, 2002.

8. Cosmides L, Tooby J. Consider the source: the evolution of adaptations

for decoupling and metarepresentation. In: Sperber D (ed.). Meta-

representation. Vancouver Studies in Cognitive Science. New York:

Oxford University Press, 2001.

9. Croskerry P. Achieving quality in clinical decision making: cognitive

strategies and detection of bias. Acad Emerg Med 2002;9:1184–1204.

Table 1

Cognitive Debiasing Strategies to Reduce Diagnostic Error*

Strategy Mechanism/Action

Develop insight/

awareness

Provide detailed descriptions and thorough

characterizations of known cognitive

biases, together with multiple clinical

examples illustrating their adverse effects

on decision-making and diagnosis

formulation.

Consider

alternatives

Establish forced consideration of alternative

possibilities e.g., the generation and

working through of a differential

diagnosis. Encourage routinely asking the

question: What else might this be?

Metacognition Train for a reflective approach to problem

solving: stepping back from the immediate

problem to examine and reflect on the

thinking process.

Decrease reliance

on memory

Improve the accuracy of judgments through

cognitive aids: mnemonics, clinical

practice guidelines, algorithms, hand-held

computers.

Specific training Identify specific flaws and biases in thinking

and provide directed training to overcome

them: e.g., instruction in fundamental

rules of probability, distinguishing

correlation from causation, basic Bayesian

probability theory.

Simulation Develop mental rehearsal, ‘‘cognitive

walkthrough’’ strategies for specific

clinical scenarios to allow cognitive biases

to be made and their consequences to be

observed. Construct clinical training

videos contrasting incorrect (biased)

approaches with the correct (debiased)

approach.

Cognitive forcing

strategies

Develop generic and specific strategies to

avoid predictable bias in particular clinical

situations.

Make task easier Provide more information about the specific

problem to reduce task difficulty and

ambiguity. Make available rapid access

to concise, clear, well-organized

information.

Minimize time

pressures

Provide adequate time for quality decision-

making.

Accountability Establish clear accountability and follow-up

for decisions made.

Feedback Provide as rapid and reliable feedback as

possible to decision makers so that errors

are immediately appreciated, understood,

and corrected, resulting in better

calibration of decision makers.26

*Based on information from: Slovic and Fischhoff (1977),19 Fischhoff (1982),15

Arkes (1986),16 Plous (1993),23 Croskerry (2002),9 and Croskerry (2003).10

C O G N I T I V E E R R O R S I N D I A G N O S I S , C O N T I N U E D

A C A D E M I C M E D I C I N E , V O L . 7 8 , N O . 8 / A U G U S T 2 0 0 3 779

10. Croskerry P. Cognitive forcing strategies in clinical decision making.

Ann Emerg Med. 2003;41:110–20.

11. Bogner, MS. (ed.). Human Error in Medicine. New Jersey: Lawrence

Erlbaum Associates, 1994.

12. Croskerry P, Wears RL. Safety errors in emergency medicine. In:

Markovchick VJ and Pons PT (eds.). Emergency Medicine Secrets, 3rd

ed. Philadelphia: Hanley and Belfus, 2002: 29–37.

13. Kovacs G, Croskerry P. Clinical decision making: an emergency

medicine perspective. Acad Emerg Med. 1999;6:947–52.

14. Data from the U.S General Accounting Office, the Ohio Hospital

Association and the St. Paul (MN) Insurance Company, 1998 hhttp://

hookman.com/mp9807.htmi. Accessed 4/24/03.

15. Fischhoff B. Debiasing. In: Kahneman D. Slovic P. and Tversky A

(eds). Judgment under Uncertainty: Heuristics and Biases. New York:

Cambridge University Press, 1982: 422–44.

16. Arkes HA. Impediments to accurate clinical judgment and possible

ways to minimize their impact. In: Arkes HR, Hammond KR (eds).

Judgment and Decision Making: An Interdisciplinary Reader. New

York: Cambridge University Press, 1986: 582–92.

17. Nathanson S, Brockner J, Brenner D, et al. Toward the reduction of

entrapment. J Applied Soc Psychol. 1982;12:193–208.

18. Schwartz WB, Gorry GA, Kassirer JP, Essig A. Decision analysis and

clinical judgment. Am J Med. 1973;55:459–72.

19. Slovic P, Fischhoff B. On the psychology of experimental surprises.

J Exp Psychol Hum Percept Perform. 1977;3:544–51.

20. Edwards W, von Winterfeldt D. On cognitive illusions and their

implications. In: Arkes HR, Hammond KR (eds). Judgment and

Decision Making: An Interdisciplinary Reader. New York: Cambridge

University Press, 1986: 642–79.

21. Wolf FM, Gruppen LD, Billi JE. Use of a competing-hypothesis

heuristic to reduce pseudodiagnosticity. J Med Educ 1988;63:548–54.

22. Keren G. Cognitive aids and debiasing methods: can cognitive pills

cure cognitive ills? In: Caverni JP, Fabre JM, Gonzales M (eds).

Cognitive Biases. New York: Elsevier, 1990: 523-52.

23. Plous S. The Psychology of Judgment and Decision Making.

Philadelphia: Temple University Press, 1993.

24. Yates JF, Veinott ES, Patalano AL. Hard decisions, bad decisions: on

decision quality and decision aiding. In: Schneider S, Shanteau J.

(eds.). Emerging Perspectives in Judgment and Decision Making. New

York: Cambridge University Press, 2003.

25. Croskerry P. Cognitive forcing strategies in emergency medicine.

Emerg Med J. 2002;19(suppl 1):A9.

26. Croskerry P. The feedback sanction. Acad Emerg Med. 2000;7:

1232–38.

27. Hogarth RM. Judgment and Choice: The Psychology of Decision.

Chichester, England: Wiley, 1980.

C O G N I T I V E E R R O R S I N D I A G N O S I S , C O N T I N U E D

A C A D E M I C M E D I C I N E , V O L . 7 8 , N O . 8 / A U G U S T 2 0 0 3780