health affairs study on problems with computerized provider order entry

Upload: huffpostfund

Post on 30-May-2018

218 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/9/2019 Health Affairs Study on Problems with Computerized Provider Order Entry

    1/9

    By Jane Metzger, Emily Welebob, David W. Bates, Stuart Lipsitz, and David C. Classen

    Mixed Results In The SafetyPerformance Of Computerized

    Physician Order Entry

    ABSTRACT Computerized physician order entry is a required feature for

    hospitals seeking to demonstrate meaningful use of electronic medical

    record systems and qualify for federal financial incentives. A national

    sample of sixty-two hospitals voluntarily used a simulation tool designed

    to assess how well safety decision support worked when applied to

    medication orders in computerized order entry. The simulation detectedonly 53 percent of the medication orders that would have resulted in

    fatalities and 1082 percent of the test orders that would have caused

    serious adverse drug events. It is important to ascertain whether actual

    implementations of computerized physician order entry are achieving

    goals such as improved patient safety.

    Many people have suggestedthat electronic health rec-

    ordsrepresent essentialinfra-structure for the provision ofsafe health care in the United

    States. For several years, the Institute of Medi-cine, the Leapfrog Group, the National QualityForum, and other national groups concernedabout patient safety have recommended, in par-ticular, widespread adoption of electronic healthrecords with computerized physician orderentry.14

    Background On Decision-Support

    ToolsWith computerized physician order entry, phy-sicians and other licensed clinicians write theirorders for hospitalized patients electronically.This recommendation is based in large part ondemonstrations by pioneering organizations.The organizations have shown that importantimprovements in safety can be achieved whenrules-based decision support aids in avertingmedication errors and adverse events by provid-ing advice and warnings as physicians write or-ders using a specially programmed computer.57

    In thisapplication of clinicaldecision support,physicians are made aware of potential safety

    issues that can resultfor example, when ampi-cillin is given to a patient with a known allergy topenicillin, or the dose being ordered for a pedi-atric patient is much higher than the therapeuticrange for a child of this age and weight. Prescrib-ing errors such as these can lead to anaphylaxisor seizures, which are known as adverse drugevents, if the medications are actually adminis-tered. The goal of medication safety decisionsupport in computerized physician order entryis to prevent these types of serious errors as theorders are being written.

    A study demonstrated that one in ten patients

    hospitalized in Massachusetts suffered an ad- verse drug event that could be prevented bydecision-support tools in computerized physi-cian order entry.8 The study spurred the passageof legislation requiring Massachusetts hospitalsto implement computerized physician orderentry by 2012 as a condition of licensure.9

    More recently, meaningful use of computer-ized physician order entry and clinical decisionsupport has been singled out as a requirementfor hospitals to qualify for new financial incen-tives. These incentives will be offered under the

    doi: 10.1377/hlthaff.2010.0160HEALTH AFFAIRS 29,NO. 4 (2010): 6556632010 Project HOPEThe People-to-People HealthFoundation, Inc.

    Jane Metzger ([email protected]) is a principalresearcher at CSC Healthcarein Waltham, Massachusetts.

    Emily Welebob is anindependent consultant inIndianapolis, Indiana.

    David W. Bates is divisionchief for general internalmedicine at Brigham andWomen s Hospital in Boston,Massachusetts.

    Stuart Lipsitz is a researcherat Brigham and Women sHospital.

    David C. Classen is anassociate professor ofmedicine at the University ofUtah in Salt Lake City, and isalso with CSC Healthcare.

    AP R IL 2 0 1 0 2 9 : 4 HEALTH AFFAIRS 655

    F ocus On Qual ity

  • 8/9/2019 Health Affairs Study on Problems with Computerized Provider Order Entry

    2/9

    health information technology (IT) stimulusprovisions of the American Recovery and Rein-

    vestment Act (ARRA) of 2009.Computerized physician order entry interacts

    with other applications in the suite of digitaltools that constitute the inpatient electronichealth record (for example, to obtain infor-mation on allergies and patients weight) and

    is typically one of the later modules to be imple-mented. Hospitals adoption of the compu-terized physician order entry module is in-creasing, but slowly.10,11 Several reports havesuggested that the successful application ofdecision support achieved among pioneeringorganizations is not being replicated and thatimplementation can create new problems.1215

    This report summarizes results for sixty-twohos-pitals across the United States that used a newsimulation tool to assess their use of medicationsafety decision support in electronic healthrecords with computerized physician order

    entry.

    Study Data And MethodsHistory Of The Assessment Tool The impetusfor developing the assessment tool was initiallythe standard developed by the Leapfrog Group.This is an employer group that seeks to accom-plish breakthroughs, orbig leaps, in hospitalpatient safety through a combination of publicawareness and rewards to higher-quality provid-ers. The group selected computerized physicianorder entry asone of the first three leaps in2001.

    There was accumulating evidence concerningthe frequency, tragic consequences, and finan-cial costs of adverse drug events in hospitalizedpatientsand computerized physician order en-try and decision support had demonstrated theability to help avert many of them.16

    The Leapfrog standard includes two elementsof meaningful use to ensure that computerizedphysician order entry has been implemented insuch as way as to improve medication safety.

    According to thestandards, physicians andotherlicensed providers must enter at least 75 percentof medication orders using computerized entry.

    Clinical decision support must also be able toavert at least 50 percent of common, seriousprescribing errors.16

    Clinical decision support in this setting is thelogicbuilt intothe computerized physicianorderentry system that, for example, checks to see ifampicillin has been ordered for a patient who isknown to be allergic to penicillin. This tool wasdeveloped to specifically measure the ability ofimplemented electronic health record systemswith computerized physician order entry to de-tect and avert these common yet serious pre-

    scribing errors in live hospital settings.The tool was intentionally designed to give

    individual hospitals detailed and specific feed-back on their performance and to give purchas-ers, through Leapfrog, an overall score for thehospital that can be used for benchmarkingpurposes.17,18

    This assessment complements efforts by the

    Certification Commission for Health Informa-tion Technology (CCHIT) to evaluate the capa-bilities available in vendors electronic medicalrecord products on the shelf. It evaluates howproducts were implemented and are actuallybeing used in hospitals.

    The development of the tool was initiallyfunded by the Robert Wood Johnson Foun-dation, the California HealthCare Founda-tion, and the Agency for Healthcare Researchand Quality, and was completed in 2006. In

    April 2008 theassessment was incorporatedintotheLeapfrogAnnualSafePracticesSurveyforthe

    first time, and hospitals completing the assess-ment received a feedback report. Beginning in2009, assessment results were also factored intodetermining the extent to which the computer-ized physician order entry implementation metthe Leapfrog standard.

    D e s ig n O f T h e As s e s s m e n t T o o l The assess-mentmethodology is modeled after toolsthat arecommonly used in other industries. It mimicswhat happens when a physician writes an orderfor an actual patient in the implemented elec-tronic health record with computerized physi-cian order entry. But it uses test patientsin

    effect, fictitious patients created for purposesof the assessmentand test orders.

    A group of experts on adverse drug events, aswell as the use of decision support in computer-ized physician order entry to decrease adversedrug events, developed test orders that are

    judged likely to cause serious harm (rather thanthose with low potential for harm). The testorders belong to the categories of adversedrug events (such as drug-to-allergy or drug-to-diagnosis contraindication) that prior re-search shows cause the most harm to patients.In most cases, they are actual orders that have

    caused adverse drug events, taken from primaryadverse drug event data collection studies. Theassessment offers a one-time, cross-sectionallookat whether decision support provides adviceto a physician writing such an order.18,19

    Decision support for this purpose is a set oftools or logic that can be integrated into thecomputerized physician order entry system tosuggest appropriate orders (such as a dose cal-culator or a reminderto consider renalfunction)or to critique them once they have been entered,as through a message or an alert.The assessment

    F ocus On Qual ity

    656 HEALTH AFFAIRS AP R IL 2 0 1 0 2 9 : 4

  • 8/9/2019 Health Affairs Study on Problems with Computerized Provider Order Entry

    3/9

    gives credit for all of these forms of decisionsupport as relevant advice or information.

    Because of differences in the epidemiology ofpreventable adverse drug events, different ver-sions of the assessment were designed for adultand pediatric inpatient settings.

    U s e O f T h e A s se s sm e nt T o ol A designatedteam in the hospital performs a self-assessment

    in the following fashion. First, the team down-loads instructions and information profiles forten to twelve test patients. Then the team down-loads around fifty test orders, instructions, andobservation sheets to be used in the assessment.

    A participating physician enters test orders forthe test patients into the local electronic healthrecord andobserves and notes anyguidance pro-

    vided by decision support, such as the calculateddose,amessageoralertdisplayed,andsoon.Theteam enters the results obtained for each testorder (decision support received or not).17 Theassessment tool instantly computes an overall

    score (percentage of test orders identified), aswell as the score for the orders in each adversedrug event category. It then displays results tothe testing team. The entire process takes nomore than six hours, and many hospitals com-plete it more quickly.

    During the development period (20026),multiple testing was performed at more thantwenty-five different hospitals. The testing re-flectingall of the leading electronic health recordand computerized physician order entry ven-dors products and ensured that the test couldeffectively evaluate each product. Details of the

    reliability and validity of the assessment meth-odology are available in the Online Appendix.20

    AnalysisBetween April and August 2008, eighty-one U.S.hospitals completed the version of the assess-ment for adult patients. Test orders and postedresults were reviewed, and information poten-tially identifying patients was removed. Ninehospitals were eliminated because registrationinformation indicated that computerized physi-cian orderentry wasbeing used only in theemer-

    gency department rather than more broadly inthe hospital.

    In addition, ten hospitals were excluded be-cause they exceeded a deception-analysis thresh-old based on standard gaming detection stra-tegies (testing irregularities such as exceedingtime limits or multiple false positive results).

    A review of excluded hospitals did not revealresults that would have skewed the findings inthis study. The resulting sample contained sixty-two hospitals.

    The categories of adverse drug events ad-

    dressed by the test orders include ones for whichdecision-support tools are fairly straightforwardto implement (drug-to-drug or drug-to-allergyinteractions, therapeutic duplication, inappro-priate single dose, and inappropriate route ofadministration). The assessment also includedtest orders that require more effort to configureor customizefor example, use of an inappro-

    priate daily dose or weight-based dose, drug-to-age or drug-to-diagnosis contraindication, con-traindications based on renal status or othermetabolic abnormalities indicated by laboratorytests, or lack of monitoring.

    The framework for basic and advanced deci-sion support was based on prior work on clinicaldecision support in computerized physician or-der entry,21 and the categories are consistentwith recent research on preventable adversedrug events.8

    Statistical AnalysisThe basic unit for the analyses was the hospital.Overall scores for test orders and the scores fororders assigned to the basic group were bothfound to be approximately normally distributed.However,scoresfortestordersintheadvancedgroup were slightly right-skewed. Thus, we pre-sent additional statistics to aid in interpretation.

    A fuller description of our analysis is available inthe Online Appendix.20

    Tests based on standard assumptions aboutthe population, such as through parametrictests, gave results almost identical to those of

    tests that made no such assumptions (nonpara-metric tests). Thus, for simplicity, parametrictests are displayed (see the Online Appendixfor details).20

    The total hospital scores were found to be ap-proximately normally distributed, as is typicallythe case with normal outcomes. Thus, we as-sumed a linear regression model with the overallscore as the dependent variable (estimating therelationships between the outcome and covari-ates using ordinary least squares regression).Covariates that we identified as having a possibleinfluence on total hospital score were vendor

    (there were nine), teaching status, hospital sizegroupings (number of beds), andwhether or notthe hospital was part of a health system.

    We tested for appropriateness of our linearregression model using techniques that are de-scribed in the Online Appendix.20 We also usedstandard statistical techniques to determine thepercentage of the overall variation explained byeach factor.22

    We tested our modelfor appropriateness usingtechniques that are described in the Online Ap-pendix.20 For dichotomous outcome variables,

    A P RI L 2 0 10 2 9 :4 HEALTH AFFAIRS 657

  • 8/9/2019 Health Affairs Study on Problems with Computerized Provider Order Entry

    4/9

    suchas testorder detected(yes, no), percentageswere calculated. However, to account for thehierarchical nature of the order data, or the testorders nested within hospitals, the 95 percentbinomial confidence intervals for percentageswere adjusted using the approach described byDavid Williams.23

    All analyses were conducted using the statis-

    tical softwarepackage SAS 9.2. Alltestsweretwo-tailed, and a p value less than 0.05 (thereforenot likely to be due to chance) was consideredstatistically significant.

    Study ResultsThe types of hospitals included in the samplewere broadly representative of larger U.S. hospi-tals (Exhibit 1). Nearly two-thirds were teachinghospitals. The higher representation of teachinghospitals and lower representation of smallerhospitals is consistent with the current pattern

    of adoption of computerized physician order en-try in hospitals.10,11 Among the sixty-two hospi-tals, all but one reported using electronic healthrecord applications including computerizedphysicianorder entry fromoneof sevencommer-cial vendors.

    Individual Hospitals Scores for individualhospitals ranged from 10 percent to 82 percentof test orders detected. The scores for the top10 percent, or six hospitals, ranged from 71 per-cent to 82 percent. Scores for the six hospitalswith the lowest scores ranged from 10 percent to18 percent (Exhibit 2).

    Mean hospital scores (Exhibit 3) were higherfor orders that would lead to adverse drug events

    that can be addressed by basic decision support(61 percent) than for those requiring more ad-

    vanced decision support (25 percent).Ag g r e gat e R e s ult s : Ad ver s e D r ug E ve n ts

    When results for all hospitals were pooled, theadverse drug event category detected most reli-ably was drug-to-allergy contraindication. Muchhigher scores were obtained for each of the cat-

    egories addressed by basic clinical decision sup-port than for those requiring advanced tools(Exhibit 4).

    Drug-to-diagnosis contraindication includespregnancy, which was also analyzed separately.These potential adverse drug events were onlydetected 15 percent of the time.

    The setof test ordersfor each hospital includesfour that are judged to result in patient fatality.

    When results for this subset were analyzed, wefound that 47 percent (95 percent confidenceinterval: 36.957.6) were not detected by thedecision support in use in these hospitals.

    Although hospitals do have pharmacy and nurs-ing review processes in place that sometimescatch orders like these before the medicationreaches the patient, these medication ordersare far outside safe limits and would never beappropriate physician orders.

    C o nt r ib u ti n g F a ct o rs The informationavailable for exploring contributing factorswas limited to the vendor software solution inuse, teaching status, hospital size by number ofbeds, and whether or not the hospital was part ofa health system. We assessed the relationshipbetween performance on the assessment and

    these factors.High-low scores for hospitals using the same

    EXHIBIT 1

    Characteristics Of Hospitals Participating In The Study Of Computerized Physician Order Entry (CPOE), 2008

    Characteristic

    Participating hospitals

    All U.S. hospitals (%) Hospitals with CPOE (%)Number Percent

    HOSPITAL SIZE (BEDS)

  • 8/9/2019 Health Affairs Study on Problems with Computerized Provider Order Entry

    5/9

    computerized physician order entry softwareproduct ranged by as much as 4065 percent(Exhibit 2). Some hospitals using each productdetected at least 50 percent of the potential ad-

    verse drug events. The six top-performing hos-pitals used six different software products: onehomegrown solution and five vendor products.

    In a multiple regression model, vendor choicewas significantly correlated with performance(p 0:009, or not likely to be due to chance).This means that there is good statistical evidenceto suggest that choice of vendors does have somepositive effect on performance. However, vendorchoice accounted for only 27 percent of the total

    EXHIBIT 2

    Hospital Scores For Detection Of Test Orders That Would Cause An Adverse Drug Event In An Adult Patient According ToThe Software Product (Vendor) Implemented

    Hospitalscore(percent)

    SOURCE Authors analysis.

    EXHIBIT 3

    Hospital Mean Scores For Detecting Test Orders For Adult Patients Corresponding To Adverse Drug Event Categories

    Addressed By Basic And Advanced Decision Support

    Adverse drug event categories grouped according tolevel of clinical decision support

    Percent of test orders detected

    Mean (SE) Median Interquartile rangeBasicrelatively easy to implementa 61.4 (2.4) 61.1b 53.976.2Advancedrequires more configuration or customizationc 24.8 (2.6) 18.8b 5.938.9Overall score 44.3 (2.3) 41.7 31.657.1

    SOURCE Authorsanalysis. NOTES N 62 hospitals. SE is standard error. Interquartile range is the range in which the middle 50 percentof the observations are seen25 percent below the median, 25 percent above. ap < 0:0001 using a paired t-test for BasicScore = Advanced Score. bDrug-to-drug or drug-to-allergy contraindication, inappropriate single dose, therapeutic duplication, andinappropriate route. cInappropriate cumulative (daily) dose, inappropriate dose (patient weight), age or diagnosis contraindication,contraindication based on renal function or other condition indicated by laboratory tests, lack of monitoring.

    AP R IL 2 0 1 0 2 9 : 4 HEALTH AFFAIRS 659

  • 8/9/2019 Health Affairs Study on Problems with Computerized Provider Order Entry

    6/9

    variation that we observed in performance.Teaching status also correlated significantly

    with performance (p 0:007, very unlikely tobe due to chance), accounting for 10 percentof the observed variation in performance. Buthospital size and being part of a hospital systemdid not correlate significantly with performance.

    At least as far as we can detect statistically, hos-pital size and being part of a hospital system donot influence performance one way or the other.

    We tested for interactions between vendor and

    all other variables, and none were significant(p > 0:2). Finally, although we were able toaccount for only 39 percent of the observed

    variation in performance (overall R2 0:39),standard tests (goodness-of-fit statistics and as-sessments) indicated that our model was appro-priate for our study, as detailed in the Online

    Appendix.20

    DiscussionMany of the benefits from inpatient electronichealth records and the computerized physician

    order entry module in particular come from de-cision support.24 In this study, we found wide

    variation in the ability of implemented comput-erized physician order entry decision support todetect medication orders judged likely to causeserious harm to adult patients.

    Many hospitals performed poorly, and themean score was only 44 percent of potentialadverse drug events detected. However, top-performing hospitals in this sample achievedscores of 7080 percent or greater. To achievethese higher results, the hospitals have imple-

    mented advanced clinical decision support, aswell as basic tools, and their performance pro-

    vides a benchmark for the continuing efforts oflower-performing hospitals.

    A comparison of aggregate scores with thefindings from a recent study of adverse drugevents, based on chart reviews in six communityhospitals not using the computerized physicianorder entry module of the inpatient electronichealth record,8 showed that decision support inthesixty-two hospitals is doing a much betterjob

    detecting adverse drug events that occur infre-quently than those that occur more frequently.

    Thecategoriesinourstudywiththethreehigh-est detection ratesdrug-to-allergy, drug-to-drug, and duplicate medicationtogether onlycontributed 7 percent of the adverse drug eventsin the community hospital chart review study.Fifty-five percent of the adverse drug events inthe community hospital chart review studyrequired considering laboratory results or pa-tients age in determining medication appropri-ateness or dosing. For these categories inaggregate, corresponding test orders were de-

    tected only 20.6 percent of the timenot verygood odds from the patients perspective.

    Different Products In Use The highvariabil-ity found, including among hospitals using thesame electronic health record software product,emphasizes theimportance of this type of assess-ment to gauge the extent to which decisionsupport is being used. In fact, this is the ration-ale used by the Leapfrog Group and the Na-tional Quality Forum in including the use ofthe assessment tool in their recommended safepractices.4,16

    EXHIBIT 4

    Pooled Hospital Scores For Detecting Test Orders For Adults In Various Adverse Drug Event Categories

    Adverse drug event category Percent detectedLower 95 percentconfidence interval

    Upper 95 percentconfidence interval

    ADDRESSED BY BASIC CLINICAL DECISION SUPPORTa

    Drug-allergy contraindication 83.3 77.7 87.8Inappropriate single dose 46.4 37.6 56.6

    Therapeutic duplication 54.5 43.7 64.9Drug-drug interaction 52.4 43.4 61.3Inappropriate route 65.3 55.7 72.5

    ADDRESSED BY ADVANCED CLINICAL DECISION SUPPORTb

    Inappropriate cumulative (daily) dose 39.1 28.9 50.4Inappropriate dosing (patient weight) 36.7 27.9 46.4Age contraindication 14.1 7.9 24.0Labscreatinine 20.2 12.9 30.1Labsother 26.1 18.7 35.1Drug-diagnosis contraindication 15.0 9.9 22.1Corollary orders (monitoring) 27.0 19.7 35.7

    SOURCE Authors analysis. NOTE N 62 hospitals. aImplementation of applicable decision support is relatively straightforward.bImplementation of applicable decision support requires more effort to configure or customize.

    F ocus On Qual ity

    660 HEALTH AFFAIRS AP R IL 2 0 1 0 2 9 : 4

  • 8/9/2019 Health Affairs Study on Problems with Computerized Provider Order Entry

    7/9

    Hospital boards and executive teams who areinvesting in thesesystems, practicing physicianswho adopt the technology in their routine workto improve patient safety, and external stake-holders requiring the use of electronic healthrecordswith computerized physicianorder entryor providing incentives for adoption all need a

    way to gauge progress in applying decision sup-port to improve medication safety. This becomesacutely important as hospitals attempt to meetmeaningful-use requirements to quality for fed-eral stimulus incentives.

    As s e s sin g P atie n t S af e t y Given the largeinvestment in achieving meaningful use nation-wide, it will be important to ascertain whetheractual implementation is achieving importantgoals such as improved patient safety. To ourknowledge, this test is the first such objectiveevaluation of electronic health record systemsin actual use.

    The assessment measures the extent to whichclinical decision support is providing some formof advice or an alert in response to medicationorders that would cause an adverse drug event.The design and scope of decision-support capa-bilities in electronic health record softwareprod-ucts vary, and local hospital configuration andcustomization are always required.

    Although there was a relationship between theproduct involved and performance on the assess-ment, only 27 percent of the variation is associ-atedwith using differentelectronic health recordproducts. This suggests that other factors had a

    bigger influence on the wide variation in perfor-mance amongthe hospitals.Although results arepresented forone homegrownsystem, this is nota sufficient sample from which to draw any gen-eral conclusions.

    In most hospitals, the use of decision supportgrows over time. At the outset, managing it is anew process.6 Including too much, poorly de-signed clinical decision support initially caneven contribute to lack of computerized physi-cian order entry adoption.25

    Exploring Variability In Use Several factors

    are contributing to the variability observed.These include the completeness and ease ofapplying the decision-support tool sets in theelectronic health record software; relevantknowledge and experience within the hospital;the availability and commitment of staff resour-ces; and whether the use of clinical decision sup-port began recently or has been honed over

    many years.There are multiple possible explanations for

    the observed correlation between hospital teach-ing status and performance. These include suchfactors as researchinterest and having morestaffresources to invest. However, these could not beexplored further in this study.

    For specific categories of adverse drug events,there are also multiple possible explanations for

    variability among hospitals. For drug-to-diagno-sis contraindications, for example, many hospi-tals are likely not yet applying decision supportbecause there is no constantly updated elec-

    tronic problem list maintained by physiciansfor patients during their hospital stay.

    The assessment also does not provide insightinto whether or not the information or advicebeing presented is followed in practiceanotherprerequisite for improving medication safety.26

    A better understanding of all of these factors isneeded so that strategies for speeding progresscan be devised.

    Some reports have suggestedone remedy: pro-viding assistance to hospitals in using decisionsupport effectively.3,21,27 This assessment toolprovides guidance to the hospital in the form

    of a scorecard. Several publicationsexistto guidehospitals in their planning for and decisionsabout clinical decision support.21,26,28,29 Also,some vendors of computerized physician or-der entry software provide clinical decision sup-port starter sets or facilitate sharing amongcustomers.

    Work is now under way to begin to developlibraries of decision-support rules and practicaladvice, although it is in early stages.

    Study Limitations This study has several lim-itations. Like all voluntary surveys, the study issubject to possible response bias. It is likely that

    the hospitals completing the assessment weremore interested than many of their peers werein clinical decision support because it wasinitially offered as a learning exercise insteadof being requiredfor theannual Leapfrog survey.

    Another limitation is that the sixty-two hospi-tals in thestudyrepresent about8 percent of U.S.hospitals with the computerized physician orderentry module of the inpatient electronic healthrecord in use.10 Thus, they might not berepresentative. The fact that this is the first testof its kindmeans thatthereis nogold standard

    Work is now underway to begin todevelop libraries ofdecision-support rules

    and practical advice.

    AP R IL 2 0 1 0 2 9 : 4 HEALTH AFFAIRS 661

  • 8/9/2019 Health Affairs Study on Problems with Computerized Provider Order Entry

    8/9

    forcomparison of theseresults. However,duringdevelopment of the assessment tool, the reliabil-ity and validity were extensively evaluated andfound to be very high.

    Finally, because results are self-reported, hos-pitals may have reported better performancethan was actually assessed. However, it shouldbe noted that public reporting of overall scores,

    which could influence hospitals behavior, wasnot implemented until 2009, and hospitals wereasked to participate in part to aid them withimproving their decision support. To addressthe potential for gaming, several standard safe-guards were built into the test to detect patternsof use that suggest that gaming may be going on.

    Any assessment that included these character-istics was excluded from analysis.

    ConclusionsIn sixty-two hospitals using an inpatient elec-

    tronic health record with computerized physi-cian order entry, we found significant vari-ability in the use of decision support to detect

    and provide advice or analertconcerninga medi-cation order that would result in serious harm toan adult patient. Some hospitals performed verywell, while others performed very poorly. In ad-dition, the studied hospitals as a group wereusing basic decision support far more than themore advanced tools needed to detect types oforders that are major contributors to adverse

    drug events in chart-review studies.These findings point to theimportance of eval-

    uations of the use of clinical decision support byhospitals to help guide their continuing effortsto improve medication safety. In addition, incen-tives to hospitals relating to computerized physi-cian order entry should include some type ofdemonstration that clinical decision support isactually being employed, instead of being basedsolely on whether computerized physician orderentry is in use.

    The broader use of this type of assessmentof meaningful electronic health record use

    should be explored for other software applica-tions used in direct clinical care.

    The authors thank Peter Kilbridge, FranTurisco, and the project expert advisers,as well as individuals in many hospitals,health systems, and software vendorcompanies, for their involvement andassistance in some phase of thedevelopment and testing of theassessment tool used in the study. Theyalso thank the Leapfrog Group foraccess to the data for analysis. The

    development of the assessment toolused in this study was supported bygrants from the California HealthCareFoundation, the Robert Wood JohnsonFoundation, and the Agency forHealthcare Research and Quality(Contract no. 290-04-0016, Subcontractno. 6275-FCG-01). These organizationsdid not sponsor and were not involvedin the reported analysis or preparation

    of the manuscript. Four of the authors(Jane Metzger, Emily Welebob, David W.Bates, and David C. Classen) wereinvolved in the development of theassessment tool. The content is solelythe responsibility of the authors anddoes not necessarily represent theofficial views of the funding agency.

    NOTES

    1 Institute of Medicine. Preventingmedication errors: quality chasmseries. Washington (DC): National

    Academies Press; 2007.2 Institute of Medicine. Crossing the

    quality chasm: a new health systemfor the 21st century. Washington(DC): National Academies Press;2001.

    3 Aspden P, Corrigan JM, Wolcott J,Erickson SM. Patient safety: achiev-ing a new standard of care. Wash-ington (DC): National AcademiesPress; 2003.

    4 National Quality Forum. Safe prac-tices for better healthcare: a con-sensus report [Internet]. Washing-ton (DC): National Quality Forum;2003 [cited 2010 Feb 21]. Availablefrom: http://www.ahrq.gov/qual/nqfpract.pdf

    5 Bates DW, Leape LL, Cullen DJ, LairdN, Petersen LA, Teich JM, et al. Ef-fect of computerized physician orderentry and a team intervention onprevention of serious medicationerrors. JAMA. 1998;280(15):13116.

    6 Evans RS, Pestotnik SL, Classen DC,

    Clemmer TP, Weaver LK, Orme JFJr., et al. A computer-assisted man-agement program for antibiotics andother anti-infective agents. N Engl JMed. 1998;338(4):2328.

    7 Teich JM, Merchia PR, Schmiz JL,Kuperman GJ, Spurr CD, Bates DW.Effects of computerized physicianorder entry on prescribing practices.

    Arch Intern Med. 2000;160(18):27417.

    8 Adams M, Bates DW, Coffman G,Everett W. Saving lives, savingmoney: the imperative for comput-

    erized physician order entry inMassachusetts hospitals [Internet].Boston (MA): Massachusetts Tech-nology Collaborative and NewEngland Healthcare Institute; 2008[cited 2010 Feb 21]. Available from:http://web3.streamhoster.com/mtc/cpoe20808.pdf

    9 Getz L. On the right track in Mas-sachusetts. For the Record [serial onthe Internet]. 2008 Nov 11; 20(23)[cited 2010 Feb 21]. Available from:http://www.fortherecordmag.com/archives/ftr_111008p16.shtml

    10 Pedersen CA, Gumpper KF. ASHPsurvey on informatics: assessment ofthe adoption and use of pharmacyinformatics in U.S. hospitals2007.

    Am J Health Syst Pharm. 2008;65:224464.

    11 American Hospital Association.Continued progress: hospital use ofinformation technology [Internet].Chicago (IL): AHA; 2007 [cited 2010Feb 21]. Available from: http://www.aha.org/aha/content/2007/pdf/070227-continuedprogress.pdf

    12 Nebeker JR, Hoffman JM, Weir CR,

    Bennett CL, Hurdle JF. High rates ofadverse drug events in a highlycomputerized hospital. Arch InternMed. 2005;165(10):11116.

    13 Walsh KE, Landrigan CP, AdamsWG, Vinci RJ, Chessare JB, CooperMR, et al. Effect of computer orderentry on prevention of seriousmedication errors in hospitalizedchildren. Pediatrics. 2008;121(3):e4217.

    14 Koppel R, Metlay JP, Cohen A,Abaluck B, Localio AR, Kimmel SE,et al.Role of computerized physician

    F ocus On Qual ity

    662 HEALTH AFFAIRS AP R IL 2 0 1 0 2 9 : 4

  • 8/9/2019 Health Affairs Study on Problems with Computerized Provider Order Entry

    9/9

    order entry systems in facilitatingmedication errors. JAMA. 2005;293(10):1197203.

    15 Han YY, Carcillo JA, VenkataramanST, Clark RS, Watson RS, NguyenTC, et al. Unexpected increasedmortality after implementation of acommercially sold computerizedphysician order entry system. Pedi-atrics. 2005;116(6):150612.

    16 Leapfrog Group. Fact sheet: com-puterized physician order entry.

    Washington (DC): Leapfrog Group;2009 Mar 3 [cited 2010 Feb 21].

    Available from: http://www.leapfroggroup.org/media/file/FactSheet_CPOE.pdf

    17 Metzger JB, Welebob E, Turisco F,Classen DC. The Leapfrog GroupsCPOE standard and evaluation tool.Patient Safety and Quality Health-care [serial on the Internet]. 2008Jul/Aug: p. 225 [cited 2010 Feb 21].

    Available from: http://www.psqh.com/julaug08/cpoe.html

    18 Kilbridge P, Welebob E, Classen DC.Overview of the Leapfrog Group

    evaluation tool for computerizedphysician order entry [Internet].Lexington (MA): Leapfrog Groupand First Consulting Group; 2001Dec [cited 2010 Feb 21]. Availablefrom: http://www.leapfroggroup

    .org/media/file/Leapfrog-CPOE_Evaluation2.pdf

    19 Kilbridge PM, Welebob EM, ClassenDC. Development of the Leapfrogmethodology for evaluating hospitalimplemented inpatient computer-ized physician order entry systems.Qual Saf Health Care. 2006;15(2):814.

    20 The Online Appendix can be ac-

    cessed by clicking on the OnlineAppendix link in the box to the rightof the article online.

    21 Kuperman G, Bobb A, Payne TH,Avery AJ, Gandi TK, Burns G, et al.Medication-related clinical decisionsupport in computerized providerorder entry systems: a review. J AmMed Inform Assoc. 2007;14(1):2940.

    22 Lin DY, Wei LJ, Ying Z. Model-checking techniques based on cu-mulative residuals. Biometrics.2002;58(1):112.

    23 Williams DA. Extra-binomial varia-tion in logistic linear models. ApplStat. 1982;31;1448.

    24 Kaushal R, Shojania KG, Bates DW.Effects of computerized physicianorder entry and clinical decisionsupport systems on medicationsafety: a systematic review. Arch In-tern Med. 2003;163(12):140916.

    25 Connolly C. Cedars-Sinai doctorscling to pen and paper. WashingtonPost. 2005 Mar 21. p. A01.

    26 Bates DW, Kuperman GJ, Wang S,Gandi T, Kitter A, Volk L, et al. Tencommandments for effective clinicaldecision support: making the prac-tice of evidence-based medicine areality. J Am Med Inform Assoc.2006;13:52330.

    27 Osheroff JA, Teich JM, MiddletonBF, Steen EB, Wright A, Detmer DE.

    A roadmap for national action onclinical decision support. Bethesda(MD): American Medical Infor-matics Association, CDS RoadmapSteering Committee; 2006 Jun 13[cited 2008 Sep 14]. Available from:https://www.amia.org/files/cdsroadmap.pdf

    28 Osheroff JA, Pifer EA, Sittig DF,Jenders RA, Teich JM. Clinical deci-sion support implementers work-book. 2nd ed. Chicago (IL): HealthInformation Management SystemsSociety; 2004 [cited 2010 Mar 9].

    Available from: http://www

    .himss.org/cdsworkbook29 Health Information and Manage-

    ment Systems Society. Improvingoutcomes with clinical decisionsupport: an implementers guide.Chicago (IL): HIMSS; 2005.

    AP R IL 2 0 1 0 2 9 : 4 HEALTH AFFAIRS 663