from big data to real policy: making ehr data matter in health … › xldb2018 › sites ›...

Post on 06-Jul-2020

1 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

From Big Data to Real Policy: Making EHR Data Matter in

Health Care

R. Adams Dudley, MD, MBAProfessor of Medicine and Health Policy

Director, UCSF Center for Healthcare Value

Twitter: @RAdamsDudleyMD 1

Financial disclosures

No relevant financial disclosures

Educational Objectives

• To learn how measurements made in the health care setting become policy

• To understand how this came to be—then faded away—in California intensive care units (ICUs)

• To recognize how new versions of these tools should be developed to increase the probability of uptake

3

How I Think about My Research

My patient: The health care system

Its diagnoses:• Quality is variable,• Cost is worse than that, • But at least we don’t insure everyone.

How Payors Try ToInfluence The System

Many of the weaknesses of the system reflect the history of how care is paid for:

• Payors have been told doctors know better than they what the patient needs,

• They have been told they therefore should pay doctors for what work they do, that is, a fee-for-service.

How Payers Try ToInfluence The System

Payors at the national level have only recently recognized fee-for-service incentivizes volume, not quality or value:

• Starting to use incentives and public reporting to stimulate performance

• In California, this started in 2004

California History with Incentives: How I Succeeded and Then Failed• The California Hospital Assessment and Reporting Task Force

was founded in 2004: Representatives from all stakeholder groups, including insurers, hospitals, doctors and nurses, employers/labor, consumers

7

History: Built ICU Outcome Models Using Hand-Collected Data

8

9

…and then to a highly accurate, multi-hospital model.

PATIENTRISK

32.5%

History: Built ICU Outcome Models Using Hand-Collected Data

10

Yields pretty good predictions (AUC for mortality prediction 0.82-0.89), but data collection was very

expensive

Much More Variation in Mortality Rates than Expected by Chance in 2006 Pilot

0

1

2

Ob

serv

ed D

eath

s/E

xpec

ted

Dea

ths

Observed ICU Deaths/Expected ICU Deaths (at each of the 110 hospitals that submitted data on ≥100 patientsO/E < 1 indicates low mortality, O/E > 1 indicates high mortality)

3-fold difference in death rates between red oval (worse) and green oval (better) hospitals.

Source: www.CalHospitalCompare.org, run by the California Hospital Assessment and Reporting Taskforce (CHART), CHCF, and UCSF

11

California Hospitals Improve Outcomes: Risk-adjusted ICU Mortality Rate Fell >2% in Three Years

10.0%

10.5%

11.0%

11.5%

12.0%

12.5%

13.0%

13.5%

14.0%

2007 2008 2009 2010

Risk-adjusted ICU Mortality

12

What Happened Next?

13

14

What Happens When You Lose Track of the Incentives?

You become irrelevant

University of CaliforniaSan Francisco

Electronic ICU Outcomes Model (e-ICOM)

An eight year scramble to use big data and machine learning to climb back to relevance

University of CaliforniaSan Francisco

Big Data Example from ICU Severity of Illness Models

Old way: Making ‘big data’ littleIn the past, ICU models would take “single worst value” for GCS, which is the same in these 3 patients. But their prognoses are very different.

New way: Making ‘big data’ big again: If data collection and analysis are cheap and easy, use all GCS values and calculate the change of the GCS over time.

University of CaliforniaSan Francisco

Estimating Severity of Illness/Risk of Death in an ICU Patient Sample

University of CaliforniaSan Francisco

Using the “Bigness” of the Data:As We Use More of the Lab and Vital Sign

Observations, the Severity Model Improves

University of CaliforniaSan Francisco

Conclusion: Using Big Data Improves Severity Assessments in Ways that Are Pretty Intuitive

University of CaliforniaSan Francisco

Adding Natural Language Processing and Machine Learning Makes It Even Better

University of CaliforniaSan Francisco

Electronic ICU Outcomes Model (e-ICOM)

Most importantly: The approach is consistent with CMS’ directive for pulling performance measures from the EHR andhospitals’ insistence on low data collection costs

University of CaliforniaSan Francisco

Overcoming the “Black Box”Which words in the text have predictive power?

Favors survival Favors mortality- +

Weights give strength of association with patient mortality

University of CaliforniaSan Francisco

Topic Modeling Interpretation and contribution of human-understandable concepts

University of CaliforniaSan Francisco

Topic Modeling Interpretation and contribution of human-understandable concepts

University of CaliforniaSan Francisco

Sensitivity analysis for modeling process Extensive overlap in both categories and specific terms

University of CaliforniaSan Francisco

Sensitivity analysis for modeling process Extensive overlap in both categories and specific terms

University of CaliforniaSan Francisco

Making sense of topicsClinician-understandable topics lend intuitive meaning to “black box” predictions

University of CaliforniaSan Francisco

Making sense of topicsClinician-understandable topics lend intuitive meaning to “black box” predictions

Implications for you• Doesn’t matter how good your methods are if your product doesn’t fit in with the business interests of the stakeholders

• Although you need leading clinicians on your team to get payors to believe that what you are doing is clinically appropriate, that is still not enough for your work to have an impact (and, hence, a market)

• How providers get paid is where you should focus:- Not on anything insurers or government say they

want…unless they pay for it- Not on what consumers want

• This applies to my team, too: The methods I’ve described will only matter if CMS or insurers re-adopt incentives addressing critical care.

top related