learning healthcare systems and driving changes in quality...
TRANSCRIPT
Learning Healthcare Systems and Driving Changes in Quality of Care
David Flum, MD MPH Professor Surgery, Health Services and Pharmacy
University of Washington
Strategies for Improving Surgical Quality
•E.g. NSQIP, STS
•E.g. NNE, SCOAP, MSQC
•E.g. P4P, SCIP Measures
• E.g. COE
Selective Referral
Process Compliance
Outcomes Feedback
Regional CQI and Learning
Systems
Outline
• NNE • The Tools of the Surgical Collaborative and
Learning System – Michigan – Washington
• Take home points
CQI Toolbox
Data Performance feedback Collaborative meetings Surveys, site visits, videotape
Rate Of Overall Bypass Surgery Mortality Improvement, by State
Chassin M R Health Aff 2002;21:40-51
©2002 by Project HOPE - The People-to-People Health Foundation, Inc.
The Michigan Program
• Partnership between BCBSM, Michigan hospitals, and clinician scientists
• 12 collaborative quality improvement programs Cardiac care, cancer surgery, bariatrics, breast cancer, cardiac CT,
trauma/ acute care, joint replacement, and medical admissions 50+ hospitals 200,000+ pts / year
• $30 million annual investment from BCBSM
1. Beaumont Grosse Pointe 2. Borgess Medical Center 3. Bronson Medical Center 4. Crittenton Hospital and Medical Center 5. Forest Health Medical Center 6. Gratiot Medical Center 7. Harper University Hospital 8. Henry Ford Macomb Hospital 9. Henry Ford Hospital 10. Henry Ford Wyandotte 11. Hurley Medical Center 12. Lakeland Community Hospital 13. Marquette General Hospital 14. McLaren Regional Medical Center 15. Mercy General Health Partners 16. Metro Health in Wyoming 17. Munson Medical Center 18. Oakwood Hospital 19. Port Huron Hospital 20. Sparrow Health System 21. Spectrum Health System 22. St. John Hospital and Medical Center 23. St. John Oakland 24. St. Mary Mercy Hospital 25. St. Mary's Grand Rapids 26. University of MI Health System 27. Beaumont Troy 28. Beaumont Royal Oak 29. Huron Valley Sinai 30. Henry Ford West Bloomfield 31. St. Joseph Mercy Oakland 32. North Ottawa Community Hospital
Types of Data Collected by the MBSC Component Data Sources/Timing Content
Peri-operative care and outcomes
Chart review for all patients at 30 days post-op
Risk factors, treatment details, complications
Late outcomes Survey at baseline and mailed annually to all consenting patients
Late complications, weight loss, comorbidity
resolution, quality of life
Structure and process of care
Annual survey of surgeons and other bariatric
program staff
Specifics of bariatric practice, OR environment,
patient safety culture
Technical quality Videotaped operative procedures
Peer skill ratings
Subjective aspects of quality
Site visit Observed structure and process specifics
Cost BCBSM claims Payments for facility, professional, ancillary care
Collaborative quality improvement
• Robust, externally audited clinical registry • Rigorous, timely feedback to clinicians about
comparative performance • Identifying and implementing best practices
– Surgeons learning from their data – Surgeons learning from each other
• Quarterly meetings • Site visits • Watching videos
Measuring technical quality
• Surgeons submitted videotape of “typical” laparoscopic gastric bypass video
• Blinded peer rating • Technical skill rated according to modified
OSATS instrument
1
2
3
4
5
10
10
11
10
2
10
7
13
6
11
13
12
20
11
15
15
5
10
16
10
4
16
8
10
12
12
3
11
19
10
17
13
1
10
14
12
18
11
9
10
Average Rating
Note: ◊ represents the mean; bars extend from mean ± standard error.
Average of Six Ratings of Technical Skill
Video # =
N Raters =
0.00
0.05
0.10
0.15
0.20
2 2.5 3 3.5 4 4.5 5
Any
Com
plic
atio
n
Average Surgeon Rating
P-value for slope = 0.020
Average Rating vs. Any Complication Lap-RYGB Procedures Only
Surgical Care and Outcomes Assessment Program
Seattle Spokane
Yakima
•
Wenatchee
Richland
Port Townsend
Sunnyside
Aberdeen Kirkland
Portland
Longview
Port Angeles
Mt Vernon
Tacoma
Olympia
• Learning system focused on process of care • Surveillance • Process control • Process learning • Outcome control • Evidence-based Interventions
• Impact behavior through: Policy Peer to peer networking Checklists Education and public health initiatives
Green = Hospital meets or exceeds the benchmark performance rate for the metric GRAY = Hospital at least meets the SCOAP average, but not the benchmark rate for metric Yellow = Hospital is not reaching the SCOAP average, or the benchmark rate for metric Red = Hospital is one standard deviation away from the SCOAP average
Performance Benchmarking
0%10%20%30%40%50%60%70%80%90%
100%
2005 2006 2007 2008 2009 2010 2011
Leak Testing in Colon Surgery
$10,000
$12,000
$14,000
$16,000
$18,000
$20,000
$22,000
2006 2007 2008 2009
Non-SCOAP SCOAP
Bending the Cost Curve
Kwon et al. SCOAP at 5 years. Surgery 2012
Seattle
Spokane
Yakima
•
Wenatchee
Richland
Port Townsend
Sunnyside
Aberdeen Kirkland
Portland
Longview
Port Angeles
Mt Vernon
Tacoma
Olympia
Growing a “Real World” Learning System
• Spread across the interventional space Spine, urology, general/pediatric surgery, vascular Cross disciplines Target conditions-not procedures
• Post-hospital care • Pre-hospital space • Include patients NOT having interventions • Focus on metrics that matter
Had a spine fusion procedure
X-ray looks great….She doesn’t feel better
In a learning system….
Deborah becomes a partner in assessing care
Starting in docs office
oContact preference card
oBaseline function, QoL
oSurveyed at 6,12, 24 months
Capturing the Patient’s Voice
CERTAIN Hub Improving PRO data collection and patient survey experience
Patients are introduced to CERTAIN and asked to complete a PRO survey for neck or back pain
Evidence generation CER/PCOR
Partners in QI and
Research
Evidence into Practice
Patient Voices
Clinician Offices
Long-term Care
Hospitals
Surgery February 2012
Learning System Targets
• Topics emerge from the community • Perform statewide CER/PCOR to; Generate new evidence Evaluate if research applies to real world
• Evidence is evaluated, synthesized and “translated” into practice support
Do we all need to switch to Chloraprep?
849 gen surg pts 41% lower risk
CHG+IPA vs. PVI
Observed-to-Expected Ratio SSI: All Cases-n=11,000
0.4
1
1.6
CHG CHG+IPA PVI IPC+IPA Non-IPA IPA
Journal of the American College of Surgeons Volume 218, Issue 3, March 2014, Pages 336–344
The hospital changed our pain regimen and we noticed that we started to have leaks again-what’s up with that?
NSAID safety
Background
• NSAIDs being used more post-op • Do they impair GI healing? Schlacta et al. in study of post-operative pain
regimens showed 4x increased anastomotic leak in NSAID arm (p=0.15) o4 other small studies show elevated risk
• Assess the relationship between NSAIDs and anastomotic complications
Results
• SCOAP linked to CHARS 47 hospitals, 4 years, 13k+ patients undergoing GI
surgery involving an anastomosis RED FLAG for our community-education campaign
and decision support Opportunities for further research
Do we really need to use oral contrast?
Radiology-Pathology Concordance by Contrast Regimen
CT Scan Concordance 90.6%
No Contrast (n = 997) 85.7%*
IV Contrast Only (n = 4177)
90.4%
IV + enteral Contrast
(n = 2201) 90.0%
Enteral Contrast Only (n = 339)
92.6%
Should our surgeons be coming in to the hospital in the middle of the night?
Percent Perforation, by Deciles of ED to OR Time
0
5
10
15
20
25
1 2 3 4 5 6 7 8 9 10
Pe
rce
nt
Pe
rfo
rati
on
10
12
14
16
18
20
22
6 to 9 9 to 12 12 to 15 15 to 18 18 to 21 21 to 24 0 to 3 3 to 6
Per
cent
Per
fora
tion
Perforation by Time of ED Presentation
Percent Perforation
DAYTIME HOURS NIGHTTIME HOURS
10
12
14
16
18
20
22
6 to 9 9 to 12 12 to 15 15 to 18 18 to 21 21 to 24 0 to 3 3 to 6
Per
cent
Per
fora
tion
Perforation by Time of ED Presentation
Percent Perforation < median ED-OR time > median ED-OR time
DAYTIME HOURS NIGHTTIME HOURS
Collaboratives and Learning System
• Focus on real world-not just unusual care environments
• Put all elements together Doctors drive it-don’t become “victims” of it Patients become partners-outcomes that matter Data spans continuum of care Data from diverse sources All stakeholders at the table
Take Home Points-
• Collaboratives-effective tools for creating change and learning
• Learning system goal-Experiences of prior patients enrich the experience of all
• Surveillance is not enough-change is the goal