I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 1
KP | QUALITY MEASURESK P
Q M
How an online reporting tool of “whole-system” measures
helped Kaiser Permanente better understand, track and
improve quality across the entire healthcare system
A review of how Kaiser Permanente formulated a comprehensive list of strategic quality
and service measures (complete with composites/sub-scales), established
accountability with incentives, and created the technology necessary to track
performance compared with enterprise-wide goals and targets.
After this session, participants will be able to:
Select measures to meet strategic quality goals/targets
Use technology to track performance at all levels (enterprise, facility and department, etc.)
Create composites/subscales
Establish accountability
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 2
KP | QUALITY MEASURESK P
Q M
Andy Amster - Senior DirectorJoseph Jentzsch - Principal ConsultantCenter for Healthcare AnalyticsKaiser Permanente, USA
• Kaiser Permanente paid for travel and expenses to this conference.
• Andy/Joe have received travel expenses from NCQA, AORN, NQF, and
the AMA for past presentations on quality measurement and
reporting.
• KP limits its staff receiving honoraria (N/A for this conference)
• Research costs are paid from Kaiser Permanente’s budget.
• No conflicts of interest.
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 3
KP | QUALITY MEASURESK P
Q M
Kaiser Permanente
By the Numbers • Nearly 10 million members• More than 17,000 physicians and
174,000 employees (including 48,000 nurses)
• 7 regions serving 8 states and the District of Columbia
• 38 hospitals (co-located with medical offices)
• 618 medical offices and other outpatient facilities
• 70 years of providing care (opened to public in 1945)
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 4
KP | QUALITY MEASURESK P
Q M
Problem Statement
State of Healthcare Quality reporting at Kaiser Permanente
(circa 2007)
• Leadership had difficulty understanding and interpreting quality
performance results
• Lack of broad system-wide measures reflecting quality “writ large”
• Heavily manual quality oversight process
– Typically, 3 ring binders vs on-line readily accessible displays
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 5
KP | QUALITY MEASURESK P
Q M
Key Solutions
Identify and focus on areas with opportunities for improvement
Create composites / subscales to monitor
Set numeric goals and targets
Create An On-line Tool for Quality Measurement and Reporting
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 6
KP | QUALITY MEASURESK P
Q M
Identifying and targeting areas with opportunities for improvement
How did we decide what areas needed improvement?
• Gap from desired performance
• Unjustified/unexplained variation in performance
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 7
KP | QUALITY MEASURESK P
Q M
Creating composites and subscales
Promotes system wide reporting and improvement
Enables higher-level understanding of performance on logical
groups of measures
• Composites
– A composite measure is a combination of two or more individual
measures or subscales in a single measure that results in a single score
• Subscale
– A subscale measure is a combination of two or more individual measures
in a single measure that results in a single score and is used as a
composite subcomponent.
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 8
KP | QUALITY MEASURESK P
Q M
Setting goals and targets
Highly dependent on the underlying purpose of the measure
• Accountability
– Acceptance of specific level of performance
– Pay for performance (incentives)
Incent improvement or maintenance of high performance
Often, multiple targets (short-term and long-term)
• Improvement
– Commitment to specific target or benchmark
• Exploratory/understanding
– Can we calculate valid and reliable performance results?
– Can we set realistic performance targets for future measures?
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 9
KP | QUALITY MEASURESK P
Q M
Creating An On-line Tool for Quality Measurement and Reporting
Determine what to measure
Populate the database with measure, metadata, performance results
Create tool to search measures
Track progress
Enable on-line performance review
Identify areas with opportunities for improvement• Wide variation in performance
• Gap from target or benchmark
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 1 0
KP | QUALITY MEASURESK P
Q M
Result of this Work
Key Solutions proved immensely helpful to senior leaders in
tracking progress and focusing attention
• Key solutions included a combination of identifying areas with
opportunities for improvement, creating composites / subscales to
monitor, setting goals and targets and creating an on-line tool for
quality measurement
• Outcome Examples:
– Hospital Standardized Mortality Ratios
– Service – HCAHPS Rate Hospital 9 and 10s
– HEDIS - Colorectal Cancer Screening
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 1 1
KP | QUALITY MEASURESK P
Q M
Example 1: Hospital Standardized Mortality Ratios
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 1 2
KP | QUALITY MEASURESK P
Q M
Interventions: Hospital Standardized Mortality Ratios
• First presentation of HSMR to the National Quality Committee in
2006
• Program wide HSMR Summit with Sir Brian Jarman in 2008
• Major sepsis process improvement and mortality reduction
initiative kicked off in California in 2008-09
• Mortality “deep dive” conducted and presented to senior leaders in
2009
• Ongoing work since 2010 to assure adequate number of hospice
contracts, and more widespread adoption of inpatient and
ambulatory palliative care programs.
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 1 3
KP | QUALITY MEASURESK P
Q M
Example 2: Service - Rate Hospital 9 and 10s
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 1 4
KP | QUALITY MEASURESK P
Q M
Interventions: Service - Rate Hospital 9 and 10s
• HCAHPS incorporated into incentive programs in 2009
• Goals are aligned and cascade from the most senior executives to
the front line manager line of sight goals
• Hourly rounding
• Nurse Knowledge Exchange (bedside shift report)
• Nurse Leader Rounding
• Direct Report Rounding
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 1 5
KP | QUALITY MEASURESK P
Q M
Example 3: HEDIS - Colorectal Cancer Screening
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 1 6
KP | QUALITY MEASURESK P
Q M
Interventions: HEDIS - Colorectal Cancer Screening
• Inreach: leverage electronic health record to flag patients needing
screening and offer to schedule at any and all clinical opportunities.
• Outreach: letters, phone calls, e-mails to accomplish above.
• FIT: leverage new and more widely accepted screening test to
increase willingness and uptake
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 1 7
KP | QUALITY MEASURESK P
Q M
Example 4: HEDIS - Colorectal Cancer Screening, by Race/Ethnicity
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 1 8
KP | QUALITY MEASURESK P
Q M
Interventions: HEDIS - Colorectal Cancer Screening, by Race/Ethnicity
• Adoption of ECHO (equitable care health outcomes) Program
– Identification of disparities through collection and leveraging of data
– Trust building (based on increased understanding of cultural aspects of
care and communication)
– Health promotion (culturally tailored resources for member education)
– Evidence-based medicine
– Implementing innovative ideas (spread effective practices and provide
educational opportunities)
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 1 9
KP | QUALITY MEASURESK P
Q M
On-line Tool – Introduction
KP | Quality Measures (KPQM) is a software tool created by
Kaiser Permanente (KP)
• KPQM serves as a “one-stop” repository for its national quality
measures
• KPQM contains measure information and specifications, tracks
performance including benchmarks/targets
• KPQM measures are used for Accountability, Improvement and
Exploration/Understanding
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 2 0
KP | QUALITY MEASURESK P
Q M
On-line Tool – Introduction (cont.)
KPQM also displays performance as singular domains of quality (e.g. Safety, Service, Clinical Effectiveness), by settings of care (e.g. Ambulatory Surgery, Home Health) and by specific topics (Infection Prevention)
Measures can be searched in a variety of ways:
Disease /
Condition Accrediting categories Care Setting Framework
Product Line /
Demographic
Cancer
Diabetes
CV disease
Asthma
COPD
etc.
Prevention and
screening
Respiratory
Cardiovascular
Diabetes
Musculoskeletal
Behavioral
Survey-based
Access/availability
Inpatient
Ambulatory
Home Health
Skilled Nursing
Hospice
IOM Aims (safe, effective
efficient, timely, patient-
centered, equitable)
Donabedian classification
(structure, process, outcome)
Commercial
Medicare
Medicaid
Sex
Pediatric, Adult
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 2 1
KP | QUALITY MEASURESK P
Q M
On-line Tool – Components
Clearinghouse• A searchable database of measures of national strategic performance with
performance results
Performance View• Graphical and/or tabular presentation representing performance results• Performance may be compared to one or more benchmarks, regions, or
facilities
• An arrow is used to indicate the direction of ‘good’ in KPQM graphs
Dashboard View• A collection of performance views enabling a balanced perspective of a topic,
setting of care, domain of quality, or interest area for a specified audience
• Focus is on trend, variation, comparison with benchmarks to provide a high-level informative view of overall performance
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 2 2
KP | QUALITY MEASURESK P
Q M
On-line Tool – Components (cont.)
KPQM
Measure Clearinghouse
KP quality measures reported nationally
Includes specifications and metadata
KPQM Performance
Trend analysis of KP, regional and facility level
measures
KPQM Dashboards
Sets of measures joined into a single view
supporting the needs of a target audience
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 2 3
KP | QUALITY MEASURESK P
Q M
On-line Tool – Measures by Category
497 Measures as of October 21, 2014Major Category
Measures in Clearinghouse
Performance View Available
Clinical Effectiveness 221 160
Safety 84 82
Service 68 68
Ambulatory Surgical Centers 49 47
Other (to be classified) 32 0
Resource Stewardship 26 26
Behavioral Health 10 10
People Pulse 4 4
Medication Management 3 3
D
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 2 4
KP | QUALITY MEASURESK P
Q M
On-line Tool – Clearinghouse (Search)
D
Ability to search for all measures of interest and
their detailed information within the
Clearinghouse
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 2 5
KP | QUALITY MEASURESK P
Q M
On-line Tool – Clearinghouse (Content)
D
Ability to see detailed information about the measure including full
specification
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 2 6
KP | QUALITY MEASURESK P
Q M
On-line Tool – Performance
D
Incorporation of narrative
interpretation
Ability to see historical trends and compare facilities / regions /
enterprise to targets and national benchmarks
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 2 7
KP | QUALITY MEASURESK P
Q M
On-line Tool – Dashboards
D
Ability to combine historical trends of related measures
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 2 8
KP | QUALITY MEASURESK P
Q M
Conclusion
Our problems:
• Leadership had difficulty understanding and interpreting performance results
• Lack of broad system-wide measures reflecting quality “writ large”
• Heavily manual quality oversight process
– 3 ring binders vs on-line readily accessible displays
The tool helped solve these problems
• Enabled the logical grouping of measures into composites and subscales and related measures to enable display of performance “writ large”
• Allowed the incorporation of narrative interpretation alongside graphical performance views to facilitate understanding and interpretation by senior leaders
• Automated the production of graphical displays and analyses and was available on the Web, eliminating the need for 3-ring binders
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 2 9
KP | QUALITY MEASURESK P
Q M
Appendix
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 3 0
KP | QUALITY MEASURESK P
Q M
Andy Amster
Senior Director, Centre for Healthcare Analytics
Kaiser Permanente, USA • Andy Amster is responsible for establishing the strategic direction
for quality measurement, evaluation, and reporting throughout Kaiser Permanente nationally. Andy received his BA from Pomona College in Claremont, California and his MS in public health/epidemiology from the UCLA School of Public Health.
Other contributors:• Joseph Jentzsch – KPQM Architect and Developer
• Dennis Famularo – KPQM Program Manager
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 3 1
KP | QUALITY MEASURESK P
Q M
KPQM Application Engine – Toolset
KPQM Application Engine is the toolset used to create KPQM. The toolset is available for other (non-KPQM) applications
Toolset includes:• Web application (Measures, Performance, Dashboards)
– Requires two (2) servers (one for web, second for database) Servers can be Windows 7 up to Windows Server 2012
– SQL Server Can use SQL Server Express (free)
• Data Update Tool (MS Access) – Used to update web application with current data
• Application Design Tool (MS Access)– Used to add, modify, remove content
• Data Export Tool (MS Access)– Used to provide users with full datasets
• Training, Support, Documentation
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 3 2
KP | QUALITY MEASURESK P
Q M
KPQM Application Engine – Supported Devices
KPQM Application Engine is currently supported by the following devices / browsers:
• Windows 7 and 8.1 PC’s• IE8-IE11 (IE9-IE11 for best performance)• Firefox• Chrome• Safari• Mobile Devices• Apple iPad and iPhone (small form factor)• Android Devices• Windows 8.1 tablets and phones
Displays optimized for mobile device will be released Q2 2015
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 3 3
KP | QUALITY MEASURESK P
Q M
KPQM Application Engine – Security
Only devices logged into the enterprise network directly or via
VPN can gain access
Only IE on Windows devices support behind the scenes
authorization
• All other browsers / devices require login using Windows
username/password.
• Some browsers allow saving credentials for subsequent use
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 3 4
KP | QUALITY MEASURESK P
Q M
KPQM Application Engine – Other Users
TPMG uses KPQM Application Engine to create their independent website
To date, they have created an application consisting of:• 63 Measures
• 17 Dashboards
Their lead developer had this to say about their experience with KPQM Application Engine:
• Very pleased working with the KPQM Application Engine• The KPQM Application Engine has been a robust tool to create a professional
looking, polished, web-based dashboard with relatively little effort
• Relatively easy learning curve
• Tool is flexible enough to do any task I need to do quickly• KPQM Application Engine is EXTREMELY cost effective (costs bordering on trivial)
and it performs as well as solutions that are much more expensive
I N T E R N A T I O N A L F O R U M O N Q U A L I T Y A N D S A F E T Y I N H E A L T H C A R E | E X C E L L O N D O N U K | A P R I L 2 1 - 2 4 , 2 0 1 5 P R I V I L E G E D A N D C O N F I D E N T I A L S L I D E # 3 5
KP | QUALITY MEASURESK P
Q M
KPQM Application Engine – Availability
Kaiser Permanente will offer the online tool to government
entities and not-for-profit organizations
• Availability – 4Q2015
• Contact:
Joseph Jentzsch
Kaiser Permanente
Data behind the headlines
Simon Mackenzie
Medical Director St George’s University Hospitals NHS Foundation Trust
Professor of Quality of Care and Patient Safety St George’s University London
Formerly
Clinical Lead for Business Intelligence
Healthcare Improvement Scotland
Declarations
• Conflicts of interest- none
• Acknowledgments- many
– Brian Robson, Peter Christie, Donald Morrison, Tim Norwood
– Roger Black, Robyn Munro
– Andrew Longmate
– Don Goldmann, Rocco Perla, Gareth Parry
• To explain how NHS Scotland uses data to maximise improvement
• Help you do the same
– Variation over time
– Variation within aggregated data
– Understanding what you are looking at
– Be quick to act but slow to judge
Objectives
The headline
16.3% reduction in HSMR
(Hospital Standardised Mortality Ratio)
Principles to get the best from data
• Understand how the data are derived
• Interpret the data correctly
• Learn from variation
• Use multiple sources of information
Context
• NHS Scotland
• SPSP : Scottish Patient Safety Programme
• HSMR : Hospital Standardised Mortality Ratio
NHS Scotland cares for 5 million people
14 territorial Health Boards
156,000 staff
65,610 nurses
4,287 Hospital Consultants
4,000 General Practitioners
Source: https://isdscotland.scot.nhs.uk/Health-Topics/Workforce/Publications/2013-05-28/2013-05-28-Workforce-Report.pdf
Source : HM Treasury 2012
Healthcare Improvement Scotland (HIS)Information Services Division (ISD)/Public Health Intelligence (PHI)
National commitment to quality
http://www.scotland.gov.uk/Resource/Doc/311667/0098354.pdf Scottish Government, May 2010
3 Quality Ambitions
• Safe care
• Effective care
• Person-centred care
http://www.scottishpatientsafetyprogramme.scot.nhs.uk/programme
SPSP key aims
• 15% reduction in mortality (extended to 20%)
• 30% reduction in adverse events
HSMR (as used in NHS Scotland)
HSMR = Observed number of deaths
Predicted number of deaths*
*based on a model using administrative data
Mortality is measured 30 days after admission to an acute hospital
This matters - but is not the key point today which is about how to use data
Certainly HSMR is controversial
“Measurement of safety sits at the messy
end of measuring quality in healthcare.
Measuring and learning from hospital
mortality is at the very messy end of that
scale.”
Prof Charles Vincent, Imperial College London
June 6th 2013
HSMR (as used in NHS Scotland)
HSMR = Observed number of deaths
Predicted number of deaths*
*based on a model using administrative data
Mortality is measured 30 days after admission to an acute hospital
HSMR >1 : more deaths occurred than predicted
HSMR <1 : fewer deaths occurred than predicted
This does not necessarily mean that care was
poor (or good) or that lives were lost (or saved)
But it does raise questions
What does the value mean?
Scotland HSMR – 12.5% reduction
Is this good news?
• Yes!
• But it is a step on the journey, not the end
• What does it actually mean and how do we use it to improve further?
NHS Scotland approach
• HSMR reported quarterly:
– http://www.isdscotland.org/Publications/index.
asp
Approach is to support and work with hospitals
rather than to judge
HIS/ISD evaluate trends and variation
Trends
• Prime interest is to see if performance is improving
• To be alert to signs of concern
• To understand reasons for change
Variation within a hospital – HSMR time series
Potential concern Downwards shift
from baseline
Run chart – two shifts – continuing improvement
Run chart – no change
Two components: Observed/Predicted
‘Aggregated data may camouflage variation…..
Leaders need to seek out variation…..if safety and quality are to be effectively monitored and
improved’
Variation
4/24/2015
Variation between hospitals – funnel plot for one quarter’s HSMR data
Larger hospitals
Higher than expected
values
Lower than expected
Normal
variation
Hospital level data e.g
mortality/harm
Major procedureswith high crude or adjusted mortality
e.g. Aortic aneurysm repair, major colo-
rectal surgery
Major diagnoses with high crude or adjusted mortality
e.g. Liver disease,
pneumonia
Other subgroups with high mortality
e.g. Elderly emergency admissions
Source: Raj Behal
An approach to using Hospital Level data
Drill down to generate hypotheses. Structured case reviews. System/process reviews.
Courtesy of Professor CJ Peden
Guide for boards
• Must have a standard process
– HIS contacts any hospital which is a statistical outlier
– Normally leaves action to hospital but offer support
• Two unusual examples;
– Crosshouse Hospital 2010
– NHS Lanarkshire 2013
What to do when HSMR is high or rising
Extensive data evaluation – not just mortality
Site visits by peer review team of doctors and nurses from other Boards
• visited over 40 clinical areas
• spoke to over 200 staff, over 300 patients and carers
• reviewed 152 clinical records
Methodology
4/24/2015
Outcome
• There were data issues
• There were opportunities to improve clinical care
• Worthwhile but painful process
4/24/2015
• All Boards have looked at recommendations
• HIS and ISD have reviewed how we support Boards
• Learning for directors and leaders in interpreting data
– Look for opportunities to improve, not simply assurance
– If something is ‘too good to be true’ it is probably untrue
– Process and improvement data are both important
– Aggregate data may be misleading
Learning for all
A single measure of safety is a fantasy
While “Zero Harm” is a bold and worthy aspiration, the scientifically correct goal is ‘Continual Reduction”
The Scottish Patient Safety Indicator
• Ventilator Associated Pneumonia
– Process: Bundle compliance. Outcome: VAP rate.
• Central line related blood stream infection
– Process: Bundle compliance. Outcome: CRBSI rate
• Questions Leaders should ask if reported process improvement doesn’t deliver improved outcomes:
– Is the theory wrong?
– Is the process improvement real?
– Are the measurements accurate?
Process and Outcome measures matter
4/24/2015
Data source: Extranet; 9 of 12 ITUs ; monthly n= 220
• Data are useful only if properly understood
• Aggregated data can only tell part of the story
– Look for real change, real variation
• Multiple measures are better than any single
one
• Leaders need to understand enough to ask
appropriate questions
• Focus on improvement not just assurance
In conclusion