background:$mrcci ?framework - eansofyour$research ... bradford%hill*criteria*of*causation*(1965)...
Post on 15-Mar-2018
217 Views
Preview:
TRANSCRIPT
2016-07-08
1
1.3 Relating process to outcomes including modelling
Walter Sermeus, RN, PhDCatholic University Leuven
Belgium
Halle, July 11, 2016
The European Academy of Nursing Science 2
Background: MRC CI-framework
Modeling
2016-07-08
2
The European Academy of Nursing Science 3
What does the MRC-CI framework say on Modelling process and outcomes?
• Modelling a complex intervention prior to a full scale evaluation can provideimportant information about the design of both the intervention and the evaluation.
• One useful approach to modelling is to undertake a pre-trial economic evaluation. This may identify weaknesses and lead to refinements, or it may show that a full-scale evaluation is unwarranted, for example because the effects are so small that a trial would have to be infeasibly large.
• Formal frameworks for developing and testing complex interventions, such as MOST or RE_AIM may be a good source of ideas, and the National Institute for Health and Clinical Excellence has produced detailed guidance on the development and evaluation of behaviour change
The European Academy of Nursing Science 4
What will we do in this session ?
• What & why• Models – Tools• Examples• Exercise
2016-07-08
3
CHALLENGE
• Developing a complex intervention is the most important and most difficult thing of your research
• Most important: it will impact the effect• Most difficult: what to include, what not? How different elements will interact? The human factor?
A complex intervention ?
PROBLEM- AS IS -
SOLUTION- TO BE -
BLACK BOX
2016-07-08
4
A complex intervention ?
PROBLEM- AS IS -
SOLUTION- TO BE -
TIDIERDIVERGENTPART
CONVERGENTPART
EVIDENCEPROCESSEVA-
LUATION
MODEL PRACTICE
TIDIER
• Template for intervention description and replication (TIDieR)
• Full description of the intervention• To be able to understand the intervention – opening the black box
TIDIER
Hoffmann et al., BMJ, 2014
2016-07-08
5
Template for intervention description and replication (TIDieR)
• Name of Intervention• WHY: Describe rationale/theory essential to intervention• WHAT: Describe any physical and informational materials used• WHAT: Describe procedures, activities, processes• WHO: For each intervention provider: expertise, training• HOW: Describe modes of delivery• WHERE: Describe type of locations• WHEN, HOW MUCH: Describe frequency, duration• TAILORING: Describe tailoring activities (what, when, why,...)• MODIFICATIONS: If intervention was modified, describe changes• HOW WELL: Planned: intervention adherence/fidelity: how, by whom,..
• HOW WELL: Actual: describe the extent the intervention was delivered as planned
Hoffmann et al., BMJ, 2014
How developing a complex intervention ? Divergent part
Vanwersch R., 2016
2016-07-08
6
How developing a complex intervention ? Divergent part
• Aim:– Revenue/costs;; quality/patient safety;; patient satisfaction/experience;; time;; …
– Radical or incremental improvement• Actors:– Clinicians, managers, patients, external experts,…
Vanwersch R., 2016
How developing a complex intervention ? Divergent part
• Input:– AS-IS process specifications: process weaknesses, patient needs, culture scan, process modeling, benchmark process insights,
• Output:– TO-BE process specifications: to be service concepts, impact analysis, force-field analysis
Vanwersch R., 2016
2016-07-08
7
How developing a complex intervention ? Divergent part
• Techniques:– Unstructured: brainstorming, out-of-the box thinking, visioning,…
– Semi-structured: delphi, nominal group technique,…
– Structured: rule-based, case-based, repository based (coordination, process models,..)
• Tools:– Communication, voting, modeling, simulation,…
Vanwersch R., 2016
How developing a complex intervention ? Convergent part
Murphy et al., 1996
2016-07-08
8
Developing a complex intervention : the evidence-base• WHO Evidence Informed Policy Network
• How strong is the evidence?
Lavis et al., Health Research Policy and Systems 2009
Bradford-Hill criteria of causation (1965)• Direct evidence (no confounding, time, dose response)
• Mechanistic evidence (theory, explanation)
• Parallel evidence (replication across populations & settings)
From theory/model to practice
PROBLEM- AS IS -
SOLUTION- TO BE -
TIDIERDIVERGENTPART
CONVERGENTPART
EVIDENCEPROCESSEVA-
LUATION
MODEL PRACTICE
2016-07-08
9
Process evaluationIntervention fidelity
Moore et al., BMJ, 2015
(Ralph) Stacey diagram
https://vimeo.com/25979052
2016-07-08
10
RE-AIM model
Dimension Level Description
Reach Individual Proportion of target population that participated in the intervention
Efficacy / effectiveness
Individual Success rate;; positive / negative outcomes
Adoption Organization Proportion of settings that will adopt the intervention
Implementation Organization Extent to which intervention is implemented as intented in real world
Maintenance I&O Extent to which a program is sustained over time
The European Academy of Nursing Science 19Source: Glasgow R. et.al., 1999
(www.re-aim.org)
The goal of RE-AIM is to encourage program planners, evaluators,funders, and policy-makers to pay more attention to essential program elements including external validity that can improve the sustainable adoption andimplementation of effective, generalizable, evidence-based interventions.
Normalization process model
• Theoretical framework for understanding complexinterventions
• Normalization vs adoption / rejection• 4 dimensions:
– Interactional workability: effect on interactions betweenpeople and practices
– Relational integration: relation to existing knowledge and relationships
– Skill-set workability: effect on current division of labour– Contextual integration: relation to the organisation
The European Academy of Nursing Science 20
May C, BMC Health Services Research, 2006
2016-07-08
11
From theory/model to practice
PROBLEM- AS IS -
SOLUTION- TO BE -
TIDIERDIVERGENTPART
CONVERGENTPART
EVIDENCEPROCESSEVA-
LUATION
MODEL PRACTICE
AIMS / OUTCOMES
KEYINDICATORS
Relation between context, problemdefinition, intervention and evaluation
The European Academy of Nursing Science 22(Source: Campbell et.al., BMJ, 2007)
2016-07-08
12
Using the right metrics ?
180 cm
85 kg
BMI= 26,2
180 cm
85 kg
BMI= 26,2
RESULT
LOOK BOTH MENIDENTICAL ?
EXAMPLES OF MODELS
• Eight Step model for building evidence-based care pathways (Lodewijckx et al., 2012)
• Provencare Model Geisinger (Berry et al., 2009)
• PRocess Modelling in ImpleMEntation research (PRIME)
• Experience-based co-design• IHI Model for improvement
2016-07-08
13
How to build a complex intervention ?
• (1) review and validation of best practice evidence– (international) expert panel – Delphi-study– Final selection of active components and grading of evidence
• (2) redesign of the process– Clustering of clinical activities into key-interventions– a process flow diagram– Detailed description of the key-interventions– Translation into a set of process and outcome indicators
• (3) implementation
Lodewijckx et al., Trials, 2012;; Berry et al. BMJ Q&S , 2009
Process steps in developing a complex intervention (1)
2016-07-08
15
Translation of evidence in concrete procedures
Translating evidence into concrete procedures
2016-07-08
17
List of selected outcomes
PRIME - models
• PRocessModelling in ImpleMEntation research– Identifying active ingredients in professional behaviourchange
– Process modelling: understanding of factors underlyingclinical practice, in order to identify what sorts of processesshould be targeted in implementation interventions
– 4 levels: individuals, teams, organisations, systems– Different kind of theories
The European Academy of Nursing Science 34
(Source: Walker AE et.al., BMC HSR, 2003)
2016-07-08
18
Example of PRIME Theories - individual level
The European Academy of Nursing Science 35
Theory of Planned Behaviour (Ajzen 1991)
2016-07-08
19
Experience based co-design (EBCD)
http://www.kingsfund.org.uk/projects/ebcd
www.ihi.org
IHI Model for Improvement
• Developed by Langley et al. (2009 – IHI)
• To accerelatechange
• Three questions• PDSA-cyclus
2016-07-08
20
1. Setting aims
• S - Specific• M - Measurable• A - Achievable• R - Relevant• T - Time Related
• “Some is not a number and soonis not a time”.
!
2. Establish measures
• Outcome indicators• Process indicators• “Balancing measures” (side effects)
!Lloyd R. et al. Better Quality through better measurement, IHI, 2012
2016-07-08
21
3. Selecting change• Differentiate between common cause en special cause variation
• Define a plan for change– Based on evidence (what works)– Maier’s law: effectivity = efficacy * acceptability
• Basisingredients– Ideas (what works)– Will (what is acceptable or how can we increase acceptability)
– Execution (PDSA)
!
PDSA-cyle• PLAN: prepare what, how, whonwhen,…
• DO: do what you have planned and evaluate/measure what is happening
• STUDY: evaluate what you see and adapt the change if necessary
• ACT: Formulate adaptation and start over again.
!
2016-07-08
22
PDSA is an improvement cycle …
Lloyd R. et al. Better Quality through better measurement, IHI, 2012
2016-07-08
23
From development to optimization
• MOST-models–Multiphase Optimization Strategy
MOST- models
• Multiphase Optimization Strategy– Screening phase: • Components that are candidates for inclusion in an intervention (active components)• Basedon significant effect, effect size, cost,…
– Refining phase:• Fine tuning e.g. optimal level
– Confirming phase:• Evaluating the intervention as a package
The European Academy of Nursing Science 46(Source: Collins et.al. Am J Prev Med, 2007)
2016-07-08
24
Design: factorial design
Let’s assume 3 components in the intervention:- keeping a food diary- increasing physical activity- home visits
We have then a 23 factorial design: 8 randomized conditions
http://methodology.psu.edu/ra/most
Some examples
2016-07-08
25
Hillman et. al. MERIT Study, Lancet 2005;; 365: 2091–97
Process flow of cataract pathway
Main interventions:
1. One-stop pre-assessment2. Formulating surgical care plan based on
patient record only3. Next day telephone review by nurse4. Final review 4 weeks after surgery by
optometrist
5. Note: protocol exceptions for 2,3 and 4 so that patient is seen by eye surgeon
2016-07-08
26
Van Vliet et.al., Qual Saf Health Care, April 22, 2010
Root cause analysis using the 5-Why technique
• Why do you order ‘unnecessary’ ophthalmic screening appointments?– “Because I want to keep personal contact with my patients.”
• Why do you want to keep personal contact with your patients?– “This offers me the possibility to know the patient’s wishes and expectations for the refractive aim of the surgery
and to know the surgical complexity of the cataract.”
• Why do you need personal contact to get this information?– “The information I need is not registered in the patient record and if it is registered I am not sure how adequate it
is.”
• Why is this information not registered or not adequate?– “Ophthalmologists are not used to determine the refractive aim of the surgery in detail when admitting the patient
for surgery. Discussing the refractive aim with the patient costs additional time during the consultation. If information is present about the refractive aim, mostly it states to aim for plano (i.e., intended refractive error=0), but patients sometimes prefer a different refractive error or they prefer a multifocal lens. I have to be sure this is discussed when I decide on the type and power of the intraocular lens without the patient present.”
• Why are ophthalmologists not used to determine the refractive aim in detail during the initial consultation?– “Because we were used to this during the ophthalmic screening and we never officially coupled conducting this
activity to setting the diagnosis. So, actually, every cataract surgeon did this on his own way. Now we have to achieve consensus how to determine the refractive aim for a patient, so we can protocol it and train our residents and colleague-ophthalmologists. ”
Example of an interview with a cataract surgeon to identify the root cause for non-adherence to the decision rule for ophthalmic screening (Van Vliet E., 2011)
2016-07-08
27
Exercise
• Diabetes Care• References:• (1) Borgermans L, Goderis G, Van Den Broeke C, Mathieu C, Aertgeerts B,
Verbeke G, Carbonez A, Ivanova A, Grol R, Heyrman J. A cluster randomizedtrial to improve adherence to evidence-based guidelines on diabetes and reduce clinical inertia in primary care physicians in Belgium: study protocol[NTR 1369]. Implement Sci. 2008 Oct 6;;3:42.
• (2) Goderis G, Borgermans L, Mathieu C, Van Den Broeke C, Hannes K, Heyrman J, Grol R. Barriers and facilitators to evidence based care of type 2 diabetes patients: experiences of general practitioners participating to a quality improvement program. Implement Sci. 2009 Jul 22;;4:41.
• (3) Goderis G, Borgermans L, Grol R, Van Den Broeke C, Boland B, Verbeke G, Carbonez A, Mathieu C, Heyrman J. Start improving the quality of care for people with type 2 diabetes through a general practice support program: a cluster randomized trial. Diabetes Res Clin Pract. 2010 Apr;;88(1):56-64.
The European Academy of Nursing Science 53
Borgermans L. et.al., BMC HSR, 2009
Adoption: 142/336= 42%
Reach: 226/1577= 14%
2016-07-08
29
Results
Goderis G. et.al., Implementation Science, 2009
Assignment
• Evaluate the planned intervention – What are the active components of the «Diabetes» intervention ?
– Is it the best possible intervention ?– What are the results ?
• Formulate 3 recommendations to improve the intervention.
(4) Borgermans L. et.al., BMC HSR, 2009
2016-07-08
30
References• Berry, S. A., Doll, M. C., McKinley, K. E., Casale, A. S. and Bothe, Jr., A. 2009.
ProvenCare: quality improvement model for designing highly reliable care in cardiac surgery. Quality & Safety in Health Care, 18: 360–8.
• Campbell NC et al. 2007. Designing and evaluating complex interventions to improve health care. British Medical Journal, 334: 455–9.
• Collins, L. M., Murphy, S. A. and Strecher, V. 2007. The multiphase optimization strategy (MOST) and the sequential multiple assignment randomized trial (SMART): new methods for more potent eHealth interventions. American Journal of Preventive Medicine, 32: S112–18.
• Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999 Sep;;89(9):1322-7.
• Hoffmann TC et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014 Mar 7;;348:g1687.
• Langley GL, Moen R, Nolan KM, Nolan TW, Norman CL, Provost LP, The Improvement Guide: A Practical Approach to Enhancing Organizational Performance (2nd Edition), San Francisco, California, USA: Jossey-Bass Publishers;; 2009
References (2)• Lavis JN et al. SUPPORT Tools for evidence-informed health Policymaking
(STP). Health Res Policy Syst. 2009 Dec 16;;7 Suppl 1:I1.• Lodewijckx, C., Decramer, M., Sermeus, W., Panella, M., Deneckere, S. and
Vanhaecht, K. 2012. Eight-step method to build the clinical content of an evidence-based care pathway: the case for COPD exacerbation. Trials, 13: 229.
• May, C. 2006. A rational model for assessing and evaluating complex interventions in health care. BMC Health Services Research, 6: 1–11.
• Moore GF et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ. 2015 Mar 19;;350:h1258.
• Murphy MK, et al. Consensus development methods, and their use in clinical guideline development. Health Technol Assessment 1998;; 2 (3).
• VanWersch R., Rethinking care processes: does anyone has an idea? PhD dissertation Technical University Eindhoven, 2016, 198pp
• Walker, A. E., Grimshaw, J., Johnston, M., Pitts, N., Steen, N. and Eccles, M. 2003. PRIME– PRocess modelling in ImpleMEntation research: selecting a theoretical basis for interventions to change clinical practice. BMC Health Services Research, 3: 22.
top related