spdg national conference washington, dc march 5, 2013

36
Developing, Measuring, and Improving Program Fidelity: Achieving positive outcomes through high-fidelity implementation SPDG National Conference Washington, DC March 5, 2013 Allison Metz, PhD, Associate Director, NIRN Frank Porter Graham Child Development Institute University of North Carolina

Upload: kalyca

Post on 12-Jan-2016

55 views

Category:

Documents


0 download

DESCRIPTION

Developing, Measuring, and Improving Program Fidelity: Achieving positive outcomes through high-fidelity implementation. SPDG National Conference Washington, DC March 5, 2013. Allison Metz, PhD, Associate Director, NIRN Frank Porter Graham Child Development Institute - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: SPDG National Conference  Washington, DC March 5, 2013

Developing, Measuring, and Improving Program Fidelity:

Achieving positive outcomes through high-fidelity implementation

SPDG National Conference

Washington, DC

March 5, 2013

Allison Metz, PhD, Associate Director, NIRN Frank Porter Graham Child Development Institute

University of North Carolina

Page 2: SPDG National Conference  Washington, DC March 5, 2013

Goals for Today

• Define fidelity and its link to outcomes• Identify strategies for developing

fidelity measures• Discuss fidelity within a stage-based

context• Describe the use of Implementation

Drivers to promote high fidelity • Provide case example

Page 3: SPDG National Conference  Washington, DC March 5, 2013

“PROGRAM FIDELITY”

“The degree to which the program or practice is implemented ‘as intended’ by the program developers and researchers.”“Fidelity measures detect the presence and strength of an intervention in practice.”

Page 4: SPDG National Conference  Washington, DC March 5, 2013

Context, Compliance, and Competence

• Three components– Context: Structural aspects that encompass

the framework for service delivery– Compliance: The extent to which the

practitioner uses the core program components

– Competence: Process aspects that encompass the level of skill shown by the practitioner and the “way in which the service is delivered”

Definition of Fidelity

Page 5: SPDG National Conference  Washington, DC March 5, 2013

Purpose and Importance

• Interpret outcomes – is this an implementation challenge or intervention challenge?

• Detect variations in implementation • Replicate consistently• Ensure compliance and competence• Develop and refine interventions in the

context of practice • Identify “active ingredients” of program

Fidelity

Page 6: SPDG National Conference  Washington, DC March 5, 2013

Socially Significant Outcomes

Effective Interventions

A well-operationalized

“What”

Effective Implementation

Methods

Enabling Contexts

Formula for Success

Page 7: SPDG National Conference  Washington, DC March 5, 2013

• Clear description of the program

• Clear essential functions that define the program

• Operational definitions of essential functions (practice profiles; do, say)

• Practical performance assessment

Usable Intervention Criteria

Page 8: SPDG National Conference  Washington, DC March 5, 2013

Practice ProfilesDeveloping Fidelity Measures

Practice Profiles Operationalize the Work• Describe the essential functions that allow a model to be

teachable, learnable, and doable in typical human service settings

• Promote consistency across practitioners at the level of actual service delivery

• Consist of measurable and/or observable, behaviorally-based indicators for each essential function

Gene Hall and Shirley Hord, (2010) Implementing Change: Patterns, Principles, and Potholes (3rd Edition)

Page 9: SPDG National Conference  Washington, DC March 5, 2013

Measuring Competency

Practice Profiles

For each Essential Function: Identifies “expected” activities Identifies “developmental” variation(s)in

practice Identifies “unacceptable,” incompatible, or

undesirable practices

Page 10: SPDG National Conference  Washington, DC March 5, 2013

Sample TemplatePractice Profiles

Page 11: SPDG National Conference  Washington, DC March 5, 2013

Case ExampleDifferential ResponseImplementation Science

• Engagement• Assessment • Partnership• Goal Planning• Implementation

Communication Evaluation Advocacy Culturally Competent

Service Delivery

Functions

Page 12: SPDG National Conference  Washington, DC March 5, 2013

Case Example Differential ResponsePractice Profiles

Page 13: SPDG National Conference  Washington, DC March 5, 2013

Multiple Purpose for Implementation Practice Profiles

If you know what “it” is then:•You know the practice to be implemented•You can improve “it” •Increased ability to effectively develop the Drivers•Increased ability to replicate “it”•More likely to deliver high quality services •Outcomes can be accurately interpreted•Common language and deeper understanding

Page 14: SPDG National Conference  Washington, DC March 5, 2013

When are we ready to assess fidelity?

Stages of Implementation

Practice profiles are a part of stage-based work. When we are engaged in program development work, practice profiles operationalize the intervention so that installation activities can be effective and fidelity can be measured during initial implementation.

Stages

Page 15: SPDG National Conference  Washington, DC March 5, 2013

When are they developed?Practice Profiles

In order to create the necessary conditions for…

Creating practitioner competence and confidenceChanging organizations and systems

….we need to define our program and practice adequately so that we can install the Drivers necessary to promote consistent implementation of the specific activities associated with the essential functions of the new service(s)

Drivers

Page 16: SPDG National Conference  Washington, DC March 5, 2013

Fidelity MeasuresPractice Profiles

Start with the Expected/Proficient column

Develop an indicator for each Expected/Proficient Activity Identify “evidence” that this activity has taken place Identify “evidence” that this activity has taken place with

high quality Identify potential data source(s)

Page 17: SPDG National Conference  Washington, DC March 5, 2013

Performance Assessment

Practice Profiles

Engagement

Indicator Data Source Evidence that engagement is happening

Evidence that engagement is happening WELL

Occurrence of visitsInitialAssessmentGoal Planning

Database X visits in y months Time spent in those

visits What engagement

interventions were used during the visits?

Outcome of visits/what happened?

1.

2.

3.

4.

5.

Page 18: SPDG National Conference  Washington, DC March 5, 2013

Establishing FidelityPractice Profiles

II. Engagement- The ongoing ability to establish and sustain a genuinely, supportive relationship with family while developing a partnership, establishing healthy boundaries, and maintaining contact as mutually negotiated. The ability to identify risk and protective factors, including family’s positive supports and assess the best way to engage the family and/or who in the family to engage first.

A. Initial Engagement

SC explains service; ensures family’s understanding that service is voluntary; and describes the scope of service

Engagement Survey (ES)

SC clearly establishes the purpose of involvement with the family

ES

Family signs service agreement DB

Family signs consents DB

Health status forms completed (by SC with family) DB

Where appropriate, SC alerts Educational Advocate (EA) Educational Advocate & DB

Conducted an Engagement Activity within 60 days of the service agreementA.Who was there (list)B.What activity

DB 

SC demonstrates respect, genuineness, and empathy for all family members, as defined by the family

ES (Q:1,3,&18)

Page 19: SPDG National Conference  Washington, DC March 5, 2013

Establishing FidelityPractice Profiles

B. Ongoing Engagement (SC creates trust and buy-in) Providing services that clients view as relevant and helpful ES (Q:3,10,12,13,&14)

Parents/caregivers to listen carefully, obtain information, and begin to develop trust

ES(Q:1&25)

SC is consistent, reliable and honest with families ES (Q: 35)

SC maintains contact as negotiated with family DB(Add to DB)

Respecting the culture, racial, ethnic, linguistic, and religious/spiritual backgrounds, and sexual orientation of children, youth, and families and uses as protective factors

ES (Q:14)

Motivational Interviewing (use to elicit change) DB

Family-centered case planning and management: includes all family members in process, reflected in SP

DB/SP-SCS review(Checkbox/tab)

Use of family satisfaction measures feedback to inform work SCS

Page 20: SPDG National Conference  Washington, DC March 5, 2013

5 Steps

New or established criteria1. Assure fidelity assessors are available,

understand the program or innovation, and are well versed in the education setting

2. Develop schedule for conducting fidelity assessments

3. Assure adequate preparation for teachers/practitioners being assessed

4. Report results of the fidelity assessment promptly

5. Enter results into decision-support data system

Fidelity Data Collection

Page 21: SPDG National Conference  Washington, DC March 5, 2013

Implementation Supports

Promote High Fidelity

Fidelity is an implementation outcome

How can we create an implementation infrastructure that supports high fidelity implementation?

Page 22: SPDG National Conference  Washington, DC March 5, 2013

IMPROVED OUTCOMES FOR CHILDREN AND FAMILIES

Performance Assessment (fidelity)

Coaching

Training

Selection

Integrated & Compensatory

Systems Intervention

Facilitative Administration

Decision Support Data System

AdaptiveTechnical

Com

pete

ncy

Driv

ers O

rganization Drivers

Leadership Drivers

Integrated & Compensatory

Page 23: SPDG National Conference  Washington, DC March 5, 2013

Building the InfrastructurePractice Profiles

Differential Response Essential FunctionsSelection Engagement Assessment Partnership Communication Evaluation

What are prerequisites (skills, value, knowledge) practitioners ideally would have when hired or redeployed to implement DR?

What features of DR practice would be helpful to assess through behavioral rehearsals during the selection process? What aspects of DR practice would be important to include in the caseworker job descriptions?

Page 24: SPDG National Conference  Washington, DC March 5, 2013

Building the InfrastructurePractice Profiles

Differential Response Essential FunctionsFacilitative

AdministrationEngagement Assessment Partnership Communication Evaluation

Will new policies or procedures need to be developed by the State or County to support DR essential functions?

What role does leadership need to play at State and County levels to reduce administrative barriers to DR practice?

How can State leadership institute policy-practice feedback loops?

Page 25: SPDG National Conference  Washington, DC March 5, 2013

Function X Driver ExamplePractice Profiles

Success Coach Example

Engagement KSAs Training

The ongoing ability to establish and sustain a genuinely supportive relationship with family while developing a partnership, establishing healthy boundaries, and maintaining contact as mutually negotiated.

The ability to identify family’s positive supports and assess the best way to engage the family and/or who in the family to engage first.

• Develop rapport and build relationship with family members

• Listen actively and openly to families’ perspective, needs and story

• Complete life circles, genograms, eco maps, case mapping, and other engagement/assessment tools

• Use basic Motivational Interviewing techniques with families to overcome resistance

• Motivational Interviewing (MI)

• Family Preservation Training

• Cultural Competency Training

Page 26: SPDG National Conference  Washington, DC March 5, 2013

Improvement Cycles

Shewhart (1924); Deming & Juran (1948); Six-Sigma (1990)

Practice Profiles

Plan

DoStudy

Act

Make Adjustments Decide what to do

Do it (be sure)Look at the results

Cycle Do over and over

again until intended benefits realized

Page 27: SPDG National Conference  Washington, DC March 5, 2013

Improvement CyclesPractice Profiles

PDSA cycles•Competency Drivers•Organization Drivers•Leadership•Essential Functions of the Profile•Data Collection activities

Practice Profiles and accompanying implementation supports will change multiple times during initial implementation

Cycles

Page 28: SPDG National Conference  Washington, DC March 5, 2013

Program Improvement

Program Review Process •Process and Outcome Data•Detection Systems for Barriers •Communication protocols

Questions to Ask •What formal and informal data have we reviewed? •What is the data telling us?•What barriers have we encountered?•Would improving the functioning of any Implementation Driver help address barrier?

Fidelity Data

Page 29: SPDG National Conference  Washington, DC March 5, 2013

Results from Child Wellbeing Project

Case Example

Component T1

Selection 1.44

Training 1.33

Coaching 1.27

Perf. Assessment 0.78

DSDS 0.18

Fac. Administration 1.38

Systems Intervention 1.29

Average Composite Score 1.1

Fidelity (% of cases) 18%

Case management model involved intense program development of core intervention components and accompanying implementation drivers.

Clinical case management and home visiting model for families post-care.

Page 30: SPDG National Conference  Washington, DC March 5, 2013

Case Example

• How did Implementation Teams improve fidelity?

– Intentional action planning based on implementation drivers assessment data and program data

– Improved coaching, administrative support, and use of data to drive decision-making ; adapted model

– Diagnosed adaptive challenges, engaged stakeholders, inspired change

Using Data to Improve Fidelity

Page 31: SPDG National Conference  Washington, DC March 5, 2013

Results from Child Wellbeing Project

Case Example

Component T1 T2 T3

Selection 1.44 2.00* 1.89*

Training 1.33 1.5* 1.10

Coaching 1.27 1.73* 1.83*

Perf. Assessment 0.78 1.34 2.0*

DSDS 0.18 1.36 2.0*

Fac. Administration 1.38 2.00* 2.0*

Systems Intervention 1.29 1.86* 2.0*

Average Composite Score 1.1 1.68* 1.83*

Fidelity (% of cases)

18% 83% 83%

Success Coach model involved intense program development of core intervention components and accompanying implementation drivers

Page 32: SPDG National Conference  Washington, DC March 5, 2013

Positive Outcomes

• Stabilized families• Prevented re-entry of children

into out of home placements

High Fidelity

Did high fidelity implementation lead to improved outcomes? Early outcomes include…

Page 33: SPDG National Conference  Washington, DC March 5, 2013

Methods, Resources and Feasibility

If fidelity criteria are already developed1. Understand reliability and validity of instruments

a. Are we measuring what we thought we were?b. Is fidelity predictive of outcomes?c. Does fidelity assessment discriminate between programs?

2. Work with program developers or purveyors to understand the detailed protocols for data collection

a. Who collects the data (expert raters, teachers)b. How often is data collectedc. How are data scored and analyzed

3. Understand issues (reliability, feasibility, cost) in collecting different kinds of fidelity data

a. Process data vs. Structural data

Fidelity Data Collection

Page 34: SPDG National Conference  Washington, DC March 5, 2013

Program Fidelity

• Fidelity has multiple facets and is critical to achieving outcomes

• Fully operationalized programs are pre-requisites for developing fidelity criteria

• Valid and reliable fidelity criteria need to be collected carefully with guidance from program developers or purveyors

• Fidelity is an implementation outcome; effective use of Implementation Drivers can increase our chances of high-fidelity implementation

• Fidelity data can and should be used for program improvement

Summary

Page 35: SPDG National Conference  Washington, DC March 5, 2013

Program Fidelity

Examples of fidelity instruments

•Teaching Pyramid Observation Tool for Preschool Classrooms (TPOT), Research Edition, Mary Louise Hemmeter and Lise Fox•The PBIS fidelity measure (the SET) described at http://www.pbis.org/pbis_resource_detail_page.aspx?Type=4&PBIS_ResourceID=222Articles •Sanetti, L. & Kratochwill, T. (2009). Toward Developing a Science of Treatment Integrity: Introduction to the Special Series. School Psychology Review, Volume 38, No. 4, pp. 445–459.•Mowbray, C.T., Holter, M.C., Teague, G.B., Bybee, D. (2003). Fidelity Criteria: Development, Measurement and Validation. American Journal of Evaluation, 24 (3), 315-340.•Hall, G.E., & Hord, S.M. (2011). Implementing Change: Patterns, principles and potholes (3rd ed.)Boston: Allyn and Bacon.

Resources

Page 36: SPDG National Conference  Washington, DC March 5, 2013

Stay Connected!

nirn.fpg.unc.edu www.scalingup.org

www.implementationconference.org

[email protected]@unc.edu