welcome! the webinar will begin shortly… we are the evidence: evaluation of peer programs

83
Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Upload: charla-grant

Post on 11-Jan-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Welcome!The webinar will begin shortly…

We are the Evidence: Evaluation of Peer Programs

Page 2: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

We are the Evidence: Evaluation of Peer Programs

Presenters: Jean Campbell, Ph. D., Laysha Ostrow, Ph. D., and Bevin Croft, MPP, MASP

August 5, 2015

Page 3: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Agenda

• Welcome, introductions, and housekeeping - Oryx Cohen, Acting Executive Director• Website: www.power2u.org

• Jean Campbell, Ph. D.• Laysha Ostrow, Ph. D., Live & Learn, Inc.• Bevin Croft, MPP, MASP, Human Services Research Institute

• Q & A session and close

Page 4: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

“We Are the Evidence” A Short Story of Consumer Evaluation

Professor Jean Campbell, PhD

August 5, 2015

Page 5: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Research and evaluation ought to and can enhance consumer choice, power, and knowledge

--Jeanne Dumont, Mental Health Consumer/Survivor Research and Policy Work Group

1992

Page 6: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

We are all the embodiment of our lived experience

Page 7: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

I am both scientist and artist

Page 8: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

I have also experienced madness

Page 9: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

In 1987 my worlds collided when I was hired by the California Network of Mental Health Clients to co-direct with Ron Schraiber the Well-Being Project

The result was the first consumer research project in history and the beginning of our journey as peers to come to voice through research and evaluation

Page 10: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

The types of questions you ask, lead to the types of

answers you get

Page 11: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

A Collision of ValuesBiomedical research ignores the meaningful human aspects that encompass personal and social needs and all the factors that differentiate people from symptoms, brains, or molecules• From the perspective of mental

health professionals, mental illness refers to pathology

• Researchers monitor functionality, recidivism, and symptom control

• Providers tend to look at the way the results of services affect the mental health system

Page 12: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

• In a review of the literature, Ridgway (1988) found that their were wide differences between consumers and professionals on • the relative importance of treatment goals,• identification of problems,• barriers to service, and• needs and preferences for housing and supports

• The inclusion of consumers in the conduct of research challenged “expert-driven” research• Participation of consumers in defining and measuring psychiatric services

precipitated a “kind of turf war over controlling human beings in a landscape that includes an entire array of service options and widely divergent goals and definitions of mental health and quality of life” (Scott, 1993)

Page 13: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

THE WELL-BEING PROJECT

We Speak for Ourselves

Page 14: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

• The Well-being Project (1986-1989) was conducted to determine what factors promote or deter well-being of persons commonly labeled as mentally ill in California (N=500)• It was written, administered, and analyzed entirely by mental

health consumers• Rather than reifying patienthood, the study affirmed our

personhood• In a world of stigma, poverty, loneliness and injustice our

voices illuminated the value of self-help, creativity, meaningful work, human dignity and respect

Page 15: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Research Methods

Peer engagement, Participatory instrument development, Standardized interviewing, Convenient sampling, Data entry, Statistical analysis, Data Interpretation, Reporting

Creative Mediums

First-person narratives, Improvisational theater, Painting, Collage, Poetry, Photography, Documentary film-making, Music, Stand-up comedy

Page 16: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Multi-Media ToolKITs Produced

• The Well-Being Project: Mental Health Clients Speak for Themselves, an academic report and research guide of study findings• People Say I’m Crazy, 56 minute award-winning video

documentary (available on YouTube)• People Say I’m Crazy, compendium to documentary with

study findings integrated and reflected in edited testimony, poetry, prose, photography and paintings (available from Amazon.com)

Page 17: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

What We Learned

• Examination of the evolving peer evaluation process suggested that the struggle between rival approaches in science is no mere question of the relative efficacy of different methodologies, but rather one facet of a clash of moral and spiritual outlooks• Through research we were empowered to speak for ourselves as

individuals and as a community of peers• We were not alone. Rather, we found a consumer perspective honed

from our lived experiences • Therefore, involving consumers in research and evaluation is not only

politically correct, but produces new knowledge about mental illness that is vastly different from mental health professionals, administrators, and policy-makers

Page 18: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

The Well-Being Project was about people telling their stories

• The project surveyed over 500 mental health consumers, family members, and service providers and collected over 40 hours of recorded testimony • It identified new knowledge about the• Power of personhood in promoting well-being• Important role of empowerment, meaning in life, and self-efficacy in

recovery• Negative outcomes produced from coercion in the traditional service

system

Page 19: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Emerging Peer Research and Evaluation Model

• Research is organized from the bottom-up • Consumer perspective is sought & respected• Participants are seen as study partners• Issues of ownership & control are confronted• Evaluation story is told• Personal impact is captured• Knowledge exchange is a key goal

Page 20: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Redefining Recovery and Peer Support

• The goal of recovery is rooted in the rich history of the Mental Health Consumer/Survivor/Ex-Patient Movement and its development of organized peer support services. • Through efforts of consumers to research and evaluate peer support a

peer vision of recovery emerged• In the early 1970’s, large numbers of psychiatric patients were

discharged from psychiatric hospitals to find themselves adrift in uncaring communities • In response, they began to organized small groups for mutual support

through self-help approaches and to advocate for social justice

Page 21: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

By the 1990’s persons with mental illness began to more formally organize on a national level championing the South African disability motto “Nothing About Us, Without Us”• Peer advocates, providers, and researchers formed the

Consumer/Survivor Mental Health Research and Policy Work Group with federal support from The Center for Mental Health Services • Peer-run support groups matured, diversified and increased

in numbers• By the turn of the 21st century, the push for recovery and the

demand for accountability through measurement of outcomes and program satisfaction accelerated across the United States

Page 22: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

• To support peer advocacy efforts, consumers initiated their own studies:

• identified the needs and preferences for housing and supports; • profiled state mental health systems;• developed measurement tools (satisfaction, empowerment, well-

being, healing); • introduced consumer satisfaction teams; and, • promoted the use of focus groups

Page 23: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

In a series of focus group sessions funded by CMHS the Consumer/Survivor Mental Health Research and Policy Work Group began a systematic articulation of outcomes of mental health services and supports from a consumer perspective

Page 24: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

From the brainstorming, sorting, and ranking sessions, “maps” were generated that identified domains and performance indicators

• The most frequently identified concerns were mental health provider threats of involuntary treatments, subtle forms of coercion, lack of respect, and, debilitating side effects of psychotropic medications

• Recovery, personhood, well-being and liberty were identified as valued outcomes

Building on these preliminary studies, the federal government encouraged states to implement the value-based Consumer-Oriented Mental Health Statistics Improvement Program Report Card which included some of the performance indicators peers had identified

Page 25: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

To survive in an era of evidence-based funding, peer providers realized that they needed to measure cost, effectiveness, quality, utilization and appropriateness of the services they provided

• The Peer Outcomes Protocol Project (POPP) developed, field-tested, and disseminated an evaluation protocol to measure service and programmatic outcomes for mental health community-based peer support programs

• Funded as part of the 1995-2000 University of Illinois at Chicago’s (UIC) National Research and Training Center (NRTC) on Psychiatric Disability, it was largely designed, directed, and implemented by consumer researchers, peer advocates and providers

Page 26: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

POPP Outcome Domains

Specific outcome domains organized into individual modules:

Demographics Service Use Employment Housing/Community Life Quality of LifeWell-Being (Recovery, Empowerment & Personhood)Program Satisfaction

Page 27: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

• Mental Health Recovery: What Helps and What Hinders? (Onken, Dumont, Ridgway, Doran & Ralph, 2002) was a national research project for the development of performance indicators of recovery for mental health systems that evolved from collaborative efforts among a team of consumer and professional researchers, state mental health authorities, and a consortium of sponsors. • Structured focus groups were used in nine states with a diverse cross-

section of 115 consumers to gain knowledge on what helps and what hinders mental health recovery• Performance indicators were developed and incorporated into a

systems level recovery protocol that is rapidly becoming a standard in the field

Page 28: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Consumer Operated Services Recognized as an Evidence-Based Practice by Federal Government

• Campbell and others in the Consumer-Operated Service Program (COSP) Multisite Research Initiative (1998-2010) found that peer-run service programs were effective as an adjunct to traditional mental health services in improving the outcomes of adults diagnosed with serious mental illness• Analysis of over 1800 participants revealed that those offered consumer-operated

services as an adjunct to their traditional mental health services showed significant gains in positive subjective well-being (hope, self-efficacy, empowerment, goal attainment and meaning of life) in comparison to those who were offered traditional mental health services only• The COSP Multisite Research Initiative was the largest and most rigorous study of

consumer-operated services ever conducted• Randomized Controlled Trial• 8 study sites & Coordinating Center• 1,827 participants

Page 29: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Change in Well-Being Over TimeCOSP+TMHS vs. TMHS Only

-.2

-.1

.0

.1

.2

.3

Base(N=1600)

4mo(N=1441)

8mo(N=1357)

12mo(N=1272)

TMHS, 797

COS+, 803

Page 30: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

COSP Study Results Summary

• The greatest gains in wellbeing were found for the participants who used the peer support services the most • Variations in wellbeing effects across sites were unrelated to formal

COSP models of peer support service delivery• Most important, analyses of COSP common ingredients and outcome

results established evidence of a strong relationship between key peer practices and positive subjective wellbeing outcomes

Page 31: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Well-Being by Intensity of COSP Use

-.2

-.1

.0

.1

.2

.3

Base(N=516)

4mo(N=479)

8mo(N=449)

12mo(N=404)

High, 77

Low, 79

None, 360

Page 32: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

COSP Common Ingredients• The COSP Multisite Research Initiative identified 46 common ingredients (CIs)

of consumer-operated service programs and established a significant link between the service elements of peer-run programs (structures, values and practices) and positive psychological functioning of participants• Some of the most powerful CIs include opportunities to tell one’s story, creative self-

expression, informal and formal peer support, knowledge of the consumer movement, and a safe environment

• The Fidelity Assessment Common Ingredients Tool (FACIT) based on the common ingredients was developed and tested to evaluate how effectively a peer-run program is implementing evidence-based peer practices• Missouri DMH funds use of the FACIT in its drop-in centers conducted by several 2-

person teams of certified FACIT peer evaluators• Ohio consumer network has trained over 20 certified FACIT peer evaluators • Inquires regarding the FACIT protocol and training materials are available from Dr. Jean

Campbell at [email protected]

Page 33: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

•COS Evidence-BasedToolKIT and video available for download from the Center for Mental Health Services/SAMHSA

•Book: Clay, S. (Ed.) On Our Own Together (2005) •COSP Website soon to be published

Page 34: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Peer research and evaluation models have the capacity to examine the context in which evaluation occurs to go beyond the statistics that record numbers to include the meaningful interactions of those living with psychiatric diagnoses

As a result, new questions, methods and ways of interpreting data have emerged in the margins of traditional services research

Page 35: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Recommendations

• Consumer evaluation assessments should be inexpensive in terms of time, money, and labor. • Consumer evaluation assessments should be standardized to enable

comparisons between different programs or points in time.• It is essential to practice value-added consumer evaluations in which

plans for use of assessment information are established before beginning.

Page 36: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Evaluation Checklist

Why does this evaluation need to be done? How will you determine the concepts of interest (i.e. focus groups of consumers/family members, staff, other

stakeholders, consultants)? Who is the target population? Define the target population in terms of factors that may influence the goals of

assessment including sex, age, socio-economic status, race, and program variables. Who will have a personal or professional interest in the results of the evaluation? What is your results distribution

plan? What are the evaluation objectives? Are you looking at one point in time, between different groups, or changes over

time? What are the evaluation goals? Is the goal to identify necessary improvements, the elimination of unsatisfactory

services, or continuous quality improvement? What type of data collection will be used to answer your questions? Would it be best to use mail, telephone, or face-

to-face surveys? Who will be involved in the data collection? Will you use interviewers? If so, who will serve as interviewers (e.g.,

staff members, students, or peers)? Determine how you will measure the concepts of interest. Will you use a preexisting scale? Are the scale items neutral, clear, and operationally defined? Is it better to use open-ended or closed-ended questions, or a combination of both? How will consumers be involved in scale construction, data collection, and data analysis? How will you train

interviewers, peers and other stakeholders about the project?

Page 37: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

References

• Campbell, J., and Schraiber, R. (1989). In Pursuit of Wellness: The Well-Being Project, We Speak for Ourselves, Vol. 6, California Department of Mental Health, Office of Prevention, Sacramento, CA.

• Campbell, J. and K. Einspahr (2004). Peer Outcomes Protocol Project (POPP) Toolkit. Program in Consumer Studies and Training, Missouri Institute of Mental Health, University of Missouri—St. Louis and the University of Illinois-Chicago, Department of Psychology, National Institute of Disability Research and Training: Chicago, IL.

• Campbell, J. (2005). The historical and philosophical development of peer-run support programs. In S. Clay (Ed.): On Our Own, Together: Peer Programs for People with Mental Illness, Vanderbilt Press: TN.

• Campbell, J. (2006, June 1). Effectiveness findings of a large multisite study of consumer-operated service programs (1998-2006), Final Report for the SAMHSA/CMHS COSP Multisite Research Initiative Coordinating Center 4UDI SM52328-04 Project CG004186, Center for Mental Health Services, Substance Abuse and Mental Health Services Administration/HHS.

• Campbell, J. (2009). We are the evidence: An examination of service user research involvement as voice. In J. Wallcraft, B. Schrank and M. Amering (Eds.) Handbook of Service User Involvement in Mental Health Research. John Wiley and Sons: West Sussex, UK.

• Campbell, J., Curtis, L., Deegan, P., Mead, S., and C. Ringwald (August, 2010). Evidence Based COS Practices KIT, CMHS/SAMHSA: Rockville, MD.

• Campbell, J. (2011). Outcomes measurement and the mental health consumer movement: Research, recovery, and recognition in the human services. In J. Magnabosco and R. Manderscheid (Eds.) Outcomes Measurement in the Human Services 2nd Edition, NASW Press: Washington, D.C.

Page 38: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Peer Respites in the U.S.: Data and

DirectionsLaysha Ostrow, PhD

University of Southern CaliforniaLive & Learn, Inc.August 5, 2015

Page 39: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Introduction

• Discuss survey of characteristics conducted in summer/fall of 2014 (N=17 from across the U.S.)• Research base to-date on peer respites and how this relates to

organizational structures• Hope to highlight throughout issues to think about in design,

implementation, and evaluation

Page 40: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Where, What, Who, & Why

Page 41: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Where are the current & planned peer respites?

Current & Planned Peer Respite Programs

Page 42: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Why are there peer respites?

• Inpatient and emergency service utilization is the driver of a lot of unnecessary mental health system spending• And there is a human cost to these services that includes iatrogenic

harm, life disruption, and dependence on the mental health system rather than independence• Peer respites are created with the idea that more humane, trauma-

sensitive, and less coercive supports could be available and divert both of these “costs”

Page 43: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

What are peer respites?

• Voluntary, short-term residential programs designed to support individuals experiencing or at-risk of experiencing a mental health crisis• Intended to build supportive connections and avert the need for psychiatric

emergency services

• Staffed and operated by people with lived experience of the mental health system who have professional training in crisis support• Seek to build mutual, trusting relationships

Page 44: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Overview of Operational Models

• Peer-run indicates that the board of directors is at least 51% peers • Peers staff, operate, and oversee the respite at all levels

• Peer-operated indicates that although the board is not a majority peers, the director and staff are peers • Often attached to a traditional provider

• Mixed are embedded in traditional provider but have peer staff• Peers do not have to be in leadership roles

Page 45: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Why have these “models”?• Based on existing research, who is involved and how may have an impact

on service user outcomes• Values of mutuality & equality in peer support may be even more

important in crisis support• Structures and processes of peer respites need to be studied, along with

outcomes of guests• Who, what, where, why….?

• How are they different or the same? • Fidelity measurement to understand what are the “best” practices• May be no meaningful differences based on identification of staff and leadership

as people with “lived experience”

Page 46: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Who works there? Mean Range

(Min-Max)Full-time peer staff 3.65

0 – 7

Part-time peer staff 5.711 – 13

Peer volunteers 3.10 – 22

Number of staff per shift on weekdays

2.12 1 – 4

Number of staff per shift on weekends

1.761 – 3

Number of staff per shift at night 1.181 – 2

Also employs non-peer staff (# of programs)

2 -

Page 47: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Who can go there?

• Harm reduction approach• One small house in one community can’t change everything, but it could

provide an opportunity to reduce harm to some people some of the time

• 47% of peer respites in the country have “no restriction” related to experiencing suicidality• 41% have “no restriction” related to housing status

Page 48: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Capacity & Census Mean

(or %)Range

(Min-Max)Number of guests accommodated at once 4 2 – 8

Average length of stay 5.9 0 – 11Maximum length of stay 9.0 5 – 29Average program census 51% 29% - 99%

Page 49: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Financing

Page 50: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Operating Budgets

$150,000 - $199,000

$200,000 - $249,000

$250,000 - $299,000

$300,000 - $349,000

$350,000 - $399,000

$400,000 - $449,000

$450,000 - $499,000

$500,000 or more

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5

Page 51: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Funding Sources

State revenues or block grant

County or local behavioral health agency

SAMHSA

Private foundation

Private donations

0 2 4 6 8 10 12 14 16

Page 52: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Estimated Dollars by Funder

State revenues or block grant

County or local behavioral health agency

SAMHSA

Private foundation

Private donations

$- $2,000,000.00 $4,000,000.00 $6,000,000.00

Page 53: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Research

Page 54: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Research Base

• One RCT of a peer respite (Greenfield, et al 2008)• Greater satisfaction

• Second Story evaluation (Croft & Isvan, 2015; next segment)• Peer support has moderate evidence-base (Chinman, et al, 2014)• Acute residential crisis programs have moderate evidence-base

(Thomas, et al, 2014)

Page 55: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Research on Peer-Run Organizations

• Previous research on peer-run organizations has found that those that are more “lateral”, participatory, and democratic have shown improvements for service users in outcomes of empowerment and stigma-reduction over those that are more hierarchical (Segal, et al, 2013)• Research on degree of involvement by people with lived experience in

peer-run organizations has shown that those that include more of us in operations are more likely to engage in the strategies that lead to these outcomes (Ostrow & Hayes, 2015)

Page 56: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

What’s next for peer respites in practice?

Refine and clarify eligibility criteria and length of stay requirements

Homelessness

Suicidality

Shorter stays?

Align program operations with mission

Organizational structure: Peer-run vs. peer-operated

Financing: Capacity for Medicaid reimbursement

Relationship to traditional mental health system: Consultation with clinical staff, referrals, and

outreach

Page 57: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Next steps• Developing adapted FACIT for peer respite settings• Includes measurement of organizational structure (peer-run, -operated, -staffed)• Measures actual qualities of the program, rather than relying solely on the blunt

categories related to involvement of people with lived experience on staff• Adaptation will include particulars related to short-term, overnight crisis

programs

• Also developing more robust measures of IPS in respites • Will repeat/update survey on characteristics this fall 2015

Page 58: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Resources & Contact• Toolkit for Evaluating Peer Respites (includes data presented here):

http://www.madinamerica.com/2014/11/toolkit-evaluating-peer-respites/ • Ostrow, L. & Croft, B. (2015). Peer Respites: A Research and Practice Agenda.

Psychiatric Services. http://dx.doi.org/10.1176/appi.ps.201400422

Contact:Laysha Ostrow

[email protected]

Page 59: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Evaluation Strategies and Early Results of the 2nd Story Peer Respite

Evaluation

Bevin Croft, MA, MPP Research Associate, Human Services Research Institute

March 31, 2015

Page 60: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Presentation Overview

1. Describe 2nd Story peer respite evaluation2. Present some results3. Discuss next steps for peer respite research

Page 61: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

2nd Story Respite House

Funded by SAMHSA Transformation Grant, administered through

Encompass, a community-based mental health

organization, overseen by County behavioral health department, evaluated by

HSRI

Page 62: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Outcome Evaluation Questions: To what extent has the program met its objectives to…

Reduce emergency hospitalizations

Reduce costs for services

Foster recovery for guests

Increase meaningful choices for recovery

Process Evaluation Research Questions: How was the program implemented? What is the relationship between the program and the

community? What lessons can we learn for future programs?

Page 63: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

The life of an external evaluator• Regular calls with an evaluation liaison team:

including program manager, the data collectors, and a county data person

• Frequent contact – emails, calls, text• Site visits crucial for establishing collaborative

relationships• Worked to feed back information as quickly as

possible through quarterly reports and ad hoc meetings

• Involved program in analysis and dissemination of results

Page 64: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Evaluation Data Sources

County Service Use Data

Stakeholder Interviews

Process/ Fidelity Measures

Meeting Observation Site Visits Document

Reviews

National Outcome Measures

Guest Satisfaction Survey Recovery Survey

Page 65: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Process Measures – IPS Core Competencies

Page 66: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Preliminary Examination of Impact on Service Use• May 2011 to June 2013• County administrative data,

including• Demographics• Diagnosis and GAF• Service use (for study period and

historic)

• Propensity score-matched comparison group• Outcome: Inpatient and

Emergency Service (IES) Use after date of first Respite Stay (or index date for comparison group)

Page 67: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Who used the respite?

Guests stayed an average of 2.3 times during the two-year study period

Total respite days (across all visits) averaged 28.4

Average age was 44 years old, and about half were women

About 13% Hispanic, and 72% were non-Hispanic and white

About 6% were married, and about 5% were employed

There were no significant differences between the guests and the comparison group in the analysis of County service use data

Page 68: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Analysis and Results

Stage One: Logistic Regression

Outcome: Likelihood of IES use after the respite/index date, controlled for relevant covariatesRespite guests were 70% less likely to use inpatient and emergency services*But likelihood of IES use increased with each additional day of respite stay*

Stage Two: OLS Regression

Outcome: Total IES hours for those who used these services after the respite/index date, controlled for relevant covariatesRespite days were associated with significantly fewer inpatient and emergency service hours*But the longer the stay, the more IES hours the guests were likely to use

*All associations reported here are significant at p≤.05

Page 69: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Stakeholder Interviews

• Central question: What is the program’s impact on guests’ lives?• 23 in-depth interviews 19 with guests

• Focus on transition-age youth, high inpatient/crisis service users, and those dissatisfied with the program

• 6 interviews with 9 providers• 11 interviews with staff• 30 minutes to 1 hour each• Recorded, transcribed, and organized

into themes

Guests

Staff

Leadership and Oversight

Providers

Other Community

Stakeholders

Page 70: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Guest Surveys

Peer data collectors invited all guests to take surveys at entry, discharge, and six

months outAsked about recovery, quality of life, and

perception of the program

Some issues that came up…

• Burden on guest time• Ethics and confidentiality• Losing people to follow-up• Data collector recruitment and

retention• Conflict of interest (staff as data

collectors)• Feeding the information back to

staff

Page 71: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Preliminary Results

Interview Response Themes

• Taking a rest• Living the life you want• Connecting to a peer community• Developing relationships• Being treated as an equal• Finding direction• Gaining independence

Surveys• Areas of focus

• Quality of life• Social connectedness• Functioning• Recovery• Substance use• Trauma and violence• Housing, education, and employment• Demographics

• 101 guests at baseline and discharge survey May 2011 – December 2014

Page 72: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Connecting to a Peer Community2nd Story described as…• support network• community base• this free love thing• a second family• like a commune• a warm, loving place• communal, comfortable, welcoming• community and continuance

61%I contribute to my community

(n=94)58%

I have a sense of belonging

(n=93)

83%I feel I belong in my community

(n=96)

I’d say that it gave you a sense of identity. It gave you a sense of belonging. It showed me that there are people

whose minds work the way mind does who are in control of their minds, don’t let their minds control them—who

are hugely intelligent and really run their own lives.

Page 73: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Taking a Rest[2nd Story] is my safe place. I

come here, and I can breathe. I come here, and I have people

who love me. You know? It’s my safe place. I love it here. I don’t know why I would choose any

other place but here.

62%I feel alert and alive

(n=94)63%

I am able to deal

with stress(n=94)

Note: All percentages indicate improvement in guest ratings from first baseline to last discharge and are statistically significant (p≤.05)

Page 74: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Living the Life You Want

You know, you are all there for, to kind of discover

wellness, I guess, discover who you are. I think Freud

said, “To work and to love.” And that really is what it is.

60%I have

more good days than

bad(n=95)

62%I have decent

quality of life(n=97)

37%Overall health status

(n=92)

Page 75: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Developing Relationships

82%I am happy

with the friendships I

have(n=94) 87%

I have people with whom I can do

enjoyable things(n=94)

85%In a crisis, I would have the support I

need from family or friends

(n=96)

2nd Story made me feel comfortable enough to go up to people and talk to people. Not just staff, but clients too. I realize that I wasn’t alone. There was other people there, quiet too, like me. And so it just helped me out a

lot. I took baby steps, and it is easier for me to talk to people now. Because I’ve got 2nd Story.

Page 76: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Developing Relationships, cont.

61%I have trusted people I can

turn to for help(n=94)

68%I have at least

one close mutual

relationship(n=98)

I really like that we can have a real serious conversation between each other and exchange information from each other. It is not like a one-

way talking…it is a two-way relationship and communication, and it’s really genuine…We’re just really real with each other. And they tell me when something’s not working for them…They’re real.

It’s like a friendship instead of a very closed, cold-hearted professional support…There’s connection.

There’s real connection at 2nd Story.

Page 77: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Being Treated as an EqualI don’t feel less than in this

environment. I feel like across the table we’re all equals. Even though I’m not peer staff, still...[In traditional crisis

services] I may have come out of this feeling like somehow I’m defective. You know, if this wasn’t around and there

was just the hospital and crisis house, I would feel in those environments very

mentally ill. Like labeled that. Like, “These are mentally ill patients.” And

I’m not a patient. I’m a person. And I get treated like a full human being.

59%I control the important

decisions in my life

(n=95)

83%I am able to control my

life(n=97)

Page 78: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Finding Direction[Without 2nd Story,] I wouldn’t have as much direction in my life or a sense of hope that I can recover...it is hard to say exactly where I’d be, but I can see how it has changed

me or helped me to change patterns in my life. I think if didn’t have that, I’d just be stuck in the same old cycle of

doing the same thing over again and having the same exact thing happen to me over again...Really being - mindful, is the word – being able to see that there is a way to be able to change. Live the life that you want to live, basically. And

I’m striving to do that. And I have accomplished a lot of goals that I didn’t think I could accomplish since I first came

to 2nd Story.

70%I am

growing as a person

(n=96)

65%I feel

hopeful about my

future(n=85)

Page 79: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Gaining IndependenceI remember when I first started coming here. I was like a little kid. I was like “Oh My God”. People weren’t saying “I have to do this” and always reminding me about it. It was really a joy. But I’ve matured a lot, so I’m pretty much used to it now.

[At 2nd Story] you feel as though you can make your own decisions. And that people don’t take full control of your lives and tell you what to do, when to eat. The door swings both ways pretty much.

64%I am using my

personal strengths, skills, and

talents(n=97)

63%I believe I can make positive

changes in my life

(n=86)

Page 80: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

To sum up these results…

• We observed statistically significant improvements in quality of life, wellness, hope, self-determination, independence, personal relationships, and community connections for respite guests• Respite is significantly associated with inpatient and emergency

service use reductions but• The association between respite and inpatient and emergency service use is

complex and non-linear• Likely that unobserved characteristics – such as housing instability and social

supports – impact the relationship between respite and inpatient and emergency service use

Page 81: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

Future Evaluation Directions

Process Evaluation

•Was the program implemented as intended? •What strategies worked well, and what needed to be changed?•What is the ideal organizational structure for 2nd Story?

Outcomes Evaluation

•What guest characteristics are predictive of program success?•What is the impact of the program on the lives of the peer staff?•What is the impact of the program on the mental health system as a whole? The community?

Service use analysis Round 2•Larger sample size and time period•Detect more variation, verify previous findings, or see if they’ve changed•What guest characteristics are associated with inpatient/emergency service use reduction?

Page 82: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

What’s next for peer respite research?

Capture additional outcomes and covariates• Quality of life• Housing stability• Social connectedness• Cost

Employ a mix of research approaches• Experiments/Quasi-

experiments• Qualitative inquiry• Participatory methods

Charting fidelity• Formative process

evaluation• Instrument development

(FACIT adaptation, IPS Core Competencies)

Page 83: Welcome! The webinar will begin shortly… We are the Evidence: Evaluation of Peer Programs

References and contactCroft B & Isvan N.: Impact of the 2nd Story Peer Respite Program on Inpatient and Emergency Service Use. Psychiatric Services: available online first at http://ps.psychiatryonline.org/doi/abs/10.1176/appi.ps.201400266

Bevin Croft, Research AssociateHuman Services Research Institute2336 Massachusetts AvenueCambridge, MA 02140office: 617-844-2536fax: (617) [email protected]