program evaluation

65
PROGRAM EVALUATION Michelle Mohr Carney, Ph.D. Institute for Nonprofit Organizations University of Georgia

Upload: duscha

Post on 06-Feb-2016

55 views

Category:

Documents


0 download

DESCRIPTION

Program Evaluation. Michelle Mohr Carney, Ph.D. Institute for Nonprofit Organizations University of Georgia. Purpose/Objectives. Increase your knowledge of processes involved in program evaluation Provide information and resources to help you design and conduct your own program evaluation. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Program Evaluation

PROGRAM EVALUATION

Michelle Mohr Carney, Ph.D.

Institute for Nonprofit Organizations

University of Georgia

Page 2: Program Evaluation

PURPOSE/OBJECTIVES

• Increase your knowledge of processes involved in program evaluation

• Provide information and resources to help you design and conduct your own program evaluation

Page 3: Program Evaluation

Introduction

• Why evaluate?• What is evaluation?• What does evaluation do?• Kinds of evaluation

Page 4: Program Evaluation

Why Evaluate?

Determine program outcomes Identify program strengths Identify and improve weaknesses Justify use of resources Increased emphasis on accountability Professional responsibility to show

effectiveness of program

Page 5: Program Evaluation

What is Program Evaluation?

• Purposeful, systematic, and careful collection and analysis of information used for the purpose of documenting the effectiveness and impact of programs, establishing accountability, and identifying areas needing change and improvement

Page 6: Program Evaluation

What Evaluation Does

• Looks at the results of your investment of time, expertise, and energy, and compares those results with what you said you wanted to achieve

Page 7: Program Evaluation

Kinds of Evaluation

• Outcome• Implementation• Formative• Summative

Page 8: Program Evaluation

TYPES AND TIMING OF EVALUATIONFormative: provides information

on program’s activities and how the program is progressing (valuable in developing and improving a program; accreditation activity)

Summative: examines how well a program has achieved its goals (valuable to funding sources etc)

Page 9: Program Evaluation

Outcome Evaluation

What: Identifies the results or effects of a programWhen: You want to measure students’ or clients’ knowledge, attitudes, and behaviors as a result of a programExamples: Did program increase achievement, reduce truancy, create better decision-making?

Page 10: Program Evaluation

Implementation EvaluationWhat: Documents what the

program is and to what extent it has been implementedWhen: A new program is being introduced; identifies and defines the program; identifies what you are actually evaluatingExamples: Who receives program, where is program operating; is it being implemented the same way at each site?

Page 11: Program Evaluation

• Planning• Development• Implementation• Feedback

Overview – The Process

Page 12: Program Evaluation

Overview – The Process

Page 13: Program Evaluation

Overview – The Process

Page 14: Program Evaluation

Overview – The Process

Page 15: Program Evaluation

Scope/Purpose of Evaluation

• Why are you doing the evaluation?

– mandatory? program outcomes? program improvement?

• What is the scope? How large will the effort be?

– large/small; broad/narrow

• How complex is the proposed evaluation?

– many variables, many questions?

• What can you realistically accomplish?

Page 16: Program Evaluation

Resource Considerations• Resources

–$$–Staff

• who can assist?• need to bring in expertise?

• do it yourself?• advisory team?

–Time• Set priorities• How you will use the information

Page 17: Program Evaluation

Evaluation Questions

What is it that you want to know about your program?

operationalize it (make it measurable)

Do not move forward if you cannot answer this question.

Page 18: Program Evaluation

PURPOSES OF EVALUATIONS AND CHOICES OF EVALUATORSPurpose Commissioning

BodyTypes of Evaluators

Program improvement Agency board and staff External, Internal, or Combo

Accreditation Agency board and staff Chosen by accrediting body

Summative Funding sources, agency board or staff

External, Internal, or Combo

Formative Funding sources, agency board or staff

External reviewers (ex: auditors)

Justify changes in program or leadership

Board or other authority

External Evaluators

Response to criticisms/concerns

Board, staff or other authority

External evaluators or internal staff

Page 19: Program Evaluation

KINDS OF DATA NEEDED FOR SOUND EVALUATION

1. Coverage (Does program meet community’s needs?)

2. Equity (Does program meet needs of women, minorities etc.?)

3. Process (Extent implemented as designed)

4. Effort (Extent program is producing results)

5. Cost-efficiency (Cost-benefit analysis)

6. Outcomes (Impact of program and achievement of outcomes)

Page 20: Program Evaluation

GETTING TO OUTCOMES

Name some things agencies measure or track in their programs (e.g., number of staff, units of services, number of participants

Page 21: Program Evaluation

PROGRAM OUTCOME MODELINPUTSACTIVITIESOUTPUTS

Resources Services ProductsMoney shelter classes taughtStaff training counseling sessionsvolunteers education educational

materialsEquipment & counseling hrs. of

serviceSupplies mentoring delivered

Constraints LawsRegulationsFunders’ requirements

Page 22: Program Evaluation

PROGRAM OUTCOME MODELINPUTSACTIVITIESOUTPUTSOUTCOMES

ResourcesServices ProductsBenefits for money shelter classes taught Peoplestaff training counseling sessions new knowledge

volunteers education educational materials increased skills

Equipment counseling hrs. of service changed attitudes

supplies mentoring delivered

Constraints values modified

laws Regulations behavior improved

funders’ requirements condition changed

Page 23: Program Evaluation

PARENTING EDUCATION PROGRAM

Parents from 10 families attend the workshops

Six group workshops are conducted Parents’ understanding of children’s

developmental issues increases Parents provide more age-appropriate

guidance to children Parents participate in role plays and

group discussions

Page 24: Program Evaluation

AFTER-SCHOOL PROGRAM Children master new individual and

group activities 15 at-risk children attend after-school

sessions at the church Activities are designed to encourage

cooperative play Children’s social skills improve Children make more positive use of

free time outside of the program

Page 25: Program Evaluation

TUTORING PROGRAM 20 school-agers in grades 4 to 8 are

matched with high school tutors Youngsters’ academic performance

increases Youngsters indicate increased belief in

their abilities to learn new subjects Youngsters receive one-on-one help in

reading and math Tutors emphasize the importance of

education

Page 26: Program Evaluation

CONFLICT MANAGEMENT PROGRAM Youths are involved in fewer physical

conflicts Discussion sessions explore

experiences with stereotyping, cultural differences

Youths display greater tolerance of differing points of view

Youths practice communication and negotiation skills

Youths report more willingness to have friends with backgrounds different from theirs

Page 27: Program Evaluation

INPUTS THROUGH OUTCOMES: THE CONCEPTUAL CHAIN

Inputs

Activities

Outputs

Initial Outcomes

Intermediate Outcomes

Longer-term Outcomes

Page 28: Program Evaluation

OUTCOMES VS. INDICATORS VS. TARGETS

Outcomes: Benefits for participants during or after their involvement with a program (e.g., Parents read to their preschoolers more often).

Outcome Indicators: The specific information collected to track a program’s success on outcomes (e.g., The number and percent of parents who read to their preschoolers more often now than before coming to the program).

Page 29: Program Evaluation

OUTCOMES VS. INDICATORS VS. TARGETS (CONT.)

Outcome Targets: Numerical objectives for a program’s level of achievement on its outcomes.(e.g., 75% of parents will report an increase in how often they read to their preschoolers)

Page 30: Program Evaluation

KEEP EXPECTATIONS MODEST

Outcome findings will not tell you:• Whether the program caused the outcome • Why this level of outcome was achieved • What actions to take to improve the outcome

Page 31: Program Evaluation

“PROGRAM" FOR OUTCOME MEASUREMENT PURPOSES

A set of related activities and outputs directed at common or closely related purposes that a meaningful portion of the agency’s resources is dedicated to achieve

Page 32: Program Evaluation

CRITERIA FOR CHOOSING FIRST PROGRAM FOR OUTCOME MEASUREMENT

Consider a program if:It has recognizable and reasonably defined

mission and clienteleIt represents a substantial portion of the

agency’s activityFunders or others have been asking about

the program’s resultsProgram supervisors and staff are likely to

be supportive of outcome measurement effort

Page 33: Program Evaluation

SAMPLE TIMELINE FOR PLANNING AND IMPLEMENTING OUTCOME MEASUREMENT IN A PROGRAM

Step Initial Prep Trial Run Implementation

Month Month Month

1 2 3 4 5 6 7 8-? ?+1 +2 +3 +4 +5

1. Get ready x x

2. ChooseOutcomes

x x

3. Specify Indicators

x

4. Prepare to collect data

x x

Page 34: Program Evaluation

SAMPLE TIMELINE FOR PLANNING AND IMPLEMENTING OUTCOME MEASUREMENT IN A PROGRAM (CONT.)

Step Initial Prep Trial Run Implementation

Month Month Month

1 2 3 4 5 6 7 8-? ?+1 +2 +3 +4 +5

5. Try out System

xxxxx

6. Analyze Findings

x x x

7. Improve System

x

8. Use findings

x x x x x

Page 35: Program Evaluation

SOURCES OF IDEAS FOR OUTCOMES

Program documents Program staff Key volunteers Program participants Participants’ parents or other caregivers Records of complaint about program’s value

or relevance

Page 36: Program Evaluation

SOURCES OF IDEAS FOR OUTCOMES (CONT.)

Programs or agencies that are “next steps” for your participants

Programs with missions, services, and participants similar to yours

Outside observers of your program in action

Page 37: Program Evaluation

PROGRAM OUTCOME CRITERIA

For each outcome: Is it reasonable to think the program

can influence the outcome in a non-trivial way, even though it can’t control it?

Would measurement of the outcome help identify program successes and pinpoint problems?

Will the program’s various “publics” accept this as a valid outcome of the program?

Page 38: Program Evaluation

PROGRAM OUTCOME CRITERIA (CONT.)

For the set of outcomes: Do they reflect the program logic-the chain of

changes program outputs are intended to set in motion for participants?

Do the longer-term outcomes represent meaningful benefits or changes in the participants’ condition or quality of life?

Are potential negative outcomes identified?

Page 39: Program Evaluation

OUTCOME INDICATOR

The specific item of information that tracks a program’s success on an outcome

Identifies the characteristic or change that signals that an outcome has been achieved

Is observable and measurable Usually is expressed as number and percent

of participants achieving the outcome

Page 40: Program Evaluation

EXAMPLES OF FACTORS THAT COULD INFLUENCE PARTICIPANT OUTCOMES

Participant Characteristics Age group Sex Race/ethnicity Educational level Household income group Household composition (size, # of children,etc.) Disability status

Page 41: Program Evaluation

EXAMPLES OF FACTORS THAT COULD INFLUENCE PARTICIPANT OUTCOMES (CONT.)

Degree of Difficulty of the Participants’ Situation

Geographic Location of Residence Neighborhood Political boundaries Zip code Census tract City or county

Page 42: Program Evaluation

EXAMPLES OF FACTORS THAT COULD INFLUENCE PARTICIPANT OUTCOMES (CONT.)

Organization’s Service Unit Type or Amount of Service Provided

Page 43: Program Evaluation

SOURCES OF DATA

Written Records Specific individuals (participants, parents,

teachers, employers, etc.) General public Trained observers (rating behavior, facilities,

environments, etc.) Mechanical tests and measurements

Page 44: Program Evaluation

EXAMPLES OF OUTCOMES THAT CAN BE MEASURED BY TRAINED OBSERVER RATINGS

Participants use direct eye contact during job interview role-plays

Youths use verbal rather than physical means to resolve conflicts

Recipients of rehabilitative services are able to undertake activities of daily living

Page 45: Program Evaluation

EXAMPLES OF OUTCOMES THAT CAN BE MEASURED BY TRAINED OBSERVER RATINGS (CONT.)

Adult day care participants eat nutritious meals Condition of neighborhood parks and

playgrounds (amount of litter, broken glass, etc.) improves

Page 46: Program Evaluation

DESIGNING DATA COLLECTION METHODS

1. Decide how to obtain needed data from each source

2. Prepare data collection instruments 3. Develop data collection procedures

Page 47: Program Evaluation

METHODS OF COLLECTING DATA

Extract data from written records Survey individuals or households

Self-administered questionnaire Interviewer-administered questionnaire

Have trained observer rate behavior, environments

Take physical measurements

Page 48: Program Evaluation

MODES OF SURVEY ADMINISTRATION

Mail Telephone In-person at home In-person at a public facility Combination of the above (e.g., mail

questionnaire with telephone follow-up)

Page 49: Program Evaluation

COMPARISON OF MAJOR DATA COLLECTION METHODS

Data Collection Method

Characteristic Review of Program Records

Self-Administered Questionnaire

Interview Rating by Trained Observer

Cost Low Moderate Moderate to high

Depends on availability of low-cost observers

Amount of training required for data collectors

Some None to some Moderate to high

Moderate to high

Completion Time

Depends on amt. data needed

Moderate to long Long Short to moderate

Response rate High, if records contain needed data

Depends on distribution

Moderate to good

High

Page 50: Program Evaluation

KEY ISSUES IN DATA COLLECTION PROCEDURES

When will data be collected? When entering program When completing program Fixed interval after entering Fixed interval after completing Combination of above

Who is considered a participant?

Page 51: Program Evaluation

KEY ISSUES IN DATA COLLECTION PROCEDURES (CONT.)

Include all participants or only a sample? Who will collect the data? How will confidentiality be protected? How will participants be informed about data

collection?

Page 52: Program Evaluation

PLEASE !!!!!!

DO NOTSKIPTHE TRIALRUN

Page 53: Program Evaluation

WHAT YOU DON’T KNOW CAN HURT YOU

Measurement Problems, e.g. Overlooked outcomes Badly defined indicators Inadequate data collector training Conflicting instruments for related instruments

Page 54: Program Evaluation

WHAT YOU DON’T KNOW CAN HURT YOU (CONT.)

Administration Problems, e.g. Agency records aren’t current Data collectors lose interest No return address on questionnaire Too many long distance follow-up phone calls Respondents refuse consent, don’t keep

appointments, can’t remember

Page 55: Program Evaluation

THE TRIAL RUN Does not have to involve the entire program,

but must… Include all aspects of the outcome

measurement system Involve a representative group of participants Last long enough to span all the key data

collection points

Page 56: Program Evaluation

SOME OPTIONS FOR USING A SUBSET OF PARTICIPANTS IN A TRIAL RUN

For multi-site programs, use only some sites If staff are organized into units, use only some

units If participants go through the program in groups,

use only some groupsTHE SUBSET MUST BE REPRESENTATIVE OF ALL

PARTICIPANTS

Page 57: Program Evaluation

OUTCOME MEASUREMENT SYSTEM FEATURES TO MONITOR

Time spent Former participants not located Data frequently missing in records Response rates Refusal rates Planned observations not completed Data collection errors Data needed but unavailable Costs beyond staff time

Page 58: Program Evaluation

Drawing Conclusions

• Examine results carefully and objectively• Draw conclusions based on your data• What do the results signify about your program?

Page 59: Program Evaluation

DISAPPOINTING OUTCOME FINDINGS: THE STORY BEHIND THE NUMBERS

Internal factors, e.g. Sudden staff turnover New service delivery strategy New target group Unrealistic outcome target A problem in the measurement system

Page 60: Program Evaluation

DISAPPOINTING OUTCOME FINDINGS: THE STORY BEHIND THE NUMBERS (CONT.)

External factors, e.g. Community unemployment increased Related service used by participants closed Public transportation increased fares, shut down

some routes serving your program Severe weather caused sudden increase in service

requests

Page 61: Program Evaluation

USES OF OUTCOMES FINDINGS

Internal: Provide direction for staff Identify training needs Improve programs Support annual and long-range planning Guide budgets and justify resource allocations Suggest outcome targets Focus board members’ attention on programmatic

issues

Page 62: Program Evaluation

USES OF OUTCOMES FINDINGS (CONT.)

External: Recruit talented staff and volunteers Promote the program to potential participants

and referral sources Identify partners for collaboration Enhance the program’s public image Retain and increase funding

Page 63: Program Evaluation

Feedback to Program Improvement

• You can use evaluation findings to make program improvements– Consider adjustments– Re-examine/revise program strategies

– Change programs or methodologies– Increase time with the program

• Use your results as a needs assessment for future efforts

Page 64: Program Evaluation

Conclusion

Evaluation helps you:1. determine the effects

of the program on recipients

2. know if you have reached your objectives

3. improve your program

Page 65: Program Evaluation

LOGIC MODEL PROCESS UNITED WAY MODEL

A copy of the United Way Handbook - Available at: http://www.liveunited.org/Outcomes/Resources/MPO/

To order, contact United Way Store at 800-772-0008 (toll-free U.S.) or 703-212-6300. Item No. 0989. Price: $5 (plus shipping and handling).