evaluation 101

47
Evaluation 101 Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013

Upload: quiana

Post on 10-Jan-2016

21 views

Category:

Documents


0 download

DESCRIPTION

Evaluation 101. Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013. Coalition. A group of individuals representing diverse organizations or constituencies who agree to work together to achieve a common GOAL - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Evaluation 101

Evaluation 101

Presentation at The Conference for Family Literacy

Louisville, KentuckyBy Apter & O’Connor Associates

April 2013

Page 2: Evaluation 101

Coalition

A group of individuals representing diverse organizations or constituencies who agree to work together to achieve a common GOAL

- Feighery & Rogers, 1990

Page 3: Evaluation 101
Page 4: Evaluation 101

Evaluation is . . .

- the systematic collection of information . . . to reduce uncertainties, improve effectiveness, and make decisions

(Michael Q. Patton, 1988)

Page 5: Evaluation 101
Page 6: Evaluation 101

Effectiveness

Why Evaluate? Provide Accountability to

Community, Funders & Stakeholders

EffectivenessEffectiveness

Quality Efficiency

Effectiveness

Page 7: Evaluation 101

Why Evaluate?What gets measured, gets done

If you don’t measure results, you can’t tell success from failure

If you can’t see success, you can’t reward it

Adapted from: Reinventing Government, Osborne and Gaebler, 1992

If you can’t recognize failure, you can’t correct it

If you can’t see success, you can’t learn from it

If you can’t reward success, you’re probably rewarding failure

Page 8: Evaluation 101

Why Evaluate?

Monitor overall progress toward goals

Determine whether individual interventions are producing the desired progress

Permit comparisons among groups

Continuous quality improvement

Ensure only effective programs are maintained

Justify the need for further funding

Page 9: Evaluation 101

Outcome

Process

Formative

infrastructure,

functions and

procedures

extent of implementati

on realization of vision

Types of Evaluation

Page 10: Evaluation 101

Collective Impact Model

•Common agenda - a vision

• Shared measurement system

•Mutually reinforcing activities

•Continuous communication

•Backbone support organization

Page 11: Evaluation 101

Outcome

Process

Formative

Vision, Stakeholder Engagemen

t

Implementation of Mutually

Reinforcing Activities

Collective IMPACT

Page 12: Evaluation 101

Formative Evaluation

Why is the collaboration needed?Do we have the resources needed?Do we have strong leadership?Are the right stakeholders represented?Is a collaboration the best approach? Is there a shared vision?

Page 13: Evaluation 101

COALITION

Are the right people participating? Are meetings

productive? Are workgroup charges

clearIs the work beginning?

STRATEGIES Are you implementing

things as planned?

Are you reaching the target population?

Are you implementing with quality?

How many are you reaching?

Process Evaluation

Page 14: Evaluation 101

What has changed or improved?

Are we achieving our intended goals?

Was the effort worth the time & costs?

Outcome Evaluation

ShortIntermediateLong Term

Page 15: Evaluation 101

Evaluating a Coalition

Can be tricky

• need to evaluate my own accomplishments without undermining success of the whole

• all in this together but how do we distinguish the contribution of one agency or one stakeholder from another

Page 16: Evaluation 101
Page 17: Evaluation 101

CDC’s Framework for Evaluation

Engage Stakeholders Set Goals

& Plan Programs

Focus the Evaluation

Design

Choose Methods & Collect Data

Analyze Data & Interpret

Results

Ensure Use & Share Lessons

Learned

Page 18: Evaluation 101

Utility

Feasibility

Propriety

Accuracy

Standards to Consider

Page 19: Evaluation 101

CDC’s Framework for Evaluation

Engage Stakeholders Set Goals

& Plan Programs

Focus the Evaluation

Design

Choose Methods & Collect Data

Analyze Data & Interpret

Results

Ensure Use & Share Lessons

Learned

Page 20: Evaluation 101

Engaging Stakeholders

– Include Stakeholders who are:• Implementers• Partners • Participants – those affected• Decision-makers

– Establish evaluation team at onset with areas for stakeholder input

– Obtain buy-in & commitment to plan

Page 21: Evaluation 101

CDC’s Framework for Evaluation

Engage Stakeholders Set Goals

& Plan Programs

Focus the Evaluation

Design

Choose Methods & Collect Data

Analyze Data & Interpret

Results

Ensure Use & Share Lessons

Learned

Page 22: Evaluation 101

Set Goals and Develop Plan

• Problem Statement - define the need Who? Where? Why?

• Envision the Future - Set Your Goals

• Set the Context

• Select Strategies and Set Targets

• Connect the Dots . . . create a Logic Model

Page 23: Evaluation 101

Logic Models

A logic model is a road map for the shared work of all of the stakeholders... it answers the questions:

Where are we now?

Where are we going?

How will we get there?

Page 24: Evaluation 101
Page 25: Evaluation 101

A Logic Model . . . Can it get any simpler?

Needs and Strengths

Strategies Outcomes

Page 26: Evaluation 101

University of Wisconsin-Extension, Program Development and Evaluation

What we do

Who we reach

Our results

Resources Strategies Outcomes

Needs & Strengths Activities Outputs Short Medium

Long-term

Where we are

Logical Chain of Connections

Page 27: Evaluation 101

University of Wisconsin-Extension, Program Development and Evaluation

Detailed logic model

Page 28: Evaluation 101
Page 29: Evaluation 101

CDC’s Framework for Evaluation

Engage Stakeholders Set Goals

& Plan Programs

Focus the Evaluation

Design

Choose Methods & Collect Data

Analyze Data & Interpret

Results

Ensure Use & Share Lessons

Learned

Page 30: Evaluation 101

Focus the Evaluation Design

What do we want to know? Coalition Programs Participants Outcomes Coalition impact Influencing factors

Page 31: Evaluation 101

EVALUATION DOMAINS

Inputs

Activities Outputs Short -Term

Outcomes

Interm. Outcomes

Long Term

Outcomes

FORMATIVE/PROCESS OUTCOME

Page 32: Evaluation 101

Develop Indicators

What Will Change?For Who? By How Much? By When?

Indicators for Activities - process indicatorsIndicators for Outcomes - outcome indicatorsThere can be more than one indicator for each activity or outcome

Page 33: Evaluation 101

Communities

Individuals/

clientsLevel of impact

Page 34: Evaluation 101

Coalition Evaluation Questions

Are we meeting our members’ needs?Do our work groups function well?Have we improved community awareness?Are we influencing policies & practices?Are we building organizational/community

capacity?Are we building strategic partnerships?Have we strengthened our base of support?Are we reaching our priority audiences?Which strategies are effective?Are we making a difference?

Page 35: Evaluation 101

CDC’s Framework for Evaluation

Engage Stakeholders Set Goals

& Plan Programs

Focus the Evaluation

Design

Choose Methods & Collect Data

Analyze Data & Interpret

Results

Ensure Use & Share Lessons

Learned

Page 36: Evaluation 101

Choose Methods and Collect Data

• Collect enough data to be reliable, but consider burden

• Consider existing data sources • Don’t try to measure everything • Use mixed methods – Qualitative& Quantitative

Page 37: Evaluation 101

Methods

• Focus Groups

• Interviews

• Structured Observations

• Document/ Record Review

• Case Studies

• Surveys

• Participant Assessments

• Statistical Analysis of program data

• Cost-Benefit Analysis

Page 38: Evaluation 101

CDC’s Framework for Evaluation

Engage Stakeholders Set Goals

& Plan Programs

Focus the Evaluation

Design

Choose Methods & Collect Data

Analyze Data & Interpret

Results

Ensure Use & Share Lessons

Learned

Page 39: Evaluation 101

Analyze Data & Interpret Results

Organize and classify the data

Tabulate – counts, numbers, descriptive statistics

Identify Themes

Stratify – look at data by variables/demographics

Make Comparisons – pre-post – between groups

Present Data in clear format – use narratives, charts, tables, graphs, maps

Page 40: Evaluation 101

CDC’s Framework for Evaluation

Engage Stakeholders Set Goals

& Plan Programs

Focus the Evaluation

Design

Choose Methods & Collect Data

Analyze Data & Interpret

Results

Ensure Use & Share Lessons

Learned

Page 41: Evaluation 101

Ensure Use & Share Lessons Learned

Recommendations Preparation - Engage & Guide

StakeholdersFeedback Follow-up Dissemination

Page 42: Evaluation 101

ABC Literacy Coalition Evaluation Plan

Improve Literacy Levels Across The Lifespan. Project Outcome:

Objectives/ Activities

Evaluation Questions Methodology Time Frame

Who is Responsible

Project Outcome: Objectives and

Activities Evaluation Questions

Methodology Time Frame

Who is Responsible

Page 43: Evaluation 101

The only man who behaves sensibly is my tailor; he takes my measurements anew every time he sees me, while all the rest go on with their old measurements and expect me to fit them.

- George Bernard Shaw

Page 44: Evaluation 101

QUESTIONS?

Page 45: Evaluation 101

RESOURCESFull Resource List on Literacy Powerline Website

University of Wisconsin – Extension www.uwex.edu/ces/pdande

U.S. Dept. HHS CDC - Strategy & Innovation www.cdc.gov/eval/guide/CDCEvalManual.pdf

Annie E. Casey Foundation www.aecf.org

Two reports by Organizational Research Services – A Practical Guide to Documenting Influence and Leverage – A Guide to Measuring Advocacy and Policy.

Page 46: Evaluation 101

• For more information…• For more information…

Literacy Powerline www.literacypowerline.com

Apter & O’Connor Associates

[email protected]

[email protected] 315-427-5747

Page 47: Evaluation 101

Research vs. Evaluation

Research seeks to prove• Investigator-controlled

• Authoritative

• Scientific method – isolate /control variables

• Limited number of Sources - accuracy

• Facts – descriptions, associations, effects

Evaluation seeks to improve• Stakeholder-controlled

• Collaborative

• Incorporate variables -account for circumstances

• Multiple Sources - triangulation

• Values – quality, value, importance