evaluation 101

Post on 10-Jan-2016

21 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Evaluation 101. Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013. Coalition. A group of individuals representing diverse organizations or constituencies who agree to work together to achieve a common GOAL - PowerPoint PPT Presentation

TRANSCRIPT

Evaluation 101

Presentation at The Conference for Family Literacy

Louisville, KentuckyBy Apter & O’Connor Associates

April 2013

Coalition

A group of individuals representing diverse organizations or constituencies who agree to work together to achieve a common GOAL

- Feighery & Rogers, 1990

Evaluation is . . .

- the systematic collection of information . . . to reduce uncertainties, improve effectiveness, and make decisions

(Michael Q. Patton, 1988)

Effectiveness

Why Evaluate? Provide Accountability to

Community, Funders & Stakeholders

EffectivenessEffectiveness

Quality Efficiency

Effectiveness

Why Evaluate?What gets measured, gets done

If you don’t measure results, you can’t tell success from failure

If you can’t see success, you can’t reward it

Adapted from: Reinventing Government, Osborne and Gaebler, 1992

If you can’t recognize failure, you can’t correct it

If you can’t see success, you can’t learn from it

If you can’t reward success, you’re probably rewarding failure

Why Evaluate?

Monitor overall progress toward goals

Determine whether individual interventions are producing the desired progress

Permit comparisons among groups

Continuous quality improvement

Ensure only effective programs are maintained

Justify the need for further funding

Outcome

Process

Formative

infrastructure,

functions and

procedures

extent of implementati

on realization of vision

Types of Evaluation

Collective Impact Model

•Common agenda - a vision

• Shared measurement system

•Mutually reinforcing activities

•Continuous communication

•Backbone support organization

Outcome

Process

Formative

Vision, Stakeholder Engagemen

t

Implementation of Mutually

Reinforcing Activities

Collective IMPACT

Formative Evaluation

Why is the collaboration needed?Do we have the resources needed?Do we have strong leadership?Are the right stakeholders represented?Is a collaboration the best approach? Is there a shared vision?

COALITION

Are the right people participating? Are meetings

productive? Are workgroup charges

clearIs the work beginning?

STRATEGIES Are you implementing

things as planned?

Are you reaching the target population?

Are you implementing with quality?

How many are you reaching?

Process Evaluation

What has changed or improved?

Are we achieving our intended goals?

Was the effort worth the time & costs?

Outcome Evaluation

ShortIntermediateLong Term

Evaluating a Coalition

Can be tricky

• need to evaluate my own accomplishments without undermining success of the whole

• all in this together but how do we distinguish the contribution of one agency or one stakeholder from another

CDC’s Framework for Evaluation

Engage Stakeholders Set Goals

& Plan Programs

Focus the Evaluation

Design

Choose Methods & Collect Data

Analyze Data & Interpret

Results

Ensure Use & Share Lessons

Learned

Utility

Feasibility

Propriety

Accuracy

Standards to Consider

CDC’s Framework for Evaluation

Engage Stakeholders Set Goals

& Plan Programs

Focus the Evaluation

Design

Choose Methods & Collect Data

Analyze Data & Interpret

Results

Ensure Use & Share Lessons

Learned

Engaging Stakeholders

– Include Stakeholders who are:• Implementers• Partners • Participants – those affected• Decision-makers

– Establish evaluation team at onset with areas for stakeholder input

– Obtain buy-in & commitment to plan

CDC’s Framework for Evaluation

Engage Stakeholders Set Goals

& Plan Programs

Focus the Evaluation

Design

Choose Methods & Collect Data

Analyze Data & Interpret

Results

Ensure Use & Share Lessons

Learned

Set Goals and Develop Plan

• Problem Statement - define the need Who? Where? Why?

• Envision the Future - Set Your Goals

• Set the Context

• Select Strategies and Set Targets

• Connect the Dots . . . create a Logic Model

Logic Models

A logic model is a road map for the shared work of all of the stakeholders... it answers the questions:

Where are we now?

Where are we going?

How will we get there?

A Logic Model . . . Can it get any simpler?

Needs and Strengths

Strategies Outcomes

University of Wisconsin-Extension, Program Development and Evaluation

What we do

Who we reach

Our results

Resources Strategies Outcomes

Needs & Strengths Activities Outputs Short Medium

Long-term

Where we are

Logical Chain of Connections

University of Wisconsin-Extension, Program Development and Evaluation

Detailed logic model

CDC’s Framework for Evaluation

Engage Stakeholders Set Goals

& Plan Programs

Focus the Evaluation

Design

Choose Methods & Collect Data

Analyze Data & Interpret

Results

Ensure Use & Share Lessons

Learned

Focus the Evaluation Design

What do we want to know? Coalition Programs Participants Outcomes Coalition impact Influencing factors

EVALUATION DOMAINS

Inputs

Activities Outputs Short -Term

Outcomes

Interm. Outcomes

Long Term

Outcomes

FORMATIVE/PROCESS OUTCOME

Develop Indicators

What Will Change?For Who? By How Much? By When?

Indicators for Activities - process indicatorsIndicators for Outcomes - outcome indicatorsThere can be more than one indicator for each activity or outcome

Communities

Individuals/

clientsLevel of impact

Coalition Evaluation Questions

Are we meeting our members’ needs?Do our work groups function well?Have we improved community awareness?Are we influencing policies & practices?Are we building organizational/community

capacity?Are we building strategic partnerships?Have we strengthened our base of support?Are we reaching our priority audiences?Which strategies are effective?Are we making a difference?

CDC’s Framework for Evaluation

Engage Stakeholders Set Goals

& Plan Programs

Focus the Evaluation

Design

Choose Methods & Collect Data

Analyze Data & Interpret

Results

Ensure Use & Share Lessons

Learned

Choose Methods and Collect Data

• Collect enough data to be reliable, but consider burden

• Consider existing data sources • Don’t try to measure everything • Use mixed methods – Qualitative& Quantitative

Methods

• Focus Groups

• Interviews

• Structured Observations

• Document/ Record Review

• Case Studies

• Surveys

• Participant Assessments

• Statistical Analysis of program data

• Cost-Benefit Analysis

CDC’s Framework for Evaluation

Engage Stakeholders Set Goals

& Plan Programs

Focus the Evaluation

Design

Choose Methods & Collect Data

Analyze Data & Interpret

Results

Ensure Use & Share Lessons

Learned

Analyze Data & Interpret Results

Organize and classify the data

Tabulate – counts, numbers, descriptive statistics

Identify Themes

Stratify – look at data by variables/demographics

Make Comparisons – pre-post – between groups

Present Data in clear format – use narratives, charts, tables, graphs, maps

CDC’s Framework for Evaluation

Engage Stakeholders Set Goals

& Plan Programs

Focus the Evaluation

Design

Choose Methods & Collect Data

Analyze Data & Interpret

Results

Ensure Use & Share Lessons

Learned

Ensure Use & Share Lessons Learned

Recommendations Preparation - Engage & Guide

StakeholdersFeedback Follow-up Dissemination

ABC Literacy Coalition Evaluation Plan

Improve Literacy Levels Across The Lifespan. Project Outcome:

Objectives/ Activities

Evaluation Questions Methodology Time Frame

Who is Responsible

Project Outcome: Objectives and

Activities Evaluation Questions

Methodology Time Frame

Who is Responsible

The only man who behaves sensibly is my tailor; he takes my measurements anew every time he sees me, while all the rest go on with their old measurements and expect me to fit them.

- George Bernard Shaw

QUESTIONS?

RESOURCESFull Resource List on Literacy Powerline Website

University of Wisconsin – Extension www.uwex.edu/ces/pdande

U.S. Dept. HHS CDC - Strategy & Innovation www.cdc.gov/eval/guide/CDCEvalManual.pdf

Annie E. Casey Foundation www.aecf.org

Two reports by Organizational Research Services – A Practical Guide to Documenting Influence and Leverage – A Guide to Measuring Advocacy and Policy.

• For more information…• For more information…

Literacy Powerline www.literacypowerline.com

Apter & O’Connor Associates

www.apteroconnor.comdianne@apteroconnor.com

cynthia@apteroconnor.com 315-427-5747

Research vs. Evaluation

Research seeks to prove• Investigator-controlled

• Authoritative

• Scientific method – isolate /control variables

• Limited number of Sources - accuracy

• Facts – descriptions, associations, effects

Evaluation seeks to improve• Stakeholder-controlled

• Collaborative

• Incorporate variables -account for circumstances

• Multiple Sources - triangulation

• Values – quality, value, importance

top related