research, policy & evaluation: if i knew then what i know now: building successful evaluation

16
If I Knew Then What I Know Now: Building a Successful Evaluation Roblyn Brigham, Brigham Nahas Research Associates Andy Hoge, New Jersey SEEDS Janet Smith, The Steppingstone Foundation Danielle Stein Eisenberg, KIPP Foundation April 8, 2010

Upload: marissa-lowman

Post on 03-Dec-2014

1.171 views

Category:

Education


1 download

DESCRIPTION

This workshop focused on evaluation tips and tools, lessons learned, and mistakes to avoid. It was designed for those charged with leading evaluation at their organizations.

TRANSCRIPT

Page 1: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

If I Knew Then What I Know Now:

Building a Successful Evaluation

Roblyn Brigham, Brigham Nahas Research AssociatesAndy Hoge, New Jersey SEEDS

Janet Smith, The Steppingstone FoundationDanielle Stein Eisenberg, KIPP Foundation

April 8, 2010

Page 2: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

Overview and Introduction

• The workshop focus – if I knew then, what I know now…

• Presentation Outline• Introduction of Panelists

Page 3: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

Internal and External Evaluation

• The What, Why, and Who• New Jersey SEEDS Alumni Follow-up

Study– The story from inside and out

Page 4: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

Evaluation Planning: Factors to Consider

• Organizational Characteristics• Data Collection Capacity• Data Analysis

Page 5: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

Organizational Characteristics

and Evaluation Design• Size and Structure of Organization• Culture of Organization• Age of Organization• Nature of the Program Offering(s)

Page 6: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

Designing Evaluation: Non-negotiables

 • Identify programmatic or evaluation

goals upfront during the program design stage

• Involve key stakeholders at all phases, including analysis

• Articulate “Evaluable” Questions• Articulate Action Plan for Using Results

– Short-term & Long-term Evaluation Plan

Page 7: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

Data Collection: Capacity and Commitment

 • What Skills for What Aspects of

Collection?• Standardize Terminology: e.g.,

enrollment, placement• Monitor Data Integrity and Accuracy

Page 8: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

Data Analysis: Capacity and Action

• Who is Involved in Analyzing the Data?– Skills– Key Points of View– What jumps out? What is missing?

• Prioritize Action to be Taken in Response to Analysis

Page 9: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

Presenting Results: Know Your Audience

• The Presenter and the Audience • Lessons learned:

– Making Claims, Issues of Accuracy– Multiple Audiences: Most Effective

Format– Audience Response

Page 10: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

Test Results Over Time

Page 11: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

Test Results Over Time

Page 12: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation
Page 13: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

Additional Slides

• Evaluation Design Tool (KIPP)• Vision Mapping (KIPP)

Page 14: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

The KSS core team articulated specific goals, objectives, and metrics for the event (which mapped back to the overall vision).

Strand leads did the same.

14

Strand Goal Objective Metric Evaluation Tool

Boards

Board members should be inspired by KIPP's mission and energized to contribute as Board members

Board members will feel inspired to continue their work with KIPP

95% of board members will indicate that they feel somewhat or very inspired to continue their work with KIPP Strand Survey

Board members should feel part of a network-wide Board community, and national reform movement, rather than just a supporter of a local KIPP effort.

Board members will feel like part of a national network

90% of board members will indicate that they feel somewhat or very connected to a national community Strand Survey

Board members should learn practical skills and/or obtain tools that will enhance their Board's effectiveness

Board members should leave KSS with at least one tool or practical skill they can immediately put to use

Can name 1 tool or skill they used

Strand Survey and Follow-Up Survey

Board members should learn about KIPP initiatives that are meaningful to their Board service - e.g. KIPP share

Board members will leave KSS knowing about national initiatives

Can name 2 KIPP initiatives that are relevant to their region or school Strand Survey

The Board Strand’s Evaluation Planning Tool

Page 15: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

Executive summary: KSS 2009 successfully delivered against our vision; per-participant

costs were lowest level ever• 1,810 people attended KSS 2009 – up 16% from last year and 7th

consecutive year of record attendance• 94% of respondents strongly (70%) or somewhat agree (24%) that

KSS enhances their sense of belonging to the KIPP community

15

Collective Power; Intro/ Reconnection

Network; Share, Reflect, and Learn

Personal Learning

Kick off school year with high energy; Renew collective commitmentPer-participant costs were lowest level ever

Please see the appendix for the supporting data to the bullet points below.

• 91% of attendees strongly (50%) or somewhat agree (41%) that KSS provides opportunities to learn helpful strategies from colleagues in other schools or organizations.

• The top two reasons why teacher respondents attend KSS are “I value the Professional Development opportunities KSS provides” and “I came to learn new instructional strategies”

• 90% of all respondents attendees strongly (59%) or somewhat agree (31%) that they learned new ideas and strategies at KSS that they could directly apply to their

• 94% of all overall session ratings were either “excellent” (34%) or “good” (60%)• 94% of respondents strongly (70%) or somewhat agree (24%) that KSS “renewed my sense of purpose in my work”

• KSS was fun!

• KSS 2009 costs were 11% higher than originally budgeted due primarily to attendance being 13% higher than projected…

• …resulting in KSS 2009 per-participants costs being our lowest ever, down 5.5% from previous low in 2007

Page 16: Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

The Lie Factor(The Visual Display of Quantitative Information, 2nd Ed. by E.R. Tufte, 2001)

Los Angeles TimesAug. 5, 1979, p. 3