developing implementation evaluation models to provide assistance to the national science...

31
Developing Implementation Evaluation Models To Provide Assistance to the National Science Foundation’s

Upload: cecil-lawrence

Post on 29-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

Developing Implementation

Evaluation Models

To Provide Assistance to the National Science Foundation’s

Catherine Callow-Heusser

Project Director, Co-PIEvaluation Capacity Building Project

A NSF-FundedResearch, Evaluation, and Technical Assistance

(MSP-RETA) Project

Goals of NSF’s MSP Program The Math and Science Partnership (MSP) program is a major

research and development effort that supports innovative partnerships to improve K-12 student achievement in mathematics and science.

MSP projects are expected to both raise the achievement levels of all students and significantly reduce achievement gaps in the mathematics and science performance of diverse student populations.

Successful projects serve as models that can be widely replicated in educational practice to improve the mathematics and science achievement of all the Nation's students.

(NSF’s MSP RFP: http://www.nsf.gov/pubs/2003/nsf03605/nsf03605.htm)

MSP’s Five Key Characteristics

Partnership-Driven Higher Ed + K-12 + Others

Teacher Quality, Quantity, and Diversity Challenging Courses and Curricula Evidence-Based Design and Outcomes Institutional Change and Sustainability

Math-Science Partnership Program

MSP Funding,

Intervention

Student Achievement

Inside the MSP “Black Box”

Professional Development Community Involvement Mentoring Partnerships Recruitment Challenging Curriculum Teacher Retention Teacher Leaders Universities Tutoring Summer Workshops Pre-Service Redesign

K-12 Scientists/Engineers Business

MSP Goals and $$$

IncreasedStudent

Success in Math & Science

Westat (2003). [http://www.mspinfo.com/Source/Chap9_Evidence_and_Evaluation.asp]

Example from MSP Strategic Plan Goal:

To increase student achievement and reduce achievement gaps in science and mathematics for all preK-12 students in partner school districts.

Strategies for achieving goal: Work with districts to develop and implement strategic plans for

improving math and science achievement and reduce achievement gaps.

Work with districts to develop internal leadership structures and practices—among teacher-leaders, principals, and district staff—to improve teaching of math and science.

Provide well-designed, continuing professional development to help teachers learn new content and practices, become more attuned to students’ thinking, and use new curriculum materials aligned with state and national standards.

Components in the “Black Box”

MSP Funding,

Intervention

Curriculum

Professional Development

Student Achievement

Simplified Theory of Action for Example

MSP Funding,

Intervention

Leadership

Teacher Knowledge,

Practice

Curriculum

DistrictResources

Professional Development

Student Learning

Student Achievement

Family, Community Involvement

Recruitment, Retention Activities

Implementation Evaluation Definition (Scriven, 1991): “mere monitoring of program

delivery” Definition (Frechtling, 2002, Gao, 1998): assess whether

the project is being conducted as planned, e.g., fidelity of implementation Ensure the program and its components are operating, and

according to the proposed plan or description

Monitor and evaluate well-articulated activities and processes* in the “black box”

* “A process is a series of causally linked events or changes taking place over time” (Scriven)

Why Implementation Evaluation?

Ensure that activities are implemented as PLANNED in a timely manner.

Indicators are based on PLANS for project activities--PLANS that Explain the project’s rationale Document the context in which a project operates Describe the planned activities and processes Identify potential side effects

Implementation Evaluation

Answers questions such as (Westat, 2003): Were the appropriate participants selected and involved in the planned activities?

Do the activities and strategies match those described in the plan? If not, are the changes in activities justified and described?

Were the appropriate staff members hired and trained, and are they working in accordance with the proposed plan? Were the appropriate materials and equipment obtained?

Were activities conducted according to the proposed timeline? By appropriate personnel?

Was a management plan developed and followed?

“Models” for Describing & Monitoring

Program Logic Modeling Picture of how a program works, including the theory

and assumptions underlying the program Logic Model Development Guide

W. K. Kellogg Foundation, http://www.wkkf.org

Key Evaluation Checklist Checklist for evaluating/reporting on programs &

evaluations of themM. Scriven, http://www.wmich.edu/evalctr.checklists/kec.htm

Others

Program Logic Modeling

What? Systematic and visual method for presenting

relationships among program resources, activities, and anticipated changes or results.

Why? Provides a “road map” describing the

sequence of related events/processes that connect the need for the program with the desired results.

The Importance of Logic Modeling

Why programs often run into trouble – Lack of well articulated, research-based, experience-based

theory or road map. Failure to follow the road map during the trip!

If program planners don’t have any hypotheses guiding them, their potential for success is limited as is there no potential for learning – the program is probably in trouble! (1)

Why evaluations often run into trouble – Lack of well articulated, research-based, experience-based

theory or road map.

The bane of evaluation is a poorly designed program! (1)

(1) Kellogg (2001); McLaughlin (2003)

University of Wisconsin-Extension, Program Development & Evaluation, http://www.uwex.edu/ces/pdande/progdev/index.html

University of Wisconsin-Extension, Program Development & Evaluation, http://www.uwex.edu/ces/pdande/progdev/index.html

Westat (2003). [http://www.mspinfo.com/Source/Chap9_Evidence_and_Evaluation.asp]

MSP Project Logic Models

Show relationships, links between Resources (inputs) from NSF, Higher Education,

K-12, Partners Activities and processes that will address MSP five

key characteristics Outcomes—short, intermediate, and long term

Complex! Nested, multiple “levels” or depths Require thoughtful, thorough, rigorous, systematic

planning and development

Key Evaluation Checklist

What? Checklist of necessary items to be addressed

(iteratively) in a program evaluation. Why?

Avoid invalidity in a program evaluation. Align proposal/plan and evaluation.

Key Evaluation Checklist Components

Description* Background, context* Consumers Resources Values Processes* Outcomes

* Used for Implementation Evaluation

Costs Comparisons with

alternative options Generalizability Significance Recommendations Report Meta-evaluation

Key Evaluation Checklist

Background and Context Historical, contemporary, projected settings Stakeholders Relevant legislation, funder’s policy changes Underlying rationale (e.g. program theory,

political logic) Review of previous research and evaluations

Key Evaluation Checklist

Description and Definitions Definitions of “technical terms” Official description of program and components Detailed description for replication Goals, mileposts, benchmarks

Key Evaluation Checklist

Processes Assessment of the quality of everything significant

that happens or applies before true outcomes emerge Causally relevant context and support Goals, design, degree of implementation,

management, quality of work, activities, procedures Quality of inputs (i.e., logic model resources) Intermediate results (i.e., logic model outputs)

Key Evaluation Checklist Applied to MSP Projects

Goes from “What’s So?”

Step I: Fact finding phase

To “So What?”

Step II: Combining facts with values that bear on those facts

Complex! Iterative, multi-step Requires thoughtful, thorough, rigorous, systematic

planning and development

Complexity of Implementation Evaluation “Models”

Implementation evaluation requires Accurate description of project contexts, activities, processes,

and the relationships between them Realistic benchmarks, measurable indicators Regular monitoring of project plans, activities, processes,

timelines

Complex! Nested designs with multiple “levels” or depths Iterative, multi-step methods for planning and documentation Require thoughtful, thorough, rigorous, systematic planning,

development, and evaluation

USU’s MSP-RETA Project

Provide evaluation technical assistance to MSP projects

Collect evaluation needs assessment information

Build upon existing evaluation “models” or processes to develop evaluation processes that Address the complexity of MSP projects Help identify and measure causal effects Incorporate relevant contextual factors Involve stakeholders

Culture of Evidence

In particular, we are working to help MSP projects build a “Culture of Evidence” to meet NSF’s goal of identifying successful projects that will “serve as models that can be widely replicated in educational practice to improve the mathematics and science achievement of all the Nation's students.”

ReferencesFrechtling, J. (2002). The 2002 user-friendly handbook for project evaluation. Washington, DC:

NSF. [Document Number 02-057]

GAO. (1998). Performance measurement and evaluation: Definitions and relationships. Washington, DC: U.S. GAO. [http://www.gao.gov/special.pubs/gg98026.pdf]

McLaughlin, J.A. (October, 2003). Logic modeling: A tool for describing and aligning your program to your monitoring and evaluation. A presentation at USU’s MSP Building Evaluation Capacity of STEM/MSP Projects Workshop, Baltimore, MD.

Scriven, M. (1991). Evaluation thesaurus, 4th ed. Newbury Park, CA: Sage.

Scriven, M. (2002). Key evaluation checklist. Kalamazoo, MI: Western Michigan University, The Evaluation Center. [http://www.wmich.edu/evalctr/checklists/kec.htm]

University of Wisconsin-Extension, Program Development and Evaluation. (2002). Enhancing program performance with logic models. Madison, WI: Author. [http://www.uwex.edu/ces/pdande/ and http://www1.uwex.edu/ces/lmcourse/]

W. K. Kellogg Foundation. (2001). Logic model development guide. Battle Creek, MI: Author.

Westat, Inc. (2003). Developing math and science partnerships: Toolkit. Rockville, MD: Author. [http://www.mspinfo.com/Source/toolkit.asp]

Contact Information

USU’s MSP-RETA Evaluation Capacity Building Project

PI, Project Director: Catherine Callow-Heusser ([email protected])

Co-PI: Jim Dorward ([email protected])

Co-PI: Steve Lehman ([email protected])

PI (retired): Blaine Worthen

Consortium for Building Evaluation Capacity

http://www.usu.edu/cbec/

2810 Old Main Hill 435-797-1111

Utah State University FAX 435-797-1448

Logan, UT 84322-2810 [email protected]