a1 - developing evaluation plans & … - developing evaluation plans & reports leader...

62
A1 - Developing Evaluation Plans & Reports Leader Presenter: Steve Goodman Exemplar: Holly Niedermeyer Key Words: Applied Evaluation, Assessment, Training National PBIS Leadership Forum October 27 & 28, 2016 www.pbis.org

Upload: dangxuyen

Post on 18-Jun-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

A1 - Developing Evaluation Plans & Reports

Leader Presenter: Steve Goodman

Exemplar: Holly Niedermeyer

Key Words: Applied Evaluation, Assessment, Training

National PBIS Leadership Forum October 27 & 28, 2016

www.pbis.org

2 MIBLSI

1.  Learn key questions to address in a comprehensive program evaluation

2.  Review core features of a comprehensive program evaluation plan

3.  Identify core content to include in a comprehensive program evaluation report

4.  Identify facilitators and inhibitors in conducting a comprehensive program evaluation

5.  Review promising practices for planning and reporting a comprehensive program evaluation

Session Objectives

miblsi.org

Acknowledgements: Rob Horner, Bob Algozzine, Anna Harms, Christine Russell, Jennifer Rollenhagen, Pat Sorrelle, Julie Morrison

4 MIBLSI

“The goal is to turn data into information, and information into insight.”

–Carly Fiorina, former executive, president, and chair of Hewlett-Packard Co.

5 MIBLSI

Take Home Message Consistent with the project’s goals, objectives, and logic model, the evaluation is focused on documenting what was planned to bring about change (inputs and outputs) as well as what was accomplished (formative outcomes) and the difference it made (summative impact).

Core Features of a Program Evaluation

B. Algozzine (2015)

6 MIBLSI

Identify Questions

to be Answered

Select Measures

and Gather Data to Answer

Questions

Analyze Data and Answer

Questions

Plan/Define/Do/Document

Identify Purpose,

Team, and Timeline

Core Features of a Program Evaluation Context/Input/Fidelity/Impact

B. Algozzine (2015)

7 MIBLSI

Core Evaluation Questions [Inputs and Outputs] Context §  What are/were goals and objectives for the project?

§  State/district capacity/school adoption? §  Student, family, and community outcomes?

§  Who are primary and secondary stakeholders? Input §  What technical assistance was part of implementation?

§  Was projected level of TA capacity provided (training/coaching)? §  Who delivered the training and technical assistance? §  Who received technical assistance (schools/ cohorts)?

§  How many schools/districts/regional centers?

B. Algozzine (2015)

8 MIBLSI

Core Evaluation Questions [Formative Outcomes] Fidelity §  To what extent are we providing technical assistance with

integrity? [intervention fidelity] §  To what extent are schools implementing interventions with

integrity? [implementation fidelity] §  To what extent are participants satisfied? [social validity]

B. Algozzine (2015)

9 MIBLSI

Core Evaluation Questions [Summative Outcomes] Impact §  To what extent is leadership and policy structure

established? §  To what extent is project associated with changes in

academic outcomes? §  To what extent is project associated with changes in

behavior outcomes? §  To what extent is project associated with changes in other

outcomes?

B. Algozzine (2015)

10 MIBLSI

Core Evaluation Questions Replication, Sustainability, and Improvement §  To what extent is district/state capacity (local training,

coaching, evaluation, behavioral expertise) established?

§  To what extent do outcomes sustain across time?

§  To what extent does initial implementation affect implementation with later cohorts?

§  To what extent did implementation change educational/behavioral capacity/policy?

§  To what extent did implementation affect systemic educational practice?

§  To what extent are modifications needed? B. Algozzine (2015)

11 MIBLSI

Logic Model

Impact/Outcomes

Targeted Receiver

Process/ Activity Output

Short Term Objective: Change Learning

Intermediate Term Objective: Change Behavior

Long Term Objective: Change Conditions

12 MIBLSI

13 MIBLSI

14 MIBLSI

Activities

15 MIBLSI

Training Record Example

Date Training Topic Duration District Schools Number of Participants

1/27/15 School Wide PBIS Day 1- Elementary

1 day Charlton Public Schools

•  Carlton Early Elementary •  Charlton Upper

Elementary

22

2/10/15 School Wide PBIS Day 1- Secondary

1 day Alton Community Schools

•  Westview High School •  Robertson High School

14

2/12/15 School Wide PBIS Day 1- Elementary

1 day Williamsburg Consolidated Schools

•  Pinewood Elementary •  Hickory Elementary •  Maple Hill Elementary •  Sand Lake Elementary

31

TOTALS 3 days 3 Districts 8 Schools 6 Elementary 2 Secondary

67 Participants

16 MIBLSI

Training Evaluation Form Workshop Title: School Wide PBIS Day 1 Date: 1/27/15

Training Perception Example

Question

Stro

ngly

Agre

e Ag

ree

Sligh

tly A

gree

Di

sagr

ee

Stro

ngly

Disa

gree

1.  Today's learning was a valuable use of my time. X

2. I am leaving with tools and strategies to successfully complete the next steps (assignment, communication, activities) that were identified in today's session. X

3. The trainer(s) presented the content in such a way that promoted active engagement, opportunities for processing, and time for participants to work together.

X

4. The pacing and amount of material presented were appropriate for the time allocated. X

17 MIBLSI

Training Perception Example

18 MIBLSI

Instructions: Read the definitions for each skill dimension. Reflect on your current knowledge and practice (as of right now) and your knowledge and practice at the beginning of this training. Read each behavioral statement below the definition and circle the number for each item that best describes your behavior. Please be honest with yourself. 1 = Not Competent/Need More PD 2 = Developing Competence 3 = Competence 4 = Mastery

Retrospective Self Assessment Example

Before Training Question

After Training

1 2 3 4 1 2 3 4 X 1.  I can articulate the purpose of implementation team as it relates to the

implementation of Positive Behavioral Interventions & Supports (PBIS). X

X 2. I can summarize our building’s data related to office discipline referrals X

X 3. I can create a behavior expectation matrix that define the expectation and examples of each expectation within an identified setting.

X

X 4. I can describe the purpose of teaching behavior expectations.

X

19 MIBLSI

Participant Knowledge and Understanding Survey

Knowledge and Understanding Example

Question

Stro

ngly

Agre

e Ag

ree

Sligh

tly A

gree

Di

sagr

ee

Stro

ngly

Disa

gree

1.  As a result of participating in this project, I have an increased my knowledge of key features of a Multi-Tiered Behavioral Framework X

2. As a result in participating in this project, I have a deeper my understanding of how to effectively implement a Multi-Tiered Behavioral Framework X

3. As a result in participating in this project, I feel that our schools can better implement a Multi-Tiered Behavioral Framework. X

20 MIBLSI

Knowledge and Understanding Annual Survey Example

Increased Knowledge

Deeper Understanding

Better Implementation

21 MIBLSI

Develop organizational structures and staff competencies

Implementation of Effective Practices

Successful outcomes in student reading and behavior

Feedback Loops

REACH

CAPACITY FIDELITY IMPACT

Extending number of districts/ schools implementing

MiBLSi Evaluation Components

22 MIBLSI

Reach

23 MIBLSI

Number of Schools Implementing MTBF Example

Number of Schools in Participating District(s)

Number of Schools Implementing

Percent of Schools Implementing

2013-14 2014-15

2013-14 2014-15

Elementary 24 10 14 42% 58% Middle 12 6 9 50% 75% High 8 3 4 38% 50%

24 MIBLSI

Cumulative Implementing Schools Chart Example

0

2

4

6

8

10

12

14

16

2013

-14

2014

-15

2013

-14

2014

-15

2013

-14

2014

-15

Num

ber

of S

choo

ls

New Existing

Elementary Middle High

25 MIBLSI

Capacity

26 MIBLSI

Measures of SEA/LEA Capacity www.sisep.org

State Capacity Assessment (SCA) •  Assessment of extent to which state department has capacity to implement

evidence-based practices at scales of social significance? •  No Cost, from OSEP’s SISEP TA-Center www.sisep.org;

www.scalingup.org •  Scores: Total, Sub-scale, Item •  Used for initial assessment, action planning, progress monitoring District Capacity Assessment (DCA) •  Assessment of extent to which a school district (or region) has capacity to

implement evidence-based practices as scales of social significance? •  No Cost, from OSEP’s SISEP TA-Center •  Scores: Total, Sub-scale, Item •  Used for initial assessment, action planning, progress monitoring

27 MIBLSI

District Capacity Assessment (DCA)

0% 10% 20% 30% 40% 50% 60% 70%

DCA Total Score

2+ yrs Implementation (n=11) 1 yr Implementation (n=10)

New Implementation (n=14)

•  The primary purpose of the DCA is to assist school districts to implement effective innovations such as PBIS that benefit students.

•  The capacity of a district to facilitate building-level implementation refers to the systems, activities, and resources that are necessary for schools to successfully adopt and sustain Effective Innovations.

28 MIBLSI

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

District Capacity Assessment Score Average School Fidelity Score

2014-15 2015-16

Port Huron Area School District: District Capacity and PBIS School Tier 1 Fidelity

District Schools in District

Fidelity criteria

29 MIBLSI

Fidelity

30 MIBLSI

Fidelity Measures!

31 MIBLSI

Implementation Fidelity Summary Example

Winter 2014 Winter 2015

32 MIBLSI

Impact

33 MIBLSI

0

0.2

0.4

0.6

0.8

0%

20%

40%

60%

80%

Year 1 Year 2 Year 3

beha

vior

al re

ferr

als

/10

0 st

uden

ts /

day

stud

ents

at

benc

hmar

k reading

behavior

0

0.2

0.4

0.6

0.8

0%

20%

40%

60%

80%

Year 1 Year 2 Year 3

beha

vior

al re

ferr

als

/1

00 s

tude

nts

/day

stud

ents

at

benc

hmar

k

reading

behavior

Schools with High Implementation Fidelity

Schools with Low Implementation Fidelity

34 MIBLSI

Decision Support Data Systems

35 MIBLSI Example Data Dashboard

36 MIBLSI

MIBLSI Website

37 MIBLSI

38 MIBLSI

Core Features of an Evaluation Report •  Executive Summary •  Goals •  Activities •  Outcomes

•  Capacity •  Reach •  Fidelity •  Impact

•  Lessons Learned •  Next Steps

39 MIBLSI

40 MIBLSI

Lesson Learned

•  Teams need to be taught how to analyze and use data

• Emphasis on directing resources to need and removing competing activities

• As we grow, it is even more important to systematic gather data that is accurate and then act on the data for continuous improvement

• More work is needed in developing feedback cycles

41 MIBLSI

“Things get done only if the data we gather can inform and inspire those in

a position to make [a] difference.”

– Mike Schmoker, former school administrator, English teacher and football coach, author.

59 MIBLSI

•  Questions/Comments?

Maximizing Your Session Participation

When Working In Your Team

Consider 4 questions:

–  Where are we in our implementation?

–  What do I hope to learn?

–  What did I learn?

–  What will I do with what I learned?

Where are you in the implementation process?

Adapted from Fixsen & Blase, 2005

• We think we know what we need so we are planning to move forward (evidence-based)

Exploration & Adoption

• Let’s make sure we’re ready to implement (capacity infrastructure)

Installation

• Let’s give it a try & evaluate (demonstration)

Initial Implementation

• That worked, let’s do it for real and implement all tiers across all schools (investment)

• Let’s make it our way of doing business & sustain implementation (institutionalized use)

Full Implementation

Leadership Team Action Planning Worksheets: Steps

Self-Assessment: Accomplishments & Priorities

Leadership Team Action Planning Worksheet

Session Assignments & Notes: High Priorities

Team Member Note-Taking Worksheet

Action Planning: Enhancements & Improvements

Leadership Team Action Planning Worksheet