cecil j. picard center for child development university of louisiana at lafayette sessions 22a &...

26
Cecil J. Picard Center for Child Cecil J. Picard Center for Child Development Development University of Louisiana at Lafayette University of Louisiana at Lafayette Sessions 22A & 22B Sessions 22A & 22B Holly Howat Oliver Winston Greg Crandall

Upload: nancy-potter

Post on 06-Jan-2018

216 views

Category:

Documents


1 download

DESCRIPTION

Cecil J. Picard Center for Child Development The Cecil J. Picard Center for Child Development was established in 2005 at the University of Louisiana at Lafayette. Our mission is to improve Louisiana by focusing on its children. The Center’s is dedicated to providing high quality, rigorous evaluation of programs that addresses learning from birth to adulthood. The Center is proud to partner with many state agencies including the Department of Education. Our Center’s work with DOE includes the evaluation of the implementation of Positive Behavior Support.

TRANSCRIPT

Cecil J. Picard Center for Child DevelopmentCecil J. Picard Center for Child DevelopmentUniversity of Louisiana at LafayetteUniversity of Louisiana at Lafayette

Sessions 22A & 22BSessions 22A & 22BHolly Howat Oliver Winston Greg Crandall

PBS in Louisiana: PBS in Louisiana: 2006-2007 Evaluation Findings2006-2007 Evaluation FindingsUnderstanding the power of data-based decisions

Cecil J. Picard Cecil J. Picard Center for Child DevelopmentCenter for Child DevelopmentThe Cecil J. Picard Center for Child Development was established in 2005 at the University of Louisiana at Lafayette. Our mission is to improve Louisiana by focusing on its children. The Center’s is dedicated to providing high quality, rigorous evaluation of programs that addresses learning from birth to adulthood. The Center is proud to partner with many state agencies including the Department of Education. Our Center’s work with DOE includes the evaluation of the implementation of Positive Behavior Support.

Evaluation FocusEvaluation FocusSchool-wide Evaluation Tool Correlation AnalysisBehavioral CharacteristicsAcademic CharacteristicsRisk and Protective Factors

CharacteristicsQualitative Results for District-Wide

Implementation

Positive Behavioral SupportPositive Behavioral SupportSchools TrainedSchools Trained

2006-2007 School Year2006-2007 School Year

Positive Behavioral SupportPositive Behavioral SupportSchools TrainedSchools Trained

2006-2007 School Year2006-2007 School Year

School Wide Evaluation ToolSchool Wide Evaluation Tool

The more experience a sampled school has with universal level PBS, the better they are at implementing it.

Most sampled schools had strengths with monitoring and district support and had difficulty with expectations taught.

Comparison of 2006-07 SET Total Scores Across Cohorts by Years of Experience

0

20

40

60

80

100

1 Year 2 Years 3 Years 4 Years

Perc

enta

ge

Cohort 4 N=7 Schools

Cohort 3 N=11 Schools

Cohort 2 N=15 Schools

Cohort 1 N=6 Schools

SET Total and Subcategories Mean Scores for All Sampled PBS Schools

0

20

40

60

80

100

Perc

enta

ge

Total Score

Expectations Defined

Expectations Taught

Reward System

Violation System

Monitoring

Management

District Support

Correlation AnalysisCorrelation AnalysisThis graph indicates that there is statistical significant correlation between School-wide Evaluation Tool scores and Benchmarks Of Quality scores.

SET Scores Relation to Benchmark Scores

0

20

40

60

80

100

0 20 40 60 80 100

Benchmarks

SET SET - Benchmarks

Linear (SET - Benchmarks)

Behavioral Characteristics:Behavioral Characteristics:Suspension RatesSuspension Rates

Sampled schools with over two years of PBS implementation had much lower increases in in-school suspension rates.

A similar pattern existed for out-of-school suspension rates.

Change in ISS Rates from 2003-04 to 2005-06

1.05 1.04

6.73 6.86

0

2

4

6

8

10

Perc

enta

ge

Cohort 1 Cohort 2 Cohort 3 Cohort 4

Change in OSS Rates from 2003-04 to 2005-06

-2.45

0.020.45

2.67

-3

-2

-1

0

1

2

3

4Pe

rcen

tage

Cohort 1 Cohort 2 Cohort 3 Cohort 4

Academic Characteristics:Academic Characteristics:Test Scores and Retention RatesTest Scores and Retention Rates

A general pattern of decline in retention rates can be observed in this sample.

From the data collected for 2006-2007, there was no discernible correlation of PBS implementation to academic outcomes on test scores.

Retention Rates Over Time by Year

0

5

10

15

20

2003-04 2004-05 2005-06

Perc

enta

ge

Cohort 1 N=9 Schools

Cohort 2 N=24 Schools

Cohort 3 N=17 Schools

Cohort 4 N=8 Schools

State Ave.N=1475 Sch.

Risk and Protective FactorsRisk and Protective Factors

Protective factors increased in Grades 6 and 8, particularly the rewards for pro-social behavior.

Risk factors decreased in Grades 6 and 8, particularly a low commitment to school.

PBS Sample School Results on CCYS Protective Factor: Rewards for Pro-social Behaviors

0

20

40

60

80

100

Grade 6 Grade 8 Grade 10

Perc

enta

ge

2004 N=34 Schools

2006 N=34 Schools

PBS Sample School Results on CCYS Risk Factor: Low Commitment to School

0

20

40

60

80

100

Grade 6 Grade 8 Grade 10

Perc

enta

ge

2004

2006

Qualitative Results for Qualitative Results for District-Wide ImplementationDistrict-Wide ImplementationComponent Description One District-wide PBS implementation must have

continued technical assistance from LDE, LSU and UL Lafayette in the development, implementation, and evaluation of a district-wide plan.

Two District-wide PBS implementation must include the organization of personnel, resources, and time, as well as set out goals and strategies for sustainability and expansion.

Three District-wide PBS implementation must have superintendent buy-in at the district level as well as principal buy-in at the school level.

Qualitative Results for Qualitative Results for District-Wide ImplementationDistrict-Wide Implementation

Component Description Four District-wide PBS implementation must have

training and technical assistance that is consistent and continual.

Five District-wide PBS implementation must address a systematic method for collecting, analyzing, and using data to make decisions.

Six District-wide PBS implementation must include an evaluation of the implementation across the district.

Data Driven Decision MakingData Driven Decision MakingAt the Picard Center for Child Development, we collect and analyze data to inform policy makers so they can informed decisions.

School and districts can also collect and analyze data so they can make informed decisions.

Data Driven Decision MakingData Driven Decision Making

PURPOSE:

To review critical features & essential practices of data collection and the analysis of data for interventions

Non-classr

oom

Setting Syst

ems

ClassroomSetting Systems

Individual Student

Systems

School-wideSystems

School-wide Positive Behavior School-wide Positive Behavior Support SystemsSupport Systems

Data Collection ExamplesData Collection Examples

An elementary school principal found that over 45% of their behavioral incident reports were coming from the playground.

High school assistant principal reports that over two-thirds of behavior incident reports come from our cafeteria.

Data Collection ExamplesData Collection Examples

A middle school secretary reported that she was getting at least one neighborhood complaint daily about student behavior during arrival and dismissal times.

Over 50% of referrals occurring on “buses” during daily transitions.

Data Collection ExamplesData Collection Examples

At least two times per month, police are called to settle arguments by parents & their children in parking lots.

A high school nurse lamented that “too many students were asking to use her restroom” during class transitions.

Data Collection QuestionsData Collection Questions

What system does the parish utilize for data collection?

How is the data system being used in each school setting?

How frequently are data collection system reports generated (bi-weekly, monthly, grading period and/or semester reports )?

Minimal School-Level Data Minimal School-Level Data Collection NeedsCollection Needs

Minor referrals Major referrals Referrals by staff members Referrals by infractions Referrals by location Referrals by time Referrals by student

Minimal District-Level Data Minimal District-Level Data Collection NeedsCollection Needs

Majors referrals (ODRs) Referrals by Incident Referrals by Infractions Times of incidents Locations of incidents (what school and

where in the school)

Data Analysis QuestionsData Analysis Questions How is the data displayed (graphs, tables,

etc.) and is it effective?

What are the outcomes of data review?

Are data-based decisions reached?

How are data-based decisions monitored for effectiveness?

Minimal School-Level Data Minimal School-Level Data Analysis NeedsAnalysis Needs

PBS team should be part of analysis process

Data should be reviewed to determine patterns of problem behaviors

Decisions should be based upon data presented

Decisions should include an intervention that can be successfully implemented and monitored.

Using Data to Make DecisionsUsing Data to Make Decisions What interventions are needed to respond

to problem behaviors?

How do we implement the intervention throughout the school?

What is the time table for the intervention to show a decrease in undesirable behavior?

Contact InformationContact InformationDr. Holly [email protected]

Mr. Oliver [email protected]

http://ccd-web.louisiana.edu/