1 illinois statewide implementation of the problem solving/rti initiative nasp 2010 annual...
TRANSCRIPT
1 Illinois Statewide Implementation of the Problem
Solving/RtI Initiative
NASP 2010 Annual ConventionHyatt Regency, Chicago, IL
March 4, 2010David Bell, St. Xavier University & I-ASPIRE Chicago, [email protected]
Gary L. Cates, Illinois State University & I-ASPIRE Central, [email protected] Cox, Illinois State Board of Education, [email protected]
Ben Ditkowsky, Lincolnwood SD 74 & I-ASPIRE North, [email protected] Golomb, Doctoral Student, Loyola University Chicago, [email protected]
Mark E. Swerdlik, lllinois State University & I-ASPIRE Central, [email protected]
2
Session Objectives
In today’s presentation, we will: Provide an overview of Illinois ASPIRE and the major project
evaluation components Discuss project evaluation results from three data sources:
Self-Assessment of Problem Solving Implementation (SAPSI) Student Outcome Data IHE Checklist (Review of selected educator preparation programs in
Illinois) Share some of challenges associated with data collection for
project evaluation
5-year State Personnel Development Grant from OSEP (now in Year 5)
Based on 10+ years of state experience with problem-solving in a 3-tier model
Primary Goal: Establish and implement a coordinated, regionalized system of personnel development that will increase the capacity of LEAs to provide early intervening services [with an emphasis on K-3 reading], aligned with the general education curriculum, to at-risk students and students with disabilities, as measured by improved student progress and performance.
Illinois ASPIREAlliance for School-based Problem-solving & Intervention Resources in Education
I-ASPIRE Objectives
1. Deliver research-based professional development and technical assistance.
2. Increase the participation of parents in decision-making across district sites.
3. Incorporate professional development content into IHE general and special education preservice curricula.
4. Evaluate the effectivenessof project activities.
I-ASPIRE: Regional System for T.A. & Professional Development
4 regional Illinois ASPIRE Centers Illinois ASPIRE - Chicago: Chicago Public Schools Illinois ASPIRE - North: Northern Suburban Special Ed. Illinois ASPIRE - Central: Peoria ROE #48 Illinois ASPIRE - South: Southern Illinois University Edwardsville
Collaboratives of LEAs, IHEs, regional providers and parent entities
Responsible for: Training to districts and parents in region General technical assistance (T.A.) On-site T.A. to school demonstration sites
Illinois T.A. Project Evaluation
Coordinated through Loyola University Chicago, Center for School Evaluation, Intervention & Training (CSEIT); http://www.luc.edu/cseit/i-aspire.shtml
Evaluation Framework:1. If people are trained, do they implement?2. If people implement, do they do so with fidelity?3. If people implement with fidelity, do they sustain the
practice(s) over time?4. If people sustain the practice(s), what is the impact
on student outcomes (school, group, individual)?
I-ASPIRE Evaluation: Key Data Sources
Self-Assessment of Problem Solving Implementation (SAPSI): Assesses the degree of implementation of the problem solving process at the building level as self-reported by school sites
Fidelity of Implementation Checklist: Designed to assess the degree to which problem solving & RtI processes are implemented as intended; involves a review of products by external reviewer
Student Outcome Data: Involves analysis of universal screening, progress monitoring, and state assessment (ISAT) results
Parent Survey: Assesses participation (more than satisfaction) in the problem solving process of parents and guardians whose children are receiving Tier 3 interventions
IHE Checklist: Designed to assess the amount of RtI content incorporated into IHE general and special education pre-service and graduate curricula
SAPSI and Fidelity Of Implementation Checklist
Gary L. CatesIllinois State University
I-ASPIRE Central
8
Self Assessment of Problem-Solving Implementation (SAPSI)
School/Administration Focused Problem-Solving Survey
25 questions Completed twice per year Action Planning Document Developed over time and tweaked when
necessary
9
SAPSI10
Adobe Acrobat Document
SAPSI Outcomes11
SAPSI Outcomes12
SAPSI Outcomes13
SAPSI Outcomes14
SAPSI Outcomes15
SAPSI Outcomes16
SAPSI Outcomes17
Fidelity Checklist
External check of fidelity of implementation Completed 1 time per year (Spring) 5 “cases” from 5 randomly chosen schools Inter-rater Reliability of evaluators >80% Sources: SIP, IPF, Data files, CBM, Training Logs Dichotomous scoring Comments Few Additional Scoring Guidelines for specific
Items
18
Fidelity Checklist19
Adobe Acrobat Document
Fidelity Outcomes20
Fidelity Outcomes21
Fidelity Outcomes22
Fidelity Outcomes23
Fidelity Outcomes24
Fidelity Outcomes25
Fidelity Outcomes26
Fidelity Outcomes27
Fidelity Outcomes28
Fidelity Outcomes29
Fidelity Outcomes30
Fidelity Outcomes31
Fidelity Outcomes32
Illinois – ASPIRENorthern Region Targeted Program Evaluation
Ben DitkowskyLincolnwood School District 74
I-ASPIRE North
33
Evaluation Question
Assumption(s) Successful Implementation of RTI Will Increase
Average Overall Achievement and in Particular, ROI for Students Who Receive Tiers 2 and 3 Intervention
Facts
Increases in Achievement Come from Changes in Curriculum and Instruction, Fidelity of Implementation, Increased Behavior Support, etc
Why Use Local Assessments, Such As CBM? State-mandated tests assess outcomes Local assessments allow us to:
Measure students earlier than 3rd grade Monitor progress more frequently than once per year Rely on multiple assessment tools for our information Develop an integrated assessment system with
benchmarks for performance, linked to a common outcome
Question #1. Do scores on CBM Matter?
Source data AIMSWEB
How many words did they read in one minute?
Sam read 22Mary read 44
Juan read 65 Dorothy read 94
Sam read 22
Mary read 44
Juan read 65
Dorothy read 94
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Re
ad
ing
IS
AT
Sca
le S
core
Exceed Standards
Meets Standards
Below Standards
Academic Warning
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Re
ad
ing
IS
AT
Sca
le S
core
in Fall, Dorothy read 94 correct words in a minute She obtained a score of 169 on the state test in the spring
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Re
ad
ing
IS
AT
Sca
le S
core
0
1
Grade 3 (N =143)
(Score at or below: 20; n = 1)
0%Met
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Re
ad
ing
IS
AT
Sca
le S
core
0
4
Grade 3 (N =144)
(Score at or below: 30; n = 4)
0%Met
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Re
ad
ing
IS
AT
Sca
le S
core
0
9
Grade 3 (N =144)
(Score at or below: 40; n = 9)
0%Met
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Re
ad
ing
IS
AT
Sca
le S
core
2
11
Grade 3 (N =144)
(Score at or below: 50; n = 13)
15%Met standards
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Re
ad
ing
IS
AT
Sca
le S
core
6
14
Grade 3 (N =140)
(Score at or below: 60; n = 20)
30%Met standards
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Re
ad
ing
IS
AT
Sca
le S
core
15
18
Grade 3 (N =143)
(Score at or below: 70; n = 33)
45%Met standards
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Re
ad
ing
IS
AT
Sca
le S
core
27
22
Grade 3 (N =138)
(Score at or below: 80; n = 49)
55%Met standards
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Re
ad
ing
IS
AT
Sca
le S
core
27
22
Grade 3 (N =138)
(Score at or below: 80; n = 49)
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Re
ad
ing
IS
AT
Sca
le S
core
27
22
Grade 3 (N =138)
(Score at or below: 80; n = 49)
85
5
94%Met standards
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Re
ad
ing
IS
AT
Sca
le S
core
85
5
Grade 3 (N =138)
(Score at or above: 80.5; n = 90)
94%Met standards
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Re
ad
ing
IS
AT
Sca
le S
core
55%
45%
0%
100%
94%
6%
Grade 3 (N =138)
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Re
ad
ing
IS
AT
Sca
le S
core
49%
51%
89%
11%
Grade 3 (N =517)
Curriculum-Based Measurement is a measure of general reading competence
Validity coefficients for R-CBM with the Comprehension subtest of the SAT were .91 as compared with Question Answering .82 , Recall .70, Cloze .72 (Fuchs, Fuchs &
Maxwell, 1988)
Validity coefficients for Text Fluency of Folk Tales with the Iowa Test of Basic Skills Comprehension was .83 (Jenkins, Fuchs, Espin, van den Broek & Deno, 2000)
Fluency is causally related to reading comprehension (National Reading Panel -NICHD, 2000)
Is fall Curriculum-Based Measurement related to state testing?
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Rea
ding
ISA
T S
cale
Sco
reGrade
3
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Rea
ding
ISA
T S
cale
Sco
re46%
54%
90%
10%
Grade 3
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Rea
ding
ISA
T S
cale
Sco
re50%
50%
91%
9%
Grade 3
Curriculum-Based Measurement and High Stakes Testing
120
130
140
150
160
170
180
190
200
210
220
0 20 40 60 80 100
120
140
160
180
200
220
240
260
Reading (CWPM)
Rea
ding
ISA
T S
cale
Sco
re29%
71%
91%
9%
Grade 3
62%
38%
Big Idea # 1. Scores on CBM are related to results of high-stakes testing
Correct = 90%Correct = 81%
Correct = 71%
Correct = 71%
Correct = 83%
Correct = 88% 115 DNM144 80%
Below Basic
242 met279 87%
Proficient
2006-07 Grade 3 ISAT and RCBM (AIMSWEB)
100
150
200
250
300
350
0 50 100 150 200 250
WRC
SC
AL
E S
CO
RE
AW
BS
MS
ES
74% Risk Identification
92% Proficiency Identification
Large Unit0.1% Native Am~6% Asian~25% Black~29% Hispanic~38% White
N = 1133
0
50
100
150
200
250
300
350
400
0 20 40 60 80 100 120 140 160 180 200 220 240 260
W
B
M
E
91%
88%
63%12%
37% 9%
RCBM (DIBELS) and ISAT 2006-07Grade 3
Large UnitNative Am .1%Asian 2.1%Black 19.4%Hispanic 69.5%White 8.2%
Cross Validation of Cut Scores to ISAT (2008)
100
150
200
250
300
350
0 20 40 60 80 100 120 140 160 180 200 220 240 260
Curriculum - Based Measurement
ISA
T R
ea
din
g S
cale
Sco
re
67%
94%
N = 2557
CBM Is A Reliable Predictor Of ISAT
Predictive and Concurrent Validity
Grade Fall Winter Spring
3 r .71 .74 .74
N 2557 2601 2605
4 r .74 .73 .72
N 1320 1335 1333
5 r .71 .73 .72
N 1811 1620 1659
6 r .72 .74 .70
N 1364 1370 1319
Note. Data from 8 small to moderate school districts in the Northern Region of Illinois, 2008
0.00
0.10
0.20
0.30
0.40
0.50
0.60
0.70
0.80
0.90
1.00
0 20 40 60 80 100 120 140 160 180 200 220 240 260
0.00
0.10
0.20
0.30
0.40
0.50
0.60
0.70
0.80
0.90
1.00
0 20 40 60 80 100 120 140 160 180 200 220 240 260
0.00
0.10
0.20
0.30
0.40
0.50
0.60
0.70
0.80
0.90
1.00
0 20 40 60 80 100 120 140 160 180 200 220 240 260
What Is The Probability Of Meeting Standards On The State Test?
Fall R-CBM (Grade 3)
Pro
babi
lity
of m
eetin
g st
anda
rds
on I
SA
T
For a probability of .5 a student would have to read 53 WRC between (51 and 54 WRC)
For a probability of .8 a student would have to read 77 WRC between (76 and 80 WRC)
Note. Empirical confidence intervals constructed through bootstrapping 100 samples without replacement
Grade MeasureFall
Below BasicFall
ProficientWinter
Below BasicWinter
ProficientSpring
Below BasicSpring
Proficient
K LNF* 2 8 15 27 29 40LSF 3 13 15 30 25 40ISF* 4 8 10 25 - -PSF* - - 7 18 10 35NWF* - - 5 20 15 25
1 LSF 20 35 - - - -NWF* 15 25 30 50 - -R-CBM 0 20 20 40 40 60MAZE x x 2 6 5 10
2 R-CBM 30 45 55 65 70 90MAZE 2 4 5 10 8 15
3 R-CBM 51 76 72 97 87 114MAZE 5 10 10 17 15 22
4 R-CBM 71 97 87 113 98 126MAZE 7 11 13 18 15 20
5 R-CBM 80 106 94 121 110 138MAZE 11 17 16 23 21 28
Cut-Scores For Proficiency
Note. Cut scores for grades 3-5 revised based on 2008 ISAT data
A Summary of Adequate Progress
Using Cut – Scores we can determine the effects of curriculum and instruction on students based on an estimate of initial skill.
Kindergarten Measures by Time Students were administered CBM or TELS three points in time
(Fall, Winter and Spring).
•Caveat – Not every district measured the same things or at the same times (i.e., because of local decision-making, the dependent variable is not well controlled)
Fall – Letter Naming Fluency
Winter – Either Letter Sounds Fluency or Nonsense Word Fluency
Spring – Either Letter Sounds Fluency or Nonsense Word Fluency
Approximately 70% of students began the year with adequate Print Awareness.
But this was sufficient for only about 60% to translate into alphabetic skills.
Approximately 12% of students began the year with Questionable Print Awareness.
About 40% of these students were able to demonstrate sufficient alphabetic skill in spring
Approximately 17% of students began the year with little to no Print Awareness.
About 60% demonstrated progress but.. Only 30% demonstrated sufficient alphabetic skill.
ALL Schools Effects in K; For Students entering with and Without Print Awareness
Proficient 1.2 times more
likely to demonstrate
PROFICIENCY
1.7 times more likely to
demonstrate PROFICIENCY
Overall 1.2
times more likely to demonstrate
progress
A High Fidelity Implementation Site
Proficient 1.1 times more
likely to demonstrate
progress
0.7 times as likely to
demonstrate progress
Overall 1.1
times more likely to demonstrate
progress
A Demographically Diverse Site
Below Basic 1.3 times more likely to demonstrate
progress
Proficient Equally likely
to demonstrate progress
0.4 times as likely to
demonstrate progress
Overall 0.9
times more likely to demonstrate
progress
A New Implementation Site
Below Basic 0.4 times as likely to
demonstrate progress
Quick Summary of SoAP findings for Kindergarten students in the N. Region Print Awareness increases the likelihood of the
development of Alphabetic Skill, but it is not enough. Too few students enter kindergarten with sufficient Print
Awareness (1 of 3 does not have sufficient print awareness) Students who enter kindergarten with sufficient print
awareness are 1.5 times more likely to meet key benchmarks than those with questionable skill; and 2.14 times more likely than those who enter without sufficient skill
Only our highest implementation site consistently out performed, the general trend, but even there only 2 of 3 students demonstrated sufficient Progress
•Caveat – Not every district measured the same things or at the same times (i.e., because of local decision-making, the dependent variable is not well controlled)
Cohort 1 (Gr 1 2004-05)
0
20
40
60
80
100
120
140
160
180
200
0 1 2 3 4 5 6 7 8
Cohort 1
Cohort 2 (Gr 1 2005-06)
0
20
40
60
80
100
120
140
160
180
200
0 1 2 3 4 5 6 7 8
Cohort 1
Cohort 3 (Gr 1 2005-06)
0
20
40
60
80
100
120
140
160
180
200
0 1 2 3 4 5 6 7 8
Cohort 1 Trend
Growth Over Time; Examination of Cohorts
Benchmark Targets
Adequate Progress With CBM and TELS
On Average IASPIRE North Sites have maintained performance well above expected targets.
As new sites enter the project, the overall trends were maintained
Ultimately, Schools in Illinois Are Evaluated Based on State Testing Results
A High Fidelity Implementation Site
89% 97%
Progress?
82% 77%
Progress?
When Demographics Change
A High Poverty Implementation Site
33% 27%
Progress?
Up until 2006, ELL students were assessed with the Illinois Measure of Annual Progress in English
74% 85%
Progress?
A Newer Implementation Site
Another Newer Implementation Site
41% 47%
Progress?
Conclusion
We have seen improvement over time for students.
How much improvement we see depends on our focus
Ultimately, Increases in Achievement Come from Changes in Curriculum and Instruction, Fidelity of Implementation, Increased Behavior Support, etc
Survey of Institutions of Higher Education
Mark E. SwerdlikIllinois State University
I-ASPIRE Central
66
Institutions of Higher Education Checklist-Purpose, Methodology, and Sample
Purpose: Review preservice and graduate curricula related to the degree to which they are including school-based problem solving and early intervening services (RtI knowledge and skill set) .
Methodology: Expert rater reviewed syllabi for relevant courses and interviewed faculty member familiar with preparation program
Sample: Five IHEs within the four I-ASPIRE regions. Programs included school psychology, general education, special education and educational leadership/administration
67
Sections of IHE Checklist
Section 1: Three-Tier Problem Solving and Response to Intervention
Section 2: Universal Screening and Problem Identification
Section 3: Scientifically Based Reading Instruction in a Three Tier Model
Section 4:Scientifically Based Progress Monitoring Tools
Section 5: Effective Problem Solving Teams
68
Rating Scale for IHE Checklist
Rating/Meaning of Score 0 /No evidence that the component is included in
the class 1/Component is mentioned in the class 2/Component is mentioned in the class AND there
are required readings, assignments, and/or projects for application
A higher rating indicated that the curriculum included problem solving, RtI, and early intervening services content.
69
Open-Ended Interview
How does your program prepare pre-service students to participate in three-tier problem-solving models and Response to Intervention?
How does your program prepare pre-service students to participate in universal screening and problem identification as part of this model?
How does your program prepare pre-service students to implement scientifically-based reading instruction as part of this model?
How does your program prepare pre-service students to implement scientifically-based progress monitoring in a three-tier model?
How does your program prepare pre-service students to participate in effective problem-solving teams?
Space was left for additional questions that arose from the interview. Only two of the five schools submitting syllabi completed the interview process described.
70
Institutions of Higher Education Checklist-Overall little implementation
71
Percentage of Items by Section72
Percentage of I tems at each Level by Section (All Schools)
77.5%
81.8%
80.6%
81.4%
88.1%
10.5%
10.9%
11.2%
9.0%
12.0%
7.4%
8.1%
9.6%
5.8% 6.1%
0% 20% 40% 60% 80% 100%
section 1
section 2
section 3
section 4
section 5
Sect
ion
Percentage
level 0 level 1 level 2
School Psychology Highest Percentage of Infusion
73
Percentage of I tems at Each Level by Program (All Schools)
74.4%
67.1%
64.7%
76.5%
100.0%
97.1%
99.0%
100.0%
14.9%
10.2%
21.2%
20.0%
1.0%
10.7%
22.7%
14.1%
3.5%
2.0%1.0%
0% 20% 40% 60% 80% 100%
Special Education
School Psychology
Administration
General Education
Early Childhood Education
Reading
Secondary Education
Elementary Education
Pro
gra
m
Percentage
Level 0 Level 1 Level 2
Themes and Subcategories of Interview
Field Experience-Students participate in clinical experiences-Students participate in a practicum-Students participate in student teaching. Coursework-Topics are highlighted in class work.-Students must take a special education course dealing with
these topics.-A topic running through these interviews is reading strategies.
Lack of Implementation/Application-Interviewees indicate a possibility for future courses.
74
Conclusions and Implications
Low rate of infusion of RtI knowledge and skill set development in preservice preparation programs
School psychology graduate preparation programs followed by special education have the highest rates of infusion
At some schools this activity has lead to increased communication about the importance of these topics and professional development for program faculties (e.g., Meeting with Public Deans and topic of annual Education Institute)
75
Data Collection“Challenges”
David BellSt. Xavier University
I-ASPIRE North
Data Collection
One major component of the grant is focused on retrieving student outcome data (behavior and academic)
Some barriers Inconsistency with respect to data that is collected
Some districts did not have an appropriate data management system to collect behavior information
Additional variables (beyond the grants control) impacted the ability to collect academic screening data
SAPSI
SAPSI Collecting reliable information from the participants
required revisions to the original document Some participants viewed the document as “busy
work” especially districts that have been implementing the key principles years prior to the grant
Multiple administrations seemed overwhelming for participants
IHE Tool
Required the analysis/interpretation of syllabi and interviews with professors Difficulty with truly understanding the depth of course
content just by reviewing syllabi Difficulty establishing the sense of urgency from
University professors
Additional Tools
Fidelity Checklist Challenges of collecting data at a minimum
PARENT SURVEY Collecting the information seemed to be a consistent
barrier
General Information on I-ASPIRE
http://www.illinoisaspire.org/welcome/
All Evaluation Forms Available
http://www.luc.edu/cseit/i-aspireresourcesforcoordinator.shtml
82
83
Questions
Thank You for Your Attention!