1 data analysis within an rti 2 framework: linking assessment to intervention aimee r. holt, phd...
TRANSCRIPT
1
Data Analysis within an RtI2 Framework: Linking Assessment to Intervention
Aimee R. Holt, PhDMiddle Tennessee State University
2
What is RTI2?
•A systematic and data-based method for addressing academic concerns:
–identifying
–defining &
–resolving
Brown-Chidsey & Steege (2010)
3
RTI2 is a general education initiative….
•Components of RTI2
–High-quality instruction
–Frequent assessment of academic
skills
–Data-based decision making
Brown-Chidsey & Steege (2010)
4
Problem Solving
• At each tier within RTI2, a problem solving model is
employed to make decisions
• Analyze the Results of
Implementation• Determine Next
Steps
• Analyze the Assessment Plan
Results• Develop an
Intervention Plan
• Define the Problem• Develop an Assessment Plan
Implement PlanProgress Monitor
Problem Identification
Problem AnalysisPlan Evaluation
5
What would Assessment at Tier I look like?
6
Universal Screeners
•LEAs are required to:
–Administer a nationally normed,
–skills-based universal screener
–to students at their grade level
7
–For K-8, Universal Screeners should be
administered 3X per year
–In grades 9-12, there are multiple sources
of data that can be reviewed, such as: • EXPLORE, PLAN and ACT; Tennessee Comprehensive
Assessment Program (TCAP) which includes Writing (TCAP-
WA), End of Course (EOC), 3-8 Achievement and in 2014-2015,
Partnership for Assessment of Readiness for College and
Careers (PARCC); TVAAS
8
Characteristics of Appropriate Universal Screening Tools•Helps answer questions about efficiency of core program–Aligns with curriculum for each grade level
•Skills mastery aligns with state mandated year-end assessment
Ikeda, Neessen, & Witt (2008).
9
3 Types of CBM’s
•General Outcome Measures (GOM’s)
•Skill Based Measures
•Sub-skill Mastery Measures
10
General Outcome Measures•GOMs
–sample performance
–across several goals at the same time
– capstone tasks •Ex. Oral reading fluency
•Can be used for
–screening (benchmarking),
–survey & specific level assessment
–progress monitoring
11
Skills-Based Measures
•SBM are similar to GOM’s but can be used when capstone tasks are not available–Ex. Math computation
•Can be used for –screening (benchmarking), –survey & specific level assessment–progress monitoring
12
Subskill Mastery Measures
•SMMs are very narrow in focus
–Ex. Names of letters•Should not be used for benchmarking
–(exception… early skills such as Letter Naming Fluency, Letter Sound Fluency, Number Naming Fluency)
13
Example Reading Skills Typically Assessed by Universal Screeners
Grade Areas Typically Assessed by Universal Screeners6th Oral Reading Fluency ; Reading for understanding5th Oral Reading Fluency; Reading for understanding4th Oral Reading Fluency; Reading for understanding3rd Oral Reading Fluency; Reading for understanding2nd Oral Reading Fluency; Reading for understanding1st Letter Naming Fluency (beginning); Phonemic Awareness;
Phonics; Word Identification Fluency; Oral Reading Fluency (end)
K Letter Naming Fluency; Phonemic Awareness:Early Phonics Skills including Letter Sound Fluency
14
What would Data Analysis at Tier I look like?
15
Making Decisions about Group Data
•Review universal screening data to answer the following questions:
–Is there a class wide problem?
–Who needs a Tier II intervention?•Be sure to examine students at the margin
–Does anyone need Tier III now?
16
17
Who needs a Tier II or Enrichment?• Winter Benchmark for ORF: –90th %- 153; –25th % - 72;
• Winter Benchmark for Maze:
–90th % - 25;
–25th % - 9;
• Instructional level criteria• For contextual reading – 93-97%
correct• For most other academic skills –
85-90% correct
ORF Maze
154/100% 26 /98%
154/85% 26 /79%
68/ 95% 09 /94%
68/88% 08 /80%
18
Examining students at the Margins
• Winter Benchmark for ORF: –90th %- 153; –25th % - 72;
• Winter Benchmark for Maze: –90th % - 25; –25th % - 9;
• Instructional level criteria• For contextual reading – 93-
97% correct
ORF Maze
75/96% 11 /100%
80/100% 10 /97%
73/82% 11/75%
19
Identifying who needs Tier III
• Winter Benchmark for ORF: –25th % - 72; –10th % -44
• Winter Benchmark for Maze: –25th % - 9; –10th % - 6
• Instructional level criteria• For contextual reading –
93-97% correct
ORF Maze
46 / 76% 6 / 80%
42 / 83% 5 / 75%
20
Referral to Tier II Decision Tree
Core literacy instruction has been implemented with fidelity ≥80% of student needs are met by core instruction
Differentiated instruction has been provided in a small group within core literacy instruction
Student has been present for ≥75% of instructional days
Student has passed vision and hearing screening
Data indicates performance below the 25th% on universal screening of student achievement compared to national norms
Additional Assessment data supports universal screening data
21
What do we mean by linking assessment to intervention?
22
Linking Assessment to Interventions….
•Research has shown that effective interventions have certain features in common:
–Correctly targeted to the student’s deficit
–Appropriate level of challenge (instructional range)
–Explicit instruction in the skill
–Frequent opportunities to practice (respond)
–Provide immediate corrective feedback
(e.g., Brown-Chidsey & Steege, 2010; Burns, Riley-Tillman, & VanDerHeyden, 2013; Burns, VanDerHeyden, & Boice, 2008;)
23
Academic Instruction in Reading
•Both NCLB and IDEA require that instruction in the general education setting cover all 5 areas of reading identified by the National Reading Panel
• Phonemic Awareness
• Phonics
• Fluency
• Vocabulary
• Text Comprehension Strategies
24
Reading Comprehension
Vocabulary Text Comprehension Strategies
Reading FluencyFluency
Basic Word Reading Phonemic Awareness Phonics
Linking the 5 skill areas to 3 SLD areas
25
Phonological Awareness
•A metacognitive understanding that words we hear have internal structures based on sound–Research on PA has shown that it exerts an independent causal influence on word-level reading. (Berninger & Wagner, 2008)
–Phoneme – smallest unit of speech•The English language has 44-46 phonemes
26
Phonics•Alphabetic principle - Linking phonological (sound) and orthographic (symbol) features of language (Joseph, 2006)
–Important for learning how to read and spell
•National Reading Panel –students with explicit AP instruction showed benefits through the 6th grade
–Phonological awareness is a prerequisite skill
27
• Word Reading Skills - (McCormick, 2003)
–Word identification: the instance when a reader accesses one or more strategies to aid in reading words (e.g., applying phonic rules or using analogies)
•Decoding – blending sounds in words or using letters in words to cue the sounds of others in a word (Joseph, 2006)
–Word recognition: the instant recall of words or reading words by sight; automaticity
28
Fluency•“ The ability to read a text quickly, accurately, and with proper expression” (NRP, 2000 p.3-5)
•Most definitions of fluency include an emphasis on prosody – the ability to read with correct expression, intonation and phrasing (Fletcher et al., 2007)
•National Reading Panel -Good reading fluency skills improved recognition of novel words, expression during reading, accuracy and comprehension
29
Vocabulary & Text Comprehension Skills
• Vocabulary knowledge – including understanding multiple
meanings of words; figurative language etc..
• Identifying stated details
• Sequencing events
• Recognizing cause and effect relationships
• Differentiating facts from opinions
• Recognizing main ideas – getting the gist of the passage
• Making inferences
• Drawing conclusions
30
What Would Assessment at Tier II Look Like?
31
So you have identified your “at risk students”- now what?
•You will need to conduct Survey Level Assessment (SLA) for these students
•Survey Level Assessment (SLA)
–Can be used to: (a) provide information on the difference between prior knowledge and skills deficits to be used to plan instructional interventions & (b) serve as baseline for progress monitoring
32
Why is it important to conduct Survey Level Assessments before beginning Tier II interventions?
•The primary question being addressed by the survey level assessment at Tier II is
–“What is the CATEGORY of the problem”
–(What is the specific area of academic deficit?)
(e.g., Riley-Tillman, Burns, Gibbons, 2013)
33
An Example of Survey Level Assessment Using DIBELS
Grade CBM Assessed Benchmarked
6th Oral Reading Fluency Fall, Winter, Spring
5th Oral Reading Fluency Fall, Winter, Spring
4th Oral Reading Fluency Fall, Winter, Spring
3rd Oral Reading Fluency Fall, Winter, Spring
2nd Oral Reading Fluency Fall, Winter, Spring
1st Oral Reading Fluency Winter, Spring
1st Nonsense Word Fluency Fall, Winter, Spring
1st Phoneme Segmentation Fluency Fall, Winter, Spring
1st Letter Naming Fluency Fall
K Nonsense Word Fluency Winter, Spring
K Phoneme Segmentation Fluency Winter, Spring
K Letter Naming Fluency Fall, Winter, Spring
K Initial Sound Fluency Fall, Winter
1) Start at student’s grade level
2)Test backwards by grade until the student has reached the “low risk” benchmark for a given skill
•Low risk/ established indicates the student has “mastered” that skill
34
For example….. In reading
• comprehension & fluency = •comprehension intervention
• comprehension + low fluency, but decoding = •fluency intervention
• comprehension + fluency + decoding, but phonemic awareness skills•decoding interventionRiley-Tillman et al., (2013)
35
Let’s look at Michael a 2nd grade student•At the fall benchmark, he was identified on
ORF as being in the some risk range. •His score was 30 wcpm
• Survey level assessment were conducted using:• DORF 1st grade – (fluency)• DNWF 1st grade – (decoding)• DPSF 1st grade – (phonemic awareness)
Problem Identification
Problem Analysis
36
Michael’s Scores
• DORF – 35 wcpm
•DNWF – 28 scpm
•DPSF – 38 pcpm
DIBELS Scores Representing Skills Mastery
Fall Winter SpringDORF > 20 > 40DNWF > 24 > 50 > 50DPSF > 35 > 35 > 35DLNF > 37 --- ---
37
What next….
•You link your assessment data to an intervention that targets the category of skill deficit that was identified
•You select progress monitoring probe(s) that assess that skill
•You set the student’s goal for improvement– You can use ROI & Gap Analysis Worksheets to help with this
38
What progress monitoring is not…
•It is NOT an instructional method or intervention
•Think of progress monitoring as a template that can be laid over goals and objectives from an assortment of content areas
39
What Would Data Analysis at Tier II Look Like?
40
41
Referral to Tier III Decision Tree
Tier II intervention(s) have occurred daily for 30 minutes in addition to core instruction
Intervention logs attached (3) Fidelity checks completed and attached
Implementation integrity has occurred with at least 80% fidelity
Student has been present for ≥75% of intervention sessions Tier II intervention(s) adequately addressed the student’s area of need
42
Tier II intervention was appropriate and research-basedResearch based interventions are:□ Explicit□ Systematic□ Standardized□ Peer reviewed□ Reliable/valid□ Able to be replicated
Progress monitoring has occurred with at least 10-15 weekly data points –OR- 8-10 bi-monthly data points
Gap analysis indicates that student’s progress is not sufficient for making adequate growth with current interventions
43
Does a student require Tier III intervention?
•Step 1: Need to check to see if the data can be interpreted
–A minimum of 8-10 data points, if progress monitoring every other week, OR 10-15 data points, if progress monitoring weekly to make a data-based decision to change to Tier III.
44
• Step 2: Examine Rate of Improvement–You can compare the student’s actual
ROI to the goal that was established
–You can use the ROI worksheets
• Let’s complete one for Michael
45
Completing the ROI Worksheet for Michael
Assessment Used: DIBELS NWF
Student’s score on first probe administered: 28
Student’s score on last probe administered: 37
Fall benchmark expectation: 24
Spring benchmark expectation: 50
Step 1
____________-
_____________
/_________
=
___________
Spring benchmark expectation
Fall benchmark expectation
Number of weeks
Typical ROI (slope)
50 24 36 0.72
46
37 28 13 0.69
47
0.72 1.44
0.72 1.08
48
You also can visually analyze the graphed progress monitoring data
– Calculate the trend line of the intervention data points and compare it to the aim (goal) line. » If the slope of the trend line is less than the
slope of the aim line, the student may need to be moved to Tier III.
» Especially if it appears that given the student’s current ROI that they will not meet year end grade level standards
49
Dual Discrepancy
• -A student should be deficient in level and have a poor response to evidenced-based interventions (slope) to the degree that he/she is unlikely to meet benchmarks in a reasonable amount of time without intensive instruction to move:
– between Tier II to Tier III as well as between Tier III and referral for a comprehensive special education evaluation.
–(e.g., Brown-Chidsey & Steege, 2008; Lichenstien, 2008)
50
What Would Assessment at Tier III Look Like?
51
Specific Level Assessment
•Functional analysis of skills–Are used to: •(a) identify specific skills deficits; •(b) students prior knowledge; & •(c) serve as baseline for progress monitoring
–specific level assessments rely primarily on subskill mastery measures.•“drill down” to specific deficits
52
Functional Analysis
•R- review
• I – interview
•O – observe
•T - test
• I – instruction
•C – curriculum
•E – environment
•L- learner
RIOT/ICEL Matrix
53
Linking Assessment Data to Interventionat Tier III
•The learner–focus on alterable learner variables –identify academic entry level skills
•The task–level of the material the student is expected to
master
•The instruction–research-based methods and management
strategies used to deliver curriculum
Match = SuccessInstruction
Student
Task
54
Targets for Academic Instructional Materials•Instructional level
•contextual reading – 93-97% correct
•other academic skills – 85-90% correct
–Produce larger gains more quickly
Gravois, T.A., & Gickling, E.E. (2008). Best practices in instructional assessment. In A. Thomas & J.Grimes (Eds.), Best practices in school psychology (5th ed., pp. 503 518). Bethesda, MD: National Association of School Psychologists.
55
• identifying initial, final & medial sounds in words
Alliteration
• blending individual sounds to make a whole word
Blending
• breaking a whole word into it’s individual parts
Segmenting
• Deleting: saying the new word created by omitting a syllable or individual sound in a word
• Substituting: changing the initial, final, or medial sound in a word to create a new word• Reversing: saying the sounds of a word in reverse order to create a new word
Manipulating
Daly, Chafouleas, & Skinner (2005)
Phonemic Awareness Hierarchy
56
Let’s look at Michael again…..
•Specific Level Assessment –•Phonics:
•Decoding Skills test•Developmental Spelling Analysis
•Sight words:•Graded word list
•Phonemic Awareness:•LAC 3
Problem Analysis
57
Linking specific level assessment data to interventions….
•Basing interventions on direct samples of
student’s academic skills has been shown to
result in larger effect sizes than interventions
derived from other data
–This is also known as a skill by treatment
interaction
–Burns, Codding, Boice & Lukito, (2010)
58
What Would Data Analysis at Tier III Look Like?
59
•Need to look at 3 areas»Level
»Slope
»Variability
60
Level
•Central location of data within a phase
•often compared to benchmark (goal/aim line)
• can also look at mean or median for each phase
– (e.g., Daly III et all., 2010; Hixson et al., 2008; Riley-Tillman & Burns, 2009)
• Can conduct a Gap Analysis using the worksheet
61
Slope/Trend
•How the central location changes over time
•With academic data we are usually looking for
an increase in skills
•Target students ROI can be compared with peer
groups ROI or benchmark
(e.g., Daly III et all., 2010; Hixson et al., 2008; Riley-Tillman & Burns, 2009)
62
2 approaches for analyzing slope
•Calculate ROI and compare to an identified peer group using the ROI worksheet
•Plot the trend line and compare the aim (goal) line to the slope (trend) line
63
Variability
•Should be examined both within and between phases
–General rule- most of the variability in the data should explained by the trend line
•80% of the data points should fall with in 15% of the trend line
64
65
Referral for SLD Evaluation Decision Tree
Tier III Intervention(s) have occurred daily for 60 minutes in addition to core instruction
Intervention logs attached(5) Fidelity checks completed and attached
Implementation integrity has occurred with at least 80% fidelity
Student has been present for ≥75% of intervention sessions
Tier III intervention(s) adequately addressed the student’s area of need
66
Referral for SLD Evaluation Decision Tree
Tier III intervention was appropriate and research-basedResearch based interventions are:□ Explicit□ Systematic□ Standardized□ Peer reviewed□ Reliable/valid□ Able to be replicated
Progress monitoring has occurred with at least 10-15 weekly data points –OR- 8-10 bi-monthly data points at Tier III
Gap analysis indicates that student’s progress is not sufficient for making adequate growth with current interventions
67
Referral for SLD Evaluation Decision Tree
The following have preliminarily been ruled out as the primary cause of the student’s lack of response to intervention
□ Visual, motor, or hearing disability□ Emotional disturbance□ Cultural factors□ Environmental or economic factors□ Limited English proficiency□ Excessive absenteeism
68
Deciding to refer for SLD evaluation
•As part of the teams decision to refer for an SLD evaluation, a Gap Analysis should be conducted
•Let’s look at how to complete the Gap Analysis worksheet with Michael
69
Gap Analysis
Assessment Used: 2nd ORFStudent’s current benchmark performance: 66Student’s current rate of improvement (ROI): 1.3Current benchmark expectation: 90End of year benchmark expectation: 90Number of weeks left in the school year: 5
Is Gap Significant?
________ / ________
= _________□ Yes □ NoCurrent
benchmark expectation
Current performance
Current gap
90 1.466
70
Conducting a Gap Analysis
• Step 2
90 66 24
24
5 4.824
1.3 18
71
Additional Consideration
72
SEM
• Additionally, we cannot ignore issues such as interpreting CBM scores in light of SEM or CI when those scores are used for such as diagnoses and eligibility determinations
• For more detailed discussion including suggested SEM guidelines for oral reading fluency scores in grades 1-5 see:
– Christ, T. J. Silberglitt, B. (2007). Estimates of the standard error of measurement for curriculum-based measures of oral reading fluency. School Psychology Review, 36, pp. 130-146.
73
Use of Progress Monitoring in Special Education
• Because CBM data
–can be directly tied to skill development necessary to be successful in the curriculum,
–they possess a higher level of sensitivity, and
–allows for graphic representation;
–they allows for development of a higher quality IEP
• Progress monitoring should continue after the IEP is initiated
• Exit criteria can be set to determine if early reevaluation can be completed due to student success.
74
Helpful Resources
75
Helpful Resources from NASP
76
77
Additional Helpful Resources
•Guilford Press
78
79
80
81