foundations of assessment

43
Foundations of Assessment Train-the-Trainer Competency 3 Acknowledgements : These materials were prepared by the Florida Center for Reading Research in partnership with the Florida Department of Education, the Just Read Florida! Office, and faculty in the College of Education at Florida State University. The authors would especially like to thank the following faculty and staff for their significant contributions to creating and reviewing materials for this integrated reading endorsement pathway: Mr. Nathan Archer, Ms. Amy Carroll, Dr. Jennifer Gans, Dr. Jennifer Hamilton, Dr. Laurie Lee, Dr. Arzu Leushuis, Ms. Shayla Lightfoot-Brown, Dr. Nicole Patton Terry, Dr. Kevin Smith, and Dr. Kelly Whalon. We acknowledge the authors of the Professional Learning Community materials that support the Foundational Skills to Support Reading for Understanding in Kindergarten Through 3rd Grade*. With those authors' permission, as well as additional resources and materials developed by instructional leaders in Seminole County Public Schools, the same format and five step process for implementing the PLC sessions were utilized for content in Competencies 1 and 2. Permission to reprint or use these materials is required. Inquiries may directed to the Florida Center for Reading Research at [email protected] . *Kosanovich, M. & Foorman, B. (2016). Professional learning communities facilitator’s guide for the What Works Clearinghouse practice guide: Foundational skills to support reading for understanding in kindergarten through 3rd grade (REL 2016-227). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southeast. Retrieved from http://ies.ed.gov/ncee/edlabs .

Upload: others

Post on 11-Jun-2022

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Foundations of Assessment

Foundations of AssessmentTrain-the-Trainer

Competency 3

Acknowledgements:ThesematerialswerepreparedbytheFloridaCenterforReadingResearchinpartnershipwiththeFloridaDepartmentofEducation,theJustReadFlorida!Office,andfacultyintheCollegeofEducationatFloridaStateUniversity.Theauthorswouldespeciallyliketothankthefollowingfacultyandstafffortheirsignificantcontributionstocreatingandreviewingmaterialsforthisintegratedreadingendorsementpathway:Mr.NathanArcher,Ms.AmyCarroll,Dr.JenniferGans,Dr. JenniferHamilton,Dr.LaurieLee,Dr.ArzuLeushuis,Ms.ShaylaLightfoot-Brown,Dr.NicolePattonTerry,Dr.KevinSmith,andDr.KellyWhalon.WeacknowledgetheauthorsoftheProfessionalLearningCommunitymaterialsthatsupporttheFoundationalSkillstoSupportReadingforUnderstandinginKindergartenThrough3rdGrade*.Withthoseauthors'permission,aswellasadditionalresourcesandmaterialsdevelopedbyinstructionalleadersinSeminoleCountyPublicSchools,thesameformatandfivestepprocessforimplementingthePLCsessionswereutilizedforcontentinCompetencies1and2.Permissiontoreprintorusethesematerialsisrequired.Inquiriesmaydirected [email protected].

*Kosanovich,M.&Foorman,B.(2016).Professionallearningcommunities facilitator’sguidefortheWhatWorksClearinghousepracticeguide:Foundationalskillstosupportreadingforunderstanding inkindergartenthrough3rdgrade(REL2016-227).Washington,DC:U.S.DepartmentofEducation,InstituteofEducationSciences,NationalCenter forEducationEvaluationandRegionalAssistance,RegionalEducationalLaboratorySoutheast.Retrievedfromhttp://ies.ed.gov/ncee/edlabs.

Page 2: Foundations of Assessment

Purpose of this Module • This course provides a foundation in assessment with an emphasis on

literacy/reading and is required for the Florida Reading Endorsement (Competencies 3)

• Grounded in the principles of research-based reading instruction and the Reading Endorsement Guiding principle that teaching reading for understanding is an ongoing systematic, problem solving process, teachers will implement and analyze assessments, and select appropriate instruction/interventions based on the collected data.

Page 3: Foundations of Assessment

Goals for the Train-the-Trainer Course: Foundations of Assessment (Module 3)

• Become familiar with the materials in Module 3 that address using assessment as a systematic problem solving process– Selecting and administering assessments – Analyzing assessment data to inform instruction

• Learn how to use the materials in Module 3 to deliver content that will meet the Competency 3 Reading Endorsement indicators.

Page 4: Foundations of Assessment

Materials for Trainers of the course: Foundations of Assessment (Module 3)

This module includes several components 1. Framework for Assessing Reading 2. Foundations of Assessment 3. Example Measures

1. Standardized Norm, Referenced Measures 2. Informal Reading Inventories 3. Curriculum Based Measures

4. Multi-tiered Systems of Support 5. Identification of Learning Disabilities/Dyslexia

Page 5: Foundations of Assessment

Materials for Trainers of the course: Foundations of Assessment (Module 3)

• Syllabus • Slide Deck • Case Study 1• Case Study 2• Case Study 3• Culminating Assessment with Example Rubric

Page 6: Foundations of Assessment

Goals for Today • Discuss your preparation related to assessment • Review foundational information related to assessment • Model and role-play information presented in the slide

deck • Review the syllabus and culminating project • Create a plan for implementation

Page 7: Foundations of Assessment

FRAMEWORK FOR ASSESSING READING

Page 8: Foundations of Assessment

The Simple View of Reading(Gough & Tunmer, 1986)

Word recognition and language comprehension are relatively independent of each other but are both highly correlated with reading comprehension.

Word Recognition X Language Comprehension = Reading Comprehension

The Simple View Explained

Page 9: Foundations of Assessment

Classification of Reading Problems: The Simple View of Reading

Poor Good

Poor

Good

Word Recognition La

ngua

ge C

ompr

ehen

sion

Dyslexia Non-specified

MixedSpecific

Comprehension Deficit

Kamhi, Catts, & Adolff, 2012

Page 10: Foundations of Assessment

Foundational Skills for Effective Reading Word Recognition • Phonological

awareness • Alphabet recognition • Letter/sound

correspondence • Word recognition • Nonword recognition • Spelling • Sight words • Oral reading fluency

Language Comprehension • Vocabulary (expressive

and receptive; breadth and depth)

• Background knowledge • Language structures

(syntax, semantics) • Inference generation • Print concepts • Text structures

Reading Rope Video

Page 11: Foundations of Assessment

Explaining SVR as a Framework for Assessment

In small groups, 1. Discuss how SVR can be used as a framework for

assessment 2. Role play how you will share the first 3 slides with

teachers to explain how the SVR can be used as a framework for assessment

Page 12: Foundations of Assessment

Foundations of Assessment

Let’s Review the Foundations for Assessment Information

Page 13: Foundations of Assessment

Purposes for Assessment Screening Diagnostic Progress

Monitoring Outcomes

• Brief measures designed to identify learner(s) in need of additional supports or instruction

• Often administered universally 2-3 times a year

• Typically administered when screening measures identify a need

• Used to create an informed instructional plan

• Brief, frequent, ongoing assessment

• Used to determine if instruction or intervention are working, student is making progress toward goals, rate of progress

• Used to determine if students are mastery

• Often used to make high stakes decisions including grade promotion and accountability

Page 14: Foundations of Assessment

Two Categories of Assessment

Formative • Assessing to make decisions

about teaching and learning – Is instruction effective?– Are students learning?

Summative • Evaluate effectiveness of

instruction • Evaluate what students have

learned

Page 15: Foundations of Assessment

Type of AssessmentsFormal

• Typically standardized • Compare individual performance to a

group or criteria to identify specific strengths and/or areas of need

• Follow a prescribed process for administration and scoring

Informal • Typically non-standardized • Assess progress toward learning goals • Allow greater flexibility • Most frequently used to make daily

instructional decisions

Page 16: Foundations of Assessment

Type of Assessments

Norm-Referenced• Compare individual scores to

that of the norm group• These measures often include

multiple norm groups (e.g., gender, age and grade norms)

• These tests produce derived scores including standard scores and percentile ranks

Examples include commercially available standardized tests

Criterion-Referenced • Compare individual scores to a

predetermined criterion or benchmark

• These measures can include norm groups, but often focus on mastery learning

Examples include curriculum based measurement, informal reading inventories

Page 17: Foundations of Assessment

Interpreting Norm Referenced Tests Scores of Relative Standing • Tell us percentage or proportion of

students who earned better or worse scores

• Highly comparable because they mean the same thing regardless of age or content tested

• Used to compare performance on several tests

• Compare several people on the same test

• Examples include standard scores and percentile ranks

Scores recommended by professional organizations

Developmental Scores • Based on average performance • Age and grade equivalents • Professional organizations suggest

not reporting these scores because they are easily misinterpreted (e.g., American Psychological Association, International Reading Association, Council for Exceptional Children)

"Your child performed the same as a 9.2 year old."

This is inaccurate. Why?

Page 18: Foundations of Assessment

A Few Example Problems with Developmental Scores (Sylvia et al., 2017)

Systematic Misinterpretation• Achieving the same number correct as an older/younger child does not mean the child being

tested performed in the same wayEstimates based on data from the norm group • These scores are estimates based on the norm group

– For example, a student may get age equivalent score of 4.1 when the norm sample does not include children under age 5

Typological Thinking• There is no such thing as the average X-year old. These scores are based on a range and

represents the middle 50 percent. Implication of a False Standard of Performance • In the norm group, 50 percent of students tested scored below the median

Page 19: Foundations of Assessment

Interpreting Norm-Referenced Tests • Raw scores are converted

to scores of relative standing to determine how students performed compared to a norm sample of same age or grade peers

• These scores tell us to what extent an individual’s raw score differs from a norm group of same age/grade peers Salvia et al., 2017

Page 20: Foundations of Assessment

Example of a distribution from a norm-referenced standardized test with a mean of 100 and standard deviation of 15

X = 100-1SD = 85-2SD = 70 +2S = 130+1S = 115

34%14%

34%14% 2%2%

68%

96%

What percentage of learners will score 1. within two standard deviations

of the mean? 2. within the mean or average

range?

Page 21: Foundations of Assessment

Interpreting Standard Scores and Percentile Ranks

Standard Score

• Ron scored 85 on a measure with a mean of 100 and SD of 15

• Ron scored one standard deviation below the mean or below average for his grade

Percentile Rank

• Sue achieved a percentile rank of 25

• Sue scored the same or better than 25 percent of the people in the norm sample in her grade

Which is easier to interpret? Why?

Page 22: Foundations of Assessment

Your Turn to Interpret

Standard Score and Percentile RankAlly’s scores on the Woodcock Johnson Test of Achievement Letter Word Identification Subtest (Mean 100 and Standard Deviation 15)

– Standard score = 83 – Percentile Rank = 10

Standard Score and Percentile Rank Carlos’ scores on the Woodcock Johnson Test of Achievement Letter Word Identification Subtest (Mean 100 and Standard Deviation 15)

– Standard score = 98 – Percentile Rank = 46

Page 23: Foundations of Assessment

Technical Adequacy Reliability • A reliable test provides

consistent results • Allows us to generalize a

person’s test performance – to other tests items measuring

the same construct/domain, – a different day/time, – and with different testers

Validity • A valid test measures what it

intends to measure • Allows us to make valid

inferences about test data

Let’s Learn More

Page 24: Foundations of Assessment

All Tests Include Error Standard Error of Measurement (SEM)

• Because we are unable to measure performance across all possible times, items, testers, there is always error

• Every score includes the learner’s true score + error • SEM tells us how much error or noise there is in a test score • The larger the SEM the greater the uncertainty of a test score• SEM is used to help us interpret test scores quantifying accuracy of the score • SEM is used to create confidence intervals

Page 25: Foundations of Assessment

Confidence Intervals • Confidence intervals are a range within which the true score has a known probability

of falling– A confidence interval is the score +/- the SEM

• If a score is 100 and the SEM is 5, then the confidence interval is +/-5

• The confidence interval is based on how certain we want to be that the true score falls within a specified range – Levels of confidence typically reported are 68, 90 and 95%– When making important decisions we want to be confident – 95% – We select the desired level of confidence first.

• For example, If we want to be 95% confident in a vocabulary knowledge test score and a child scored a 99 with a 95% confidence interval of +/-4, we would say that we are 95% confident that the child’s true score falls between 95-103

• If our confidence interval is really large, that will mean error is large

Page 26: Foundations of Assessment

Your Turn to Interpret • Tom’s standard scores with confidence interval (95%) on the Test of

Preschool Early Literacy (TOPEL)– Phonological Awareness 96 +/- 7 (89-103)– Print Knowledge 99 +/- 5 (94-104)– Definitional Vocabulary 78 +/- 6 (72-84)

Explain in your own words…

Page 27: Foundations of Assessment

Bias

• All tests have error • Bias can occur when

– We do not consider who the test is designed for and for what purpose • Background knowledge and experiences of the learner

– Failure to follow standardized procedures

Page 28: Foundations of Assessment

Relationship Between Reliability and Validity • Anything that interferes with a test measuring what it is intended to

measure affects validity

• Therefore, if a test is not reliable it is not valid – Reliability, norm groups, bias, administration, interpretation, error, etc.

are all subparts of validity

Describe the relationship in your own words.

Consider this scenario Lupe has lived in the US for 3 months. When selecting a norm referenced test to determine her current reading level, what are some considerations for reliability and validity.

Page 29: Foundations of Assessment

Important Criteria for Establishing Validity • A test must be reliable to be valid, but reliability does not ensure validity • Students must have the skills necessary to complete the test

– Language– Hearing

• Items should work the same way for various groups of students • Administration errors• Representative norms

Page 30: Foundations of Assessment

Small Group Activity

Page 31: Foundations of Assessment

Role-Play in Small Groups 1. Review the slide deck for the example measures assigned to your group

1. Norm-referenced, standardized measures (Session 4 Slides 9-24)2. Informal reading inventories (Session 5 Slides 3-22) 3. Curriculum-based measures (Session 6 Slides 6-27)

2. Determine notes of information that you would want to make sure to cover – You can type in the slide deck notes section

3. Identify strategies/activities that you might you use to facilitate learning

4. Share your ideas when you return to the whole group

Page 32: Foundations of Assessment

Data Collection for Effective and Targeted Planning

Let’s Review the Slides on Instruction, Planning, Prevention and Diagnosis

Page 33: Foundations of Assessment

What is a Multi Tiered Systems of Support (MTSS) Model?

We can’t take a ”wait and see” approach

Page 34: Foundations of Assessment

Causes of Reading Difficulties (1) Ineffective instruction and (2) Individual differences1. Ineffective instruction may reflect

– Limited time dedicated to building necessary prerequisite knowledge or skills as well as learning new skills or content

– Lack of teacher content or pedagogical knowledge (i.e., how to deliver instruction)

– Commitment to ineffective methods over evidence-based practices• For example, some schools continue to reject the idea of systematically

teaching phonics despite clear evidence that this approach works.2. Individual differences

– Individual learning differences can impact learning.– Motivation to learn also influences skill development.

Page 35: Foundations of Assessment

What is Dyslexia? • https://www.youtube.com/watc

h?v=klG8vHRLtJU&t=3s

Children identified with dyslexia have trouble decoding, spelling, and reading fluently.

These difficulties are associated with phonological processing deficits or specific language-based difficulties.

For children with dyslexia, comprehension may also be a problem because of difficulty building vocabulary and background knowledge often gained through effective reading.

Page 36: Foundations of Assessment

Importance of Early Identification and Early Screening

• https://www.youtube.com/watch?v=0FJJLhJweWk&list=PLLxDwKxHx1yJkKZlbnu6gc_lBbB4TyxQU&index=2

• https://www.youtube.com/watch?v=cPHK5IHKva4&list=PLLxDwKxHx1yJkKZlbnu6gc_lBbB4TyxQU&index=4

Reading Rockets Interview with Dr. Jack Fletcher

Page 37: Foundations of Assessment

Identifying Dyslexia • In the past, children with reading disabilities including dyslexia were identified using a

model called the severe discrepancy model– Educational achievement in reading was much lower than would be expected based on cognitive or

IQ tests• Today, most advocate for using multi-tiered systems of support model to avoid a wait to fail

approach • A delay receiving additional or targeted instructional services fails to address the

foundational skills children require to catch their peers or even prevent the identification of a reading disability

• IDEA, states that the use of the severe discrepancy model must not be required for identification of learning or reading disabilities including dyslexia

• Some students with complicated learning profiles such as twice exceptional students (most commonly, gifted students with a reading disability or dyslexia) will need cognitive or intellectual testing to demonstrate their unique learning profile and needs

Page 38: Foundations of Assessment

Steps in MTSS Models • A universal screener is provided to all students to determine progress in reading• Diagnostic measures of specific reading skills associated with reading disabilities are used

to pinpoint target areas for intervention• Tiered interventions are provided based on results of initial screening and progress

monitoring. – All children receive quality, evidence-based Tier 1 instruction in reading.– Tier II instruction is provided in areas identified in early screeners as needing additional

instruction. – Tier III instruction is intensive, delivered in small groups or 1:1 and specifically targets a need

identified through progress monitoring. – Tiers are not fixed as learners may move to different Tiers at different times.

• Progress Monitoring is provided every 2-3 weeks in Tiers II and III and less frequently in Tier 1.

• Identification for Special Education occurs when children are not making adequate progress in targeted instruction.

Page 39: Foundations of Assessment

Assessment to Determine Dyslexia(International Dyslexia Association)

Assessments should measure • Phonological Awareness • Phonological or Language-Based Memory – ability to recall sounds, syllables, words• Rapid Automatic Naming – speed of naming objects, colors, digits, or letters• Receptive Vocabulary – understanding of words heard• Phonics Skills – understanding of the letter/sound correspondence• Decoding –ability to use use letter/sound correspondence to identify words

– Real Words– Nonsense Words

• Oral Reading Fluency – ability to read accurately and at a rate that facilitates understanding – Single Words– Sentences and Paragraphs

• Spelling• Writing

– Sentence Level– Paragraph Level

Page 40: Foundations of Assessment

Data Based Individualization (DBI) Step 1: Implement the current validated intervention program with increased intensity (e.g., smaller group size, more time) Step 2: Collect frequent progress monitoring data to determine whether the student is respondingStep 3: If the student continues to struggle, collect diagnostic information to identify difficulties Step 4: Use the diagnostic data along with educator expertise to modify or adapt the intervention to meet the student’s individual need Step 5: Continue to collect progress monitoring data at regular intervals to determine responsiveness and make adaptations as needed

Page 41: Foundations of Assessment

Small Group Activity 2

Page 42: Foundations of Assessment

Case Study Implementation 1. Review the case study assigned to your group

1. Marco (Session 3) 2. David (Session 6) 3. Jarred (Session 7)

2. Identify strategies and/or activities for using the case study in your course

3. Share your ideas when you return to the whole group

Page 43: Foundations of Assessment

Syllabus & Culminating Activity

Let’s Review the Syllabus & Culminating Activity

– Discuss the syllabus and any questions about developing this course

– Review the rubric for the culminating activity and build your own