getting the most out of assessment
DESCRIPTION
Getting the Most Out of Assessment. “You haven’t taught until they’ve learned.” – UCLA Basketball Coach John Wooden. “So how do you know they have learned?” . 2012 Legislation. - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/1.jpg)
Getting the Most Out of Assessment
“You haven’t taught until they’ve learned.” – UCLA Basketball Coach John Wooden
“So how do you know they have learned?”
![Page 2: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/2.jpg)
2012 Legislation• Requires Colorado to participate as a Governing
Board member in a consortium of states that focuses on the readiness of students for college and careers.
• Requires the Board to rely upon the assessments developed by the consortium expected to be ready for spring 2015.
• Encourages the Board to conduct a fiscal and student achievement benefit analysis of Colorado remaining a Governing Board member starting on or before January 1, 2014.
![Page 3: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/3.jpg)
Proposed Summative Assessment Timeline
•TCAP and CoAlt Continue as is•Field test new social studies and
computer based science items
2013•TCAP and CoAlt Reading, Writing, and
Math will continue•First year of new social studies and
science assessments will be operational
2014
•New Reading Writing, and Math assessments
•Second year of new social studies and science assessments will be operational
2015
![Page 4: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/4.jpg)
Tensions-Multiple Measures• validity vs. reliability• performance/constructed response tasks vs. selected response items• all students vs. sampling• content vs. skills• local educators vs. professional testing contractors• local scoring vs. outside scoring• student work vs. numeric scores• summative vs. formative• holistic vs. analytic• stand-alone vs. embedded• one-year’s growth vs. differences in resources (instructional time, etc.)• mandate by edict vs. preparation through professional development
Moving Through TensionsRe-envisioning the purpose of assessment -asking key questions to invite
innovative thinking regarding evidence of student learning
![Page 5: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/5.jpg)
Evidence of Student Learning
• A Private Universe/ MIT and the Light Bulb• Honors Student and the Light Bulb• What Causes the Seasons?
Office of Assessment, Research & Evaluation
![Page 6: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/6.jpg)
![Page 7: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/7.jpg)
Assessments and protocols which uncover private thinking
• You and the Moon• More examples (Traffic Light, etc.)• Honeycomb!
![Page 8: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/8.jpg)
![Page 9: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/9.jpg)
Tensions-Assessment Literacy• validity vs. reliability• performance/constructed response tasks vs. selected response items• all students vs. sampling• content vs. skills• local educators vs. professional testing contractors• local scoring vs. outside scoring• student work vs. numeric scores• summative vs. formative• holistic vs. analytic• stand-alone vs. embedded• one-year’s growth vs. differences in resources (instructional time, etc.)• mandate by edict vs. preparation through professional development
Moving Through TensionsDefining purpose s for assessment across all disciplines
![Page 10: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/10.jpg)
![Page 11: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/11.jpg)
Formative Assessment“Process used by both teachers and students during instruction that
provides ‘in the moment’ feedback for adjusting teaching and learning. It reveals points of confusion, misunderstanding or progress toward
mastery of an idea.” (CDE, 2011)
Typical uses
• An instructional process used to inform instruction and learning during the learning process
• Aligned to standards and focused on learning progressions
• Intended to motivate students towards learning targets
• Used for instructional purposes and is not punitive or used to compare students to students, teachers to teachers, schools to schools, or districts to districts
• Using informal and formal instructional strategies to gather, interpret, and use information to adjust and monitor teaching and learning.
Examples• Exit ticket• Formative Performance Task• Think-Pair-Share• Self-Assessments• Response Journals• Observations• Anecdotal Records
Office of Assessment, Research & Evaluation
![Page 12: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/12.jpg)
Interim Assessment“Assessments typically administered every few months to fulfill one or more of the following functions: instructional (e.g., to supply teachers
with student diagnostic data); evaluative (e.g., to appraise ongoing educational programs; predictive (e.g., to identify student
performance on a later high-stakes test).” (CDE, 2011)Typical uses
• Provides a predictive measure of postsecondary & workforce readiness;
• Provides student demonstration of current knowledge and progress towards mastery of standards;
• Informs instructional and/or programmatic adjustments;
• Results can be used in educator effectiveness evaluation
Examples• Acuity• Galileo• NWEA• Quarterly District Assessments
Office of Assessment, Research & Evaluation
![Page 13: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/13.jpg)
Summative Assessment
Typical uses• Accountability, including school,
educator, and student (e.g., graduation)
• Certify mastery• Program/curricular evaluation• Monitor trends and progress• Know students’ achievement
levels• Grades
End of unit or end of year, comprehensive and standardized measurement of student mastery in order to assess student
learning, inform taxpayers, state policy makers, support identification of successful programs, and/or serve a variety of
state and federal accountability needs.” (CDE, 2011)
Examples• TCAP• NAEP• End of Unit Summative
Assessment• Final Exam
Office of Assessment, Research & Evaluation
![Page 14: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/14.jpg)
Tensions-Diverse Stakeholders• validity vs. reliability• performance/constructed response tasks vs. selected response items• all students vs. sampling• content vs. skills• local educators vs. professional testing contractors• local scoring vs. outside scoring• student work vs. numeric scores• summative vs. formative• holistic vs. analytic• stand-alone vs. embedded• one-year’s growth vs. differences in resources (instructional time, etc.)• mandate by edict vs. preparation through professional development
Moving Through TensionsCommitment to collaborative approaches and multiple layers of implementation
![Page 15: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/15.jpg)
Content CollaborativesP-12 educators from around the state are gathering to identify and create high-quality assessments, which are aligned to the new Colorado Academic Standards and may be used in the context of Educator Effectiveness evaluations.The Content Collaboratives, CDE, along with state and national experts, will establish examples of student learning measures within each K – 12 content area including: Cohort I
Dance Drama & Theatre Arts Music Reading, Writing and Communicating
Social Studies Visual Arts
Cohort IIPhysical Education Science World Languages Comprehensive Health
Mathematics
![Page 16: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/16.jpg)
Pilot then peer
review
NationalResearchers
I: Jan-Mar 2012II: Jun-Aug 2012
I: Feb-May 2012II: July-Nov 2012 I &II:
Feb-Dec 2012I & II: Aug 2012-
Aug 2014I: Aug 2013II: Aug 2014
Researchers gather existing fair, valid and reliable measures for
Consideration.
Technical Steering Committee creates
frameworks and design principles for collaboratives to use
in reviewing and creating measures.
Committee reviews recommendations of
collaboratives.
Piloting and peer review of
measures.
Aug 2012-Aug 2013: Cohort I piloting & peer
review
January 2013-Aug 2014: Cohort II piloting & peer
review
Measures placed in online
Education Effectiveness
Resource Bank for voluntary use.
Collaboratives use protocol to review
researchers’ measures for
feasibility, utility and gaps.
Prepare to fill gaps.
Provide recommendations to Technical Steering
Committee.
Cohort I & II: Flow Chart of Work
Colorado Content
Collaboratives
Technical Steering
CommitteeFuture WorkBank
![Page 17: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/17.jpg)
High Quality Assessment Review Tool
• A high quality assessment should be...
•Aligned to the content you want
students to master•Uses clear and rigorous scoring
criteria•Is fair and unbiased •Provides students with an opportunity
to learn
colorado content collaboratives cde
![Page 18: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/18.jpg)
How are we doing this?Who is involved?
•Researchers•Content Collaborative Members•Technical Steering Committee
•Center for Assessment (NCIEA)•Pilot Districts
•Peer Reviewers•Other states and districts
Office of Assessment, Research & Evaluation
![Page 19: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/19.jpg)
Alberta ES Alberta HS God, Gold, Glory S Africa Ancient Greece World Views & Conflict
Worldview Gr 60%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%75
.9%
72.2
% 75.9
%
61.1
%
81.5
% 85.2
%
64.8
%
100.
0%
83.3
%
83.3
%
66.7
%
66.7
%
83.3
%
83.3
%
61.1
% 66.7
%
66.7
%
38.9
%
77.8
%
72.2
%
55.6
%
86.7
%
60.0
%
80.0
%
66.7
%
93.3
%
86.7
%
80.0
%
73.3
%
86.7
%
80.0
%
80.0
%
80.0
%
100.
0%
53.3
%
Content Review Tool Summary: Scored Social Studies Assessments
Overall Total Total Standards Match Total Scoring Total Fair & Unbiased Opportunities to Learn
![Page 20: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/20.jpg)
Inventory of Assessments
colorado content collaboratives cde
![Page 21: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/21.jpg)
![Page 22: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/22.jpg)
Review Progress
colorado content collaboratives cde
![Page 23: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/23.jpg)
![Page 24: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/24.jpg)
How Colorado Will Determine Student Learning
• Quality Criteria for One Measure• Multiple Measure Design Principles for Combinations of Measures• Growth Measure Composite
colorado content collaboratives cde
![Page 25: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/25.jpg)
Assessment in DPSDenver Public Schools
• Teachers in math, reading, and writing have a variety of standardized state and/or district assessments that can contribute measure of student growth to a teacher’s evaluation
• In all other subjects, there are neither state nor district assessments that can be used in a similar way
• Nearly 70 percent of the 4500 teachers in DPS teach something other than math, reading, or writing
![Page 26: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/26.jpg)
Non-Tested Subjects Assessment Development
• Beginning in November 2011, DPS embarked on developing assessments in traditionally non-tested subjects
• Working collaboratively with teachers • Assessments will be used to develop measures of
student growth and contribute to teacher evaluations• Work will continue through summer of 2014
![Page 27: Getting the Most Out of Assessment](https://reader035.vdocuments.us/reader035/viewer/2022062811/56815f61550346895dce4a10/html5/thumbnails/27.jpg)
Current and Future Work
• Cohort 1– Music– Visual Arts– Physical Education
• Work during the 2012-13 school year will include:– Dance– Drama/Theater Arts – Social Studies