understanding and using data to plan for school improvement 2009 reading and mathematics assessment...
TRANSCRIPT
Understanding and Using Data to Plan for School Improvement
2009 Reading and Mathematics
Assessment for Learning
Outcomes
• Bringing context and meaning to the math and reading assessment project results;
• Initiating reflection and discussion among school staff members related to the math and reading assessment results;
• Encouraging school personnel to judiciously review and utilize different comparators (criterion-referenced, experience-referenced, standards-referenced, longitudinal-referenced, and normative-referenced) when judging assessment results;
• Modeling processes that can be used at the school level for building understanding of the links between assessment, curriculum and instruction among school staff and the broader community; and,
• Providing opportunity to discuss and plan at the school level
Agenda
• Principles of the Saskatchewan Assessment for Learning Program
• Conceptual Frameworks• Nature of Assessment
Data• Comparators• Opportunity-to-Learn• Student Performance
Outcomes
• Standards and Cut Scores
• Mining the Report• Data Analysis• Designing Interventions• Identifying Teaching and
Learning Strategies• Action Planning
Purposes of the AFL Project
• Stimulate and inform discussion around student performance
• Assist identification of areas of focus for school and division planning in reading and mathematics
• Promote and support a teaching and learning community across grade levels
• Strengthen assessment literacy and use of data for improvement.
Principles of the AFL Program
1. Cooperation and Shared Responsibility
2. Equity and Fairness
3. Comprehensiveness
4. Continuous Improvement that Promotes Quality and Excellence
5. Teacher Professionalism
6. Authenticity and Validity
7. Honesty and Openness
Conceptual Frameworks: pps 3 & 4
• With a partner, read the Conceptual Framework for either Math or Reading.
• Use the Say Something strategy as you read.– Determine with your partner how you will
break up the document into sections.– Pause after reading each section, look up and
say something – an aha, a question or a comment.
Nature of Assessment Data
From Understanding the numbers. Saskatchewan Learning
Definitive Indicative
Individual Classroom School Division Provincial National International
Student Evaluations System Evaluations
.
Depth and Specificityof Knowledge
From Saskatchewan Learning. (2006). Understanding the numbers.
Little knowledge ofspecific students
In-depth knowledge of specific students
Individual NationalSchoolClassroom InternationalDivision Provincial
Assessments
In-depth knowledge of specific students
Individual NationalSchoolClassroom InternationalDivision Provincial
Assessments
In-depth knowledge of systems
Assessment for Learningis a Snapshot
• Results from a large-scale assessment are a snapshot of student performance.– The results are not definitive. They do not tell the
whole story. They need to be considered along with other sources of information available at the school.
– The results are more reliable when larger numbers of students participate and when aggregated at the provincial and division level, and should be considered cautiously at the school level. Individual student mastery of learning is best determined through effective and ongoing classroom-based assessment. (Saskatchewan Ministry of Education, 2008)
Comparators: Types of Referencing
• Criterion-referenced: Comparing how students perform relative to curriculum objectives, level attribution criteria (rubrics) and the level of difficulty inherent in the assessment tasks. If low percentages of students are succeeding with respect to specific criteria identified in rubrics, this may be an area for further investigation, and for planning intervention to improve student writing.(Detailed rubrics, OTL rubrics and test items can be sourced at www.education.gov.sk.ca)
• Standards-referenced: Comparing how students performed relative to a set of professionally or socially constructed standards. Results can be compared to these standards to help identify key areas for investigation and intervention.(The standards set by the panel.)
Refer to p. 2 in the detailed reports for more information
Comparators: Types of Referencing
• Experience or self–referenced: Comparing how students perform relative to the assessment data gathered by teachers during the school year. Where discrepancies occur, further investigation or intervention might be considered. It is recommended that several sources of data be considered in planning.(E.g.. Comparing these results to current school data.)
• Norm-referenced: Comparing how students in a school performed relative to the performance of students in the division, region or project. Note cautions around small groups of students. Norm-reference comparisons contribute very little to determining how to use the assessment information to make improvements.(E.g.. Tables comparing the school, division and province.)
Refer to p. 2 in the detailed reports for more information
Comparators: Types of Referencing
• Longitudinal-referenced: Comparing how students perform relative to earlier years’ performance of students. Viewed across several years, assessment results and other evidence can identify trends and improvements.– Four years of data for Math are available.– Two years of data for Reading are available.
• It is important to note that identification of trends be done with extreme caution until more assessment cycles have been completed.
Refer to p. 2 in the detailed reports for more information
Opportunity-to-Learn Elements as Reported by Students
MATH• Preparation for and
commitment to learn• Persistence in solving
math problems• General Support for
learning• Support for learning in
math
READING• Preparation for and
commitment to learn• Knowledge and use
of reading strategies• Home support for
reading
Look at the detailed Opportunity-to-Learn rubric. Discuss with your table group the key features of each criteria.
Opportunity-to-Learn Elements as Reported by Teachers
• Availability and Use of Resources
• Classroom Instruction and Learning
Math only:
• Approaches to Problem Solving
Look at the detailed Opportunity-to-Learn rubric. Discuss with your table group the key features of each criteria.
Student Performance Outcome Results
• Reading Comprehension Skills– Explicit– Implicit– Critical
• By Strategy:– Using Cueing Systems– Connecting to Prior Knowledge– Making Inferences/Predictions– Noting Key Ideas and Finding Support– Summarizing/Recalling/ Organizing Information– Recognizing Author’s Message and Craft
• Reader Response
Student Performance Outcome Results
• Math Content Skills 5 & 8– Number– Patterns and Relationships– Shape and Space– Statistics and Probability
• Short Answer• Math Challenge –
integrated applications• Estimation skills• Computation (no
calculator)
• Math 20 Content Skills– Irrational Numbers– Consumer Mathematics– Polynomials and Rational
Functions– Quadratic Functions– Quadratic Equations– Probability– Angles and Polygons– Circles
• Estimations• Math Challenge
Standards
To help make meaningful longitudinal comparisons three main processes are used.
1. Assessment items will be developed for each assessment cycle using a consistent table of specifications.
2. The assessment items will undergo field-testing - one purpose of which is intended to inform the comparability of the two assessments.
3. A process for setting of standards for each of the assessment items, so that any differences in difficulty between two assessments are accounted for by varying standards for the two assessments.
Opportunity-to-Learn and Performance Standards
• In order to establish Opportunity-to-Learn and Performance standards for the 2009 Reading and Math Assessments, three panels were convened (one from each assessed grade), consisting of teachers from a variety of settings and post-secondary academics including Education faculty.
• The panelists studied each section of the 2009 assessments in significant detail and established expectations for both the performance elements and opportunity to learn.
Thresholds of Adequacyand Proficiency
Beginning Developing Adequate Proficient Insightful
Thresholds of Adequacyand Proficiency
Threshold of Adequacy
Threshold of Proficiency
61% 14%
Adequate Proficient & Beyond
61% of students achieved at least the adequate standard of performance for a particular measure
14% of students achieved at least the proficient standard of performance for a particular measure
Cut Scores
• On page 4 of the detailed reports you will find the cut scores. Cut scores indicate the percentage correct required for students to be classified as either excellent or sufficient with respect to OTL or proficient or adequate with respect to performance:
Opportunity-to-Learn Elements
Performance Component
Excellent Standard
Sufficient Standard
Proficient Standard
Adequate Standard
5 Level Scale Math & Reading – reported as percentages
Math Challenges and Reader Response – 5 Level Scale
Effective Data Use
Making significant progress in improving student learning and closing achievement gaps is a moral responsibility and a real possibility in a relatively short amount of time – two to five years. It is not children’s poverty or race or ethnic background that stands in the way of achievement. It is school practices and policies and the beliefs that underlie them that pose the biggest obstacles.
Love, N., Stiles, K., Mundry, S., & DiRanna, K. (2008). Passion and principle ground effective data use. Journal of Staff Development, 29(4), 10-14.
Mining the Report:Multiple Choice Questions
MATH• Resource: MC answer
key with reasons for distractors
• Grade 5 & 8– Table Grade.5
• Grade 11– Table Grade.5
READING• Resource: Explanation of
reading strategies and levels of comprehension
• Table Grade.6
What patterns are you noticing? Identify strengths and weaknesses. Write them on sticky notes – one per note!
Do any of these patterns extend to sub-groups? (Boys, Girls, First Nations & Métis) Refer to the Gender Comparisons.
Mining the Report:Math: Challenges, Computation & Estimation
Reading: Reader Response
MATH• Integrated Applications
– Gr. 5 & 8 - Section 4– Gr. 11 – Section 3
• Computation and Estimation– Gr. 5 & 8 - Section 5– Gr. 11 – Section 4
READING• Resource: Informational
and Literary Text Coding Guides
• Table Grade.8
What patterns are you noticing? Identify strengths and weaknesses. Write them on sticky notes – one per note!
Do any of these patterns extend to sub-groups? (Boys, Girls, First Nations & Métis) Refer to the Gender Comparisons.
Looking for Trends
• Refer to pages 10-15 of the detailed report for Mathematics. What trends seem to be emerging? Write these on sticky notes as well.
• For reading, refer to pages 10-19, there will only be one other year of data to review. While this is not enough data to establish a trend, in what ways have results changed in comparison to the 2007 assessment? Write these on sticky notes as well.
Reflection on Findingspg. 9
• In what ways does your school data in these areas confirm or contradict what you have found?– For the school?– For subpopulations?
• What other information might you require?
Student Performance Outcome Results
• Reading Comprehension Skills– Explicit– Implicit– Critical
• By Strategy:– Using Cueing Systems– Connecting to Prior Knowledge– Making Inferences/Predictions– Noting Key Ideas and Finding Support– Summarizing/Recalling/ Organizing Information– Recognizing Author’s Message and Craft
• Reader Response
Take the sticky notes and classify them by the categories within the AFL report:
Student Performance Outcome Results
• Math Content Skills 5 & 8– Number– Patterns and Relationships– Shape and Space– Statistics and Probability
• Short Answer• Math Challenge –
integrated applications• Estimation skills• Computation (no
calculator)
• Math 20 Content Skills– Irrational Numbers– Consumer Mathematics– Polynomials and Rational
Functions– Quadratic Functions– Quadratic Equations– Probability– Angles and Polygons– Circles
• Estimations• Math Challenge
Take the sticky notes and classify them by the categories within the AFL report:
Lunch
We resume at 12:45
Cause and Effect Analysispp. 10-11
• Choose one strength and one weakness to analyze.
• Using the supplied template, analyze the causal factors that contributed to that strength or weakness in the areas of:– Instruction (Math Grade.2d; Reading Grade.13, .14)– School Based Data/Student Work– Demographics (pp. 23-24)– Student Perceptions (Tables Grade.2a, 2b, 2c;
Reading Grade.9, 10, 11, 12)
Designing Interventions
Using data itself does not improve teaching. Improved teaching comes about when teachers implement sound teaching practices grounded in cultural proficiency – understanding and respect for their students’ cultures – and a thorough understanding of the subject matter and how to teach it, including understanding student thinking and ways of making content accessible to all students.
Love, N., Stiles, K., Mundry, S., & DiRanna, K. (2008). Passion and principle ground effective data use. Journal of Staff Development, 29(4), 10-14.
Look for Teaching Strategiespg. 12
• Choose one of the categories that you have identified.
• Place the name of that category in the middle of the bubble chart provided.
• Using the materials provided, begin a preliminary investigation of teaching strategies that are available.
• Discussion Questions
– Which strategies are well integrated in our practice?
– Which strategies might we study further and begin to integrate into our practice?
– What does the research say about how students work with this content?
Four Tasks of Action Planning – pg. 13
1. Decide on strategies for improvement.
2. Agree on what your plan will look like in classrooms.
3. Put the plan down on paper.
4. Plan how you will know if the plan is working.
Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.
Writing Out The Planpg.14
• Using the supplied Implementation Indicators template, begin to draft the details of the plan as you work to achieving your goal.
• The supplied template is only a suggestion – you may create your own or use another of your own design.
Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.
EVIDENCE OF IMPLEMENTATION INDICATORS
Strategy: Estimation & Computations
What we will see and hear in classrooms EvidenceCollected
by
Teachers
Teachers will teach strategies and processes for using estimation in computations.
Teachers will model and encourage students to demonstrate more than one strategy for computations.
Teachers will ask a variety of students to explain their reasoning.
Students
Students will be experimenting with a variety of ways to mentally solve numerical problems.
Students will “talk through” their computation of a math problem.
Students will be explaining how they reason.
Classrooms
Noise, as students collaborate and “talk through” their problems.
Manipulatives, drawings, small groups. Processes or steps for mental computations.
Student Work
Students can clearly articulate processes for mental computations.
Physical and graphic representations of solutions to problems.
Increasing speed and fluency in computation skills.
Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.
EVIDENCE OF IMPLEMENTATION INDICATORS
Strategy: Making Inferences/Predictions
What we will see and hear in classrooms EvidenceCollected
by
Teachers
Teachers will teach strategies for making inferences and predictions.
Teachers will be seen using supports such as anticipation-reaction guides.
Teachers will encourage students to make inferences and predictions in the materials they are listening to, reading, viewing.
Students
Students will be using a variety of graphic organizers and other supports such as anticipation-reaction guides.
Students will articulate inferences and predictions with reasons.
Students will evaluate the accuracy of their inferences and predictions.
Classrooms
Noise, as students collaborate. Samples of graphic organizers. Information on making inferences and predictions.
Student Work
Students will journal inferences and predictions with supporting reasons.
Students will reflect on their inferences and predictions in writing.
Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.
Plan How You Will Knowif the Plan is Working
• Before implementing your plan, it is important to determine what type of data you will need to collect in order to understand whether students are moving towards the goal.
Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.
Put the Plan Down on Paperpg. 16
• By documenting team members’ roles and responsibilities and specifying the concrete steps that need to occur, you build internal accountability for making the plan work.
• Identifying the professional development time and instruction your team will need and including it in your action plan lets teachers know they will be supported through the process of instructional improvement.
Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.
Advancing Assessment Literacy Modules – pg. 18
• 17 Modules designed to facilitate conversations and work with data for improvement of instruction.
• www.spdu.ca– Publications
• Advancing Assessment Literacy Modules
• Download a PDF of a PowerPoint and accompanying Lesson Plan for use by education professionals in schools.
Key Word Cycle(Adapted from Anne Davies, 2009)
• As you think about your learning today, create a key word cycle.
Outcomes
• Bringing context and meaning to the math and reading assessment project results;
• Initiating reflection and discussion among school staff members related to the math and reading assessment results;
• Encouraging school personnel to judiciously review and utilize different comparators (criterion-referenced, experience-referenced, standards-referenced, longitudinal-referenced, and normative-referenced) when judging assessment results;
• Modeling processes that can be used at the school level for building understanding of the links between assessment, curriculum and instruction among school staff and the broader community; and,
• Providing opportunity to discuss and plan at the school level