presented by: sara bryant, measured progress michigan assessment consortium april 15, 2013

44
Measuring Growth Without a Measuring Tape: What Teachers Need to Consider in Thinking About Teacher Effectiveness Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013 1:00—2:30

Upload: odette

Post on 23-Feb-2016

38 views

Category:

Documents


0 download

DESCRIPTION

Measuring Growth Without a Measuring Tape: What Teachers Need to Consider in Thinking About Teacher Effectiveness. Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013 1:00—2:30. Acknowledgment. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Measuring Growth Without a Measuring Tape: What Teachers Need to Consider in Thinking About Teacher Effectiveness

Presented by: Sara Bryant, Measured ProgressMichigan Assessment Consortium

April 15, 20131:00—2:30

Page 2: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Acknowledgment

The work described here has been developed for the Literacy Design Collaborative by Measured

Progress and the Stanford Center for Assessment, Learning, and Equity with funding

by the Bill and Melinda Gates Foundation.

Page 3: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Overview

• Objectives for today’s session• Teacher Moderated Scoring Systems (TeaMSS)• Literacy Design Collaborative (LCD)• Teacher Effectiveness • “Take 5s”

Page 4: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Session Objectives

• To learn about Measured Progress’ work on TeaMSS

• To learn about a partnership with the LDC• To think about how this project might inform

your own local work on teacher effectiveness models

Page 5: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Take 5s

The Big Question:

As I learn about Measured Progress’ work with LDC and the Gates Foundation, what

connections am I making to my own local work?

Page 6: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Project Components

Page 7: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Measured ProgressScoring Professional Development

Page 8: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Teacher Moderated Scoring Systems (TeaMSS)

• Teachers scoring student tasks

• Common rubrics aligned to Common Core

Standards (CCS)

• Common summative assessments (“tasks”) aligned to

rubrics and CCS

• Other PD tools and resources to help teachers learn

to score and become calibrated with others

Page 9: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

PD Components

1. Grade student work with no rubric/guidance2. Learn the intricacies of a common rubric3. Learn how anchor sets are used as a scoring tool4. Practice scoring rubric elements5. Reflect on essential Scoring Principles6. Practice, Practice, Practice7. Score two final papers to look for calibration8. Continue practicing with additional student tasks

Page 10: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013
Page 11: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Learning ObjectivesBig Ideas

• Adopting a Mind-Set of “Learning to Score”

• Understanding and Using Rubrics

• Understanding and Using Anchor Sets

• Scoring

• Application to Classroom

Page 12: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Grading: reflects the performance of students relative to expectations at a particular point in time.

Scoring: uses fixed standards of quality that do not change over time.

Grading vs. ScoringWhat’s the Difference?

Page 13: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Holistic Scoring: balances characteristics of writing to arrive at a score appropriate to its overall quality.

Analytic Scoring: considers criteria of assessment separately, identifying a single score for each criterion.

Analytic vs. Holistic ScoringWhat’s the Difference?

Page 14: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Take 5

When thinking about your own experiences with organizing and implementing scoring common

student work, what tools and trainings might be helpful to enhance the experiences?

What tools and resources might be helpful to learn more about scoring common student

work?

Page 15: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013
Page 16: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

1. Know the rubric.

Page 17: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

2. Trust evidence, not intuition.

Page 18: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

3. Match evidence to language inthe rubric.

Page 19: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

4. Weigh evidence carefully;base judgments on the

preponderance of evidence.

Page 20: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

5. Know your biases;leave them at the door.

Page 21: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

6. Focus on what the student does, not on what the student does not do.

Page 22: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

7. Isolate your judgment: One bad element does not equal a bad paper.

Page 23: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

8. Resist seduction: One good element does not equal a good paper.

Page 24: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

9. Recognize direct copy or plagiarism.

Page 25: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

10. Stick to the rubric.

Page 26: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Literacy Design Collaborative

Examples on the following slides and more information about LDC can be found at:

www.literacydesigncollaborative.org

Page 27: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

What is LDC?

• A framework for building literacy skills and core content knowledge - aligned to Common Core Standards (CCS)

• “Template Tasks” built on text-based essential questions and a genre of writing (e.g. essay)

• Common rubrics for argumentation, informational and narrative writing

Page 28: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013
Page 29: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Template Task

“LDC ‘template tasks’ provide fill-in-the-blank shells that teachers use to create powerful

assignments. For example, Template Task 2 calls for student analysis that builds an argument.”

- www.literacydesigncollaborative.org/tasks

Page 30: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Template Tasks

“[Insert question] After reading _____ (literature or informational texts), write _________ (essay or substitute) that addresses the question and support your position with evidence from the text(s). L2 Be sure to acknowledge competing

views.”

- www.literacydesigncollaborative.org/tasks

Page 31: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Rubric Scoring Elements

• Focus• Controlling Idea• Reading/Research• Development• Organization• Conventions• Content Understanding

Page 32: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013
Page 33: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Scoring Element Example

Controlling Idea

Scoring Element 1 1.5 2 2.5 3 3.5 4

Controlling Idea

Attempts to establish a

claim, but lacks a clear purpose.(L2) Makes no

mention of counter claims.

Establishes a claim.

(L2) Makes note of counter

claims.

Establishes a credible claim. (L2) Develops

claim and counter claims fairly.

Establishes and maintains a substantive

and credible claim or proposal.

(L2) Develops claims and counter claims

fairly and thoroughly.

Page 34: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Scoring Element Example

Organization

Scoring Elements 1 1.5 2 2.5 3 3.5 4

Organization

Attempts to organize ideas, but

lacks control of structure.

Uses an appropriate

organizational structure for

development of reasoning and

logic, with minor lapses in structure and/or coherence.

Maintains an appropriate

organizational structure to address

specific requirements of the prompt. Structure

reveals the reasoning and logic

of the argument.

Maintains an organizational structure that

intentionally and effectively enhances the presentation of

information as required by the specific prompt.

Structure enhances development of the

reasoning and logic of the argument.

Page 35: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Scoring Element Example

Reading Research

Scoring Element 1 1.5 2 2.5 3 3.5 4

Reading/ Research

Attempts to reference reading

materials to develop

response, but lacks

connections or relevance to the purpose of the

prompt.

Presents information from reading

materials relevant to the purpose of the prompt with

minor lapses in accuracy or

completeness.

Accurately presents details from reading

materials relevant to the purpose of the

prompt to develop

argument or claim.

Accurately and effectively presents important details

from reading materials to develop argument or claim.

Page 36: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Scoring Element Example

Content Understanding

Scoring Element 1 1.5 2 2.5 3 3.5 4

Content Understanding

Attempts to include

disciplinary content in

argument, but understanding

of content is weak; content is irrelevant,

inappropriate, or inaccurate.

Briefly notes disciplinary

content relevant to the prompt; shows basic or uneven understanding

of content; minor errors in

explanation.

Accurately presents

disciplinary content relevant

to the prompt with sufficient

explanations that demonstrate

understanding.

Integrates relevant and accurate

disciplinary content with thorough

explanations that demonstrate in-

depth understanding.

Page 37: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Take 5

When thinking about your common assessment work in your districts, what LDC processes and

structures might appeal to you and your colleagues?

Page 38: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Putting it All Together

• Local teacher development of modules • Common modules and tasks used across

districts and states• Student work samples and common rubrics

used to develop scoring professional development

• Teachers scoring student tasks across districts and states

Page 39: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Putting it All Together

Page 40: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Professional Dialogue

Page 41: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Calibration

Page 42: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Take 5

How might a common assessment model that includes common local modules, assessment

and teacher scoring be part of a Michigan teacher effectiveness model?

Page 43: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

Final Thoughts

• Models such as LDC honors teacher involvement in the process of curriculum, instruction, and assessment.

• Scoring professional development allows teachers to become part of the game.

• Teacher dialogue about student work enhances teacher knowledge.• Teacher effectiveness can be measured

using processes such as these!

Page 44: Presented by: Sara Bryant, Measured Progress Michigan Assessment Consortium April 15, 2013

For More Information

Sara [email protected]

Literacy Design Collaborativewww.literacydesigncollaborative.org

Measured Progresswww.measuredprogress.org