waiting room

36
Waiting Room Today’s webinar will begin shortly. REMINDERS: Dial 800-503-2899 and enter the passcode 6496612# to hear the audio portion of the presentation Download today’s materials from the sign-in page: Webinar Series Part 6 PowerPoint slides Correlation Example Excel file

Upload: noble-sexton

Post on 31-Dec-2015

29 views

Category:

Documents


1 download

DESCRIPTION

Waiting Room. Today’s webinar will begin shortly. REMINDERS: Dial 800-503-2899 and enter the passcode 6496612# to hear the audio portion of the presentation Download today’s materials from the sign-in page: Webinar Series Part 6 PowerPoint slides Correlation Example Excel file. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Waiting Room

Waiting Room Today’s webinar will begin shortly.

REMINDERS:

• Dial 800-503-2899 and enter the passcode 6496612# to hear the audio portion of the presentation

• Download today’s materials from the sign-in page:

• Webinar Series Part 6 PowerPoint slides

• Correlation Example Excel file

Page 2: Waiting Room

Determining How to Integrate Assessments into Educator Evaluation: Developing Business Rules and Engaging Staff   Webinar Series Part 6

Page 3: Waiting Room

Webinar SeriesTitle Date Length Time

1Introduction: District-Determined Measures

and Assessment Literacy3/14 60 minutes 4-5pm

2 Basics of Assessment 4/4 90 minutes 4-5:30pm

3 Assessment Options 4/25 60 minutes 4-5pm

TA and Networking Session I 7/11 3 hours 9am-12pm

4Determining the Best Approach to District-

Determined Measures7/18 60 minutes 4-5pm

5Measuring Student Growth and Piloting

District-Determined Measures8/15 60 minutes 4-5pm

TA and Networking Session II 9/19 3 hours2:30pm-5:30pm

6Integrating Assessments into Educator Evaluation: Developing Business Rules

and Engaging Staff10/24

60 minutes

4-5pm

7 Communicating Results 12/5 60 minutes 4-5pm

TA and Networking Session III 12/12 3 hours2:30pm-5:30pm

8 Sustainability 1/23 60 minutes 4-5pm

3

Page 4: Waiting Room

Audience & Purpose

Target audience District teams that will be engaged in the

work of identifying, selecting, and piloting District-Determined Measures.

After today participants will understand: Examples of practical solutions to issues of

fairness in using District-Determined Measures (DDMs).

Practical examples of engaging educators in the process of implementing DDMs.

4

Page 5: Waiting Room

Agenda

Student Impact Rating Rollout Reminder

DDM Comparability

Identifying Bias

Standardizing DDMs

Ensuring Sufficient Variability

Q&A and Next Steps

5

Page 6: Waiting Room

Student Impact Rating Rollout:

Date Action

Sept. 2013: Decide which DDMs to pilot and submit list to ESE.

Sept. 2013 – June 2014:

Pilot DDMs in at least the five required areas and research DDMs in additional areas.

June 2014: Submit final plans, including any extension requests, for implementing DDMs during the 2014-15 school year*.

SY 2014-2015 Implement DDMs and collect Year 1 Student Impact Rating data for all educators (with the exception of educators who teach the particular grades/subjects or courses for which an extension has been granted).

SY 2015-2016 Implement DDMs, collect Year 2 Student Impact Rating, and determine and report Student Impact Ratings for all educators (with the exception of educators who teach the particular grades/subjects or courses for which a district has received an extension).

*ESE will release the June 2014 submission template and DDM implementation extension request form in December 2013.

6

Page 7: Waiting Room

DDM Key Questions Is the measure aligned to content?

Does it assess what the educators intend to teach and what’s most important for students to learn?

Is the measure informative? Do the results tell educators whether

students are making the desired progress, falling short, or excelling?

Do the results provide valuable information to schools and districts about their educators?

7

Page 8: Waiting Room

Refining your Pilot DDMs Districts will employ a variety of approaches to

identify pilot DDMs (e.g., build, borrow, buy). Key considerations:

1. How well does the assessment measure growth?

2. Is there a common administration protocol?3. Is there a common scoring process?4. How do results correspond to low,

moderate, of high growth?5. Is the assessment comparable to other

DDMs? Use the DDM Key Questions and these

considerations to strengthen your assessments during the pilot year.

8

Page 9: Waiting Room

DDM Comparability: Two Types

DDMs must be “comparable across schools, grades, and subject matter district-wide.” (Per 603 CMR 35.09(2)a)

Comparability = Two types (Type 1) Comparable across schools (Type 2) Comparable across grades

and subject matter

Learn more in Technical Guide B, page 9 and appendix G

9

Page 10: Waiting Room

Comparability (Type 1) Comparable across schools

Example: Teachers with the same job (e.g., all 5th grade teachers)

Where possible, measures are identical Easier to compare identical measures Do identical measures provide meaningful

information about all students? When might they not be identical?

Different content (different sections of Algebra I) Differences in untested skills (reading and writing

on math test for ELL students) Other accommodations (fewer questions to

students who need more time)

10

Page 11: Waiting Room

Error and Bias Error is the difference between true ability and

a student’s score. Random error

Student sleeps poorly, lucky guess, … etc Systematic error (bias)

Error occurs for one type or group of students ELL student misreads a set of questions Systematic Error = Bias

Why This matters? Error (OK) decreases with longer/additional measures Bias (BAD) does not decrease with longer/additional

measures Even with identical DDM, bias threatens

comparability

11

Page 12: Waiting Room

When does bias occur? Situation: Students who score high on

the pre-test have less of an opportunity to grow because they cannot get more than a top score (Ceiling Effect).

Situation: Special education students gain fewer points from pre-post test, and as a result are less likely to be labeled as having high growth. 12

Page 13: Waiting Room

Checking for Bias Do all students have an equal chance to

grow? Is there a relationship between the initial

score and gain score? We can do this in EXCEL using correlation

We have Pre-Test Score Post-Test Score Gain Score

Type “=correl”, click formula Highlight Pre-Test Scores, Press “Comma” Highlight Difference Scores, Close Parentheses, Press

“Enter”

Correlation formula in Excel:=CORREL(PRE-TEST SCORES, GAIN

SCORES)

13

Page 14: Waiting Room

Interpreting Correlation Correlation is the degree to which two numbers

are related Correlation

Number between -1 and 1. A zero correlation means numbers are unrelated Closer to 1 or -1 means strong correlation

DDMs should provide all students an opportunity to demonstrate growth We want to see little to no correlation between pre-

test scores and gain scores A correlation above .3 or below -.3 suggests that there

are systematic differences in gain for low and high ability students

14

Page 15: Waiting Room

Correlation Example Demonstration of computing Correlation

between pre-test and gain Very Low Correlation

students of all ability were equally likely to demonstrate growth

Negative Correlation Students of high ability systematically

demonstrated less growth (due to ceiling effect)

Positive Correlation Students with lower scores generally grew

less (bias)

15

Page 16: Waiting Room

Interpreting Correlation Strong correlation is an indication of a

problem

A low correlation is not a guarantee of no bias! Strong effect in small sub-population Counteracting effects at both low and high

end Use common sense

Always look at a graph! Create a scatter-plot graph and look for patterns

16

Page 17: Waiting Room

Example of Bias at Teacher Level

Teacher A

Pre Post Gain

3 4 1

3 4 1

3 4 1

3 4 1

8 14 6

Teacher BPre Post Gain

3 4 1

8 14 6

8 14 6

8 14 6

8 14 6

Even though similar students gained the same amount

Teacher A’s average gain is 2

Teacher B’s average gain is 5

17

Page 18: Waiting Room

Solution: Grouping Grouping allows teachers to be

compared based on similar students, even when the number of those students is different

Teacher Average Growth

Low Students

A 1

B 1

High Students

A 6

B 6

18

Page 19: Waiting Room

Addressing Bias: Grouping How many groups?

What bias are you addressing? Enough students in each group?

Using Groups Weighted average Rule based (all groups must be above cut off) Professional judgment

19

Page 20: Waiting Room

Comparability (Type 2) Comparability across different DDMs

Across different grades and subject matter Are different DDMs held to the same

standard of rigor? Does not require identical number of

students in each of the three groups of low, moderate, and high

Common sense judgment of fairness

20

Page 21: Waiting Room

One option: Standardization Standardization is a process of putting

different measures on the same scale For example

Most cars cost $25,000 give or take $5,000 Most apples costs $1.50 give or take $.50 Getting a $5000 discount on a car is about equal to

what discount on an apple?

Technical terms “Most are” = mean “Give or take” = standard deviation 21

Page 22: Waiting Room

Guest Speaker

Jamie LaBillois – Executive Director of Instruction,

Norwell Public Schools

22

Page 23: Waiting Room

Developing Local Norms

Student A English: 15/20 Math: 22/25 Art: 116/150 Social Studies: 6/10 Science: 70/150 Music: 35/35

We learned early on that we needed a process that would create one universal measurement unit to discuss student progress.

23

Page 24: Waiting Room

Transform the Data…

24

Page 25: Waiting Room

How?

Step One Calculated the difference between

Post and Pre (or any approach from Technical Guide B)

Step Two Find the mean (average) of the

difference scores Step Three

Find the standard deviation of the difference scores

25

Page 26: Waiting Room

How? Now, we’re ready to “transform” the

difference scores into a universal measurement system.

Step Four Calculate the z-score of each individual

difference score

(observation – Mean) Z = ------------------------------------

Standard Deviation Step Five

Calculate percentile rank for each z-score

26

Page 27: Waiting Room

Developing Local Norms

Student A English: 15/20 Math: 22/25 Art:

116/150 Social Studies: 6/10 Science: 70/150 Music: 35/35

Student A English: 62

%ile Math: 72

%ile Art: 59

%ile Social Studies: 71 %ile Science: 70

%ile Music: 61 %ile

27

Page 28: Waiting Room

Examining an Educator’s Impact

Grade 4 DIBELS Oral Reading Fluency MEDIAN %ile per class:

Teacher 1: 65 %ile Teacher 2: 71 %ile Teacher 3: 59 %ile Teacher 4: 59 %ile Teacher 5: 62 %ile Teacher 6: 57 %ile Teacher 7: 29 %ile Teacher 8: 50 %ile

Evaluator’s Focus 28

Page 29: Waiting Room

Lessons Learned Growth vs. Achievement Robust Tool Timely Analysis Re-Assessment of Instruction Re-Assessment of Ability vs. Disability Development of Building-Based

Evaluators Educator Engagement is Essential

29

Page 30: Waiting Room

Ensuring Sufficient Variability Technical Guide B’s two key questions:

Is DDM aligned to content? Does the DDM provide information to

educators and evaluations?

Lack of variability reduces information

30

Page 31: Waiting Room

Looking for Variability

Low Moderate High0

50

100

150

200

Good

# o

f stu

dents

Low Moderate High0

50

100

150

200

Problematic

# o

f stu

dents

The second graph is problematic because it doesn’t give us information about the difference between average and high growth because so many students fall into the “high” growth category.

31

Page 32: Waiting Room

Guest Speaker Experience with constructing measures

with greater variability

32

Page 33: Waiting Room

Wrap-Up Today, we discussed three strategies for

evaluating the fairness of your DDMs

1. Check for bias by computing the correlation between pre-test scores and gain scores.

Remember: Zero correlation indicates that all students have an equal chance to demonstrate growth.

2. Standardization can help you compare DDMs in different content areas.

3. Look for variability in student growth. A lack of variability reduces the amount of information available to educators about their students.

33

Page 34: Waiting Room

ResourcesAvailable Now at

http://www.doe.mass.edu/edeval/ddm/: Technical Guide B DDMs and Assessment Literacy Webinar Series Technical Assistance and Networking Sessions Core Course Objectives and Example DDMs

Coming Soon Using Current Assessments Guidance (Curriculum

Summit) Model Contract Language DDM Pilot Plan Cohorts

34

Page 35: Waiting Room

Register for Webinar Series Part 7

Part 7: Communicating Results

Date: December 5th, 2013Time: 4-5pm EST (60 minutes)Register: https://air-event500.webex.com/air-event500/onstage/g.php?d=597905353&t=a 35