dvas training 10-03-05 find out how battelle for kids can help presentation outcomes learn rationale...

Post on 17-Jan-2016

214 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

DVAS Training 10-03-05

• Find out how Battelle for Kids can help

Presentation Outcomes

• Learn rationale for value-added progress measures

• Receive conceptual overview of value-added analysis

• View sample value-added reports

• See how value-added information

fits with school improvement

Why are traditional achievement measures alone an insufficient way to assess student achievement?

Rationale for Value-Added Progress Measures

The ChangingEducational Landscape

• 2000’s: Standards-based education

• 1960’s: Mastery learning

• 1970’s: Behavioral objectives

• 1980’s: Minimum competencies

• 1990’s: Outcomes-based education

Looking at the changing educational landscape, a clear pattern exists:

The focus has moved from what goes into a child’s education to what comes out of the process

What’s the Upshot?

From Teaching Inputs

• Context (room, furniture, master schedule, course of study)

• Resources (number/quality of books, computers, materials)

• Capacities (knowledge of subject and teaching/learning processes, classroom control, lesson planning)

To Learning Outputs

• State content standards and local curriculum aligned to standards

• Annual paper/pencil tests to measure achievement

• State report cards at district/building level

• Good teaching = High student performance

The primary measure in “output focused” system is student scores on statewide achievement tests.

The Shift in Focus

Stair-Step Expectations

Most achievement measures imply:

• Achievement test scores are enough to show growth

• Students start at the same place

• Students progress at the same rate

Reality

• Students start at different places

• Students progress at different rates

• Educators need more than individual test scores to evaluate school’s impact on student learning

Need for Progress Measures

To measure school effectiveness, we must pay attention to passage rates AND annual student progress

How do we maximize student progress each year, regardless of where they start?

85% of the public believes student progress is

the best measure of a school’s effectiveness

Question for EducatorsToday

— Phi Delta Kappa/Gallup Poll 2005

Conceptual Overview of Value-Added Measures in Ohio

How is performance data used to produce value-added information?

Two Value-Added Systems in Ohio

• Project SOAR

• Ohio’s Accountability System

Project SOAR

• Operated by Battelle for Kids• Began in 2002 with 42 school districts• Now has 106 districts and 3 charters schools• Provides analysis in 5 subjects for grades 3-10• Uses state and non-state test data• Uses a prediction based value-added approach• Expected growth is normative (“Average

Growth”)

Ohio’s System

• Operated by the Ohio Department of Education• Begins as a 4th grade pilot in 2006 in all

districts• Provides analysis in math & reading

in grades 4-8• Uses only state achievement tests• Uses a mean gain value-added approach• Expected growth is likely to be a fixed amount

What Do the Two Have in Common?

• Utilize the power of longitudinal data and linking together each student’s assessment data over time

• Compare students’ current test scores to baseline scores

• Provide value-added information in Web based reports

• Use the statistical power of EVAAS™ to produce the value-added analysis

Why EVAAS™ in Ohio?• EVAAS™ is a the value-added

methodology pioneered by Dr. William Sanders of SAS

• Applies most sophisticated statistical methodologies available to ensure reliability

• Allows for the use of all student test data• Provides valuable diagnostic information• Approaches for handling different types of

test data• Identified by RAND and others as a

preferred model• Used statewide in Tennessee for 10 years

What is Value-Added?Value-added in simplest form is an accurate measure of the present (observed scores) minus an accurate measure of the past (baseline scores) for the same group of students.

Mean Observed Score - Mean Baseline Score = Value-Added

Math ScoresYear 1 3rd Grade

Student 1 350Student 2 370Student 3 360Student 4 375Student 5 365

Year 2 4th GradeStudent 1 400Student 2 385Student 3 395Student 4 405Student 5 390

Mean Baseline Score 364 Mean Observed Score 395

31 would be a crude measure of the value-added

Why are Two Systems Needed?

• When both tests are on a common scale like the Ohio achievement tests, baseline can come from

the prior year’s test scores (Mean Prior Score Approach - Ohio System)

• When the tests are on different scales the

baseline must be calculated. (Mean Predicted Score Approach - Project SOAR)

Sophisticated statistics are required in both approaches to ensure that all students’ data are included, that the information is reliable, and to add predictive diagnostic power.

Different approaches are needed to provide reliable baseline scores from the different kinds of tests used in Ohio

What are Common Scales?• Vertical Scales increase in equal

intervals as you increase grade levels

• Horizontal Scales remain the same as you increase grade levels

Score at the 3rd 4th 5th 6th 7th 8th50th percentile 350 400 450 500 550 600

Ohio Achievement TestsProficient 3rd 4th 5th 6th 7th 8thScore 400 400 400 400 400 400

When you have common scales, the prior years’ scores can be used as the baseline.

How Are Test Data Used When They Are Not on a Common Scale?

• All available test data is collected and linked for each student

• Districts are grouped into pools based on common testing histories

• Relationships between and among all tests in pool are used to create predicted baseline scores

What students’ like them would typically score on this year’s test.

• Collect all individual student data available for a minimum of three years

• Link each student’s annual test data together to create a longitudinal record

How Do you Create a Longitudinal Record?

How Are Comparison Pools Created?Districts grouped into pools based on common testing histories at each grade-level cohort

Example: Mike’s cohort testing history or pool

How Are the Relationships Between Tests Within a Pool Defined?

Relationships between and among all tests in the pool are calculated and can be represented as a number or correlation.

Example: using 2 years of and 4 years of Mike’s testing history

How Much Prior Data is Used?

Up to 5 years of student test data and the relationships between tests are used to calculate predicted baseline score for this year’s subject-area tests.

How Are Predicted Scores Calculated?

Using the test data for students with similar prior performance on common

tests and the tests’ relationships to each other allows for the creation of

statistically reliable predicted scores for each student in each subject

Mike’s Prediction

Student Prediction 2

Student Prediction 3

Student Prediction 4

Student Prediction 5

Student Prediction 6

Student Prediction 7

Student Prediction 8

Student Prediction 9

Student Prediction 10

Student Prediction 11

Student Prediction 12

Student Prediction 13

Student Prediction 14

Student Prediction 15

Student Prediction 16

Student Prediction 17

Student Prediction 18

Student Prediction 19

Student Prediction 20

Are All Students Used in the Analysis?

Mike’s Prediction

Student Prediction 2

Student Prediction 3

Student Prediction 4

Student Prediction 5

Student Prediction 6

Student Prediction 7

Student Prediction 8

Student Prediction 9

Student Prediction 10

Student Prediction 11

Student Prediction 12

Student Prediction 13

Student Prediction 14

Student Prediction 15

Student Prediction 16

Student Prediction 17

Student Prediction 18

Student Prediction 19

Student Prediction 20

Mean Predicted Score “Baseline”

Mike’s Score

Student Score 2

Student Score 3

Student Score 4

Student Score 5

Student Score 6

Student Score 7

Student Score 8

Student Score 9

Student Score 10

Student Score 11

Student Score 12

Student Score 13

Student Score 14

Student Score 15

Student Score 16

Student Score 17

Student Score 18

Student Score 19

Student Score 20

Mean Observes Score

Your School

Only students with enough prior data to create a predicted score are included.

Mean Student Score – Mean Predicted Score (with some statistical reliability factored in)

= School Effect

How Do you Estimate the School’s Effect on Student Growth?

Sample Value-Added Reports

What information do value-added reports provide that was previously unavailable?

Achievement & Progress

High Achievement, High Progress

High Achievement High Mean Scores = 89% passage

High Progress Positive School Effects

2005 School Value-Added Report for OPT Math

High Achievement, Low Progress

High Achievement High Mean Scores = 85% passage

Low Progress Negative School Effects

2005 School Value-Added Report for OPT Math

Low Achievement, High Progress

Low Achievement Low Mean Scores = 69% passage

High Progress Positive School Effects

2005 School Value-Added Report for OPT Math

Student A Report

Student B Report

Student B Projection

Connecting to School Improvement Efforts

How can value-added progress measures enhance school and district improvement at the:

• District Level• School Level

District Improvement Efforts

• Identify patterns of progress across buildings, grade levels and subject areas

• Locate areas of strength to build upon• Locate areas for improvement

District Value-Added Summary Report

2005 District Value-Added Summary Report4th Grade

School Search Report

2005 School Search 4th Grade Math

School Improvement Efforts

• Identify patterns of progress across grade levels, subject areas and student subgroups

• Locate areas of strength to build upon• Locate areas for improvement

School Value-Added Report

2005 School Value-Added ReportReading

Performance Diagnostic Report

2005 Performance Diagnostic for Reading 5th Grade Means

School Diagnostic Report

2005 School Diagnostic for Reading 4th Grade Means

Search for Students

By Subgroup or Achievement

Student Search

Identified At-Risk Students

Student Search Results

In Summary, Value-Added Information Shows…

• How much progress students make in each subject area and grade level

• How much progress students at different previous achievement levels have made

• How students’ progress in one curricular area or program compares to their progress in another

• Whether individual students are making adequate progress to meet state standards

Without data, all we have are opinions!

School Strategic Planning Cycle

Pre-School Start of the Year Meetings• Examine value-added and other school performance

information by grade level and/or subject area

• Assess strengths and weaknesses and potential

actions, grade level by grade level, subject by subject

• Celebrate progress

• Set 1-2 goals for each grade level, department,

or team around strengths and weaknesses

• Create action plans and accountabilities

Grade level, department and/or team meet to work on team specific goals

Grade level, department and/or team meet to work

on team specific goals

Grade level, department and/or team meet to work on team specific goals

Select a lead person in the school district who understands value-added information, and can access, interpret and conduct presentations

Train school leaders (principals and lead teachers) to access, interpret and conduct presentations

Have school leaders share value-added information with school staff members

Have school staffs use value-added information to assess annual progress and set goals for next year

Implementation Checklist

Ohio’s Scale Up Plan

For more information, contact:

www.BattelleforKids.org

(866) KIDS-555

top related