oerc june 2014 final ppt combined

71
OERC RESEARCH RELATED TO STUDENT GROWTH MEASURES AND EDUCATOR EFFECTIVENESS Jill Lindsey, Ph.D. Wright State University Marsha Lewis, Ph.D. Ohio University Extending Your Knowledge Through Research That Works! I Columbus, OH I June 18, 2014

Upload: ohio-education-research-center

Post on 25-May-2015

110 views

Category:

Education


0 download

TRANSCRIPT

Page 1: Oerc june 2014 final ppt combined

OERC RESEARCH RELATED TO STUDENT GROWTH

MEASURES AND EDUCATOR EFFECTIVENESS

Jil l Lindsey, Ph.D.Wright State University

Marsha Lewis, Ph.D.Ohio University

E x t e n d i n g Y o u r K n o w l e d g e T h r o u g h R e s e a r c h T h a t W o r k s ! I C o l u m b u s , O H I J u n e 1 8 , 2 0 1 4

Page 2: Oerc june 2014 final ppt combined

OERC can examine statewide policy and practice questions

Follow and document early implementation in order to inform policy and practice

Research implementation for multiple years as start-up issues are resolved and implementation takes hold—look for what is working and what can be improved.

PURPOSE OF RESEARCH

Page 3: Oerc june 2014 final ppt combined

Time span of findings offer insight into changing landscape around teacher evaluation and student growth measures

2012: Teachers and principals philosophically supportive of new evaluation system and need to measure student growth

2013: HB 555

2014: Teachers and principals far less supportive and troubled by concerns related to use of different types of SGMs for evaluation

Common Themes

3

FOUR STUDIES SPANNING THREE YEARS

Page 4: Oerc june 2014 final ppt combined

4

METHODOLOGY

Structured interviews with superintendents and administration team members

Focus groups with teachers

Surveys of teachers in each pilot LEA

eTPES data analysis

Page 5: Oerc june 2014 final ppt combined

OTES/OPES Implementation Study (37 LEAs)

Extended Testing for Value-Added Reporting (23 LEAs)

Initial Use of Student Learning Objectives (30 LEAs)

Student Growth Measures Policy & Practice (13 LEAs)

5

FUNDED PROJECTS RELATED TO STUDENT GROWTH MEASURES

Page 6: Oerc june 2014 final ppt combined

Sequencing, planning, feedback, and student growth measures for teachers and principals

Preparation for evaluation

Experiences of teachers and principals evaluated using student growth measures

Processes and measures of student growth that districts adopted for use in teacher and principal evaluation systems

6

OTES/OPES IMPLEMENTATION STUDY

Page 7: Oerc june 2014 final ppt combined

7

EARLIEST FINDINGS

Generally positive about the new evaluation systems Supported use of student growth measures in evaluation Lack of trust & misunderstandings about value-added, vendor

assessments, and local measures of student growth Unfairness of using different kinds of measures and differing

time cycles for different measures of student growth Conversations around new evaluation system focus on

instruction Time required to complete evaluation took time away from

working with students Appreciative at being asked about their experiences and views

Page 8: Oerc june 2014 final ppt combined

Grant funds provided vendor testing for grades 1,2,3, and high school subjects

Provided teacher-level value-added scores from vendor test results

Processes and challenges related to extended testing implementation

Role of roster verification

Use of SGMs in educator evaluation

Best practices/lessons learned

8

SGM EXTENDED TESTING MINI-GRANT

Page 9: Oerc june 2014 final ppt combined

Findings Want reliable student

growth measures Lack assessment literacy Unclear how vendors will

provide data Uncertain of roster

verification timing and impact on VAM

LEAs opting to use lowest percentages in weights

Grateful for being asked about their experiences

Nine Drop-outs’ Reasons

Requirement to use extended testing results too soon, unfair, not part of grant

Cost of extended testing was too high

Too many changes, and too much on teachers’ plates

SGM MINI–GRANTS FOR EXTENDED TESTING

Page 10: Oerc june 2014 final ppt combined

10

INITIAL USE OF STUDENT LEARNING OBJECTIVES

Study examined fidelity of SLO use for: improving student performance measuring academic growth evaluating teachers

Page 11: Oerc june 2014 final ppt combined

Training was not uniform across the state

Assessments varied widely across grade levels, buildings, and districts

Processes excessively time-consuming

More challenging for semester or quarter courses; l imited time to complete the pre-test, teach, post-test cycle

Implementation hampered by too many changes to common core, piloting new state tests and PARCC, and implementing OTES

Many emotional moments and gratitude for being invited to talk about experiences

Interviews

Surveys

Documents

eTPES Data

11

EARLY SLO

THEMES

ALL DATA NOT YET

ANALYZED

Page 12: Oerc june 2014 final ppt combined

12

SGM POLICY AND PRACTICE STUDY

OERC study of early adopter districts of Student Growth Measures Designed to provide timely data to inform state

policy and district practice. “What does this look like when implemented?” Teachers’ perceptions of SGM components

Do SGMs correlate with Performance on Standards? If not, why not?

The distribution of teacher and principal ratings

Page 13: Oerc june 2014 final ppt combined

Focus group themes: Fairness questions (e.g. Category A teachers do

not know OAA items in advance while Category C teachers develop their own assessments)

Principals’ time consumed with teacher observation activities

Teachers have questions/misconceptions about value-added methodology

13

SGM POLICY AND PRACTICE STUDY

Page 14: Oerc june 2014 final ppt combined

14

SGM POLICY AND PRACTICE SURVEY

Deployed late February through mid-April, 2014

22% response rate (603 teacher respondents/2,709 full-time teachers) N = 469–classroom teachers, 97 intervention specialists

Survey responses were similar to focus group findings

Of the four SGMs (Value-Added, SLOs, Vendor-Approved Assessments, Shared Attribution), more surveyed teachers think Student Learning Objectives “most accurately assess a teacher ’s instructional impact.”

Page 15: Oerc june 2014 final ppt combined

Early stage of implementation

Uncontrollable factors

Unequal measures/accuracy of the measures

SLOs teacher-developed, validity/reliability questions

Others see the SLOs as most fair because focused on the content taught and results during evaluation year

Approved vendor assessments may not match content standards

Value-added model was not formulated to measure individual teacher effectiveness

15

FAIRNESS CONCERNS WITH SGMS & EVALUATION

Page 16: Oerc june 2014 final ppt combined

Teachers who see value in SGMs: Feel it is important to measure student growth

Recognize the need for accountability

SGMs useful source of feedback for planning and adjustment to outcomes

16

SGM POLICY AND PRACTICE STUDY

Page 17: Oerc june 2014 final ppt combined

“I do think it is important to make sure a child makes adequate growth. However, there are factors that are out of my control (attendance, home support, etc.) that affect a child's learning and are not considered when calculating the yearly academic growth of a student.”

“It shows the effectiveness of a teacher and useful data to adjust your teaching.”

17

SGM POLICY AND PRACTICE STUDY

Page 18: Oerc june 2014 final ppt combined

Teacher–Student Data Link/Roster Verification is necessary to ensure SGM data quality.Research Questions: Are teachers actively participating in the verification of their

own rosters and percentage of instructional time with students as specified by Ohio’s roster verification process guidelines?

Do principals and teachers have access to adequate training and technical assistance?

Do principals and/or teachers perceive any issues with roster verification?

What do Ohio educators view as the hallmarks of a good system?

18

TEACHER ROSTER VERIFICATION RESEARCH

Page 19: Oerc june 2014 final ppt combined

19

TEACHER ROSTER VERIFICATION SURVEY

Sent online survey to all teachers and principals who completed the link/roster verification process in spring 2013 and spring 2014

2013 survey: 5,984 teacher responses from 695 LEAs

2014 survey to-date: 6,778 teacher responses. Survey still in field.

Page 20: Oerc june 2014 final ppt combined

2011 2013 2014* (prelim.)

Yes 46% 57% 59%

No 23% 25% 25%

Don't know 31% 17% 16%

Teachers – Do you think the linkage process accurately captured what was happening in your classroom (i.e. students you taught last year, their length of enrollment, and your percentage of instructional time with them)?

20

TEACHER ROSTER VERIFICATION SUR

Page 21: Oerc june 2014 final ppt combined

21

TEACHER ROSTER VERIFICATION SURVEY

For teachers that answered “No”:Teachers – Explain why you think the student–teacher linkage process did not accurately capture what was happening in your classroom. (open-ended)

Themes: Difficulty dividing time in various co-teaching

situations. Unable to account for student absences Teachers want to be able to report finer

increments of shared instructional responsibility Students’ schedules changed too

often/environment too dynamic to accurately estimate time

Page 22: Oerc june 2014 final ppt combined

2011 2013 2014*(prelim.)

Not at all confident 39% 32% 35%

Somewhat confident 55% 61% 58%

Very confident 6% 8% 7%

Teachers – Given your experience with the linkage process, how confident are you that the linkage process improves the accuracy of the teacher-level value-added data?

22

TEACHER ROSTER VERIFICATION SURVEY

Page 23: Oerc june 2014 final ppt combined

Concerns

Early in implementation—lack of trust and misunderstandings

Perceived unfairness of different kinds of measures

Time required to complete evaluation

Too many changes at the same time

Kudos

Support measuring student growth

Training for assessment literacy desired

Appreciate being consulted and heard

Roster verification process is improving

23

COMMON THEMES ACROSS TIME

Page 24: Oerc june 2014 final ppt combined

Build trust by continuing to include teachers and administrators in conversations and policies that impact them

Acknowledge concerns as legitimate

Provide professional development opportunities to correct misunderstandings and knowledge deficits

Streamline paperwork where possible; use adaptations from the field

Modify policy and roll-out timelines when possible

24

RECOMMENDATIONS

Page 26: Oerc june 2014 final ppt combined
Page 27: Oerc june 2014 final ppt combined

PLANNING FOR THE FUTURE, LEARNING FROM THE PAST: WHAT

CAN SCHOOLS LEARN FROM COLLEGE AND CAREER PROFILES OF

GRADUATES?

Joshua D. HawleyDirector, OERC and Associate Professor

John Glenn School of Public AffairsThe Ohio State University

Mak ing Research Work fo r Educa t ion

Page 28: Oerc june 2014 final ppt combined

Ohio’s Constitution

A “thorough” and “efficient“ education.

OHIO’S STANDARDS OF PUBLIC EDUCATION

Page 29: Oerc june 2014 final ppt combined

College for all

High School College Work

Education and Career

High School

• CTE• STEM

College

• AP/Dual Enrollment

• School + Work

Workforce Training

• Apprenticeship• Military

29

LINKING SCHOOL TO WORK

Page 30: Oerc june 2014 final ppt combined

30

CHANGING VIEWS OF WORK REQUIRE NEW INFORMATION

As educators, we want to know about a variety of educational outcomes, not just college-going rates for students. At its broadest, we might consider the following domains:College (traditional two and four-year sectors)Credentialed and non credentialed workforce trainingApprenticeshipsMilitary

Page 31: Oerc june 2014 final ppt combined

31

QUALITY OF OUTCOMES MATTER

In this day and age, the quality of the outcomes matter a great deal, and, therefore, we are concerned with how well students are prepared to perform over time.

Concerns we typically have in this case: Student remediation Does student knowledge match what is required in college

classes? Are students prepared to pick a career? (I distinguish this

from a job) What happens to kids that go directly from high school to work? What happens to kids that dropout (both in terms of further

education and work?)

Page 32: Oerc june 2014 final ppt combined

32

PILOT HIGH SCHOOL REPORTS

Using data from the Ohio Longitudinal Data Archive (OLDA), the OERC has been able to answer many of these questions, beginning for high schools, and present them in a format that schools can use. The report has four key question areas: What are the employment outcomes of high school graduates? What are the post secondary education outcomes of high school

graduates? What is the quality of post secondary education high school

graduates are carrying out? What happens to individuals that do not graduate from high

school (dropout)?

Page 33: Oerc june 2014 final ppt combined

33

PILOT REPORTS: COLLEGE AND CAREER

Page 34: Oerc june 2014 final ppt combined

34

WHAT KIND OF EMPLOYMENT EXPERIENCES DO STUDENTS HAVE

AFTER SCHOOL?

Page 35: Oerc june 2014 final ppt combined

35

WHAT HIGHER EDUCATION OUTCOMES DO STUDENTS

HAVE?

Page 36: Oerc june 2014 final ppt combined

36

HOW WELL DO SCHOOLS COMPARE WITH EACH OTHER IN

REMEDIAL EDUCATION?

Page 37: Oerc june 2014 final ppt combined

37

VISUAL LOOK

College and Career ReportHigh School NameCountyState How do X year graduates from this school compare to others in Ohio?Education Career Outcomes(Note): Following legal agreement covering data use by the OERC, individual cells with fewer then 10 people have been redacted. They are indicated by a *.

School District State  Number of Students that Started High School in X year Number of Students working in Ohio in the year following graduationNumber of Graduates Average annual earnings for individuals with high school diplomaNumber of Dropouts Average annual earnings for individuals without a high school diplomaAverage High School GPA      Percent of students in this class eligible for free or reduced lunch Industry of employment for graduates of high school not in college

  Retail 20Average junior year ACT scores for this class by subject   Constructi 20

Financial S 60English  Math  ReadingScienceComposite

Percent of graduates ready for college Industry of employment for high school dropouts  Retail 40

Average state scholarship awards for graduates Constructi 40Financial S 10

Overall post secondary rate for graduates from this class Other 10Two year collegeFour year collegeOther vocational/workforce training

Percent of graduates from this class who attended an in‐state college or university

Percent of graduates from this class who attended an out of state college or university

Trend in college going rates for this school vs. state and district Working and school

*Readiness scores are calculated based on the individual ACT score over or below the "Remeditaion Free Standard"

Footer: Describe data origin. Refer to website that summarizes origin. 

Retail

Construction

Retail

Construction

FinancialServices

Other

Page 38: Oerc june 2014 final ppt combined

38

COLLEGE OUTCOMES

How do X year graduates from this school compare to others in Ohio?

Education

(Note): Following legal agreement covering data use by the OERC, individual cells with fewer then 10 people have been redacted. They are indicated by a *.

School District State

Number of Students that Started High School in X year

Number of Graduates

Number of Dropouts

Average High School GPA

Percent of students in this class eligible for free or reduced lunch

Average junior year ACT scores for this class by subject

English

Math

Reading

Science

Composite

Page 39: Oerc june 2014 final ppt combined

39

COLLEGE OUTCOMES (CONT.)

Percent of graduates ready for college

Average state scholarship awards for graduates

Overall post secondary rate for graduates from this class

Two year college

Four year college

Other vocational/workforce training

Percent of graduates from this class who attended an in‐state college or university

Percent of graduates from this class who attended an out of state college or university

Trend in college going rates for this school vs. state and district

Page 40: Oerc june 2014 final ppt combined

40

CAREER OUTCOMES

Number of Students working in Ohio in the year following graduationAverage annual earnings for individuals with high school diplomaAverage annual earnings for individuals without a high school diploma     Industry of employment for graduates of high school not in college  Retail 20  Constructi 20

Financial S 60  

Industry of employment for high school dropouts  Retail 40

Constructi 40Financial S 10Other 10

Retail

Construction

Retail

Construction

FinancialServices

Other

Page 41: Oerc june 2014 final ppt combined

Develop a formal high school college and career report for select districts (next up, Columbus City Schools; Battelle For Kids)

Complete Workforce Success Measures Project with the Office of Workforce Transformation (see OWT website for introduction: http://workforce.ohio.gov/ ).

Work with Ohio Department of Education and Board of Regents to answer questions about employment outcomes for K-12 and higher education.

41

FUTURE PLANS

Page 42: Oerc june 2014 final ppt combined

THANKS FOR YOUR ATTENTION!

[email protected] | oerc.osu.edu

SM22

Page 43: Oerc june 2014 final ppt combined

Slide 42

SM22 to be consistent with brochure and briefs

www.oerc.osu.edu | [email protected]

Sunny Munn, 6/13/2013

Page 44: Oerc june 2014 final ppt combined
Page 45: Oerc june 2014 final ppt combined

THIRD GRADE READING GUARANTEE:

A CASE STUDY

Suzanne Franco, Professor, Wr ight State Univers i tyJarrod Brumbaugh, Pr inc ipal , Mi l ton-Union Schools

Mak ing Research Work fo r Educa t ionE x t e n d i n g Y o u r K n o w l e d ge T h r o u g h R e s e a r c h T h a t W o r k s ! I C o l u m b u s , O H I J u n e 1 8 , 2 0 1 4

Page 46: Oerc june 2014 final ppt combined

Ohio Third Grade Reading Guarantee (TGRG) 2012

In 2012–13, 81% of Ohio’s 3rd graders were proficient or above

ODE offered competitive funding grant for developing TGRG 2013–2014 implementation

OERC funded a case study of a funded TGRG three-LEA consortium for 2013–2014

BACKGROUND

45

Page 47: Oerc june 2014 final ppt combined

Co-located in Midwestern Ohio but had not collaborated on previous initiatives

Orton-Gillingham Multi-Sensory training and instructional strategies

Professional Learning Community (PLC)

Parent Informational Opportunities

46

CONSORTIUM TGRG PLAN

Page 48: Oerc june 2014 final ppt combined

47

CONSORTIUM DEMOGRAPHICS

LEA Typology Report Card

Rating

2010–2011

% passed

3rd grade reading

2011–2012

% passed 3rd grade reading

2012–2013

Mobility

% White/ Non-

Hispanic

% Econ. Dis-

advan.

1 2: Rural w/ avg. pvrty & small ADM

Exc. 87.3 85.9 7.0 97 40

2 4: Small Town w high pvrty & avg ADM

Exc. 90.3 80.2 11.4 87 53

3 2: Rural w/ avg. pvrty & small ADM

Exc. w/ Distinc-tion

84.7 84 6.1 98 19

Page 49: Oerc june 2014 final ppt combined

Feedback and buy-in for the training, implementation, and PLC.

Progress and monitoring tools used.

Reading skills improved for On-Target students. For Not on Target students?

Percentage of K–3 students Not on Target in 2012-14?

49

RESEARCH QUESTIONS

Page 50: Oerc june 2014 final ppt combined

For each LEA: Document analysis of historical data and end-of-year

2014 RIMPS

Interviews and focus groups with Administrators (6), Teachers (12)

Observations of O/G training and classroom instruction

50

METHODOLOGY

Page 51: Oerc june 2014 final ppt combined

51

O/G TRAINING

Training Details Two 5-day sessions the week after school year ended One 5-day session in November Refresher course available summer, 2014

Training Feedback Teachers felt it was engaging but too long, or covered

grade levels not in their interest. They would like to repeat after one year of implementation.

Administrators from one LEA attended training. They felt that common language helped with classroom observations.

Page 52: Oerc june 2014 final ppt combined

52

IMPLEMENTATION

Implementation Details LEA 1 – O/G not required due to receipt of grant

funds in mid-September, 2013. Used in RTI, Title 1, and other interventions. KRAL is the identifier for K; State assessment tool for

grades 1–3 DIBELS is the progress monitor along with STAR and

Study Island LEA 2 – O/G not required (see above). Used in RTI,

intervention, and Title 1 NWEA (2012) and DIBELS (2013) for K–3 DIBELS is the progress monitor

LEA 3 decided not to participate

Page 53: Oerc june 2014 final ppt combined

Implementation Feedback Details

Not all supplies at beginning of year for all teachers due to delay in receiving grant funds

Not all training completed at beginning of year (new teachers)

Use of O/G not required; inconsistency a challenge for teams

Merging O/G with LEA-approved reading curriculum difficult

Parent Nights were not well attended; PLC not formed

53

IMPLEMENTATION FEEDBACK

Page 54: Oerc june 2014 final ppt combined

54

PROGRESS AND MONITORING TOOLS

LEA 1: DIBELS LEA 2: NWEA (2012); DIBELS (2013)

Feedback O/G assessment tools not Ohio-approved, therefore the

LEAs use DIBELS and NWEA to assess student progress RIMPs not standardized among LEAs (issue for moving and

determining LEA or statewide impact) For highly mobile student populations, 30-day requirement

for RIMP is very difficult to meet. Too much testing for young students; test anxiety rising

Page 55: Oerc june 2014 final ppt combined

Successes

After School Program

Students respond well to Multi-Sensory

Teachers want more training

Challenges

Use of O/G not consistent

Costs to sustain O/G assessments not

state approved RIMP forms could be

improved; data should be collected for analyses

No information about other LEA TGRG plans

55

CONSORTIUM SUMMARY

Page 56: Oerc june 2014 final ppt combined

LEA1 Grade 3 results to dateChanges in implementations for 2014–2015

LEA 2 DetailsGrade 3 results to dateChanges in implementations for 2014–2015

56

2013–2014 RESULTS2014–2015 PLANS

Page 57: Oerc june 2014 final ppt combined

Assessment tools aligned with TGRG programs funded by state need approval.

Primary students exhibit high anxiety regarding TGRG, impacting performance and fear of school.

Required testing takes away from instruction time. Embrace testing that collects needed data for all accountability purposes, not just one initiative.

57

TESTING RECOMMENDATIONS

Page 58: Oerc june 2014 final ppt combined

Continue funding for TGRG development.

Continue monitoring LEA implementation of funded and non-funded TGRG implementation plans, and share “lessons learned.”

Revise RIMP format and collect RIMP data for longitudinal analyses of common deficiencies across the state.

58

TGRG POLICY RECOMMENDATIONS

Page 59: Oerc june 2014 final ppt combined

QUESTIONS

[email protected] [email protected]

[email protected]

Mak ing Research Work fo r Educa t ionE x t e n d i n g Y o u r K n o w l e d ge T h r o u g h R e s e a r c h T h a t W o r k s ! I C o l u m b u s , O H I J u n e 1 8 , 2 0 1 4

Page 60: Oerc june 2014 final ppt combined
Page 61: Oerc june 2014 final ppt combined

READY OR NOT?

E x t e n d i n g Y o u r K n o w l e d g e T h r o u g h R e s e a r c h T h a t W o r k s ! I C o l u m b u s , O H I J u n e 1 8 , 2 0 1 4

Ginny Rammel, Ph.D.Superintendent

Milton-Union Exempted Vil lage Schools

Page 62: Oerc june 2014 final ppt combined

Research and study

Use of multiple forms of data

Create a culture of “calculated risk-takers”

Embed professional development

“EVERY STUDENT, EVERY DAY”

Page 63: Oerc june 2014 final ppt combined

Role model to all at all times

Establish high expectations

Collaborate, share

Trust, be truthful and supportive

Know your staff

63

CULTURE TAKES TIME TO CHANGE!

Page 64: Oerc june 2014 final ppt combined

64

EXPLORE OPPORTUNITIES

Through grants, pilot studies, action research

Connect with:Personnel at colleges and universities, OERC

Educators from other districts

Members of professional organizations

Policymakers, legislators

Page 65: Oerc june 2014 final ppt combined

Milton-Union was involved in a number of grants:RttT Mini-grant Value-AddedStudent Growth MeasuresEarly Literacy and Reading ReadinessOERC case study

The more you and your staff research, study, and share data, the better decisions you make. Collaborating and working together help to create a culture of “Every Student, Every Day.”

65

EMBEDDED PD:BE AN ACTIVE PARTICIPANT

Page 66: Oerc june 2014 final ppt combined

Our culture and the research and data from our grant involvement led to development of our OTES instrument, and a successful year of implementation

This trust and openness flowed throughout recent negotiations.

66

RESULTS

Page 67: Oerc june 2014 final ppt combined

All initiatives impact one another and YOU:OTESOPESGraduation requirementsThird Grade Reading Guarantee

Keep the main thing the main thing – is what I’m doing going to help students learn? Are we preparing students for “down-the-road” careers?

67

CHANGE WILL OCCUR WITH OR WITHOUT YOU!

Page 68: Oerc june 2014 final ppt combined

Do the research upfront

Study the data

Reflect, revise if necessary

Building project

Food service program

68

CALCULATED RISK-TAKERS

Page 69: Oerc june 2014 final ppt combined

All day, every day kindergarten

On-site Head-Start programs

Grouping students by quintiles

H.S. ACT EOC exams

Recognized as a U.S. Department of Education Green Ribbon School

Food service program ended the year in the black!

69

DATA SUPPORTS INITIATIVES

Page 70: Oerc june 2014 final ppt combined

How do we better prepare students for their futures – colleges, universities, employers?

How can we convey to young parents the importance of their role as a teacher?

How can we differentiate education so all students are better served?

How can we better communicate the results of research and the sharing of data?

70

NEXT STEPS…

Page 71: Oerc june 2014 final ppt combined

[email protected]

[email protected] | oerc.osu.edu