human capital policies in education: further research on teachers and principals

26
Human Capital Policies in Education: Further Research on Teachers and Principals 5 rd Annual CALDER Conference January 27 th , 2012

Upload: nayda-griffith

Post on 02-Jan-2016

25 views

Category:

Documents


0 download

DESCRIPTION

Human Capital Policies in Education: Further Research on Teachers and Principals 5 rd Annual CALDER Conference January 27 th , 2012. Where You Come From Or Where You Go? Distinguishing Between School Quality And The Effectiveness Of Teacher Preparation Programs January 27, 2012. Kata Mihaly - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Human Capital Policies in Education: Further Research on Teachers and Principals

Human Capital Policies in Education: Further Research on Teachers and Principals

5rd Annual CALDER ConferenceJanuary 27th, 2012

Page 2: Human Capital Policies in Education: Further Research on Teachers and Principals

Where You Come From Or Where You Go?

Distinguishing Between School Quality And The Effectiveness Of Teacher Preparation Programs

January 27, 2012

Kata MihalyRAND

J. R. LockwoodRAND

Daniel McCaffreyRAND

Tim SassGeorgia State University

Page 3: Human Capital Policies in Education: Further Research on Teachers and Principals

Introduction

• Improving teacher effectiveness is one of four pillars for education reform under the Obama Administration

• States are using evidence based techniques to evaluate teacher preparation program effectiveness

• One technique links student achievement to the preparation programs where their teacher was trained and certified

• Among the many concerns is that the school context could affect preparation program estimates

Page 4: Human Capital Policies in Education: Further Research on Teachers and Principals

Table 1 - Characteristics of Schools Where Graduatesfrom a Sample Program in Florida were Hired (N=22)

Mean Std. Dev. Min Max

Black 0.3243 0.3123 0 1

Hispanic 0.2490 0.2158 0 0.8000

Female 0.4791 0.1296 0 0.6000

Parent no English 0.3297 0.2323 0 0.8000

LEP 0.1249 0.1314 0 0.4093

Free Lunch 0.5000 0.2723 0 0.9465

Math gain score (norm) -0.0276 0.3364 -0.6662 0.6895

New Teachers 0.4065 0.2566 0 1

Number of Prep Programs 1.6818 0.8937 1 4

Page 5: Human Capital Policies in Education: Further Research on Teachers and Principals

• There are many potential problems of linking preparation program to student achievement• selection of teachers into and out of programs• selection of program graduates into teaching positions • how teacher performance is measured

• Here we consider problem of trying to distinguish preparation program effects from environment of schools where graduates teach

• We estimate preparation program effectiveness using Value Added Model of student achievement with data from Florida

Introduction

Page 6: Human Capital Policies in Education: Further Research on Teachers and Principals

Overview of Research Questions

1. Can school fixed effects be included in the value added model?

2. If yes, does the inclusion of school fixed effects change preparation program estimates?

3. What are the implications of including school fixed effects on precision of estimates?

4. Are fixed effects suitable in this setting?

5. What is the impact of the sample restrictions:• Years of data• Inexperienced teachers

Page 7: Human Capital Policies in Education: Further Research on Teachers and Principals

Prior Research Comparing Value-Added of Teachers Across Preparation Programs

• Models with Program and School Fixed Effects• New York City -- Boyd, et al. (2008)• Florida -- Sass (2008)• Kentucky -- Kukla-Acevedo, et al. (2009)

• HLM Models• Texas – Mellor, et al. (2010)• Louisiana – Noell, et al. (2009)

Page 8: Human Capital Policies in Education: Further Research on Teachers and Principals

Data• Analyze recent graduates (<3 years of experience) from

Florida elementary education teacher preparation programs • teaching in grades 4 and 5 during 2000/01-2004/05 period• 33 preparation programs• 1 to 496 teacher graduates in tested grades/subjects• Graduates from a single program working in 1 to 271 schools• Over 550,000 students

• Sample also includes experienced teachers and those certified out of state or in Florida through alternative pathways.

Page 9: Human Capital Policies in Education: Further Research on Teachers and Principals

Fixed Effects Identification

• School fixed effect estimation only possible if all preparation programs are linked to one another through the schools where their graduates teach

• Preparation programs do not need to be linked directly • as long as there are some new teachers in the school who

graduated from other programs

• Regional Clustering could lead to stratification• Work of Boyd et al. (2005) on the “draw of home” suggests

graduates tend to teach in schools close to where they grew up or where they went to college

Page 10: Human Capital Policies in Education: Further Research on Teachers and Principals

Figure 1 - Estimated Probability of Preparation Program Graduate Teaching at School with at Least one Graduate from another Program as a Function of

Distance from Program to School

Negative relationship indicates graduates are more likely to teach in schools closer to where they graduated

Page 11: Human Capital Policies in Education: Further Research on Teachers and Principals

Figure 2 – Preparation Program and School Connections

Shade of line represents strength of connection - the number of graduates from a program going to that school

Page 12: Human Capital Policies in Education: Further Research on Teachers and Principals

Figure 3 – Preparation Program Network Using a 5-Year Data Window

All preparation programs are connected, so school fixed effect estimation is possible

Page 13: Human Capital Policies in Education: Further Research on Teachers and Principals

Model – Preparation Program Effectiveness

• We estimate a model of student achievement gains as a function of student characteristics, teacher experience, grade and year indicators and program fixed effects

• Program effects are estimated relative to the average preparation program in Florida using STATA felsdvregdm command

• We rank programs on effectiveness, and divide the rankings into quartiles

• We compare the rankings and ranking quartiles with and without school fixed effects

Page 14: Human Capital Policies in Education: Further Research on Teachers and Principals

Table 2 – Top Tier Preparation Programs

No School FE With School FE

Program ID Rank Rank Quartile Rank Rank Quartile

20 1 1 6 1

32 2 1 32 4

17 3 1 3 1

4 4 1 9 2

7 5 1 13 2

28 6 1 2 1

13 7 1 7 1

12 8 1 14 2

19 10 2 4 1

31 18 3 1 1

24 20 3 5 1

26 27 4 8 1

Page 15: Human Capital Policies in Education: Further Research on Teachers and Principals

Table 3 – Bottom Tier Preparation Programs

No School FE With School FE

Program ID Rank Rank Quartile Rank Rank Quartile

32 2 1 32 4

14 14 2 29 4

11 26 4 10 2

26 27 4 8 1

22 28 4 30 4

15 29 4 28 4

23 30 4 26 4

27 31 4 31 4

21 32 4 27 4

33 33 4 33 4

Page 16: Human Capital Policies in Education: Further Research on Teachers and Principals

Results – Preparation Program Rankings

• Rankings are significantly affected by the inclusion of school fixed effects in the value added model

• Of the 12 programs in the top quartile of rankings in either specification, 8 programs are in a different quartile of rankings in the other specification

• The bottom quartile of program rankings is more stable, with 6 programs in this quartile for both specifications.

• There are 2 programs that switch from the bottom quartile of rankings for one specification to the top in the other specification

Page 17: Human Capital Policies in Education: Further Research on Teachers and Principals

Results – Variance Inflation

• Schools where all new teachers came from a single program do not help identify preparation program effects in school fixed effect model: ~32% of our sample of teachers

• Loss of these teachers can greatly inflate the standard errors of the estimated program effects for some programs

• The standard errors of the preparation program estimates increases by an average of 16.5% after including school fixed effects

• The variance inflation is severe for 10 of the 33 programs, with standard errors increasing over 20%

Page 18: Human Capital Policies in Education: Further Research on Teachers and Principals

Homogeneity Assumption

• School fixed effects can only yield consistent estimates of program effectiveness if there are no systematic differences among teachers and schools that create the connections among programs from other teachers and schools in the state

• Three tests of homogeneity assumption:1. Are schools with graduates from 1 program different than school with

graduates from more than 4 programs?

2. Are teachers that teach in schools with graduates from 1 program different than teachers that teach in schools with graduates from more than 4 programs

3. Are central schools, ones that help the most in connecting preparation programs different than other schools in the state?

Page 19: Human Capital Policies in Education: Further Research on Teachers and Principals

Figure 3 – Central School Locations

Central schools have a disproportionately high influence in identifying program effects

Page 20: Human Capital Policies in Education: Further Research on Teachers and Principals

Results – Homogeneity Assumption

• Three tests of homogeneity assumption:1. Schools different?

• YES: schools with new teachers from 4+ preparation programs are larger, higher % black and Hispanic, lower test scores and higher % free lunch

2. Teachers different? • YES: average characteristic of teachers in schools with from 4+

preparation programs are higher % black and Hispanic, lower test scores and lower SAT exams

3. Central schools different? • YES: larger schools, higher % Hispanic, immigrant, and LEP

students

Homogeneity Assumption Likely Violated

Page 21: Human Capital Policies in Education: Further Research on Teachers and Principals

Years of Data

• The data window length affects program connections• programs will have a tie through a school if they both have a

graduate teaching in the school sometime during the window

• As we lengthen the window more programs will have ties making estimation possible however, requires the assumption that both school and program

effects are constant over the entire window

• Other implications• Variance inflation• Homogeneity assumption

Page 22: Human Capital Policies in Education: Further Research on Teachers and Principals

Results – Years of Data

• Identification is robust to shorter window length• Even with 3 years of data the school fixed effects can be identified• Restricting to 2 years of data results in 3 preparation programs

being disconnected

• Variance inflation is worse• Due to an increase in the proportion of teachers in schools with

graduates from a single program who are not used to estimate program effects

• Characteristics of schools that contribute to program estimates with school fixed effects are very similar• Somewhat larger schools and higher % Hispanic

Page 23: Human Capital Policies in Education: Further Research on Teachers and Principals

Sample of Teachers• In the results reported so far only inexperienced teachers (< 3 years experience) were included in the analysis.

• This restriction is warranted if the impact of the preparation program dissipates as the teacher gains experience on the job

• We can include experienced teachers in the sample and restrict program effects to exclude these teachers• but experienced teachers change the connections between schools and

preparation programs

• Experienced teachers will identify school fixed effects• This can result in reduced variance of program effects• Our ability to compare programs could rely on differences between

schools on experienced teachers

Page 24: Human Capital Policies in Education: Further Research on Teachers and Principals

Results – Sample of Teachers

• Preparation program ranking quartiles are unaffected by the inclusion of experienced teachers in models without school fixed effects

• Ranking quartiles in models with school fixed effects change for 8 out of 33 programs• 3 out of these 8 changes are more than 2 quartile differences

• Experienced teacher result in lower program effect variances in models with school fixed effects by 13% on average

Page 25: Human Capital Policies in Education: Further Research on Teachers and Principals

Summary and Conclusions

• Good News for School Fixed Effects Models:• Despite regional clustering, Florida preparation programs are

connected, so use of school fixed effects is feasible

• There is significant variation in school characteristics for graduates of any preparation program, so use of school fixed effects desirable

• Bad News for Models with School Fixed Effects• Including school fixed effects inflates variance of estimated

program effects• Homogeneity assumption likely violated

• Preparation program effectiveness rankings differ significantly with and without school fixed effects

Page 26: Human Capital Policies in Education: Further Research on Teachers and Principals

Summary and Conclusions• A 3-year data window and use of school fixed effects may

provide a reasonable compromise between bias and variance inflation

• However, there is no clean empirical method to identify a model with no bias or a model that yields program effect estimates with the smallest MSE

• States will need to make a choice knowing that the choice may affect preparation program rankings and might be yielding a biased estimate unless untestable assumptions hold

• States may need to consider if value added modeling alone can provide useful information about preparation program