comparing growth in student performance

33
Comparing Growth in Student Performance David Stern, UC Berkeley Career Academy Support Network Presentation to Educating for Careers/ California Partnership Academies Conference Sacramento, March 4, 2011 1

Upload: shyla

Post on 10-Feb-2016

20 views

Category:

Documents


0 download

DESCRIPTION

Comparing Growth in Student Performance. David Stern, UC Berkeley Career Academy Support Network Presentation to Educating for Careers/ California Partnership Academies Conference Sacramento, March 4, 2011. What I’ll explain. - PowerPoint PPT Presentation

TRANSCRIPT

  • Comparing Growth in Student Performance

    David Stern, UC BerkeleyCareer Academy Support Network

    Presentation to Educating for Careers/California Partnership Academies ConferenceSacramento, March 4, 2011*

  • What Ill explainWhy value added is the most valid way to compare academy students progress with other students in same school and grade

    How to compute value added

    Example of application to career academies*

  • *???What questionsdo you haveat the start?

  • What is value added?Starts with matched data for individual students at 2 or more points in timeUses students characteristics and previous performance to predict most recent performancePositive value added means a students actual performance is better than predictedIf academy students on average perform better than predicted, academy has positive value added *

  • *.93

    Correlation betweenacademic performance indexand % low-income studentsin California school districts

  • Value added is better measure thanComparing average performance of 2 groups of students without controlling for their previous performance because one group may have been high performers to start with

    Comparing this years 11th graders (for example) with last years 11th graders because these are different groups of students!

    *

  • Creates better incentivesReduces incentive for academy to recruit or select students who are already performing wellRecognizes academies for improving performance of students no matter how they performed in the pastProvides a valid basis on which to compare student progress, and then ask why*

  • What NOT to doDONT attach automatic rewards or punishments to estimates of value added use them as evidence for further inquiryDONT rely only on test scores analyze a range of student outcomes: e.g., attendance, credits, GPA, discipline, etc.DONT use just 2 points in time analyze multiple years if possible, and do the analysis every year

    *

  • Recent reportsNational Academies of Science: Getting Value out of Value Added http://www.nap.edu/catalog.php?record_id=12820Economic Policy Institute: Problems with the Use of Student Test Scores to Evaluate Teachers http://epi.3cdn.net/b9667271ee6c154195_t9m6iij8k.pdf*

  • How its doneNeed matched data for each student at 2 or more points in timeAccurately identify academy and non-academy students in each time periodUse statistical regression model to predict most recent performance, based on students characteristics and previous performance*

  • Example: comparing teachersEach point on graph shows one students English Language Arts test score in spring 2003 (horizontal axis) and spring 2004 (vertical axis) for an actual high schoolRegression line shows predicted score in 2004, given score in 2003Students who had teacher #30 generally scored higher than predicted in 2004 this teacher had positive value added*

  • Scatterplot of 2003 and 2004 English Language Arts scores at one high school

  • Scatterplot of 2003 and 2004 scores,with regression lineDots above the line represent studentswho scored higher than predicted in 2004.Dots below the line represent studentswho scored lower than predicted.

  • Most students with teacher 30scored higher in 2004than predicted by their 2003 scoreThis students 2004 scorewas higher than predictedThis students 2004 scorewas lower than predicted

  • *Example using academies,in a high school with4 career academiesand 4 other programs:

    Programs 2, 4, 5, and 8 arecareer academies

  • Parents education differs across programs*

  • Student ethnicity also differs*

  • *Students in programs 4, 5, and 8 are

    less likely to have college-educated parents

    less likely to be white.

    Comparisons of student performance should takesuch differences into account.

  • Grade 11 enrollments, 2009-10*Analysis focused onstudents in grade 11who were present inat least 75% of classes.

  • Mean GPA during junior year, 2009-10*

  • Mean 11th grade test scores, spring 2010*

  • Mean 8th grade test scores for 2009-10 juniors*

  • *Juniors in programs 4 and 5 had lower grades and test scores.

    But comparing 11th grade test scores is misleading becausestudents who entered programs 4 and 5 in high schoolwere already scoring lower at end of 8th grade.

    More valid comparison would focus on CHANGEin performance during 2009-10.

  • Numbers of students by change in English lang. arts performance level during 2009-10*Only program 8had more studentswhose performancelevel went up thanstudents whoseperformance levelwent down.Performance levels:far below basic,below basic, basic,proficient, advanced.

  • Change in GPA from grade 8 to 11*Programs 1, 3, and 8had students withhighest GPAs in8th grade.

    GPA in 11th grade waslower than in 8th gradefor students in these3 programs.

  • Predicting 2010 test score based on 2009 score*Dots above the line represent studentswho scored higher than predicted in 2010.Dots below the line represent studentswho scored lower than predicted.

  • Predicting 11th grade GPA based on 8th grade*Dots above the line represent studentswho scored higher than predicted in 2010.Dots below the line represent studentswho scored lower than predicted.

  • *Regression analysis uses prior performancealong with other student characteristicsto estimate each students predicted performancein 2009-10.

    In this analysis, programs 2-8 are compared toprogram 1.

    Positive regression coefficient says, on average,students in that program exceeded predictionmore than students in program 1 did.

  • Value added results for test scores*Only program 8 had positivevalue added compared to program 1.

    The only statistically significantdifferences with program 1were programs 2 and 4, bothnegative. In these two programs,students scored significantlylower than predicted.

  • Value added results for GPA*Programs 3, 6 and 8 weresignificantly differentfrom program 1.Average GPA was lowerthan predictedin these three programs.

  • Questions for this schoolWhy did juniors GPA in 2009-10 fall below prediction in programs 3, 6, and 8? Why did juniors test scores in English language arts fall below prediction in programs 2 and 4?Important to see whether these patterns persist for more than one year.

    *

  • ConclusionAcademy National Standards of Practice: It is important to gather data that reflects whether students are showing improvement and to report these accurately and fairly to maintain the academys integrity.

    Measuring value added will keep academies in the forefront of evidence-based practice

    *

  • *???What questionsdo you have now?