final report - leap innovations · 2019-08-12 · please contact annalee good...

196
Personalized Learning in Practice An Evaluation of Breakthrough Schools: Chicago Final Report WISCONSIN EVALUATION COLLABORATIVE

Upload: others

Post on 26-May-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

Personalized Learning in Practice An Evaluation of Breakthrough Schools: Chicago

Final Report

WISCONSIN

EVALUATION

COLLABORATIVE

Page 2: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

Preface to Personalized Learning in Practice An Evaluation of Breakthrough Schools: Chicago

As personalized learning gains momentum across the United States, the need for implementation research has never been stronger. For part of an answer to this, LEAP Innovations and Next Generation Learning Challenges proudly share Personalized Learning in Practice: Evaluation of Breakthrough Schools: Chicago, a rigorous study of the LEAP approach to personalized learning. Conducted by the University of Wisconsin’s Wisconsin Evaluation Collaborative, the study explores the earliest outcomes of deep, full-school shifts toward personalized learning, as well as insights that the process reveals. All six schools studied were members of Breakthrough Schools: Chicago, NGLC’s multi-year program, in which schools worked with LEAP to plan and execute whole-school shifts toward student-centered learning models. The six schools featured in the report received implementation grant funding, planning support and extensive professional learning to transform instruction across their entire schools. Though personalized learning is rooted in decades of research, it is still a new, evolving approach, and we are relatively early in the process of studying and refining the discipline. When a school commits significant resources and planning to reimagining and restructuring its approach to learning, what does the result look like? What does it take to get there? What difference does it make for students, educators and communities? And how does it change when the schools are able to learn from and share with each other? We’re very glad to help begin answering these questions, and we’re excited to say the study finds promising potential outcomes in schools where there is deep implementation of personalized learning. It’s important to note that the WEC evaluators observed LEAP-partnered classrooms throughout the 2016-2017 school year, and into school year 2017. Since then, each school’s practice has evolved and advanced considerably, as has LEAP’s approach. This being said, for all the schools across the country that have explored similar journeys with personalized learning in the years since, we hope this study serves as both a validation of their goals and a powerful resource as they bring those goals to life. Thanks to brave, trailblazing schools like those featured in this report, schools no longer have to navigate their way through wilderness as they chart their own paths with personalized learning. It’s also important to note that while more traditional, academic data on school success is undeniably important, the ambition and vision of these schools extended far beyond what standardized test scores can reflect. Personalized learning is about building education systems that embrace all young people in their full, complex humanity, and while we’re proud of the report’s promising quantitative findings, the richness of this study lies in the evaluators’ interviews and observations. These qualitative insights into the implementation process—which range from what it takes to successfully bring personalized learning to life to the difference the approach makes for students and educators—are relevant and affirming to all educators considering similar journeys to these brave school teams. In addition to NGLC’s grant to bring Breakthrough Schools: Chicago to life, this program was made possible thanks to generous contributions from The Joyce Foundation and The Chicago Public Education Fund, Bill & Melinda Gates Foundation and Northern Trust. We’re incredibly grateful for their partnership and support.

Page 3: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

Please contact Annalee Good ([email protected]) with questions related to this report. 1

PERSONALIZED LEARNING IN PRACTICE

AN EVALUATION OF BREAKTHROUGH SCHOOLS: CHICAGO

Executive Summary

Daniel Marlin, Annalee Good, Jennifer Vadas and Richard Halverson

INTRODUCTION: PERSONALIZED LEARNING IN PRACTICE

A growing number of U.S. schools are now experimenting with the implementation of personalized learning, an

instructional approach where teachers and students co-create a learning environment with students’ interests

and needs at the center. School districts, and in some cases, states are in turn considering promising practices in

personalized learning implementation, including the adoption of frameworks and other strategies to replicate

whole-school models with evidence of efficacy.

Against that backdrop, practitioners are grappling with critical questions about the implementation of

personalized learning frameworks: What should schools prioritize when implementing personalized learning?

How do certain practices play out differently in certain contexts? How do local and environmental

considerations affect implementation -- and impact? What does fidelity to a framework look like and how might

it impact efficacy? What are the prospects for replication?

The framework evaluated in this report, which was developed by

nonprofit LEAP Innovations in collaboration with a network of

schools in Chicago, conceptualizes a model of personalized learning

comprised of four components of learning and instruction: Learner

Focused, Learner Led, Learner Demonstrated, and Learner

Connected.

In 2014, LEAP/Chicago was one of two cities chosen for funding for

Breakthrough Schools: Chicago (BSC), a multi-year initiative where

cohorts of schools worked with LEAP to implement whole-school

models for personalized learning.

Page 4: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

Please contact Annalee Good ([email protected]) with questions related to this report. 2

As part of the initiative, cohorts of BSC received:

• Planning grants to develop proposed models or “blueprints”; • An additional round of grant funding to implement their proposed models;

• Ongoing workshops and professional development;

• Participation within a community of practice of other schools implementing personalized learning;

• Access to national experts on personalized learning

The BSC initiative is funded through a partnership between Next Generation Learning Challenges, EDUCAUSE,

Bill and Melinda Gates Foundation, The Chicago Public Education Fund, Joyce Foundation and Northern Trust.

AN EVALUATION OF BSC: PURPOSE AND DESIGN

Beginning in Fall 2016, the Wisconsin Evaluation Collaborative (WEC) at the University of Wisconsin-Madison

conducted an evaluation of personalized learning at the six LEAP BSC of the 2016-17 cohort (Gwendolyn Brooks

College Preparatory Academy, Chicago International Charter School Irving Park, Disney II Magnet School, Patrick

Henry Elementary School, Robert Lindblom Math and Science Academy, and Joseph Lovett Elementary School).

Consistent with calls for more in-depth studies of implementation, the purpose of this evaluation project was to

map and explore variables at play in the implementation of the LEAP Framework within the BSC, as well as

potential patterns in student outcomes. Our guiding evaluation questions included:

1. What are the proposed personalized learning models in Breakthrough Schools: Chicago and how are they aligned to the LEAP Framework?

2. What are patterns in practice and readiness in the first year of implementation at the school site level, and is implementation consistent with intended and proposed program models?

3. What are patterns in academic outcomes for students engaged in personalized learning in

Breakthrough Schools: Chicago?

As with any educational innovation in its design, development and implementation stages, evaluation is critically

important as a systematic way to capture salient conditions, relationships, resources, and processes. Yet

evaluating potential outcomes and impact at this stage can be challenging, due to limitations such as small

sample sizes or variability in the “treatment” (i.e. what personalized learning actually looks like in schools). Towards that end, we examined patterns in outcomes at the school and student levels with the goal of

informing the field with a more granular understanding of personalized learning implementation variables,

including educators’ perspectives. Our mixed-method design was based on a) extensive qualitative fieldwork,

including classroom observations, interviews, survey text and artifact collection; and, b) quantitative analysis of

the LEAP Personalized Learning Survey and standardized test scores at the cohort, school, and grade level via

data provided by Chicago Public Schools. Descriptive analysis of academic outcomes provided insights into

potentially promising patterns, but we ultimately determined neither the descriptive nor the quasi-experimental

analyses allowed causal claims about the impact of personalized learning on student outcomes.

Page 5: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

Please contact Annalee Good ([email protected]) with questions related to this report. 3

We explored connections between the LEAP Framework, implementation patterns and student outcomes in

ways that, we hope, establish a foundation for future research on personalized learning on student learning. A

detailed description of our evaluation design can be found in the full report.

SUMMARY OF FINDINGS: IMPLEMENTATION

A deeper look into implementation practices offers insights and potential strategies for schools and districts interested in personalized learning.

• In the schools where we observed deep implementation of personalized learning practices, we

tended to also see promising outcomes in terms of student learning. Schools can feel confident that

personalized learning is a promising approach to improving student learning.

• But conditions matter to implementation - and certain conditions matter more than others: time

(e.g., a full planning year), funding (e.g., pay for substitute teachers to support teacher professional

development), capacity (e.g., the degree to which students, families and teachers develop

understanding of personalized learning concepts) and the digital tools available to students and

teachers (e.g., hardware and software are compatible).

• Teacher collaboration, including across schools, was central to adopting a personalized learning

model. Implementation benefitted when schools prioritized time and space for teachers to regularly

plan, implement and reflect together on their personalized learning initiatives.

Summary of data collection and analysis

• 49 school-wide and classroom observations

• 6 observations of school-level meetings

• 24 interviews with 25 school-level staff; 6 conferring discussions

• Document analysis of “blueprints”, curricular materials, etc. • Analyses of LEAP Personalized Learning surveys

• Descriptive analysis using publicly available quantitative data from the Chicago Public Schools

website

• Descriptive and quasi-experimental analysis of student level data from Chicago Public Schools

Page 6: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

Please contact Annalee Good ([email protected]) with questions related to this report. 4

• Examples of practices supporting or furthering implementation of personalized learning included:

o “Genius Hour,” an example of project-based learning in which students select a long-term project based on their interests. Genius Hour is modeled on Google’s idea that people should have time in the day to work on projects they love. For example, one student researched the Pokémon card

game, while another created a presentation on how diseases spread. Students take notes while

researching in a Genius Time Notebook, then present to their classmates and give peer feedback

to one another.

o “Flex Fridays,” in which elementary school students can choose different academic or nonacademic courses based on their interests. Some students sometimes even teach

courses, which also involved lesson planning. For

example, Spanish-speaking students in one school taught

Spanish language classes to their peers.

o “Colloquia,” in which high school students sign up for 3 flex classes each week depending on what they need help with or are interested in.

o Purposeful use of multi-grade classrooms, allowing more advanced students to move more quickly through material while also providing necessary support to other students.

o Involving families in crafting a school-wide vision plan and working with students.

• Models of personalized learning in practice often changed over the course of implementation in

response to shifts surrounding conditions (e.g. funding, staff turnover, district calendar changes) and

needs (e.g., students need more practice with online platforms, parents/caregivers need more

frontloading on the concept of competency-based progression).

• Personalized learning practices are not implemented in isolation, but are best understood in relationship

to one another in what we describe as “practice clusters.” This evaluation suggests schools may want to

consider which practices tend to work well together, purposefully plan to implement clusters to help

embed personalized learning, and assess the actual implementation of clusters through relevant survey

questions.

“Flexible learning required reflection on part of students

and teachers. Working with

students on how to self-

regulate, self-direct.”

-Assistant Principal

Page 7: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

Please contact Annalee Good ([email protected]) with questions related to this report. 5

SUMMARY OF FINDINGS: STUDENT ACADEMIC OUTCOMES

Although there are important considerations and real challenges in how best to estimate impact, this evaluation suggests the potentially promising influence of personalized learning

on student outcomes.

Despite the inability to make strong causal claims with this evaluation, our analysis of impact in combination

with detailed descriptive analysis of academic performance holds out the possibility that personalized learning improves students’ academic achievement.

In particular, descriptive analyses of students’ academic outcomes show interesting and potentially promising patterns:

• The proportions of elementary school students meeting math and reading growth targets in BSC sites

increase in the first year of participation in BSC, both overall and relative to non-BSC sites.

o 58% of elementary students in BSC sites initially met growth targets on the universal math

assessment (NWEA MAP), increasing to 63% in the first year of being in the BSC cohort. This is

compared to 52% of students in non-BSC schools initially meeting growth targets, increasing to

55%.

o BSC and non-BSC schools started with approximately 58% of elementary students meeting

growth targets on NWEA MAP reading assessment. This increased to 61% after the first year of

being in the BSC cohort, while non-BSC schools decreased to 56%. • BSC schools implementing personalized learning in middle school

grades, which started with already high average attainment levels

in reading or math, maintained high attainment levels during the

first year of being a BSC school.

• BSC schools implementing personalized learning in elementary or

middle school grades, which started with low average attainment levels in reading or math, improved the first year of being a BSC

school.

Evaluations of dynamic programs are complex and given the constraints we must be careful about making claims about the direct (and causal) of an initiative on something like academic achievement. That said,

focused and detailed descriptive work can provide hints of where to look for potential evidence of impact.

The school “attainment” level

compares the average spring MAP

NWEA test scale score for a

particular school to the national

average score. Schools are then

ranked and assigned a

corresponding percentile point.

Page 8: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

Please contact Annalee Good ([email protected]) with questions related to this report. 6

RECOMMENDATIONS FOR PERSONALIZED LEARNING IN PRACTICE

Implementing and evaluating personalized learning is complex work, but shows promising connections between the depth of implementation, surrounding conditions, and outcomes for

students.

In addition to formative feedback provided to LEAP and the BSC, we also offer broad recommendations to the

field, drawn from our evaluation findings:

• Give purposeful attention to the persistent surrounding conditions (e.g. policy contexts, staff turnover,

available technology, budget structures) that impact implementation when designing models, funding

structures and professional development approaches.

• Focus on understanding the nuances of implementation and why it might look different at different

schools, which can inform the planning process for schools considering personalized learning models.

• Structure and support opportunities for schools to educate one another on their own promising

practices, drawing on the persistent theme of collaboration in the data.

• Schools may want to consider which practices tend to work well together, and purposefully plan to

implement practice clusters to help embed personalized learning.

• Draw on suggestive descriptive outcome data and explore the personalized learning practices of schools

showing strong implementation and outcomes.

• Similar to other school-wide change models, defining and capturing the “dosage” of personalized learning is complicated by the existence of differences within and between schools. This variability is not

only in terms of the scope of implementation (e.g. classroom, grade, or school level) but also the nature

of the programming itself (e.g. flexible seating combined with station rotation versus a 1:1 device

initiative where all students use personalized learning plans to complete a project). Schools and districts

must develop a common understanding or definition of personalized learning practices early in

implementation process.

• When considering a research or evaluation design:

o Think carefully about how to structure observations of classroom practice given some elements

(e.g. “growth mindset”) are challenging to observe and many are found in “practice clusters”. Also consider how to leverage teachers’ observations of one another’s practice.

o Given limitations of using quasi-experimental estimates of impact, explore descriptive analyses of

quantitative outcomes.

The purpose of this evaluation was to draw on rigorous, mixed-method approaches to understanding the

implementation of personalized learning in the context of the Breakthrough Schools: Chicago initiative, as well

as examine potential patterns in student outcomes. Our key insights can guide educators, schools and districts

interested in personalized learning, and strengthen the call for nuanced studies of implementation to better

understand the conditions and practices that matter most to student learning.

Please see the full evaluation report for a detailed description of the analysis behind these key insights.

Page 9: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

Personalized Learning in Practice

An Evaluation of Breakthrough Schools: Chicago

Final Report

Daniel Marlin [email protected]

Annalee Good [email protected]

Jennifer Vadas [email protected]

Richard Halverson [email protected]

Page 10: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

ii

Table of Contents

List of Tables ................................................................................................................................................. iv List of Diagrams and Figures ......................................................................................................................... vi Acknowledgments ....................................................................................................................................... vii Evaluation Design .......................................................................................................................................... 1 Findings for Evaluation Question 1: What are the proposed personalized learning models in Breakthrough Schools: Chicago and how are they aligned to the LEAP Framework? .................................. 5 Findings for Evaluation Question 2: What are patterns in practice and readiness in the first year of implementation at the school level, and is implementation consistent with intended and proposed program models? ........................................................................................................................................... 9

Implementation Conditions and Challenges ............................................................................................. 9 Commonly Observed Personalized Learning Practices and LEAP Framework Components .................. 12 Practice Clusters....................................................................................................................................... 16 Practice Clusters and LEAP Surveys ......................................................................................................... 20

Student Perceptions of Personalized Learning .................................................................................... 20 Teacher Perceptions of Personalized Learning ................................................................................... 26 Joint Perceptions of Personalized Learning ......................................................................................... 29

Collaboration among Educators .............................................................................................................. 30 Findings for Evaluation Question 3: What are patterns in academic outcomes for Breakthrough Schools: Chicago students engaged in personalized learning? ................................................................... 32

Evaluating Student Academic Outcomes in Breakthrough Schools: Chicago ......................................... 32 Data .......................................................................................................................................................... 33 Patterns in Student Academic Performance ........................................................................................... 34

Looking at School Attainment with Growth in Student Test Scores ................................................... 37 Distribution of Improvement in Student Test Scores .......................................................................... 39 Growth in Elementary and Middle School and by “Where Students Start” ....................................... 40 Elementary School (Grades 3–5) ......................................................................................................... 40 Middle School (Grades 6–8) ................................................................................................................ 42

Discussion: What does strong personalized learning implementation look like? ...................................... 45 Disney II .................................................................................................................................................... 45 Lovett ....................................................................................................................................................... 45 Henry ........................................................................................................................................................ 46 Lindblom, Brooks, and Irving Park ........................................................................................................... 46

Recommendations for Programming and Evaluation Design ..................................................................... 48 Appendix A: Qualitative Technical Appendix............................................................................................... 50

Qualitative Data Collection ...................................................................................................................... 50 Qualitative Data Analysis ......................................................................................................................... 52

Appendix B: Quantitative Technical Appendix ............................................................................................ 55 Survey Analysis......................................................................................................................................... 55 Quasi-experimental Analysis ................................................................................................................... 56

Description of the Selection into Treatment ....................................................................................... 56 Selection of a Control Group ............................................................................................................... 57

Descriptive Analysis ................................................................................................................................. 60 Notes on Analysis of Attainment ......................................................................................................... 60 Notes on Analysis of Growth ............................................................................................................... 62

Page 11: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

iii

Notes on Analysis of Outcomes by Pretest Quartile ........................................................................... 65 Appendix C: Excerpts from Original Evaluation Plan ................................................................................... 74 Appendix D: Data for Survey Figures ........................................................................................................... 76 Appendix E: School Profiles ......................................................................................................................... 91

Gwendolyn Brooks College Preparatory Academy .................................................................................. 91 Personalized Learning in Context ........................................................................................................ 91 General Personalized Learning Survey Data ........................................................................................ 93 Relationship to LEAP Framework ........................................................................................................ 94 Demographics and Academic Outcomes 2016-17 (from publicly available CPS data) ..................... 100 Potential Areas for Reflection............................................................................................................ 101

Irving Park .............................................................................................................................................. 102 Personalized Learning in Context ...................................................................................................... 102 General Survey Data .......................................................................................................................... 104 Relationship to LEAP Framework ...................................................................................................... 105 Demographics and Academic Outcomes 2016-17 (from publicly available CPS data) ..................... 115 Potential Areas for Reflection............................................................................................................ 116

Disney II Magnet School ........................................................................................................................ 118 Personalized Learning in Context ...................................................................................................... 118 General Survey Data .......................................................................................................................... 120 Relationship to LEAP Framework ...................................................................................................... 122 Demographics and Outcomes 2016-17 ............................................................................................. 132 Potential Areas for Reflection............................................................................................................ 134

Patrick Henry Elementary School .......................................................................................................... 135 Personalized Learning in Context ...................................................................................................... 135 General Survey Data .......................................................................................................................... 137 Relationship to LEAP Framework ...................................................................................................... 138 Demographics and Outcomes 2016-17 ............................................................................................. 149 Potential Areas for Reflection............................................................................................................ 150

Lindblom Math and Science Academy................................................................................................... 151 Personalized Learning in Context ...................................................................................................... 151 General Survey Data .......................................................................................................................... 152 Relationship to LEAP Framework ...................................................................................................... 153 Demographics and Outcomes 2016-17 ............................................................................................. 163 Potential Areas for Reflection............................................................................................................ 164

Joseph Lovett Elementary School .......................................................................................................... 165 Personalized Learning in Context ...................................................................................................... 165 General Survey Data .......................................................................................................................... 167 Relationship to LEAP Framework ...................................................................................................... 169 Demographics and Outcomes 2016-17 ............................................................................................. 179 Potential Areas for Reflection............................................................................................................ 180

Page 12: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

iv

List of Tables

Table 1: Evaluation Activities ......................................................................................................................... 2 Table 2: Evaluation Questions ....................................................................................................................... 3 Table 3: Breakthrough Schools: Chicago ....................................................................................................... 3 Table 4: Personalized Learning Practices....................................................................................................... 6 Table 5: Types of Challenges Identified ....................................................................................................... 10 Table 6: Personalized Learning Practices Observed in Each School ............................................................ 13 Table 7: Summary of Observed Personalized Learning Practices in Each School ....................................... 13 Table 8: Activities Observed in Each School at Least Twice, by LEAP Framework Component .................. 14 Table 9: Infrequently Observed LEAP Framework Activities and Personalized Learning Practices ............ 15 Table 10: Commonly Observed Practice Clusters, by School ...................................................................... 17 Table 11: Personalized Learning Practices Most Frequently Included in Clusters ...................................... 19 Table 12: Changes in Agreement, Targeted Instruction, by School ............................................................ 29 Table 13: Changes in Agreement, Competency-based Progression, by School .......................................... 30 Table 14: Number of BSC per Grade ............................................................................................................ 33 Table 15: Number of BSC versus Non-BSC Students, by Grade ................................................................... 34 Table 16: BSC Math Attainment Percentiles, by Grade Level and Subject.................................................. 36 Table 17: Percentage of Full-year Students in BSC and Non-BSC Sites Who Met Expectations in Math

and Reading Test Scores ....................................................................................................................... 36 Table 18: Math School Attainment and Students Meeting Test Score Expectations ................................. 38 Table 19: Reading School Attainment and Students Meeting Test Score Expectations ............................. 39 Table 20: MAP Spring 2016 to Spring 2017 Lower Quartile, Median, and Upper Quartile in Student

Growth Percentile for BSC, non-BSC, and Each School ......................................................................... 40 Table 21: Grades 3–5 School Quartile of Student Growth Percentile ......................................................... 41 Table 22: Elementary (Grades 3–5) BSC and non-BSC Students Whose Math and Reading Scores Met

Expectations by Pre-test Score Quartile ............................................................................................... 42 Table 23: Grades 6–8 School Quartile of Student Growth Percentile ......................................................... 43 Table 24: Grades 6–8 BSC and non-BSC Students whose Math and Reading Scores Met Expectations

by Pre-test Score Quartile ..................................................................................................................... 43 Table A-1: Personalized learning practices described in blueprints ............................................................ 50 Table A-2: Grades and Subjects observed in each school ........................................................................... 51 Table A-3: Code tree .................................................................................................................................... 52 Table B-1: Student (Grades 4-8) and teacher (Grades K-8) survey respondents ........................................ 55 Table B-2: Treatment Effect Estimates for Combined grades for Elementary Schools, Middle Schools,

and all schools, BSC versus non-BSC ..................................................................................................... 59 Table B-3: Treatment Effect Estimates by grade: BSC vs. Non-BSC ............................................................ 59 Table B-4: National School Attainment Percentile in BSC sites in math in spring 2016 and 2017; grade

level. ...................................................................................................................................................... 61 Table B-5: National School Attainment Percentile in BSC sites in Reading in spring 2016 and 2017. ........ 61 Table B-6: Percentage of full-year students in BSC and non-BSC sites who met their growth

projection in reading and math in spring 2017. .................................................................................... 63 Table B-7: Percentage of full-year elementary (grade 3 to 5) students in BSC and non-BSC sites who

met their growth projection in reading and math in spring 2016 and 2017 ........................................ 64 Table B-8: Percentage of full-year middle school students in BSC and non-BSC sites who met their

growth projection in reading and math in spring 2016 and 2017 ........................................................ 64 Table B-9: MAP Math Spring to Spring Lower Quartile, Median, and Upper Quartile in Student

Growth Percentile for BSC, non-BSC, and each site.............................................................................. 66

Page 13: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

v

Table B-10: MAP Reading Spring to Spring Lower Quartile, Median, and Upper Quartile in Student Growth Percentile for BSC, non-BSC and for each site. ........................................................................ 66

Table B-11: Percentage of full-year elementary (grade 3 to 5) students in BSC and non-BSC sites who met their growth projection in Math in spring 2016 and 2017. ........................................................... 67

Table B-12: Elementary Grades MAP Math Spring to Spring Lower Quartile, Median, and Upper Quartile in Student Growth Percentile for BSC, non-BSC and for each site. ........................................ 68

Table B-13: Percentage of full-year elementary (grades 3 to 5) students in BSC and non-BSC sites who met their growth projection in Reading in spring 2016 and 2017. ............................................... 69

Table B-14: Elementary Grades MAP Reading Spring to Spring Lower Quartile, Median, and Upper Quartile in Student Growth Percentile for BSC, non-BSC and for each site ......................................... 70

Table B-15: Percentage of full-year middle school students in BSC and non-BSC sites who met their growth projection in Math in spring 2016 and 2017 ............................................................................ 70

Table B-16: Middle school (grades 6 to 8) MAP Math Spring to Spring Lower Quartile, Median, and Upper Quartile in Student Growth Percentile for BSC, non-BSC and for each site .............................. 71

Table B-17: Percentage of full-year middle school (grades 6 to 8) students in BSC and non-BSC sites who met their growth projection in Reading in spring 2016 and 2017 ................................................ 72

Table B-18: Middle school (grades 6 to 8) MAP Reading Spring to Spring Lower Quartile, Median, and Upper Quartile in Student Growth Percentile for BSC, non-BSC and for each site. ............................. 73

Page 14: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

vi

List of Diagrams and Figures

Diagram 1: Key Conditions for Implementation .......................................................................................... 10 Figure 1: Targeted Instruction, Student Perception, by School .................................................................. 21 Figure 2: Competency-based Progression, Student Perception, by School ................................................ 21 Figure 3: Targeted Instruction, Student Perception, by School, Restricted Sample ................................... 22 Figure 4: Competency-based Progression, Student Perception, by School, Restricted Sample ................. 23 Figure 5: Targeted Instruction, Student Perception, by Grade Level .......................................................... 24 Figure 6: Competency-based Progression, Student Perception, by Grade Level ........................................ 24 Figure 7: Targeted Instruction, Student Perception, by Grade Level, Restricted Sample ........................... 25 Figure 8: Competency-based Progression, Student Perception, by Grade Level, Restricted Sample ........ 26 Figure 9: Targeted Instruction, Teacher Perception, by School .................................................................. 27 Figure 10: Competency-based Progression, Teacher Perception,” by School ............................................ 27 Figure 11: Targeted Instruction, Teacher Perception, by Grade Level ........................................................ 28 Figure 12: Competency-based Progression, Teacher Perception, by Grade Level...................................... 29

Page 15: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

vii

Acknowledgments

The authors would like to thank the teachers and administrators in the Breakthrough Schools: Chicago cohort for their willingness to share their experiences and time. We also would like to thank colleagues at WEC, including Emily Cheng for assisting with quantitative modeling and analysis, Dr. Jed Richardson for assisting with quantitative modeling, and Karen Faster for editing the final report.

Page 16: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

1

Personalized Learning in Practice: An Evaluation of Implementation, Practices, and Outcomes

A growing number of U.S. schools are adopting personalized learning, an instructional approach where teachers and students co-create learning environments with students’ interests and needs at the center. The framework for personalized learning evaluated in this report, which was developed by nonprofit LEAP Innovations in collaboration with a network of schools in Chicago, conceptualizes a model of personalized learning comprised of four components of learning and instruction: Learner Focused, Learner Led, Learner Demonstrated, and Learner Connected.

In 2014, LEAP/Chicago was one of two cities chosen for funding and Breakthrough Schools: Chicago (BSC) was a multi-year initiative where cohorts of schools worked with LEAP to implement whole-school models for personalized learning.

Beginning in Fall 2016, the Wisconsin Evaluation Collaborative (WEC) at the University of Wisconsin-Madison conducted an evaluation of implementation and outcomes at the six LEAP Breakthrough Schools: Chicago in the 2016-17 cohort.

The WEC team studied six Chicago public and charter schools: Gwendolyn Brooks College Preparatory Academy, Chicago International Charter School Irving Park, Disney II Magnet School, Patrick Henry Elementary School, Robert Lindblom Math and Science Academy, and Joseph Lovett Elementary School. A seventh school did not continue into the 2017-18 school year.

The following report details the evaluation design, the personalized learning models schools proposed in their applications to become a BSC, how the schools implemented personalized learning, patterns in academic outcomes for BSC students engaged in personalized learning, and a set of recommendations to guide practitioners engaged in personalized learning initiatives.

Evaluation Design

Findings in this report draw upon a mixed-method evaluation of the implementation of personalized learning in the 2016-17 cohort of BSC and patterns in student outcomes in these same cohort schools:

• Qualitative design: See Appendix A for a detailed description of the qualitative methodology. In summary, the WEC team conducted extensive qualitative fieldwork, including classroom observations, interviews, and artifact collection; highlights include: o Starting in Fall 2016 and continuing through the Fall of 2017, WEC conducted five rounds of

school site visits across the BSC cohort, including schoolwide and classroom observations, interviews with administrators and educators, and observations of and feedback from team meetings of teachers’ professional learning communities. At these teacher meetings we collected feedback by asking teachers to respond to two questions: “Was this a typical meeting?” and “How did this meeting build your capacity to implement personalized

Page 17: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

2

learning in your classroom?” During evaluation visits, team members occasionally and spoke informally with students to get an idea of the activities they were working on and their perceptions of personalized learning in their classrooms, but never collected any identifiable information from them.

o LEAP and WEC held a conferring session with BSC school administrators and teachers in the summer of 2017 to review preliminary findings and capture reflections by school staff on preliminary patterns in the evaluation of implementation.

• Quantitative design: See Appendix B for a detailed description of the quantitative methodology. In summary, the evaluation included analysis of the following types of quantitative data: o The LEAP Personalized Learning Survey conducted by LEAP to gather student and teacher

perceptions of personalized learning. LEAP staff suggested survey items for the analysis that would directly connect to the learning practices identified in the qualitative work.

o Demographic characteristics and standardized test scores at the cohort, school, and grade levels, with Spring 2016 as the pretest and Spring 2017 as the posttest.

While analyzing quantitative data, WEC staff engaged in extensive conversations with LEAP staff around potential statistical models to estimate LEAP’s impact on academic outcomes. Descriptive analysis of academic outcomes provided insights into potentially promising patterns, but WEC ultimately determined neither the descriptive nor the quasi-experimental analysis lead to causal claims about personalized learning.

The evaluation began in the summer of 2016. Table 1 summarizes evaluation activities for August 2016 through July 2018.

Table 1: Evaluation Activities

Data Collection and Analysis Communication and Reporting

● 49 schoolwide and classroom observations (see Table 2)

● Weekly check-in meetings between evaluation and programming teams

● Six observations of school-level meetings ● Progress/interim reports in February and October 2017 and March 2018

● 24 interviews with 25 school-level staff; six conferring discussions

● Personalized learning workshop and co-interpretation session with school staff in June 2017

● Document analysis of blueprints, curricular materials, etc.

● Ongoing discussions regarding survey analysis and quantitative modeling

● Analysis of personalized learning surveys Evaluation Design

● Quantitative analysis of public data ● Evaluation plan drafting/refining

● Quantitative analysis of student-level data from Chicago Public Schools.

● Modeling discussions for quantitative analyses with LEAP team and advisory board members

The evaluation plan in Appendix C details the initial evaluation design, guided by the evaluation questions in Table 2.

Page 18: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

3

Table 2: Evaluation Questions

Guiding Evaluation Question Data Sources

1. What are the proposed personalized learning models in BSC, and how are they aligned to the LEAP Framework?

● Document analysis ● Observations ● Interviews ● LEAP personalized learning survey

2. What are patterns in practice and readiness in the first year of implementation at the school level, and is implementation consistent with intended and proposed program models?

● Document analysis ● Observations ● Interviews ● Surveys

3. What are patterns in academic outcomes for students engaged in personalized learning in Breakthrough Schools: Chicago?

● Quantitative administrative data ● LEAP personalized learning survey

These questions guided the examination process of each school, as well as patterns across the entire 2016-17 BSC cohort of the six schools WEC staff observed. Table 3 includes general information about each school; more information about each school, along with a profile of its personalized learning practices, can be found in Appendix E.

Table 3: Breakthrough Schools: Chicago

School Type

Grades in which Personalized

Learning Occurred

Gwendolyn Brooks College Preparatory Academy

Selective enrollment secondary school

7

Chicago International Charter School Irving Park

Kindergarten–8 charter school 3–4, 6–8

Disney II Magnet School Kindergarten–12 magnet school Kindergarten–8

Patrick Henry Elementary School

Kindergarten–6 neighborhood school

2–6

Robert Lindblom Math and Science Academy

Selective enrollment high school Classroom- dependent

Joseph Lovett Elementary School

Kindergarten–8 neighborhood school

2–5

aWEC staff identified these grades through interviews and then confirmed them with LEAP staff.

As part of their application to join the BSC cohort, schools submitted blueprints of their intended personalized learning activities. In addition to being required by the application, blueprints helped schools to spell out their plans for incorporating personalized learning that aligned with their pedagogical philosophies, and how schools planned to communicate, fund, and structure personalized learning opportunities. All blueprints included the following elements:

Page 19: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

4

1) mission, vision, core values, non-negotiables, and goals; 2) analysis of opportunities and gaps; 3) overview of “design anchors,” principles to guide the school in its implementation of

personalized learning; 4) student narrative describing how a typical student would experience personalized learning at

the school 5) elements of the school’s proposed model for personalized learning; 6) pilot and plan; and 7) implementation plan.

To understand how schools intended to use personalized learning, the evaluation team undertook a thorough review of each school’s blueprint and drafted a logic model for each school. From this review, the team collected the personalized learning models and practices each school said it would pursue (both within the application section about the school’s model and elsewhere in the document) and added them as “tags” to the LEAP Framework within an online observation platform used by LEAP called Teachboost. The evaluation team then used these tags to code classroom observations and notes from interviews and teacher team meetings.1

1 In the months after WEC’s observations, LEAP Innovations updated some of the language in the LEAP Framework; this report incorporates the terms in use during the evaluation. The new language is available at http://leaplearningframework.org/.

Page 20: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

5

Findings for Evaluation Question 1: What are the proposed personalized learning models in Breakthrough Schools: Chicago and how are they aligned to

the LEAP Framework?

Personalized learning practices in the BSC schools and the degree of their alignment to the LEAP Framework formed the basis of the evaluation’s qualitative analyses. LEAP Innovations uses the framework to help schools conceptualize and identify personalized learning strategies and practices to implement in their schools.

The LEAP Framework and Its Four Components

Learner Led = Students help shape their individual learning paths and monitor their own progress.

Learner Focused = Students consider their backgrounds, passions, personalities, and needs to experience learning that is relevant and inclusive.

Learner Demonstrated = Students progress based on competency, no longer limited by classrooms that rush them or ask them to wait.

Learner Connected = Students supplement classroom learning with real-world experiences, enabling them to thrive in a relationship-driven world.

LEAP Innovations developed its framework at the beginning of this evaluation process, and LEAP staff were interested to know the degree of alignment of BSC blueprints and practices with the framework. The framework provides scaffolding to support understanding of how and to what degree schools incorporate personalized learning and includes specific activities within all components except for Learner Connected.

Table 4 compiles the BSC blueprints’ personalized learning practices and defines each practice and lists the schools that proposed each method. This summary provides a picture of schools’ intended models and gauges the prevalence of certain practices in schools’ conceptualizations of personalized learning. The next section details the prevalence of certain practices in schools’ implementation of personalized learning. For example, all six BSC identified four personalized learning practices:

• Personalized/learner focused/targeted instruction

• Personal learning paths

• Competency or proficiency-based progression

• Flexible learning environments/schedules

Not surprisingly, LEAP Innovations’ suggested blueprint structure and elements highlighted certain personalized learning practices, including: “student agency,” “learner profiles,” “competency-based progression,” and “flexible learning environments.” The latter three in particular map directly onto the four most common personalized learning practices identified by BSC.

Takeaways for proposed models: • Many of the BSC schools identified similar personalized learning practices

when articulating their blueprints. • Many of the personalized learning practices explicitly aligned to the LEAP

Framework.

Page 21: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

6

Table 4: Personalized Learning Practices

Personalized Learning Practice

Definition Schools Number

of Schools

Brooks Irving Park

Disney II

Henry Lindblom Lovett

Personalized/learner focused/targeted instruction

Instruction is personalized for students based on their needs. X X X X X X 6

Personal learning paths

Students have their own distinct learning paths or plans.

X X X X X X 6

Competency- or proficiency-based progression

Students progress in their learning based on competency and mastery of material.

X X X X X X 6

Flexible learning environments/schedules

Students have the opportunity to learn where and when they want.

X X X X X X 6

Learner profiles Students have profiles of their interests and learning styles.

X X X 3

Content area specialized classrooms

Classrooms are specialized by subject. X 1

Mixed-age grouping Students learn in classrooms mixed by age or grade level.

X X X 3

Collaborative learning models/labs

Students learn in classrooms or lab spaces designed to foster collaboration.

X X X 3

Project-based learning

Students learn by working on projects suited to their learning needs, styles, and interests.

X X X 3

Data-driven consultation

Staff and students use data to understand needs and develop learning targets.

X 1

Students as teachers (peer instruction)

Students learn by teaching each other.

X 1

Social emotional and/or noncognitive learning

Students develop nonacademic skills through personalization. X X X X 4

Positive behavioral interventions and supports (PBIS) and restorative practices

Students engage in restorative practices as a result of personalization.

X X X 3

Understanding by design

Educators set goals and work backward to identify supporting instructional practices (backward-mapping).

X X 2

Horizontal alignment

Personalization is aligned within a grade level.

X X 2

Page 22: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

7

Personalized Learning Practice

Definition Schools Number

of Schools Brooks

Irving Park

Disney II

Henry Lindblom Lovett

Vertical alignment

Personalized learning aligns across multiple grade levels; teachers develop sequences that allow students to build skills and knowledge as they advance.

X X 2

Gradual release

Teachers gradually provide students opportunity to engage in personalized learning based on degree to which students choose where they work and what they work on.

X X 2

Flexible staffing Staff move between grade levels, subject areas, and/or classrooms.

X X 2

Participatory culture (around media)a

Students learn by creating their own media.

(0)

aWEC included this item based on author Halverson’s research on personalized learning; none of the schools explicitly included it within their blueprints.

Some practices BSC identified in their blueprints can be tied directly to the LEAP Framework. For example, in many cases, schools intended students to have personal learning paths. The personal learning paths in the blueprints took different forms at different schools, but typically involved worksheets or software that students use to select the order in which they completed required lessons or activities during a given timeframe. Following these paths, students assess and monitor their own progress, an item the LEAP Framework included in its Learner Led category:

The teacher reminds students to update progress ("tracking and reflection") sheets, which include what they do each day, in each

rotation, whether they finish it, and how they did.

Meanwhile, competency-based progression fits into the Learner Demonstrated category, as it can reflect students entering at levels appropriate to their prior knowledge, pacing through units, and receiving credit based on proficiency:

A teacher pre-assesses the students every day and forms groups of students based on that. She thought it would be a lot of work at first, to adapt groups each day based on the pre-assessments, but [found] that the kids tend to stay in the same groups, especially at the upper end of

proficiency.

On the other hand, some practices’ ties to the LEAP Framework depend on the content of the observation and context of the classroom. Project-based learning can represent a Learner Led approach if students express their interests, collaborate with others, and ask for help from teachers or peers, as with Lindblom school’s colloquium projects. It can also be Learner Focused, as students’ learning opportunities can be based on their individual needs …

Page 23: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

8

The unit all students are working on is the dystopian genre of fiction. Students could choose from 10 different books that represented

different reading levels, with some guidance from teachers. They all were to work on a “summative project” that could either be a Socratic

discussion with the teacher, or creating an experience (e.g., game) from a close read of the book for peers to go through.

… or Learner Demonstrated if students show evidence of their learning in multiple ways …

One student working at a table appears to be making some sort of physical board game with a path and spinner. Two other students work together on a board game that appears to include multiplication facts.

We will discuss this phenomenon further when reviewing personalized learning practices observed in tandem.

Page 24: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

9

Findings for Evaluation Question 2: What are patterns in practice and readiness in the first year of implementation at the school level, and is implementation

consistent with intended and proposed program models?

Implementation Conditions and Challenges

To understand the nature of implementation of personalized learning in the BSC, we first must understand the surrounding conditions and context, including challenges schools face. WEC evaluators coded 11 types of challenges that schools identified (plus an “other” code). Table 5 lists these challenges and the schools that identified them. All six schools indicated that lack of “time,” “digital tools,” and “staffing” were obstacles to some extent. Five indicated that “funding/sustainability” and “capacity” (staff’s ability to implement personalized learning) presented challenges.2 A challenge not being listed under a particular school does not necessarily indicate school staff did not experience it; rather, they just did not identify it in conversations with them or in the observed meetings.

2 WEC identified the challenge of funding/sustainability early on in the evaluation; subsequent work by LEAP, in collaboration with Afton Partners, shows that costs may be sustainable over the long term. See https://www.leapinnovations.org/our-research/financial-study/.

Takeaways for implementation: Similar to other schoolwide initiatives, implementation differs from school to school with some underlying patterns:

• Certain conditions (i.e., time, funding, capacity, and digital tools) appear to be critical to implement personalized learning effectively.

• Schools showed various levels of fidelity to their blueprint plans, but often had to adapt practice to different conditions and needs.

• Certain personalized learning practices appeared in “clusters” across multiple schools, the most common of which was a combination of targeted instruction and competency-based progression.

• Teachers across all BSC express they highly value collaboration as both a practice and outcome of personalized learning.

Page 25: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

10

Table 5: Types of Challenges Identified

Challenge Schools Number

of schools

Brooks Irving Park

Disney II

Henry Lindblom Lovett

Time X X X X X X 6

Digital tools X X X X X X 6

Staffing X X X X X X 6

Capacity X X X X X 5

Funding and sustainability X X X X X 5

Difficulty level (too challenging for many students or too easy for certain groups of students)

X X X X 4

Need to meet standards/testing requirements/grading

X X X X 4

Difficulty in instilling student agency X X 2

Attendance X X 2

Teachers union X X 2

Note: A challenge only needed to be mentioned in one interview or teacher team observation to be coded.

Some specific comments related to the detrimental effect of staff turnover on implementation; teacher buy-in is critical for effective personalized learning, and losing teachers who understand how to implement personalized learning results in a loss of capacity and a need to orient new teachers who may not appreciate personalized learning (or even be hostile to it). Similarly, any change in school leadership or administration could also stymie any progress a school has made toward personalized learning. Interviewees also identified the importance of parent/caregiver involvement (the “Learner Connected” component of the LEAP Framework).

The challenges that the BSC described reflect the inherent difficulties schools face in transitioning to a full personalized learning model. Schools cannot merely be randomly assigned to a treatment group and be told to implement personalized learning. Rather, four key conditions (Diagram 1) must be in place before a school can do so. These four conditions (related to the challenges listed above) influence the extent to which schools implemented personalized learning. These conditions can overlap, as described below:

Time

Because implementing personalized learning can be a major transition for schools, school staff must have enough time to plan, train teachers, and prepare students. Teachers and administrators reflected on the importance of both having enough preparation time in the form of a planning year, as well as time outside of the classroom to collaborate and plan as teams during implementation. One principal discussed the difficulty of balancing professional development and instructional time. Certain personalized

Diagram 1: Key Conditions for Implementation

Page 26: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

11

learning tools that students use, such as Lexia, a literacy program, can present a time issue, as well; teachers at multiple schools discussed difficulty in “hitting their minutes,” or meeting requirements to spend a certain amount of time on a topic.

Capacity

School level staff consistently spoke about the importance of stakeholder capacity to understand the overall concept of personalized learning and the specifics of its implementation. This capacity included the knowledge and skills of teachers, administrators, students, and caregivers. As one assistant principal said, teachers have to be comfortable making mistakes and getting feedback. Teacher meetings fostered capacity; as one teacher wrote in a postmeeting feedback form, “this meeting aids in classroom management which allows personalized learning to happen.” Additionally, as mentioned above, educator and administrator turnover can limit a school’s capacity to implement personalized learning.

Funding

Especially in the context of tight budgets across the district, the availability of funds targeted for personalized learning affected timelines and implementation. Schools needed the ability to purchase digital tools, pay for substitute teachers so teachers could engage in professional development and planning, conduct teacher workshops, or hire personalized learning coaches. Teachers in one professional learning community meeting indicated that they did not have the materials they needed to conduct project-based learning. Budgets also had an impact on time; one assistant principal said that budget cuts cost the school grade-level meeting days. Given that many teachers mentioned that collaboration was helpful (as discussed below), losing such time as a result of funding could be detrimental to institutionalizing personalized learning. However, funding can also act as a facilitator; at the same school, administration was going to use funding to decrease class sizes, which could conceivably make personalization of learning less burdensome.

Digital Tools

The nature of available digital tools influenced the ability of schools to fully implement the practices they identified in their blueprints. These tools included hardware (e.g., Chromebooks, tablets, etc.) and software/platforms that would allow teachers to personalize and manage student learning (e.g., Summit Basecamp, Google Classroom, Lexia, etc.). One principal said integrating new software with district software took extra time and work. Basecamp in particular was a source of frustration. Teachers at one school tied this frustration to lack of the time; not all teachers were trained on Basecamp initially, so the delayed training took time away from training on other aspects such as project-based learning. One principal intimated that Basecamp is often used merely as a platform rather than a curriculum. However, a teacher in a different professional learning meeting at the same school said Basecamp “would help with differentiation” and “ease teacher workload.” Other tools, such as those at Lovett school that allow teachers to better track their students’ academic and attendance data, also appeared to better facilitate personalized learning.

LEAP Innovations implicitly embedded this understanding of conditions through its process of selecting schools. This consideration affects the quantitative design (see the quantitative methodology section below).

Page 27: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

12

Commonly Observed Personalized Learning Practices and LEAP Framework Components

With these conditions related to time, capacity, funding, and digital tools in mind, WEC researchers used observation data to test alignment of implementation to the personalized learning practices each school identified. For this analysis, the team defined an observed personalized learning practice as one observed at least twice across different classroom settings within a school. (While researchers also coded observations of staff meetings using the personalized learning practices, those are not included here, as analysts were interested in looking at the personalized learning practices in practice.) The team chose this approach because the observations only reflected a moment in time in each school or classroom, so observers wanted to avoid reporting on frequencies—the team did not feel confident in saying that a school or teacher used a certain personalized learning practice more often than others. At the same time, the team wanted to be as certain as possible that the observed practices were not just one-time practices, but in fact were reasonably entrenched. Further, just because observers did not see a personalized learning practice during an observation does not mean it was not in place; for instance, horizontal or vertical alignment is hard to observe taking place in the course of everyday instruction.

Drawing from the personalized learning practices identified in Table 4, Table 6 notes whether (a) a school identified the personalized learning practice, (b) the WEC team actually observed it in that school, and (c) whether the team observed it even if it was not identified. The point of this analysis was to see the extent to which a school was doing what it said it would in its blueprint.

Page 28: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

13

Table 6: Personalized Learning Practices Observed in Each School ID+O = identified in blueprint and observed; ID-N = identified in blueprint but not observed;

ADD = not identified in blueprint but observed.

Personalized Learning Practice

Number of

Schools

Schools

Brooks Irving Park

Disney II Henry Lindblom Lovett

Personalized/learner focused/targeted instruction

6 ID+O ID+O ID+O ID+O ID-N ID+O

Personal learning paths 6 ID-N ID+O ID+O ID+O ID-N ID+O

Competency- or proficiency-based progression 6 ID+O ID+O ID+O ID+O ID-N ID+O

Flexible learning environments/schedules 6 ID+O ID+O ID+O ID+O ID-N ID+O

Learner profiles 3 ID-N ID-N ID-N

Content area specialized classrooms 1 ADD ID-N ADD

Mixed-age grouping 3 ID-N ID+O ADD ID+O

Collaborative learning models/labs 3 ADD ID+O ID-N ADD ID-N

Project-based learning 3 ID+O ADD ID+O ID+O ADD ADD

Data-driven consultation 1 ADD ADD ID-N

Students as teachers (peer instruction) 1 ADD ADD ADD ADD ADD ID+O

Social emotional and/or noncognitive learning 4 ADD ID+O ID-N ID+O

Positive behavioral interventions and supports (PBIS) and restorative practices

3 ID+O ID-N ID-N

Understanding by design 2 ID-N ID-N

Horizontal alignment 2 ID-N ID+O

Vertical alignment 2 ID-N ID-N

Gradual release 2 ID-N ADD ID-N

Flexible staffing 2 ADD ADD ID-N

Participatory culture (around new media) 0

Table 7 summarizes the data in Table 6.

Table 7: Summary of Observed Personalized Learning Practices in Each School

School

Number of personalized

learning practices identified

Number of identified personalized learning practices observed at

least twice

Number of additional personalized learning

practices observed

Brooks 9 4 2

Irving Park 7 5 4

Disney II 8 7 4

Henry 12 7 3

Lindblom 6 0 5

Lovett 12 7 1

Averages 9 5 3.33

Table 7’s final column indicates that all schools used an average of 3.3 personalized learning practices that their blueprints may not have identified; Table 6 indicates that project-based learning and peer

Page 29: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

14

instruction are the two most common of these. This finding may suggest that schools already incorporate these practices and do not view them as unique to personalized learning. Notably, Lindblom Math and Science Academy (a high school) was particularly mismatched between the personalized learning practices identified in the school’s blueprint and those actually observed.

As with the personalized learning practices, WEC staff also reviewed LEAP Framework activities observed in each school at least twice (Table 8). LEAP Innovations defined the activities listed within each framework component.

Table 8: Activities Observed in Each School at Least Twice, by LEAP Framework Component

Framework Component Activity

Schools

Brooks Irving Park

Disney II Henry Lindblom Lovett

Learner Led

Collaborate with learners to identify and include learner preferences and optimal learning conditions (e.g., modalities, technology use, the nature and duration of learning activities, pacing, grouping size, and when and where learning will take place)

X X

Partner in setting learning goals and plans X

Articulate their interests, strengths, and needs X X

Assess and monitor their own progress X X X X X

Collaborate with others to achieve goals X X X X X X

Advocate for needed support from teachers, peers, technology and other sources X X X X

Reflect upon their learning and continually refine their strategies X X X

Adopt a growth mindset

Learner Focused

Have learning opportunities that reflect an understanding of individual needs, interests, and strengths with respect to academic skills and needs

X X X X X X

Have learning opportunities that reflect an understanding of individual needs, interests, and strengths with respect to nonacademic skills, and needs including social, cultural and emotional

X X X X

Learner Demon-strated

Enter at a level appropriate to their prior knowledge and learning needs X X

Have supports and pacing that fits their learning X X X X X

Demonstrate proficiency when ready X

Demonstrate evidence of learning in multiple ways X X X X X

Receive credit (recognition of learning) based on demonstrated proficiency (not seat time) X X X

Learner Connected

Learning transcends the classroom in relevant and accredited ways, connected to families and communities

Page 30: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

15

From this list, we can see that certain framework constructs are more commonly in place across schools. Schools attempt to reach students academically and nonacademically, and to support students by pacing their learning. Students collaborate, track their progress, ask for help, and demonstrate learning in multiple ways. These practices likely are easier to enact than others, which is probably why observers saw them more frequently as schools adopted personalized learning. (Anecdotally, the WEC team did hear that allowing students to proceed in their learning at the correct pace was challenging, but WEC team members saw evidence of attempts to institute such pacing.) With that in mind, this report next reviews LEAP Framework components and the personalized learning practices observed less frequently across the cohort (Table 9).

Table 9: Infrequently Observed LEAP Framework Activities and Personalized Learning Practices

LEAP Framework Component and Activity Personalized Learning Practices

Learner Led: Collaborate with learners to identify and include learner preferences and optimal learning conditions

Understanding by design

Learner Led: Partner in setting learning goals and plans

Learner profiles

Learner Led: Articulate their interests, strengths, and needs

Vertical alignment

Learner Led: Adopt a growth mindset

Learner Demonstrated: Enter at a level appropriate to their prior knowledge and learning needs

Learner Demonstrated: Demonstrate proficiency when ready

The fact that some of these items were tagged infrequently should not be alarming; it is much easier to observe evidence of students collaborating than “adopting a growth mindset,” or “entering at a level appropriate to their prior knowledge and learning needs.” Other gaps indicate areas in which schools could improve, such as a more institutionalized usage of learner profiles or making sure students have the opportunity to demonstrate proficiency when they are ready to do so. Future study could investigate whether such practices become more common or observable as personalized learning evolves within and across schools.

In addition to the elements of the LEAP Framework and the identified personalized learning practices from the schools’ blueprints, observers tagged the types of resources students and teachers used in classrooms. Commonalities across these resources were observed, specifically with respect to the personalized learning technologies the BSC used. Lexia, ThinkThroughMath, and ST Math are common at the elementary schools, and middle schools typically utilize the Summit Basecamp platform. Also common are resources such as portable furniture so students can create flexible learning environments—observers rarely saw students sitting at neatly arranged rows of desks.

Page 31: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

16

Practice Clusters

As qualitative analysis commenced, observers noticed that they tagged personalized learning practices within a school in tandem with others. Any such instance was termed a “practice cluster,” which facilitated the determination as to whether an identified practice cluster was present in multiple schools in the BSC cohort. For a pair of practices to be considered a cluster, the pair had to be observed across two or more schools. Next, for the analysis and for more formal grounding of the practice clusters, WEC linked the practice clusters to the four categories in the LEAP Framework: Learner Led, Learner Focused, Learner Demonstrated, or Learner Connected. Analysts reviewed the content of each observation within each practice cluster and tagged the observation to an observed activity and then to the LEAP Framework. For example, one straightforward observation was:

Across the classroom, all students work on projects. Unless they are specifically working with the teacher, all students are on their

Chromebooks, some sitting at desks, some on carpets.

This observation was tagged using the personalized learning practices flexible learning environments and project-based learning. The content of this observation appeared to match most closely to the observed activity collaborate with learners to identify and include learner preferences and optimal learning conditions, which falls under the Learner Led component (Table 8). Evaluators performed this process for each school individually and then reviewed the linkages across the BSC cohort.

For the most part, the connections between the practice clusters and the LEAP Framework components were uniform across all schools. That is, if a practice cluster aligned to the Learner Focused component in one school, it likely did so in other schools, as well. However, the content of the observations within a cluster sometimes varied substantially by school, and thus, the cluster would end up linked to multiple LEAP Framework components. For instance, one observation in the practice cluster personal learning paths + competency-based progression read in part:

This observation reflected that students in this classroom have supports and pacing that fits their learning, an activity under the Learner Demonstrated component of the LEAP Framework. However, at a different school, the same cluster looked in part like this:

The content of this observation mapped more closely to have learning opportunities that reflect an understanding of individual needs, interests and strengths with respect to academic skills and needs, a

They have ST Math worksheets that they are to use to track what they're doing and how far they've progressed throughout the week, and at the end of the week, they're supposed to do reflections based on

their work. The students apparently haven't been so good at that part yet, and need to do better. The students also describe how the teacher tracks their progress on ST Math.

Students on the laptops are working on: their book notes in a google doc; reviewing slides for an assessment; working on a study guide with teacher in a discussion to review before taking an assessment;

working on concepts related to the unit (i.e., on storytelling devices like conflict) in Basecamp that they can progress through on their own and test out of.

Page 32: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

17

Learner Focused activity. This inconsistency reflects some of the inherent “gray” area of the LEAP Framework—Learner Led, Learner Focused, Learner Demonstrated, and Learner Connected may not be discrete categories, but rather a continuum of practice that involves some overlap. Additionally, some practice clusters—for example, a combination of content-area specialized classrooms + mixed-age grouping—do not have a logical home within the LEAP Framework. Indeed, while analysts endeavored to fit the practice clusters into the LEAP Framework components by matching them to the activities, personalized learning practices that are harder to categorize or define may in fact transcend the LEAP Framework altogether.

Nonetheless, most clusters appear to fall into the same LEAP Framework components across the BSC cohort as shown in Table 10. Further, only at Disney II school was any pair of practices observed in more than one classroom; this phenomenon was likely a function of the short duration of most observations.

Table 10: Commonly Observed Practice Clusters, by School

Practice Cluster

Associated LEAP

Framework Component

Schools Number

of Schools Brooks

Irving Park

Disney II Henry Lindblom Lovett

Targeted instruction + Competency-based progression

Learner Demonstrated

X X X X X 5

Targeted instruction + flexible learning environments

Learner Focused X X X 3

Flexible learning environments + project-based learning

Learner Led X X X 3

Content area specialized classrooms + flexible staffing

Learner Focused X X X 3

Targeted instruction + collaborative learning models/labs

Learner Led, Learner Focused

X X 2

Targeted instruction + data-driven consultation

Learner Focused X X 2

Personal learning paths + competency-based progression

Learner Focused, Learner

Demonstrated

X X 2

Competency-based progression + flexible learning environments

Learner Demonstrated

X X 2

Competency-based progression + project-based learning

Learner Focused, Learner

Demonstrated

X X 2

Flexible learning environments + content area specialized

classrooms Learner Focused X X 2

Flexible learning environments + mixed-age grouping

Learner Focused X X 2

Flexible learning environments + collaborative learning models/labs

Learner Focused X X 2

Page 33: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

18

Practice Cluster

Associated LEAP

Framework Component

Schools Number

of Schools Brooks

Irving Park

Disney II Henry Lindblom Lovett

Flexible learning environments + gradual release

Learner Focused X X 2

Content area specialized classrooms + mixed-age grouping

None X X 2

Content area specialized classrooms + horizontal alignment

+ flexible staffing Learner Focused X X 2

The most common cluster, as shown in Table 10, is targeted instruction + competency-based progression. A few examples of this practice cluster from observations are as follows:

● Teacher was at a half table with four students talking through the worksheet on their Chromebooks; a teachers’ aide was floating around checking in with students. Teacher said they group students by their summative scores on the last unit. Students who haven’t shown mastery spend additional time up with the teacher.

● Students in group doing “Group Practice” are using software called “IXL.” A student shows an observer the software, with a long list of topics (“IXL5.1 Which x satisfies an equation”) that shows where they are in progression and if they finish that topic they get a badge next to it on the list. When they are in the problem itself it shows their progress for that “S.1” topic in terms of a number. They were aware of how they are doing via a “smart score.” The teacher decides what they work on, they can just go onto the next one in the sequence if they finish and don’t want to interrupt the teacher to check on it.

● Seven students are working with the main teacher at a table in the front of the room in front of the white board. These students remain at that table the entire duration of the observation. These students had not finished a paragraph assignment (“TEAL” or Topic sentence, Evidence, Analysis, Link – so what?) so they were working on it together. They had to respond to the question, “Why are dystopian novels so popular with young people?”

Table 11 shows the personalized learning practices most commonly included in Table 10’s practice clusters.

Page 34: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

19

Table 11: Personalized Learning Practices Most Frequently Included in Clusters

Practice Number of occurrences

Flexible learning environments 7

Targeted instruction 4

Competency-based progression 4

Content area specialized classrooms 4

Project-based learning 2

Flexible staffing 2

Mixed-age grouping 2

Collaborative learning models/labs 2

Data-driven consultation 1

Personal learning paths 1

Gradual release 1

Horizontal alignment 1

Across clusters, the practice of flexible learning environments was often paired with other practices. (Indeed, given that targeted instruction, competency-based progression, and flexible learning environments were all mentioned in each blueprint, and observed in nearly all schools, it is perhaps not surprising that they appear frequently in practice clusters.) At a surface level, the prevalence of flexible learning environments may arise because it is easier to observe students engaged in learning in different settings than other, more subtle practices. However, just because students have flexibility as to where they do their work does not mean the work is truly personalized; otherwise, schools would only need to invest in beanbags and laptops. Rather, the value of exploring practice clusters involving flexible learning environments is being able to see what other personalized practices were taking place while students worked where (and when) they liked, and how that work aligned to LEAP Innovations’ conception of personalized learning as outlined in the LEAP Framework. From there, we can hope to garner a better understanding of personalized learning as it is practiced, which can help establish best practices as it becomes more institutionalized. For example, pairing gradual release with flexible learning environments allows students to take more responsibility for their learning in a deliberate fashion:

Competency-based progression was observed frequently across clusters, most often with targeted instruction, with students receiving individual instruction or placed in groups based on academic knowledge or need. Indeed, educators may find competency-based progression may be most useful when they pair it with instruction. In one school where competency-based progression existed in tandem with project-based learning, students (with the help of teachers) were to choose a book from a

Level 1 - can work at: your own personal desk/table; Level 2 - can work at: your own personal desk/table; another table-top around the classroom; Level 3 - can work at: your own personal desk/table; another table-top around the classroom; on the floor around the classroom; Level 4 - can work at: your own personal desk/table; another table-top around the classroom; on the floor around the classroom; in the hallway or closet

Page 35: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

20

list that included multiple reading levels and then work on a “summative project” based on the book. In another school, students’ lesson plans (learning paths) were created based on their proficiency with the aid of software:

They have ST Math worksheets that they are to use to track what they’re doing and how far they’ve progressed throughout the week, and at the end of the week, they’re supposed to do reflections based

on their work. ... One student has progressed to the end and is now starting on 4th grade.

Yet another frequently observed practice within clusters is content area specialized classrooms. We may expect these types of classrooms to exist in high schools, where classes are generally organized by subject, but the elementary BSC also employ them. For instance, in one elementary school, students in fourth and fifth grades are grouped by subject (a literacy block) rather than age. In another, teachers switched classrooms when they changed the subject to be taught, emphasizing the possible importance of flexible staffing to organizing classrooms by subject at the elementary level.

Practice Clusters and LEAP Surveys

LEAP Innovations administers surveys to students and teachers in each semester, asking them how much they agree with various statements related to personalized learning. WEC’s quantitative analysis explored how student and teacher responses varied over time by grade and school.

Mapping survey responses to the observed practices provides a better sense of student and teacher perceptions of personalized learning. Once the WEC team identified the individual practices (Table 4), LEAP Innovations staff suggested student and teacher survey items to connect to each practice. The discussion below focuses on the practices in the most frequently observed practice cluster, which was present in five schools: targeted instruction + competency-based progression, which correlates with the Learner Demonstrated component of the LEAP Framework. The discussion examines changes in student and teacher perception from Fall 2016 to Spring 2017. Student responses are analyzed in two ways, once with all student responses and once with responses only from students whose teachers also responded to the personalized learning surveys in Fall 2016 and Spring 2017. This latter pupil cohort is the “restricted student sample.” Restricting the teacher sample in a similar way, by only including teachers who responded to the survey in both Fall 2016 and Spring 2017, is impractical because the sample size, already small, prevents drawing meaningful conclusions. Student perceptions are shown in Figures 1–8; Figures 9–12 illustrate teacher perceptions per their responses to two survey questions (the underlying data for the figures is in Appendix D). The narrative also explores relationships between student and teacher survey responses.

Student Perceptions of Personalized Learning

One statement that seems to fit the targeted instruction practice in this cluster is “My teacher asks me what I know about new topics before I begin working on them.” Figure 1 shows the levels of student agreement with this statement.

Page 36: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

21

Figure 1: Targeted Instruction, Student Perception, by School “My teacher asks me what I know about new topics before I begin working on them.”

A statement that seems to match the competency-based progression part of the cluster is “I can move ahead to new topics as soon as I show what I have learned, even if other students are still working on it.” Levels of student agreement with the statement are shown in Figure 2.

Figure 2: Competency-based Progression, Student Perception, by School “I can move ahead to new topics as soon as I show what I have learned, even if other students are still

working on it.”

0 20 40 60 80 100percent of frequency

Lovett

Lindblom

Irving Park

Henry

Disney II

Brooks

Fall17Spring17

Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16

Don't Agree Agree a little

Mostly Agree Agree a lot

0 20 40 60 80 100percent of frequency

Lovett

Lindblom

Irving Park

Henry

Disney II

Brooks

Fall17Spring17

Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16

Don't Agree Agree a little

Mostly Agree Agree a lot

Page 37: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

22

As Figures 1 and 2 show, from fall 2016 to spring 2017, at Lovett and Disney II, the proportions of students who agree with both statements increase (the black bars decrease in length from term to term). At Henry and Irving Park, agreement increases on targeted instruction and decreases on competency-based progression, while Lindblom shows the reverse. The proportions of students who “agree a lot” with both statements (the lightest gray bars) represent a clearer trend – at both Brooks and Lovett, the proportions of students at this level increase, while at both Irving Park and Lindblom, they decrease.

Figures 3 and 4 show the results for these two statements when restricting the sample to students with teachers who responded to the fall 2016 and spring 2017 surveys (the “restricted student sample”). Henry is missing student responses from Fall 2016, and Lindblom is missing responses from Fall 2017. (Appendix B includes more information on survey methodology.)

Figure 3: Targeted Instruction, Student Perception, by School, Restricted Sample “My teacher asks me what I know about new topics before I begin working on them.”

0 20 40 60 80 100percent of frequency

Lovett

Lindblom

Irving Park

Henry

Disney II

Brooks

Fall17Spring17

Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16

Don't Agree Agree a little

Mostly Agree Agree a lot

Page 38: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

23

Figure 4: Competency-based Progression, Student Perception, by School, Restricted Sample “I can move ahead to new topics as soon as I show what I have learned, even if other students are still

working on it.”

In the sample restricted to students whose teachers answered the survey, changes in levels of agreement for Lindblom and Brooks are similar to those in Figures 1 and 2, with agreement about targeted instruction decreasing (bigger black bars) from Fall 2016 to Spring 2017 and increasing for competency-based progression. Agreement with both statements among Irving Park students again decreased. Disney II’s level of agreement remained nearly the same for targeted instruction and declined for competency-based progression. Lovett students in the restricted sample saw greater agreement about targeted instruction and less agreement about competency-based progression when Fall 2016 is compared to Spring 2017. Meanwhile, the proportions of students who “agree a lot” with both statements (the lightest gray bars) are only consistent at Brooks, where proportions increase for both questions; the proportions of “agree a lot” responses at each of the other schools increase for one statement but decrease for the other.

Figures 5 and 6 illustrate student survey responses to these questions by grade level.

0 20 40 60 80 100percent of frequency

Lovett

Lindblom

Irving Park

Henry

Disney II

Brooks

Fall17Spring17

Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16

Don't Agree Agree a little

Mostly Agree Agree a lot

Page 39: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

24

Figure 5: Targeted Instruction, Student Perception, by Grade Level “My teacher asks me what I know about new topics before I begin working on them.”

Figure 6: Competency-based Progression, Student Perception, by Grade Level “I can move ahead to new topics as soon as I show what I have learned,

even if other students are still working on it.”

As the 2016-17 school year progressed, students in most grades reported their teachers asked them what they already knew about new topics before they started to work on them. However, only fourth-grade students exhibited increased agreement that they could move ahead to new topics as soon as

0 20 40 60 80 100percent of frequency

8th grade

7th grade

6th grade

5th grade

4th grade

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Don't Agree Agree a little

Mostly Agree Agree a lot

0 20 40 60 80 100percent of frequency

8th grade

7th grade

6th grade

5th grade

4th grade

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Don't Agree Agree a little

Mostly Agree Agree a lot

Page 40: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

25

they showed what they learned; the proportions of fourth-grade students who responded “don’t agree” decreased, and the proportions who responded “agree a lot” increased.

For the restricted student sample in Figure 7, agreement decreased from Fall 2016 to Spring 2017 about targeted instruction for fifth, seventh, and eighth grades. Compared to Fall 2016, more fourth- and sixth-graders agreed in Spring 2017 that their teachers asked what they knew about topics before the pupils started studying them, as represented by both a decrease in the proportions of “don’t agree” responses and an increase in the proportions of “agree a lot” responses.

Figure 7: Targeted Instruction, Student Perception, by Grade Level, Restricted Sample “My teacher asks me what I know about new topics before I begin working on them.”

As Figure 8 shows, fewer older students said in Spring 2017 that they could move ahead to new topics as soon as they demonstrated what they had learned, compared to Fall 2016. More fourth- and fifth-graders disagreed with the statement about competency-based progression. Across all grades, though, the proportion of “agree a lot” responses decreased from Fall 2016 to Spring 2017.

0 20 40 60 80 100percent of frequency

8th grade

7th grade

6th grade

5th grade

4th grade

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Don't Agree Agree a little

Mostly Agree Agree a lot

Page 41: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

26

Figure 8: Competency-based Progression, Student Perception, by Grade Level, Restricted Sample

“I can move ahead to new topics as soon as I show what I have learned, even if other students are still working on it.”

Discrepancies between student perceptions captured in the survey and the practices the WEC team observed may be due to the differences among students at different grade levels. The WEC team derived the practice clusters from observations of learning environments from kindergarten through high school, and the survey analysis only covers students in Grades 4–8. Additionally, the greater proportions of younger students showing agreement overall is consistent with our hypothesis that personalized learning may be more effective for students in lower grades, as discussed in the analysis of Evaluation Question 3.

Teacher Perceptions of Personalized Learning

Next, we turn to the survey responses for teachers from Kindergarten through Grade 8 that LEAP identified as aligning to the targeted instruction and competency-based progression practices that make up this frequently observed practice cluster. Brooks, Henry, and Lovett had very low levels of teacher participation in each of the three survey administrations. Lindblom only had one teacher respondent in Fall 2017 but had higher participation in both Fall 2016 and Winter 2017.

For targeted instruction, no teachers disagreed with the statement “I use what students already know to direct their work on new topics/skills” (Figure 9). Strength of teacher agreement increased from Fall 2016 to Spring 2017 (longer lightest gray bars) at Disney II, Henry, and Irving Park.

0 20 40 60 80 100percent of frequency

8th grade

7th grade

6th grade

5th grade

4th grade

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Fall17Spring17

Fall16

Don't Agree Agree a little

Mostly Agree Agree a lot

Page 42: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

27

Figure 9: Targeted Instruction, Teacher Perception, by School “I use what students already know to direct their work on new topics/skills.”

Perhaps not surprisingly, teachers report they employ competency-based progression more than students perceived they did. As Figure 10 shows, at all schools except for Brooks and Lindblom, strength of agreement (shown by smaller black bars and larger light gray bars) among teachers increased from Fall 2016 to Spring 2017.

Figure 10: Competency-based Progression, Teacher Perception,” by School “If students master skills faster than others, they move ahead to a new topic, unit, or set of skills.”

0 20 40 60 80 100percent of frequency

Lovett

Lindblom

Irving Park

Henry

Disney II

Brooks

Fall17Spring17

Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16

Agree a little Mostly Agree

Agree a lot

0 20 40 60 80 100percent of frequency

Lovett

Lindblom

Irving Park

Henry

Disney II

Brooks

Fall17Spring17

Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16Fall17

Spring17Fall16

Don't Agree Agree a little

Mostly Agree Agree a lot

Page 43: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

28

Figures 11 and 12 show teacher responses by grade. The data cannot be analyzed by individual grade, as the underlying data only included grade ranges. Teachers are bullish about their implementation of competency-based progression but less so about targeted Instruction. Notably, Grades 1–3 teacher agreement with both statements (the proportions of teachers who responded “agree a lot”) increases from Fall 2016 to Spring 2017. Again, this finding supports the idea that personalized learning might be most effective, and thus practice clusters more evident, in early grades.

Figure 11: Targeted Instruction, Teacher Perception, by Grade Level “I use what students already know to direct their work on new topics/skills.”

0 20 40 60 80 100percent of frequency

Grades 1-3Fall17

Spring17Fall16

0 20 40 60 80 100percent of frequency

Grades 4-5Fall17

Spring17Fall16

0 20 40 60 80 100percent of frequency

Grades 6-8Fall17

Spring17Fall16

Agree a little Mostly AgreeAgree a lot

Page 44: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

29

Figure 12: Competency-based Progression, Teacher Perception, by Grade Level “If students master skills faster than others, they move ahead to a new topic, unit, or set of skills.”

.

Joint Perceptions of Personalized Learning

Teacher and student agreement with statements from Fall 2016 to Spring 2017 related to the targeted instruction and competency-based progression practice cluster diverge at the school level. Table 12 shows the changes in the proportions of students and teachers who responded “Don’t Agree” and “Agree a lot” for the targeted instruction items. Table 13 shows the same changes for the competency-based progression items. The tables use “+/-” rather than terms such as “increase” or “decrease” to indicate whether agreement moved in a desirable direction.

Table 12: Changes in Agreement, Targeted Instruction, by School

School Don’t Agree Agree a lot

All Students

Restricted Students

Teachers All Students Restricted Students

Teachers

Brooks – – None + + None

Disney II + + None + + +

Henry + None + +

Irving Park + – None – + +

Lindblom – – None – – –

Lovett + + None + + –

0 20 40 60 80 100percent of frequency

Grades 1-3Fall17

Spring17Fall16

0 20 40 60 80 100percent of frequency

Grades 4-5Fall17

Spring17Fall16

0 20 40 60 80 100percent of frequency

Grades 6-8Fall17

Spring17Fall16

Don't Agree Agree a littleMostly Agree Agree a lot

Page 45: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

30

Table 13: Changes in Agreement, Competency-based Progression, by School

School Don’t Agree Agree a lot

All Students

Restricted Students

Teachers All Students Restricted Students

Teachers

Brooks + + None + + –

Disney II + – + – – +

Henry – None – +

Irving Park – – + – – +

Lindblom + + – – + +

Lovett + – None + – +

From Fall 2016 to Spring 2017, Disney II, Henry, and Lindblom are the only schools at which the direction of student and teacher survey responses to the targeted instruction items aligned. No student-teacher alignment exists with the competency-based progression items. Further, only at Lindblom does restricting the student sample to students whose teachers responded to the Fall 2016 and Spring 2017 surveys improve alignment, and only for the items related to competency-based progression.

This approach of aligning practice clusters to the LEAP Framework and then locating relevant survey items is one type of analysis LEAP may consider performing with more survey items. The LEAP Framework can bridge observed practices with findings from surveys, as most survey constructs are categorized as Learner Focused, Learner Led, or Learner Demonstrated, to see if student and teacher perceptions match the practices observers are seeing in classrooms. This set of survey items, related to the most common practice cluster, shows some similar levels of agreement within student responses and teacher responses, though only a few associations between student and teacher agreement, relationships largely unimproved by restricting the student samples. To review all survey responses for all schools, see the individual school profiles in Appendix E.

Collaboration among Educators

Understanding teacher practice is key to understanding the potential of personalized learning to ultimately change student opportunities to learn and academic outcomes. The theory of action behind BSC is that teacher practice will change to better align with the principles laid out in the LEAP Framework, which in turn facilitate truly personalized learning for students.

To that end, one theme that arose from many of WEC’s interviews, observations of regular teacher meetings, and teacher postmeeting reflections was the importance of collaboration. Indeed, the research team’s curiosity about how teachers and administrators collaborated was the genesis of the decision to observe teacher meetings. Educators’ comments about collaboration may help teachers and administrators think differently about how to approach collaboration as it relates to personalized learning.

For instance, in their postmeeting reflections, teachers at Irving Park wrote:

Page 46: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

31

• “It helped put ideas to personalize for some students. Collaborating [with] the entire middle school time is always helpful.”

• “This meeting helped build capacity to implement personalized learning through collaborating.”

• “Collaborating with teachers to bounce ideas off each other.” An instructor at the same school told us that students were good at collaborating, even when they got stuck on certain material, which might suggest that the merits of collaboration filter down from the teacher level to the student level.

A Disney II administrator said the school’s learning lab (a personalized learning practice) helped build collaboration among educators, especially with new staff members. In a post-meeting reflection, a teacher wrote that there was a “collaborative sharing of ideas,” and another wrote that “collaboration always exists. Great Ideas are generated.” Another Disney II instructor was a little less sanguine about the utility of collaboration: “Pretty typical collaborative but not necessarily concrete. Exactly what are we doing [and] how?”

Henry administrators reflected that they changed the schedule to allow collaboration time for teachers, which fits with the flexible staffing/learning environment practices common to personalized learning settings. Specifically, in Henry, clusters of teachers (i.e. kindergarten and Grade 1; Grades204, and Grades 5-6 ). During an end-of-year all-staff meeting at Henry, grade-level teams sat together and created slides to show the rest of the staff how they incorporated personalized learning. (Henry also underwent Grade 3 staffing changes in Fall 2017, after the first year as a BSC – the third-grade team wanted to stay together, so one teacher moved with her students to fourth grade, one stayed in third, and one has a split third/fourth grade classroom.)

Lovett also shifted scheduling, giving Grades 2–3 and 4–5 common prep time to foster collaboration among teachers. A teacher said that she tries to encourage her students to collaborate when they are learning literacy and reading skills, and during the observed Grades 4–5 teacher meeting, participants discussed how students collaborate using technology; both instances suggest that the ethos of collaboration has been passed down from teachers to students. In a later presentation, Lovett’s principal identified the school as a “highly collaborative school community” and touted evidence of teacher collaboration using evidence from the 5Essentials surveys of students and teachers.3

Given the emphasis on collaboration, LEAP Innovations may want to consider including survey questions on teacher collaboration as it relates to student personalized learning.

3 See https://consortium.uchicago.edu/surveys for more information.

Page 47: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

32

Findings for Evaluation Question 3: What are patterns in academic outcomes for Breakthrough Schools: Chicago students engaged in personalized learning?

Evaluating Student Academic Outcomes in Breakthrough Schools: Chicago

In addition to identifying personalized learning practices and mapping their implementation, this evaluation examined student academic outcomes during the first year of BSC (in the 2016-17 school year). WEC’s analysis describes academic outcomes for BSC and non-BSC students by4

• using school and grade level statistics describing student growth and attainment in reading and math scores during the first BSC year and the prior year; and

• grouping students based on which quartiles their scores ranked in the pre-test during the first BSC year and the prior year.

These quantitative analyses use nationally benchmarked statistics to look at student performance in reading and math in BSC. Specifically, we compare performance in the first year of implementing BSC to past performance at that same school, as well as comparing BSC to other, non-BSC in the district.

From the beginning and throughout the evaluation, WEC (in regular discussion with LEAP Innovations) weighed the challenges of analyzing the impact of the BSC model on student academic outcomes. As discussed in Appendix B, although WEC did complete a quasi-experimental analysis of impact using propensity score matching, the selection process and small number of BSC meant that no causal claims could be made from these data. In addition to acknowledging these methodological limitations, it is important to understand that when schools dramatically change instructional strategies (such as shifting

4 Appendix B includes a more detailed description of the data, sample, and analytic processes.

Takeaways related to student academic outcomes: • Descriptive analyses of students’ academic outcomes show interesting and

potentially promising patterns for BSC: o BSC schools started with greater proportions of elementary school

students meeting their growth targets than non-BSC schools, and had even greater proportions during the first year of being a BSC

o BSC schools and grades starting with very high average student attainment levels in reading or math, maintained those high attainment levels during the first year of being BSC.

o BSC schools and grades starting with medium or low average student attainment levels in reading or math, increased their average student attainment levels during the first year of being BSC.

• The BSC selection process and the small number of schools in the study preclude claims of causality based on descriptive or quasi-experimental analyses.

Page 48: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

33

to an emphasis on personalized learning), the effect on academic performance in the first, transitional year of implementation can be weak or even negative.

With these constraints in mind, WEC aimed to provide stakeholders with a description of the data by summarizing it in meaningful ways and potentially identifying patterns that might emerge from it. After briefly describing the data WEC obtained and used, the following sections will summarize the descriptive analysis. Please see Appendix B for a more detailed description of quantitative methods.

Data

In alignment with the formal research partnership between WEC, LEAP Innovations and CPS, WEC received limited access to student level data - necessary for completion of the analysis - for students in grades 3-8 enrolled in the 2015-16 and 2016-17 school years. The data include information on school enrollment, demographic characteristics (race, free-or-reduced-price lunch status, special education status, and English language learner status) and academic performance. Student data was handled with the highest regard for privacy and security, and in alignment with the privacy and security terms of the agreement - including but not limited to encryption of student data and the requirement that all with access to the data first obtain the appropriate background checks. WEC used school enrollment data to assign a student to a grade and school. Then, a student was a BSC student in a given year if the student spent the school year in one of the following schools and grade:

1) Brooks (Grade 7) 2) Irving Park (Grades 3, 4, 6, 7, and 8) 3) Disney (Grades 3–6) 4) Henry (Grades 3–6) 5) Lovett (Grades 3–5)

WEC analysists derived this list of school and grades from the school site visits and interviews. Five of the six schools are included. Robert Lindblom Math and Science Academy is not included because it is a high school. We can see that for any given grade, the number of schools in the analysis is even smaller than five ranging from one to four as shown in Table 14. Across the six grades, 1,224 students participated in the first year of BSC (Table 15).

Table 14: Number of BSC per Grade

Grade 3 4 5 6 7 8

Number of schools 4 4 3 3 2 1

Source: LEAP Innovations and WEC site visits

Page 49: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

34

Table 15: Number of BSC versus Non-BSC Students, by Grade

Number of BSC

Students

Number of Non-BSC

Students in District

Grade 3 231 24,258

Grade 4 224 23,598

Grade 5 159 22,204

Grade 6 194 22,000

Grade 7 220 21,091

Grade 8 196 21,653

Total 1,224 134,804

Source: Chicago Public Schools

For academic outcomes, WEC used end-of-year standardized test scores from the Measures of Academic Progress (MAP) assessments provided by the Northwest Evaluation Association (NWEA). MAP is a computer adaptive test measuring basic skills in language arts and mathematics at each grade level. For the analysis of growth in student’s test scores, because of reduced participation in the fall testing, WEC took the difference between the spring of 2015-16 (pretest) and the spring of 2016-17 (posttest). WEC also obtained nationally benchmarked statistics provided by NWEA for academic performance over time. For each student, NWEA calculates growth expectations from one MAP test to the next, which facilitates comparison of actual changes in scores to the expected improvements in scores. If actual changes in test scores meet or exceed expectations, the student score meets the growth target. NWEA also calculates student and school growth percentiles (SGP)

Finally, WEC obtained school attainment level data from the Chicago Public School website. School attainment is based on year-end NWEA MAP scores and nationally benchmarked statistics NWEA collected over time. For each school, NWEA compares the average spring MAP score for a particular grade to the national average score for that grade. NWEA then ranks the schools and assigns corresponding percentiles. These percentiles are school attainment levels.

Patterns in Student Academic Performance The evaluation of academic outcomes starts with the analyses of school attainment measures to provide a context to anchor the analysis of student academic performance over time (growth).

The school “attainment” percentile compares the average spring MAP scale score for a particular school to the national average score. Schools are then ranked and assigned corresponding percentile points.

Page 50: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

35

Under the assumption that classes of students are comparable from one year to the next and in the absence of large external changes affecting test scores, school attainment measures can give a good picture of school year-to-year progress. Moreover, looking at the progression of school-level performance is especially useful when recent initiatives have improved student-level test scores significantly and led to higher expectations for further student improvement, and thus for improved school attainment. Without controlling for the whole available history of test scores, the estimated impact of an initiative such as personalized learning on school performance can be biased downward. For example, an intervention could aim to raise test scores in a low-performing school. If the intervention is highly effective, students experience large gains in test scores, and impact estimates are significantly positive. If the same benchmarked growth measures are used in following years, without controlling for other factors, school performance can look poor even though the school is still increasing the share of students who meet growth expectations (expected improvements in their test scores). Hence, to look at schools’ overall progress, WEC compared school test score attainment levels over time and in comparison to attainment levels of schools in a nationally representative sample. All student growth and school attainment data for each school are presented in Appendix B’s descriptive analysis section.

Table 16 shows math BSC attainment percentiles by grade in the 2015-16 and 2016-17 school years. Percentiles in bold indicate improvement from Spring 2016 to Spring 2017. In math, all schools improved or were highly ranked. Henry and Lovett schools had large improvements in particular, as did Irving Park. The trend with assessments of reading proficiency is not as uniform across schools, where we see improved performance for most grades at Henry and Lovett; Grades 3, 4, and 6 at Irving Park; and high test scores prior to the intervention at Disney II and in Irving Park’s Grades 7 and 8.

NWEA provides “growth” expectations between pre- and post-MAP tests. We can compare growth that actually occurred to the growth expectation. If the actual growth exceeds the projected growth, the student met their growth target.

Page 51: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

36

Table 16: BSC Math Attainment Percentiles, by Grade Level and Subject Percentiles in bold indicate improvement from Spring 2016 to Spring 2017. N indicates the number of

tested students

School Subject

Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Grade 8

’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17

Brooks Math

95 N=55

Reading 99

N=55

Irving Park

Math 46

N=54 67

N=50 42

N=28 48

N=58

77 N=56

89 N=81

93 N=77

89 N=81

87 N=64

88 N=74

Reading 46

N=55 67

N=50 52

N=28 60

N=58 85

N=56 94

N=82 90

N=77 90

N=51 93

N=64 91

N=74

Disney II

Math 89

N=51 93

N=49 96

N=51 97

N=54 98

N=50 99

N=54 97

N=51 96

N=55

Reading 91

N=51 87

N=49 98

N=51 93

N=54 91

N=50 97

N=54 97

N=51 96

N=55

Henry Math

46 N=79

91 N=61

24 N=72

30 N=54

58 N=71

65 N=57

47 N=56

52 N=47

Reading 35

N=79 85

N=61 36

N=73 46

N=54 53

N=71 53

N=57 34

N=56 62

N=46

Lovett Math

21 N=42

63 N=34

18 N=38

51 N=36

33 N=42

52 N=37

Reading 17

N=42 74

N=34 33

N=38 44

N=36 64

N=42 57

N=37

Source: Chicago Public Schools

NWEA also provides expectations for how each student’s scores should improve at each test period compared to the previous. Once a test is administered, and if a pretest is available, the improvement that actually occurred can be compared to the expected growth. Table 17 shows the percentages of full-year students who met expectations in BSC, non-BSC, and each BSC.

Table 17: Percentage of Full-year Students in BSC and Non-BSC Sites Who Met Expectations in Math and Reading Test Scores

All grades Grades 3–5 Grades 6–8 Math ’15-16

Math ’16-17

Reading ’15-16

Reading ’16-17

Math ’15-16

Math ’16-17

Reading ’15-16

Reading ’16-17

Math ’15-16

Math ’16-17

Reading’’15-16

Reading ’16-17

Non-BSC 55.9% 56.0% 60.4% 58.7% 52.2% 54.7% 58.5% 56.0% 59.7% 57.3% 62.3% 61.6% BSCa 59.5% 59.1% 59.6% 59.3% 57.6% 62.7% 57.8% 60.7% 63.2% 52.4% 62.8% 56.7% Difference in percentage points (BSC – non-BSC)

3.6 3.1 -0.8 0.6 5.4 8.0 -0.7 4.7 3.5 -4.9 0.5 -4.9

Brooks 45.8% 58.3% 45.8% 58.3% Irving Park 60.6% 52.3% 62.2% 56.6% 39.5% 42.9% 52.4% 54.7% 69.4% 57.4% 66.3% 57.6%

Disney II 65.5% 70.1% 62.6% 64.3% 66.5% 78.2% 57.2% 65.2% 62.8% 47.3% 78.4% 61.8% Henry 51.6% 53.5% 51.6% 55.7% 53.7% 56.9% 55.0% 57.5% 43.3% 40.0% 38.3% 48.2% Lovett 66.7% 71.6% 68.5% 66.7% 66.7% 71.6% 68.5% 66.7%

Source: Chicago Public Schools; aIn 2016-17, these data exclude Brooks to make it comparable across the two years.

Page 52: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

37

In the math 2017 posttest, BSC students (59.1% met expectations) did better than non-BSC (56.0%) In reading, BSC students’ scores (59.3% met expectations) were comparable to the rest of the district (58.7%). Disney II and Henry students were especially successful improving their scores in both subjects. These improvements in Spring 2017 occurred mostly in elementary grades from Spring 2016. BSC Grades 6–8 overall started slightly above district averages in 2015-16 but ended below district averages after the first year of BSC.

Looking at School Attainment with Growth in Student Test Scores

To contextualize the decline in percentages of middle school students’ test scores meeting expectations, overall school attainment measures must be reviewed. The next two tables for each BSC and its relevant grades combine findings in the progression of school attainment with percentages of students whose scores met expectations for growth. The rows in Tables 18 and 19 with unshaded values of “no” in the column labeled “Concerns” are school/grade combinations with no decrease in student growth or school attainment (all four third grades in Table 18, for example). These schools or grades are therefore those considered to have made good progress, assuming all else is equal. Eight of 16 school/grade combinations made good progress in math (Table 18), and six out of the 16 made progress in reading (Table 19). The table rows with shaded Concerns cells (such as all the fourth grades in Table 19) show declines in shares of students meeting growth expectations and/or of school attainment (eight school/grade combinations in math [Table 18] and 10 in reading [Table 19]).

The following is a detailed and school-specific explanation of Tables 18 and 19. The combination of school-level attainment percentiles and whether student scores met expectations in Table 18 shows that decreases in school attainment and lower than expected student math test scores are not a concern - the schools (Irving Park and Disney II) that did not improve scores as expected already had high scores in 2015-16 or showed large improvement in 2016-17. At Irving Park, at least two of four possibly concerning situations are not severe: The sixth-grade math scores were high to begin with, and the attainment percentile increased. Grade 7 shows a moderate increase in the percentage of student scores that met expectations for improvement, despite a very high school attainment level. The small decrease in school attainment could be due to a small cohort effect, especially since public data show Irving Park’s Grade 7 was in the 99th in NWEA National School Growth Percentile for Spring 2017, meaning that school’s Grade 7 had a greater average improvement than 99 percent of the schools in the NWEA national sample that had the same average Spring 2016 test. Irving Park’s Grade 6 attainment had a large increase and a small decline in the share of student math scores meeting expectations. This is, however, not concerning because according to publicly available data, there was an increase in National School Growth Percentile of 12 points. The large decline in the percentage of fourth-graders whose scores grew as expected combined with moderate school attainment levels is a concern, but again Irving Park Grade 4 improved by seven points on the NWEA’s national growth percentile scale. Finally, Irving Park’s eighth grade experienced a very large decrease in percentage of student math scores that did not meet expectations and a large drop in the national ranking. However, this shift is combined with very high school attainment levels. The “noise” associated with already very high levels, especially in upper grades where ceiling effects are likely, means the very large decrease is not alarming. The scenario is similar for Disney II’s sixth grade.

Page 53: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

38

At Henry, fourth and fifth grades may suggest a potential concern because, despite low math scores in Spring 2016, few students achieved their improvement targets in Spring 2017 and the national ranking for those grades dropped. Only the Henry attainment percentiles moderately improved, perhaps due to a cohort effect. Finally, fifth grade at Lovett is not worrisome because despite low school attainment and decreases in percentages of students whose scores growth targets, the school’s national growth percentile for Grade 5 was 99 for both years.

Table 18: Math School Attainment and Students Meeting Test Score Expectations

School Name Grade

Attainment Level in 2015-16

School Attainment Percentile Change from

2015-16 to 2016-17

Change in Student Test Scores Meeting

Expectations from 2015-16 to 2016-17 Concerns

Irving Park

3 medium very large increase moderate increase No 4 medium moderate increase large decrease Possibly 6 high large increase small decrease No 7 very high small decrease moderate increase No 8 very high small increase very large decrease Possibly

Disney II

3 very high small increase moderate increase No 4 very high small increase very large increase No 5 very high small increase same No 6 very high small decrease large decrease Possibly

Henry

3 medium very large increase large increase No 4 low moderate increase large decrease Yes 5 medium moderate increase moderate increase No 6 medium moderate increase small decrease Yes

Lovett 3 low very large increase very large increase No 4 very low very large increase moderate increase No 5 low large increase moderate decrease No

Source: Chicago Public Schools

Likewise, in reading, the combination of school attainment and student test score expectation measures demonstrates little cause for concern (Table 19). Irving Park started with a medium level of attainment in fourth grade during the base year and experienced a decline in the percentage of students whose score met growth expectations in Spring 2017. However, that drop is associated with a moderate increase in school attainment and a jump in the national school growth ranking from 55 to 90. The number of Henry’s fifth-graders decreased slightly in meeting expectations, but their test scores remained the same and the national ranking fell. Finally, Lovett’s fifth grade had high attainment levels and experienced a moderate and large decline in attainment and percentages of students who met improvement expectations. However, national rankings were very high in both years at 99 in 2015-16 and 97 in 2016-17.

Page 54: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

39

Table 19: Reading School Attainment and Students Meeting Test Score Expectations

School Name

Grade

Reading Attainment

Level in 2015-16

School Attainment Percentile Change From

2015–16 to 2016–17

Change in Student Test Scores Meeting

Expectations from 2015–16 to 2016–-17

Concerns

Irving Park

3 medium very large increase small increase No 4 medium moderate increase moderate decrease Possibly 6 very high moderate increase large decrease No 7 very high same small decrease No 8 very high small decrease large decrease No

Disney II

3 very high small decrease large increase No 4 very high moderate decrease small decrease No 5 very high moderate increase large increase No 6 very high small decrease large decrease No

Henry

3 low very large increase large increase No 4 low large increase moderate decrease No 5 medium same small decrease Possibly 6 low very large increase large increase No

Lovett 3 very low very large increase same No 4 low large increase moderate increase No 5 high moderate decrease large decrease Possibly

Source: Chicago Public Schools

Distribution of Improvement in Student Test Scores

Here the report summarizes the distribution of student growth in test scores at the BSC/non-BSC levels and for each school. Rather than just focusing on averages, table 18 summarizes median, lower quartile, and upper quartile of student SGPs in math and reading scores. By construction, those quartiles for a given population would be 25, 50, and 75. We see that BSC schools are well above in all three quartiles in math and reading in both years. Moreover, during the first year of the BSC cohort, those quartiles either improved or stayed the same, showing good improvements in academic outcomes.

Overall, in both years, BSC schools outperformed non-BSC schools in math in all three quartiles. Compared to the base year of 2016, BSC schools improved in math at the median and in the upper quartile in 2017. In reading, BSC schools outperform non-BSC schools only in the lower quartile only during the year personalized learning was instituted, despite Lovett having a strong improvement for its upper quartile.

Page 55: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

40

Table 20: MAP Spring 2016 to Spring 2017 Lower Quartile, Median, and Upper Quartile in Student Growth Percentile for BSC, non-BSC, and Each School

Math Reading Lower Quartile Median Upper Quartile Lower Quartile Median Upper Quartile ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17

Non-BSC 25 25 54 54 81 81 31 29 59 58 83 82 BSC 30 29 58 60 82 85 31 34 58 58 83 82 Difference (BSC – non-BSC)

+5 pp +4 pp +4 pp +6 pp +1 pp +4 pp 0 pp +5 pp -1 pp 0 pp 0 pp 0 pp

Brooks 12 41 63 36 63 80 Irving Park 30 28 56.5 50.5 81 82 32 35 63 58 79 79.5 Disney II 39 40 64 70 84 91 35 39 58 64 84 82 Henry 24 20 49 52 75 81 26 28 49 54 78 79 Lovett 34 44 67 67 94 93 37 33 74 68 74 94

Source: CPS student level data pp=percentage point

Growth in Elementary and Middle School and by “Where Students Start”

In theory, we would expect personalized learning to have a differential impact on students based on how they perform in the pretest (Spring 2016) relative to the rest of the students in their schools. For instance, in a nonpersonalized learning environment with limited resources, teachers could choose instruction levels to line up with the “middle” or average of the classroom’s starting point in order maximize classroom learning. Once personalized learning is implemented, one would expect students at the bottom and at the top of the pretest score distribution to benefit the most from the change in instruction.

Moreover, we might expect elementary students to be quicker to adopt a personalized approach to their learning as they have less to “de-learn” from years of more traditional instruction. On the other hand, middle school students may find it difficult to transition into the independent, project-focused, and competency-based instructional styles typical to personalized learning.

To illustrate these potential differences, the statistics are next presented separately for elementary school students and middle school students. The statistics involve MAP scores from tests taken in Spring 2016 (pretest) and Spring 2017 (posttest).

Elementary School (Grades 3–5)

In this section we look at the distribution of school Student Growth Percentile (SGP). SGPs are

constructed as follow. All students with the same pre-test are grouped and are ranked in increasing

order of their post-test. Using this ranking, each student receives a percentile in growth. For example, a

student with an SGP value of 36 means that the student grew more than 36 percent of the students

sharing the same pre-test. By construction, for a given sample such as a district, the quartiles SGPs take

values of 25, 50, and 75. This means that 25 percent of the SGPs are between 0 and 25, 25 percent

between 25 and 50, 25 percent between 50 and 75, and 25 percent between 75 and 100. We will refer

to those values as “normal” in what follows. Starting with Table 21, for elementary math, we see that

BSC SGP for the upper quartile started very high at 82 and ended even higher at 88. The rest of the

district, also higher than normal, started at 79 and ended at 82 in the same time period. For the lower

Page 56: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

41

quartile, unlike the rest of the district, BSC started well above the normal point at 29 in 2016 and rose to

30 in 2017. Finally, for the median test score, BSC started above the normal point (50) at 56 in 2015-16

and reached 64 in 2016-17. Hence, we see overall improvement in the distribution of student growth in

test scores.

In particular, Henry and Irving Park seem to be improving mostly top performers. Disney II improves greatly in all pretest categories but especially at the tails, as theorized above. Lovett at first seems to have no particular patterns to its students who meet growth expectations, but the student growth percentiles are very high in 2015-16, with 34 for the lower quartile (versus a norm of 25), a median of 67 (versus 50), and 94 (versus 75) for the upper quartile.

Turning to reading, BSC seem to be improving in all three quartiles but especially the lower quartile and median. Disney II and Henry clearly follow that pattern. Lovett again seems to have no clear pattern, but it does start with very high values for the median and the two quartiles in 2015-16, especially the median and upper quartile.

Table 21: Grades 3–5 School Quartile of Student Growth Percentile Math Reading

Lower Quartile SGP

Median SGP

Upper Quartile SGP

Lower Quartile SGP

Median SGP Upper Quartile SGP

’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 Non-BSC 21 23 50 53 79 82 26 24 57 54 84 81 BSC 29 30 56 64 82 88 28 31 54.5 59 84 82 Difference (BSC – non-BSC)

+8 pp +7 pp +6 pp +11 pp +3 pp +6 pp +2 pp +7 pp -2.5 pp +5 pp 0 pp +1 pp

Irving Park 20 18 39 39 59 70 22 23 50 51.5 74 76 Disney II 39 51 65 78 84.5 94.5 29 33 53 63 79.5 80 Henry 24 21 50 59.5 77 84 26 29 52 54 82 81 Lovett 34 44 67 67 94 93 37 33 74 68 93 94

Source: Chicago Public Schools Notes: Brooks is excluded because it does not house elementary grades; pp=percentage point

Table 22 reports the changes in percentages of elementary school students whose test scores met expectations in math and reading for Spring 2016 and Spring 2017 broken up by which quartile of the pre-test distribution the students fell.

Page 57: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

42

Table 22: Elementary (Grades 3–5) BSC and non-BSC Students Whose Math and Reading Scores Met Expectations by Pre-test Score Quartile

School Subject Lower quartile Quartile2 Quartile 3 Upper quartile

’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17

Non-BSC Math 56.6% 55.2% 53.3% 52.8% 51.6% 55.0% 47.5% 55.9%

Reading 60.6% 57.5% 61.1% 58.5% 58.5% 56.1% 53.9% 51.8%

BSC Math 62.3% 63.2% 51.7% 55.6% 57.0% 69.0% 59.7% 62.7%

Reading 57.5% 61.6% 61.5% 61.7% 55.7% 59.3% 56.7% 60.0%

Difference (BSC – non-

BSC)

Math 5.7 pp 7.9 pp -1.6 pp 2.8 pp 5.3 pp 14.1 pp 12.8 pp 6.8 pp

Reading -3.2 pp 4.1 pp 0.5 pp 3.2 pp -2.8 pp 3.2 pp 2.9 pp 8.2 pp

Brooksa Math NA NA NA NA NA NA NA NA Reading NA NA NA NA NA NA NA NA

Irving Park Math 52.6% 45.8% 47.6% 34.6% 21.7% 37.9% 38.9% 53.9%

Reading 47.4% 77.3% 66.7% 51.6% 45.5% 44.0% 50.0% 50.0%

Disney II Math 57.1% 75.0% 60.0% 69.1% 75.0% 85.0% 73.0% 84.2%

Reading 50.0% 56.8% 55.6% 67.6% 62.5% 63.6% 60.5% 73.0%

Henry Math 61.4% 54.0% 40.0% 50.9% 55.0% 65.5% 57.9% 56.6%

Reading 53.6% 56.4% 59.7% 55.6% 53.3% 59.3% 53.6% 58.9%

Lovett Math 77.8% 82.6% 65.5% 65.4% 64.3% 85.7% 59.3% 52.0%

Reading 82.1% 66.7% 69.0% 77.8% 59.3% 66.7% 63.0% 54.2% aNote that these are included for sake of completeness but Ns are very small. pp=percentage point

Middle School (Grades 6–8)

Table 23 shows the median, and lower and upper student growth quartiles. For math, 2015-16 starting points are not as well above normal as for elementary grades and improve only in the upper quartile, rising from 77 percentile points in Spring 2016 to 79 in Spring 2017. However, except for Henry, school attainment percentile levels are extremely high.

For middle school reading, student growth quartiles are above normal levels for BSC and non-BSC. Yet the lower and upper quartiles increase while the percentages of students meeting expectations decrease for all four pretest quartiles.

Page 58: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

43

Table 23: Grades 6–8 School Quartile of Student Growth Percentile Math Reading

Lower Quartile SGP

Median SGP Upper Quartile SGP

Lower Quartile SGP

Median SGP Upper Quartile SGP

’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 Non-BSC 29 27 58 55 83 81 36 35 61 61 82 82 BSC’ 30 25 55 51 77 79 36 38 60 58 78 82 Difference (BSC – non-BSC)

+1 pp -2 pp -3 pp -4 pp -6 pp -2 pp 0 pp +3 pp -1 pp -3 pp -4 pp 0 pp

Brooks NA 12 NA 41 NA 63 NA 36 NA 63 NA 80 Irving Park 36 29 65 57 84 82 38 40 67 58.5 82 81 Disney II 38 17 63 48 80 73 57 40 76 71 88 85 Henry 24 8 40.5 39 69 69 23 17 43 48 62 76

Source: Chicago Public Schools Lovett is excluded because it does not house middle grades that engaged in personalized learning in the study year; pp=percentage point.

Table 24 reports the changes in percentages of students meeting expectations in math and reading.

Table 24: Grades 6–8 BSC and non-BSC Students whose Math and Reading Scores Met Expectations by Pre-test Score Quartile

School Subject Lower quartile Quartile 2 Quartile 3 Upper quartile

’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17 ’15-16 ’16-17

Non-BSC Math 62.6% 57.5% 59.7% 55.8% 58.3% 56.7% 58.4% 59.5%

Reading 67.3% 66.0% 65.5% 65.3% 61.9% 62.3% 55.0% 53.0%

BSCa Math 62.3% 57.8% 59.5% 59.7% 61.7% 51.3% 69.4% 42.2%

Reading 71.8% 64.8% 62.0% 61.5% 65.4% 55.6% 52.6% 45.5%

Difference (BSC – non-

BSC)

Math -0.3 pp 0.3 pp -0.2 pp 3.9 pp 3.4 pp -5.4 pp 11.0 pp -17.3

pp

Reading 4.5 pp -1.2 pp -3.5 pp -3.8 pp 3.5 pp -6.7 pp -2.4 pp -7.5 pp

Brooksb Math NA 41.7% NA 38.5% NA 54.6% NA 50.0% Reading NA 63.6% NA 61.5% NA 54.6% NA 53.9%

Irving Park Math 65.3% 64.4% 56.5% 63.8% 67.9% 54.9% 88.9% 48.2%

Reading 70.5% 68.2% 64.0% 54.9% 71.4% 60.4% 60.0% 48.0%

Disney II Math 75.0% 61.5% 60.0% 50.0% 66.7% 42.9% 50.0% 35.7%

Reading 83.3% 76.9% 84.6% 73.3% 85.7% 50.0% 58.3% 46.2%

Henryb Math 43.8% 30.8% 69.2% 56.3% 37.5% 45.5% 26.7% 26.7%

Reading 66.7% 42.9% 37.5% 75.0% 26.7% 42.9% 21.4% 35.7% aIn 2016-17 these exclude Brooks to make it comparable across the two years. b These data are included for sake of completeness but Ns are very small. Source: Chicago Public Schools. Notes: Lovett is excluded because it did not house middle grades engaged in personalized learning during the study year; pp=percentage point.

Page 59: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

44

In sum, although we cannot make causal claims from this descriptive analysis, the evaluation of BSC in Chicago does allow for an understanding of growth patterns on academic outcomes. These analyses indicate mostly neutral or positive movement in math and reading scores for BSC students. The descriptive analyses of school-level statistics show that incorporating an evolution of school attainment in the context of changes in student test scores can alter the conclusion one might draw from looking at growth statistics only. This possibility highlights the importance of future evaluations considering not just the starting positions in outcome measures but also students’ historical progression in those same measures.5

5 The quasi-experimental analyses control for student demographic characteristics and show overall neutral or positive effects, but are only available in the appendix because the necessary conditions required to make any causal claims were not met.

Page 60: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

45

Discussion: What does strong personalized learning implementation look like?

What follows is a synthesis of distinctive practices seen at each of the schools in the BSC cohort, highlighting first those schools with consistent and comprehensive implementation of personalized learning across classrooms. Each exhibited personalized learning practices other schools could borrow, especially those with similar conceptions of personalized learning. This collection of promising practices gives school and district leaders, practitioners, and researchers a detailed picture of what strong personalized learning looks like within a school. Additional details about each school are in the school profiles found in Appendix E.

Disney II

In Disney II, several aspects of personalized learning were observed. The school uses the Summit Basecamp platform for personalized learning in the middle grades, and school administrators see personalized learning as happening in preschool. Despite identifying more challenges to personalized learning implementation than any other school (perhaps a result of the number of interviews conducted), Disney II still exhibited distinctive personalized learning practices, including the following:

• Learner Connected work. Schools tend to have difficulty with the “Learner Connected” piece of the LEAP Framework (“learning transcends the classroom in relevant and accredited ways, connected to families and communities.”) Disney II, however, showed specific efforts in this area. We observed parents working with second-grade students during our initial schoolwide observation. Additionally, the high school principal at Disney II explained in depth how staff engaged students, parents, and teachers to create a mission and vision statement by surveying students and letting students and parents control the process.

• Collaborative Learning Labs. Disney II utilized learning labs that cut across multiple subjects and grades, such as a second-grade lab that comprises multiple classrooms and English language arts and math. Learning labs also foster teacher collaboration.

Lovett

Lovett displayed a depth and breadth of personalized learning implementation, with some innovative practices other schools would benefit from exploring. For example:

• “Flex Friday.” Elementary school students can choose different academic or nonacademic courses based on their interests. Some students sometimes even teach courses, which also involved lesson planning. For example, Spanish-speaking students in one school taught Spanish language classes to their peers.

• Flexible learning environments. When observing classrooms at Lovett, one immediately notices students engaging in personalized learning while sitting on risers or beanbags in the hallway, or on the floor working on computers.

Administrator: “It’s cool to see kids lesson plan.”

School administrator: “Preschool is perfect personalized learning: choice, excitement.”

Page 61: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

46

• Data. Lovett is heavily data-driven. During two observed grade-level meetings, school leadership consistently used data (test scores, attendance) to discuss student progress. At an observed meeting with outside visitors, the principal used data to discuss Lovett’s performance.

• Students at Lovett have “tracking and reflection sheets” where they record the progress they make toward their learning goals.

• Mixed-age grouping. Students are combined into multigrade classrooms (for instance, Grades 2–3). This practice allows younger students to progress at a faster pace if they are more advanced and helps older students who may need additional support.

Henry

Henry has made a deliberate effort to gradually incorporate personalized learning by targeting third grade in 2016-17, and showed strong fidelity to its original blueprint. Some Henry’s of strong practices included:

• Project-based learning. Perhaps the practice most aligned to personalized learning that observed in any school was Henry’s “Genius Hour.” Students get to choose a long-term project to work on based on their interests. For instance, one student worked on a presentation on the card game, Pokémon. Three other students practiced their PowerPoint presentations out loud with one another. In 2016-17, Henry piloted “Genius Hour” in one second-grade classroom and one third-grade classroom, with the goal of expanding to all grades eventually. Genius Hour fits into the Learner Led, Learner Focused, and Learner Demonstrated, categories of the LEAP Framework. Also, students can work in several environments and on different platforms, which also makes Genius Hour a practice cluster.

• Gradual release. Starting in Fall 2017, third-grade classrooms included “levels of autonomy” in which students gradually advance to working in different places within or outside of their classrooms, one way in which Henry utilizes flexible learning environments and schedules. Additionally, in one observed classroom, students had choice schedules. Checking with their teacher each week, students could choose what to work on from a menu of options (Lexia, etc.) and checked off when they completed a project. The menu of options was largely the same from week to week; the choice was more about when students did these activities, not whether they did or what the doing looked like.

• Vertical alignment. At the end of 2016-17, Henry held a schoolwide meeting with all grades. Each grade group created a slide showing what they had done with respect to personalized learning throughout the year so that the other grades could get a sense of their approaches.

• Mixed-age grouping. In the fall of 2017, mixed-age grouping was present inside of “units,” in which students chose a subject to work on from a menu of topics outside of core areas.

Lindblom, Brooks, and Irving Park

Schoolwide implementation of personalized learning can be difficult to observe in secondary schools, where traditional pedagogic structures already tend to sort students by interest and ability. That said, the evaluation captured some distinctive practices at Lindblom, including:

Page 62: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

47

• Colloquia. Teachers created colloquia, where students sign up for three flex classes each week based on their interests and passions. For instance, one classroom of students was planning Lindblom’s 100-year anniversary. Student-run colloquia included topics such as cosmetology and animation. The original design was to give back to the Englewood community, and some of the colloquia do that, while others are simply based on students’ interests and passions.

• Student assistants (peer instruction). Lindblom has a student assistant teacher program, in which current students who took particular classes help instruct. In one advance placement class, the teacher explained that a student assistant planned the day’s warm-up activity, taught a lesson on how to take notes, and read through reflections that students wrote at the end of the previous unit.

Brooks started by incorporating personalized learning into seventh grade in 2016-17, then added eighth grade in 2017-18 with the intention of expanding to the whole school within five years if funding supports it. Distinctive practices at Brooks include:

• Student assistants (peer instruction). As noted, Brooks’ principal mentioned that they “copied” this concept from Lindblom. Classroom assistants at Brooks are older students who want to go into education as a career, and who mentor and instruct younger students.

• Brooks has a set of explicit social/emotional tenets and expectations it encourages students to keep in mind.

Finally, Irving Park was home to several practices (such as project-based learning and peer instruction) and strong teachers who could be considered exemplars. Irving Park staff are aware of the school’s relative shortcomings (moving from a more teacher- to student-directed approach and engaging with the community). It tried to emphasize flexible learning environments/schedules, but teachers also felt they needed to at times restrict flexibility and instead make classrooms more structured because students were not hitting their academic goals. The most distinctive personalized learning practice in Irving Park relates to positive behavioral interventions and supports, and restorative practices: Restorative practices occurred when a teacher urged a student to apologize to another. In a professional learning meeting, teachers described how they would emphasize empathy and compassion among students during Autism Awareness Week.

Page 63: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

48

Recommendations for Programming and Evaluation Design

This evaluation provides insights toward the continuous improvement and scaling of personalized learning models via the BSC initiative. Drawing directly from the findings described above, our recommendations are relevant to how LEAP might best support and evaluate personalized learning in BSC, as well as more generally to educators and educational organizations looking to implement personalized learning. Connections between these recommendations and the evaluation questions are noted.

Evaluation Questions

1. What are the proposed personalized learning models in Breakthrough Schools: Chicago and how are they aligned to the LEAP Framework?

2. What are patterns in practice and readiness in the first year of implementation at the school site level, and is implementation consistent with intended and proposed program models?

3. What are patterns in academic outcomes for students engaged in personalized learning in Breakthrough Schools: Chicago?

These are summaries of more specific and formative recommendations provided to LEAP throughout the evaluation process, including technical recommendations for future analysis of impact.

● Give purposeful attention to the persistent surrounding conditions (e.g. policy contexts, staff turnover, available technology, budget structures) that impact implementation when designing models, funding structures and professional development approaches (Questions 1 and 2).

● Focus on understanding the nuances of implementation and why it might look different at different schools, which can inform the planning process for schools considering personalized learning models (Question 2).

● Structure and support opportunities for schools to educate one another on their own promising practices, drawing on the persistent theme of collaboration in the data (Question 2).

● Schools may want to consider which practices tend to work well together, and purposefully plan to implement practice clusters to help embed personalized learning (Question 2).

● Draw on suggestive descriptive outcome data and explore the personalized learning practices of schools showing strong implementation and outcomes (Questions 2 and 3).

● Similar to other school-wide change models, defining and capturing the “dosage” of personalized learning is complicated by the existence of differences within and between schools. This variability is not only in terms of the scope of implementation (e.g. classroom, grade, or school level) but also the nature of the programming itself (e.g. flexible seating combined with station rotation versus a 1:1 device initiative where all students use personalized learning plans to complete a project). Schools and districts must develop a common understanding or definition of personalized learning practices early in implementation process (Question 1 and 2).

● When considering a research or evaluation design: o Think carefully about how to structure observations of classroom practice given some

elements (e.g. “growth mindset”) are challenging to observe and many are found in “practice clusters”. Also consider how to leverage teachers’ observations of one another’s practice (Question 2).

Page 64: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

49

o Revisit the LEAP survey in terms of its format (e.g., length), administration (e.g., incentives to increase response rates), and use (e.g., targeted professional development with school staff on how to regularly use the survey data) (Questions 2 and 3).

o Given limitations of using quasi-experimental estimates of impact, explore descriptive analyses of quantitative outcomes (Question 3).

Page 65: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

50

Appendix A: Qualitative Technical Appendix

Qualitative Data Collection

The WEC evaluation team conducted qualitative fieldwork including school visits, classroom observations, interviews, and artifact collection. In advance of the first round of school visits, WEC thoroughly reviewed each school’s Blueprint, or the intended personalized learning models and practices articulated by each school. Table A-1 provides a full list of the personalized learning practices WEC identified along with a general definition of each practice and the number of schools that identified each personalized learning practice within their blueprints.

Table A-1: Personalized learning practices described in blueprints

Personalized Learning Practice Definition

Number of

Schools Personalized/learner focused/targeted instruction

Instruction is personalized for each student based on his/her needs. 6

Personal learning paths Each student has his/her own distinct learning path. 6 Competency- or proficiency-based progression

Students progress in their learning based on competency and mastery of material.

6

Flexible learning environments/schedules

Students have the opportunity to learn where and when they want. 6

Learner profiles Each student has a profile of his/her interests and learning styles. 3 Content area specialized classrooms

Classrooms are specialized by subject. 1

Mixed-age grouping Students learn in classrooms mixed by age or grade level. 3 Collaborative learning models/labs

Students learn in classrooms or lab spaces designed to foster collaboration.

3

Project-based learning Students learn by working on projects suited to learning needs, styles, & interests.

3

Data-driven consultation Staff and students use data to understand needs and develop learning targets.

1

Students as teachers (peer instruction)

Students learn by teaching each other. 1

Social emotional and/or noncognitive learning

Students develop nonacademic skills through personalization. 4

Positive behavioral interventions and supports (PBIS) and restorative practices

Students engage in restorative practices as a result of personalization. 3

Understanding by design Educators set goals and work backward to identify supporting instructional practices (backward-mapping).

2

Horizontal alignment Personalization is aligned within a grade level. 2 Vertical alignment Personalized learning aligns across grade levels; teachers develop

sequences that allow students to build skills and knowledge as they advance.

2

Gradual release Teachers gradually provide students the opportunity to engage in personalized learning based on the degree to which students choose where they work and what they work on.

2

Page 66: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

51

Flexible staffing Staff move between grade levels, subject areas, and/or classrooms. 2 Participatory culture (around new media)a

Students learn by creating their own media. 0

aWEC included this item based on author Halverson’s research on personalized learning; none of the schools explicitly included it

within their blueprints.

Starting in the fall of 2016 and continuing through the fall of 2017, WEC conducted five rounds of school observations across the BSC cohort, including schoolwide and classroom observations. As WEC evaluators were not able to observe every grade in every school, Table A-2 shows the specific subjects and classrooms observed, in addition to initial schoolwide observations, to give context to findings and identify teachers to focus on in survey analyses. Instances of “multiple” reflect that students were working on more than one subject within the classroom.

Table A-2: Grades and Subjects observed in each school

Specific Grades & Subjects Observed

Whole School

K-1

2 3 4 5 6 7 8 9-12

Brooks X ELA, Social Studies STEM

Irving Park

X Math Math Math, Multiple

Disney II X Math ELA Math Science,

Algebra Algebra Human Geography

Henry X

Math,

Literacy, Multiple

Literacy, Multiple Multiple

Lindblom X

Algebra,

AP World History

Algebra, Choir, Engineering,

“Colloquium,” Computer Lab

Lovett X

Reading Reading,

ELA, Literacy,

Math

Literacy, Math

Literacy, Math

Diagonal line in a box indicates grades not served in school; shaded indicates grades not implementing personalized learning

At each site, WEC first conducted semi-structured interviews with school administrators, as well as brief interviews with educators if time permitted before or after observing classes. In subsequent visits, WEC also engaged in additional informal interviews with school administration and staff. During observations, evaluation team members occasionally and informally spoke with students to get an idea of the activities they were working on and general perceptions and understanding of personalized learning in their classrooms, but never collected identifiable information from students. WEC evaluators attended

Page 67: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

52

and observed teacher group meetings and professional learning communities in the spring of 2017, capturing educators’ discussions and collecting targeted, written reflections from them at the conclusion of these meetings. The two reflection questions were “Was this a typical meeting?” and “How did this meeting build your capacity to implement personalized learning in your classroom?” Additionally, LEAP and WEC held a conferring session with BSC representatives (administrators and teachers) in the summer of 2017 to review preliminary findings and capture schools’ impressions of patterns in the qualitative work.

Qualitative Data Analysis

The guiding evaluation questions and the LEAP Framework’s Learner Led, Learner Focused, Learner Demonstrated, and Learner Connected components provided the basis for the qualitative analysis. To record and code observation data, WEC used Teachboost, a tool that LEAP has used in other schools (including Pilot Network Schools) to “tag” or code observations to the LEAP Framework. WEC also added models and personalized learning practices described in Table A-1 as additional tags to the LEAP Framework within Teachboost. The code tree was also used to code interviews, teacher meeting notes and documents using Excel. The full code tree is shown in Table A-3. (Some of the language in the LEAP Framework has been updated since WEC conducted its analyses; see http://leaplearningframework.org.) The set of “challenges” codes was developed iteratively based on initial interview and teacher meeting data collected.

Once the coding of data collected in the 2016-17 school year was complete, the evaluation team (Marlin, Good, Vadas and Halverson) discussed emerging themes and patterns, including for implications for quantitative analysis. For example, initial observation and interview data further illustrated the complexity and challenges in defining both the nature of and magnitude of “dosage” of personalized learning on the classroom, grade or school levels. Subsequent early briefings with LEAP and the conferring workshop with school staff in spring and summer of 2017 provided opportunities for participant or member checks of these early patterns. Once qualitative data collection was complete, WEC revisited all data for the patterns and themes in the code tree. This included ongoing discussions around triangulating different sources of data around these patterns. For example, WEC and LEAP teams discussed the emergence of “practice clusters” across observation, interview, and survey data. WEC also would do regular check-ins between the qualitative and quantitative analysis of student level data, including around the grades in which WEC evaluators were seeing the deepest implementation and interesting patterns in outcomes.

Table A-3: Code tree

Code Sub- Code Description

LEARNER LED

1.1 Collaborate with learners to identify and include learner preferences and optimal learning conditions (e.g., modalities, technology use, the nature and duration of learning activities, pacing, grouping size, and when/where learning will take place)

1.2 Partner in setting learning goals and plans 1.3 Articulate their interests, strengths, and needs 1.4 Assess and monitor their own progress 1.5 Collaborate with others to achieve goals 1.6 Advocate for needed support from teachers, peers, technology and other sources

Page 68: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

53

1.7 Reflect upon their learning and continually refine their strategies 1.8 Adopt a growth mindset 1.9 Other (learner led)

LEARNER FOCUSED

2.1 Have learning opportunities that reflect an understanding of individual needs, interests and strengths with respect to academic skills and needs

2.2 Have learning opportunities that reflect an understanding of individual needs, interests and strengths with respect to nonacademic skills and needs including social, cultural and emotional

2.3 Other (learner focused)

LEARNER DEMONSTRATED

3.1 Enter at a level appropriate to their prior knowledge and learning needs 3.2 Have supports and pacing that fits their learning 3.3 Demonstrate proficiency when ready 3.4 Demonstrate evidence of learning in multiple ways 3.5 Receive credit (recognition of learning) based on demonstrated proficiency (not

seat time) 3.6 Other (learner demonstrated)

LEARNER CONNECTED

4 Learning transcends the classroom in relevant and accredited ways, connected to families and communities

RESOURCES

5.1 Hardware 5.2 Software 5.3 Internet connectivity 5.4 Non-digital instructional resources 5.5 Physical Space 5.6 Other resources

PERSONALIZED LEARNING PRACTICES

6.1 Personalized/learner focused/targeted instruction 6.2 Personal learning paths 6.3 Competency- or proficiency-based progression 6.4 Flexible learning environments/schedules 6.5 Learner profiles 6.6 Content area specialized classrooms 6.7 Mixed-age grouping 6.8 Blended learning 6.9 Collaborative learning models/labs 6.10 Project-based learning 6.11 Data-driven consultation 6.12 Students as teachers (peer instruction) 6.13 Social emotional and/or noncognitive learning 6.14 Positive Behavioral Interventions and Supports (PBIS) and restorative practices 6.15 Understanding by design 6.16 Horizontal alignment 6.17 Vertical alignment 6.18 Gradual release 6.19 Flexible staffing 6.20 Participatory culture (around new media) 6.21 Other personalized learning practices

INDIVIDUAL STUDENTS

7.1 Capacity (knowledge, skills, time) 7.2 Practice 7.3 Other (student)

8.1 Capacity (knowledge, skills, time) 8.2 Practice 8.3 Consultation with students

Page 69: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

54

INDIVIDUAL STAFF/ADULTS

8.4 Other (staff)

OTHER (general) 9

CHALLENGES

10.1 Time 10.2 Funding/sustainability 10.3 Capacity 10.4 Digital Tools 10.5 Staffing 10.6 Difficult/too easy for students 10.7 Need to meet standards/testing requirements/grading 10.8 Lack of student agency (instructor-driven) 10.9 Parents/homes 10.10 Attendance 10.11 Sustainability 10.12 Union 10.13 Other

Page 70: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

55

Appendix B: Quantitative Technical Appendix

Survey Analysis

LEAP Innovations administered surveys to students and teachers in Fall 2016, Spring 2017, and Fall 2017. For survey analysis, WEC identified both demographics of respondents and patterns in responses from fall 2016 to fall 2017 on selected survey items. While it would be optimal to review spring-to-spring survey patterns (as students would have had a full year with their teachers in personalized learning settings), WEC’s final round of observations took place in the fall of 2017, so it may be improper to use spring 2018 data for purposes of this evaluation. Descriptive respondent information is located in each school’s profile. For response patterns, WEC ran two sets of analyses for both the student and teacher surveys. For the first set of student survey results, WEC removed any grades identified (with LEAP’s affirmation) as not employing an explicit personalized learning model. LEAP staff indicated that they only would like the 4th-8th grade student surveys used in our analysis at this time. Thus, while elementary schools are implementing personalized learning practices for grades below 4th grade, survey results for student respondents in those grades are not included in our analyses of responses. Further, while two schools have high school respondents, students in those grades are also removed from our analysis set. With respect to the teacher survey, LEAP indicated that WEC should only analyze responses to the K-8 survey at this time. Student and teacher respondent information was anonymized to ensure privacy.

In analyzing teacher survey results, the first step was to isolate K-8 teachers. The Fall 2016 and Spring 2017 surveys only included grade ranges, not individual grades, but WEC still was able to remove the teachers who only taught high school grades or did not have any grade levels identified in their responses. In a second set of teacher analyses, we restricted the teacher sample to only those who completed the survey in fall 2016 and spring 2017 and aggregated all responses (which we termed the “restricted teacher sample”). We then ran a second student analysis consisting of only those students who had teachers in our restricted set (which we termed “restricted student sample”). Table B-1 shows the number of students and teachers in each sample in each survey term; further descriptive information on responses to the specific survey items in the report can be found in Appendix D.

Table B-1: Student (Grades 4-8) and teacher (Grades K-8) survey respondents Fall 2016 Spring 2017 Fall 2017 All student sample 867 945 869 Restricted student sample 420 443 226 All teacher sample 81 50 93 Restricted teacher sample 40 40 24

For the school profiles, we averaged unrestricted student and teacher responses in each survey administration to give “scores” for each Learner Focused, Learner Led, and Learner Demonstrated survey response, where 1 indicated the lowest agreement or frequency level. (LEAP categorizes student and teacher survey items according to the LEAP Framework.) We caution that questions should not be compared to each other; while most items include four possible response categories, some include more or fewer.

Page 71: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

56

Quasi-experimental Analysis

This section provides more details and results of the quasi-experimental analysis. The analysis sample consists of all CPS students with a pre and a post and no missing demographics characteristics who were registered in the same school in the 20th day of September 2016 and in the 20th day of May 2017. Both pretest (spring 2016) and posttest (spring 2017) scores are standardized using district level mean and standard deviation in each grade and subject.

Description of the Selection into Treatment

To choose the most appropriate quasi-experimental model, WEC needed to understand how the treatment was allocated. LEAP broke down the selection process in CPS into four main steps: 1) all principals were invited to information sessions about the LEAP Innovations tools, funding, and personalized learning framework; 2) a subset of schools self-selected to attend and participate in those information sessions; 3) if a school was interested in what LEAP had to offer, it could decide to apply to receive the treatment, and the application was fairly demanding, implying that schools who decided to move on to the application stage had to be motivated; 4) once applications were received, experts outside of Chicago reviewed and ranked applicant schools. Among other factors, to be selected, schools had to demonstrate solid leadership and be willing to implement personalized learning to a much greater degree than their then-current state. Successful applicants became the first cohort of BSC. Note that since schools were not selected based on how much personalized learning they were already offering, BSC schools varied in their starting points or “baseline” implementation levels.

Within the selection process, WEC identified two sources of selection bias: schools self-selected to apply and LEAP selected a subset of the applicant pool. Moreover, some schools were neighborhood schools and others were schools with application processes for students. This led to three additional potential sources of bias: students’ and their families’ self-selection into a school, specific population make-up of a school due to its location within the city of Chicago, and schools selecting amongst students who applied.

Selection of schools and students into treatment in itself could be overcome, especially in a large district like Chicago, but only with a substantial number of treated schools. Table 12 above shows that the number of treated schools never exceeds four in any given grade. On the plus side, we have a large number of students in BSC schools and many students in non-BSC schools in our data sets to provide some statistical analyses. Early in the process, WEC proposed a difference-in-difference model as the a priori

most promising model. However, it later seemed less desirable than propensity score matching because most BSC schools had a prior relationship with LEAP as Pilot Network schools.

Despite the limitations (selective process and small number of schools), WEC and LEAP decided to conduct an impact analysis. WEC then worked closely with LEAP to use the same quasi-experimental method carried out for the Pilot Schools Network, another LEAP initiative. Note that some BSC schools were

also Pilot Network schools in one or more years prior to the 2016-17 school year. WEC and LEAP agreed, however, that any results from this analysis are to be interpreted at best as promises of the possible influence of impact on student achievement. Note that the absence of causal claims does not mean that there are no effects. It only means that from a statistician’s standpoint, given the data, there is no way to isolate the effects of the treatment from other effects and provide an estimate of the relative size of

Page 72: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

57

the treatment impact. The details on the process and the result tables of the quasi-experimental analysis are thus only presented in this appendix.

Analysis Set and Sample Selection

Once all data sets were merged, we created indicator variables for treatment status and all demographic characteristics. The treatment indicator for participation in LEAP was assigned a value of one if a student was in one of the five BSC in the grades listed above, and a value of zero otherwise. Note that students in BSC in grades not receiving the LEAP treatment were removed from the analysis because we did not want them to become controls for the other BSC schools serving those grades.

We imposed three additional restrictions for inclusion in the analysis sample. First, students had to have non-missing values for all variables included in the model. Most observation losses were due to missing test scores, especially for the pre-test because we used the spring of the previous year. However, when test scores were normalized to have a mean of zero and a standard deviation of one for each grade and subject, we kept all observations with a non-missing value for that test. Second, we only kept students who were enrolled in the same CPS school for the whole first year of the LEAP treatment (2016-17). This sample excluding mobile students is preferable because it allows us to estimate the average treatment effect on the treated rather than the intent to treat. It is a measure of how the LEAP treatment affected students who actually received it. Mobility is a strong predictor of academic performance as it is correlated with family and life circumstances beyond the school’s influence. Third, we did not include any non-BSC schools who were Pilot Network schools.

Selection of a Control Group

This evaluation seeks to quantify the difference in academic outcomes between BSC students and non-BSC students. Due to the selection of schools into treatment, participants differ from non-participants and subsequent differences in performance can be due to those preexisting conditions as well as the treatment received by their schools. In order to control for as many preexisting conditions, we used a quasi-experimental method called a propensity score matching (PSM) analysis.

PSM analysis selects a group of students within each grade in both treated and control schools who are similar on observable characteristics. These observationally similar students in non-participating schools then serve as a comparison group when estimating impacts.

Estimating Probability of Students Participating in a BSC Grade and School

Once the analysis sample set was built, for all students in our sample, we estimated the probability of being a BSC student based on the students’ relevant observable characteristics.

For each grade and subject, we estimated the conditional probability of being assigned to treatment using a binary logistic regression:

𝑃⟨𝑌𝑖𝑃𝑟𝑒, 𝑋𝑖 = 𝑥𝑖⟩ =

1

1 + 𝑒−(𝛾𝑌𝑖𝑃𝑟𝑒+𝑋𝑖𝛽)

Page 73: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

58

Where,

𝑌𝑖𝑃𝑟𝑒student i’s score on the pretest,

𝑋𝑖 is student i’s vector of demographic characteristics (gender, race, English learner status, free and reduced-price lunch status, and special education status). Each student in both treated and control school was then assigned an estimated probability of treatment value, called a propensity score.

We then matched each treated student with all the non-treated students who have the same propensity score value. Such restrictions did not cost us too many unmatched students due to the very large number of non-treated students at our disposal and ensured that each treated student was only compared to control students with the same probability of receiving the treatment.

Once each treated student was matched with the corresponding group of non-treated students, we generated a weight based on the propensity score. Then, for each grade and subject, using PSM weights, we ran Hierarchical Linear Models (HLM) with school random effects. Aside from the treatment effect binary variable (receiving a value of one if the student was in a BSC school and grade and a value of zero otherwise), we controlled for all the variables used in the propensity score estimation. We added other subject pretest and a quadratic term for the same subject pretest as additional controls. Finally, we ran HLM to estimate treatment impacts on reading and math posttest scores. These outcome models hence controlled for all the variables used in the match along with relevant variables known to impact posttest scores. WEC also conducted the same analysis but restricting further the sample to only include schools in close geographic network. Interestingly, the quality of the matching was not impacted and the results did not differ much from the whole district results. Since WEC and LEAP agreed to keep the model close to the Pilot Network model, we only present the whole district results in this report.

Changes in Outcomes

Table B1 below presents estimates of the treatment indicator variable with their corresponding estimated standard errors and p-values. A coefficient estimates of 0.05 means that on average, treated students outperformed control students by 5 percent of a standard deviation in test score. For each coefficient estimate there is a corresponding p-values. The p-value of a coefficient is a probability, so it only takes values between zero and one. It is an indicator of significance. It provides the smallest level of significance of the coefficient of interest.

Overall, the result table with combined grades below (Table B-2) confirms our initial claims. We find that BSC students perform just as well as their non-BSC counterparts who are observationally similar in demographic characteristics and in pretest performance.

Page 74: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

59

Table B-2: Treatment Effect Estimates for Combined grades for Elementary Schools, Middle Schools, and all schools, BSC versus non-BSC

Reading Math

Treatment Effect

standard error

0.025

0.062

0.018

0.078

p-value 0.691 0.081

When looking at by grade impacts (Table B-3), we can see that fifth-grade students in BSC schools outperformed their control students in non-BSC schools by 33.4 percent of a standard deviation.

Although we cannot make any causal claims at this point, our analysis shows that first year BSC students are performing just as well as non BSC students who are observationally similar and that any significant results are positive.

Table B-3: Treatment Effect Estimates by grade: BSC vs. Non-BSC

Reading Math

Grade 3

Treatment Effect

standard error

0.052

0.077

0.041

0.124

p-value 0.504 0.738

Grade 4

Treatment Effect

standard error

0.019

0.1024

0.030

0.127

p-value 0.854 0.813

Grade 5

Treatment Effect

standard error

0.098

0.132

0.334

0.132

p-value 0.456 0.012

Page 75: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

60

Grade 6

Treatment Effect

standard error

-0.017

0.1005

-0.030

0.1475

p-value 0.8645 0.840

Grade 7

Treatment Effect

standard error

-0.003

0.170

-0.035

0.144

p-value 0. 986 0.810

Grade 8

Treatment Effect

standard error

-0.101

0.1442

-0.224

0.175

p-value 0.486 0.2045

Descriptive Analysis

The following are additional informational notes on WEC’s process of conducting descriptive analysis of patterns in student academic outcomes, followed by supplemental tables from additional analysis.

Notes on Analysis of Attainment

We present attainment measures of schools before and after the implementation of BSC. Tables B-4 and B-5 show, for each grade, the evolution of each BSC attainment percentile in math and reading at the end of spring of 2016 (base year) and at the end of spring 2017 (the first year of BSC reforms.)6 For each spring, the national school attainment percentile compares the average spring scale score of students in a given subject on the NWEA MAP to the national average score. Schools are then ranked within a nationally representative sample of schools administering the NWEA MAP and assigned a corresponding percentile point, which allows us to see where each school stands on the national distribution of schools.

6 We are using only one year prior to the treatment because NWEA changed the test norms in 2015-16.

Page 76: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

61

Table B-4: National School Attainment Percentile in BSC sites in math in spring 2016 and 2017; grade level.

Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Grade 8

2015-16

2016-17

2015-16

2016-17

2015-16

2016-17

2015-16

2016-17

2015-16

2016-17

2015-16

2016-17

Brooks NA NA NA NA NA NA NA NA NA 95

N=55

NA NA

Irving Park

46

N=54

67

N=50

42

N=28

48

N=58

NA NA 77

N=56

89

N=81

93

N=77

89

N=81

87

N=64

88

N=74

Disney II

89

N=51

93

N=49

96

N=51

97

N=54

98

N=50

99

N=54

97

N=51

96

N=55

NA NA NA NA

Henry 46

N=79

91

N=61

24

N=72

30

N=54

58

N=71

65

N=57

47

N=56

52

N=47

NA NA NA NA

Lovett 21

N=42

63

N=34

18

N=38

51

N=36

33

N=42

52

N=37

NA NA NA NA NA NA

Source: Chicago Public Schools

Table B-5: National School Attainment Percentile in BSC sites in Reading in spring 2016 and 2017.

Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Grade 8

2015-16

2016-17

2015-16

2016-17

2015-16

2016-17

2015-16

2016-17

2015-16

2016-17

2015-16

2016-17

Brooks NA NA NA NA NA NA NA NA NA 99

N=55

NA NA

Irving Park

46 67 52 60 NA NA 85 94 90 90 93 91

Page 77: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

62

N=55 N=50 N=28 N=58 N=56 N=82 N=77 N=51 N=64 N=74

Disney II

91

N=51

87

N=49

98

N=51

93

N=54

91

N=50

97

N=54

97

N=51

96

N=55

NA NA NA NA

Henry 35

N=79

85

N=61

36

N=73

46

N=54

53

N=71

53

N=57

34

N=56

62

N=46

NA NA NA NA

Lovett 17

N=42

74

N=34

33

N=38

44

N=36

64

N=42

57

N=37

NA NA NA NA NA NA

Source: CPS assessment data on public website. https://cps.edu/SchoolData/Pages/SchoolData.aspx.

Table B-4 shows that in math, all schools in all grades are either improving in attainment measures or are very highly ranked based on national performance. Assuming no strong cohort effects, Lovett seems particularly successful in closing gaps by going from well below, to above national averages in all grade levels. Henry also shows large improvements, especially in third grade. Irving Park shows a slight decline in grade seven but started with very high attainment levels and progressed greatly in grade three, four, and six.

In Table B-5, we can see that the evolution of BSC school attainment percentile in reading is not as uniform as it is in math. Disney II is very highly ranked in both years and is showing improvement in grade five. Irving Park has much greater attainment levels in middle school grades for both years than in elementary grades. It however shows great improvement in grade three, four and six the year it became BSC. Likewise, Henry and Lovett show great improvements.

Notes on Analysis of Growth

NWEA also provides growth expectations for each student at each test period. Once a test is administered and if a pre-test is available, we can thus compare the growth that actually occurred to the growth that was expected. If the actual growth is lesser than the expected, the student did not meet their growth target. On the contrary, if the actual growth exceeds the projected growth, the student met their growth target. Table B-6 presents the percentage of BSC and non BSC students who met their projected growth between the spring of 2016 and the spring of 2017. We can see that, in math, BSC are doing better than non-BSC. In reading, BSC are very comparable in performance to the rest of the district overall. Note that Disney II and Lovett are especially successful at growing their students in both subjects.

Page 78: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

63

Table B-6: Percentage of full-year students in BSC and non-BSC sites who met their growth projection in reading and math in spring 2017.

N

Math

15-16

N

Math

16-17

%

Math

15-16

%

Math

16-17

N

Read

15-16

N

Read

16-17

%

Read

15-16

%

Read

16-17

Non-BSC 127,983 124,293 55.85% 55.99% 127,383 123,354 60.38% 58.73%

BSC* 877 888 59.52% 59.12% 878 889 59.57% 59.28%

Difference (BSC - non-BSC)

3.67 pp 3.13 pp -0.81 pp 0.55 pp

Brooks NA 48 NA 45.83% NA 48 NA 58.33%

Irving Park 274 302 60.58% 52.32% 275 304 62.18% 56.58%

Disney II 203 211 65.52% 70.14% 203 210 62.56% 64.29%

Henry 289 273 51.56% 53.48% 289 273 51.56% 55.68%

Lovett 111 102 66.67% 71.57% 111 102 68.47% 66.67%

*In 2016-17, these exclude Brooks to make it comparable across the two years. Source: Chicago Public Schools

Tables B-7 and B-8 provide the same statistics but separate elementary and middle school students. The tables show that the positive differences are largely due to the elementary grades. The differences in percentage points between BSC and non BSC in both subjects are much greater for lower grades. Moreover, those differences widen during the first year of BSC, indicating improvements, especially in reading. The middle school grades overall started slightly above the district average in the base year but ended below district averages after the LEAP BSC treatment (Table B-8).

Page 79: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

64

Table B-7: Percentage of full-year elementary (grade 3 to 5) students in BSC and non-BSC sites who met their growth projection in reading and math in spring 2016 and 2017

N

15-16

(Math)

N

16-17

(Math)

%

15-16

(Math)

%

16-17

(Math)

N

15-16

(Read)

N

16-17

(Read)

%

15-16

(Read)

%

16-17

(Read)

Non-BSC 65,772 63,959 52.21% 54.73% 65,198 63,232 58.52% 55.99%

BSC 573 581 57.59% 62.65% 574 582 57.84% 60.65%

Difference (BSC - non-BSC)

5.38 pp 7.92 pp -0.68 pp 4.66 pp

Brooks NA NA NA NA NA NA NA NA

Irving Park 81 105 39.51% 42.86% 82 106 52.44% 54.72%

Disney II 152 156 66.45% 78.21% 152 155 57.24% 65.16%

Henry 229 218 53.71% 56.88% 229 219 55.02% 57.53%

Lovett 111 102 66.67% 71.57% 111 102 68.47% 66.67%

Source: Chicago Public Schools

Table B-8: Percentage of full-year middle school students in BSC and non-BSC sites who met their growth projection in reading and math in spring 2016 and 2017

N

15-16

(Math)

N

16-17

(Math)

%

15-16

(Math)

%

16-17

(Math)

N

15-16

(Read)

N

16-17

(Read)

%

15-16

(Read)

% 16-17

(Read)

Non-BSC 62,221 60,334 59.69% 57.33% 62,195 60,122 62.34% 61.60%

BSC* 304 307 63.16% 52.44% 304 307 62.83% 56.68%

Difference (BSC - non-BSC)

3.47 pp -4.89pp 0.49 pp -4.92 pp

Page 80: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

65

Brooks NA 48 NA 45.83% NA 48 NA 58.33%

Irving Park 193 197 69.43% 57.36% 193 198 66.32% 57.58%

Disney II 51 55 62.75% 47.27% 51 55 78.43% 61.82%

Henry 60 55 43.33% 40.00% 60 54 38.33% 48.15%

Lovett NA NA NA NA NA NA NA NA

*In 2016-17 these exclude Brooks to make it comparable across the two years. Source: CPS student level data

We categorize attainment levels in the base year as follows: If the attainment percentile is below 20, it is considered very low, if it is between 20 and 40 it becomes low, 40 to 60 is medium, 60 to 80 is high, and above 80 is very high. Concerning attainment percentile changes and differences in percentages who met growth between 2015-16 and 2016-17, a difference of less than 1 percentage point is considered as unchanged, a change between 1 and 5 is small, a change between 5 and 10 is moderate, a change between 10 and 20 is large, and any change greater than 20 is very large.

In addition to looking at the percentage of students performing at or above their growth targets in BSC sites, we include an examination of school quartiles in student growth percentiles in MAP in Spring 2017. The quartile student growth percentile tables provide for each school the median, and lower and upper quartile student growth percentile. They summarize the distribution of growth of students relative to a nationally representative sample. Student growth percentiles position the growth of a student with the growth of all students sharing the same pre-test score; for example, a student with a growth percentile of 75 has made growth that is equal to or greater than 75 percent of their academic peers. At the school or group of schools level, a median of 60 means that 50 percent of the students in that group had a student growth percentile SGP of 60 percentile or higher. It is a better measure for considering starting positions. “Normal” values for the quartile student growth percentile would be by construction, 25 for the lower quartile, 50 for the median quartile and 75 for the upper quartile.

Overall, in both years, BSC schools outperform non-BSC schools in all three quartiles in math. When comparing to the base year, it seems that in math BSC schools have improved in the median and upper quartile. In reading, BSC schools outperform non-BSC schools only during the LEAP year and only in the lowest quartile despite Lovett having a strong improvement for their highest quartile.

Again, while overall BSC are performing better in both subjects than non-BSC, it seems to be driven by growth in elementary grades versus middle grades. Note that Disney II and Lovett are highly performing in both subjects in elementary grades. Moreover, there are large improvements in the median and upper quartile in Math.

Notes on Analysis of Outcomes by Pretest Quartile

To address the pre-test position differential impacts, we first computed school-grade level percentile ranks on the pretest for each student and grouped students into quartiles (from the lower quartile to

Page 81: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

66

the upper). For each pretest quartile, we present the percentages of students who met their growth targets in mathematics and reading.

Table B-9: MAP Math Spring to Spring Lower Quartile, Median, and Upper Quartile in Student Growth Percentile for BSC, non-BSC, and each site

N

15-16

N

16-17

Lower Quart.

SGP

15-16

Lower Quart.

SGP

16-17

Median

SGP

15-16

Median

SGP

16-17

Upper Quart.

SGP

15-16

Upper Quart.

SGP

16-17

Non-BSC 143,273 134,804 25 25 54 54 81 81

BSC 924 988 30 29 58 60 82 85

Difference (BSC - non-BSC)

+5 pp +4 pp +4 pp +6 pp +1 pp +4 pp

Brooks NA 55 NA 12 NA 41 63

Irving Park 282 327 30 28 56.5 50.5 81 82

Disney II 204 215 39 40 64 70 84 91

Henry 312 282 24 20 49 52 75 81

Lovett 126 109 34 44 67 67 94 93

Source: CPS student level data.

Table B-10: MAP Reading Spring to Spring Lower Quartile, Median, and Upper Quartile in Student Growth Percentile for BSC, non-BSC and for each site.

N

15-16

N

16-17

Lower Quart.

SGP

15-16

Lower Quart.

SGP

16-17

Median

SGP

15-16

Median

SGP

16-17

Upper Quart.

SGP

15-16

Upper Quart.

SGP

16-17

Non-BSC 143,273 134,804 31 29 59 58 83 82

Page 82: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

67

BSC 924 988 31 34 58 58 83 82

Difference (BSC - non-BSC)

0 pp +5 pp -1 pp 0 pp 0 pp 0 pp

Brooks NA 55 NA 36 NA 63 NA 80

Irving Park 282 327 32 35 63 58 79 79.5

Disney II 204 215 35 39 58 64 84 82

Henry 312 282 26 28 49 54 78 79

Lovett 126 109 37 33 74 68 74 94

Source: CPS student level data

Table B-11: Percentage of full-year elementary (grade 3 to 5) students in BSC and non-BSC sites who met their growth projection in Math in spring 2016 and 2017.

% of Q1 in Pre-Test

met Math growth

15-16

% of Q1 in Pre-Test

met Math growth

16-17

% of Q2 in Pre-Test

met Math growth

15-16

% of Q2 in Pre-Test

met Math growth

16-17

% of Q3 in Pre-Test

met Math growth

15-16

% of Q3 in Pre-Test

met Math growth

16-17

% of Q4

in Pre-Test met Math

growth

15-16

% of Q4 in Pre-Test

met Math growth

16-17

Non-BSC 56.63%

(N=15,701)

55.24%

(N=15,318)

53.29%

(N=16,591)

52.83%

(N=16,082)

51.61%

(N=17,003)

54.98%

(N=16,532)

47.54%

(N=16,451)

55.87%

(N=15,990)

BSC 62.32%

(N=138)

63.16%

(N=133)

51.72%

(N=145)

55.63%

(N=151)

56.95%

(N=151)

69.03%

(N=155)

59.71%

(N=139)

62.68%

(N=142)

Difference

(BSC - non-BSC)

5.69 pp 7.92 pp -1.57 pp 2.8 pp 5.34 14.05 pp 12.17 pp 6.81 pp

Brooks* NA NA NA NA NA NA NA NA

Page 83: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

68

Irving Park 52.63%

(N=19)

45.83%

(N=24)

47.62%

(N=21)

34.62%

(N=26)

21.74%

(N=23)

37.93%

(N=29)

38.89%

(N=18)

53.85%

(N=26)

Disney II 57.14

(N=35)

75.00%

(N=36)

60.00%

(N=40)

69.05%

(N=42)

75.00%

(N=40)

85.00%

(N=40)

72.97%

(N=37)

84.21%

(N=38)

Henry 61.40%

(N=57)

54.00%

(N=50)

40.00%

(N=55)

50.88%

(N=57)

55.00%

(N=60)

65.52%

(N=58)

57.89%

(N=57)

56.60%

(N=53)

Lovett 77.78%

(N=27)

82.61%

(N=23)

65.52%

(N=29)

65.38%

(N=26)

64.29%

(N=28)

85.71%

(N=28)

59.26%

(N=27)

52.00%

(N=25)

Source: Chicago Public Schools *These data are included for sake of completeness but Ns are very small.

Table B-12: Elementary Grades MAP Math Spring to Spring Lower Quartile, Median, and Upper Quartile in Student Growth Percentile for BSC, non-BSC and for each site.

N

15-16

N

16-17

Lower Quart.

SGP

15-16

Lower Quart.

SGP

16-17

Median

SGP

15-16

Median

SGP

16-17

Upper Quart.

SGP

15-16

Upper Quart.

SGP

16-17

Non-BSC 74,475 134,804 21 23 50 53 79 82

BSC 610 614 29 30 56 64 82 88

Difference (BSC - non-BSC)

+8 pp +7 pp +6 pp +11 pp +3 pp +6 pp

Brooks NA NA NA NA NA NA NA NA

Irving Park 83 119 20 18 39 39 59 70

Disney II 153 160 39 51 65 78 84.5 94.5

Henry 248 226 24 21 50 59.5 77 84

Page 84: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

69

Lovett 126 109 34 44 67 67 94 93

Table B-13: Percentage of full-year elementary (grades 3 to 5) students in BSC and non-BSC sites who met their growth projection in Reading in spring 2016 and 2017.

% of Q1 in Pre-Test

met Read growth

15-16

% of Q1 in Pre-Test

met Read growth

16-17

% of Q2 in Pre-Test

met Read growth

15-16

% of Q2 in Pre-Test

met Read growth

16-17

% of Q3 in Pre-Test

met Read growth

15-16

% of Q3 in Pre-Test

met Read growth

16-17

% of Q4

in Pre-Test met Read

growth

15-16

% of Q4 in Pre-Test

met Read growth

16-17

Non-BSC 60.62%

(N=15,707)

57.50%

(N=15,189)

61.08%

(N=16,443)

58.53%

(N=15,895)

58.54%

(N=16,690)

56.11%

(N=16,326)

53.89%

(N=16,335)

51.84%

(N=15,789)

BSC 57.45%

(N=141)

61.59%

(N=138)

61.54%

(N=143)

61.74%

(N=149)

55.70%

(N=149)

59.33%

(N=150)

56.74%

(N=141)

60.00%

(N=145)

Difference

(BSC - non-BSC)

-3.17 pp 4.09 pp 0.46 pp 3.21 pp -2.84 pp 3.22 pp 2.85 pp 8.16 pp

Brooks NA NA NA NA NA NA NA NA

Irving Park 47.37%

(N=19)

77.27%

(N=22)

66.67%

(N=21)

51.61%

(N=31)

45.45%

(N=22)

44.00%

(N=25)

50.00%

(N=20)

50.00%

(N=28)

Disney II 50.00%

(N=38)

56.76%

(N=37)

55.56%

(N=36)

67.57%

(N=37)

62.50%

(N=40)

63.64%

(N=44)

60.53%

(N=38)

72.97%

(N=37)

Henry 53.57%

(N=56)

56.36%

(N=55)

59.65%

(N=57)

55.56%

(N=54)

53.33%

(N=60)

59.26%

(N=54)

53.57%

(N=56)

58.93%

(N=56)

Lovett 82.14%

(N=28)

66.67%

(N=24)

68.97%

(N=29)

77.78%

(N=27)

59.26%

(N=27)

66.67%

(N=27)

62.96%

(N=27)

54.17%

(N=24)

Source: CPS student level data

Page 85: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

70

Table B-14: Elementary Grades MAP Reading Spring to Spring Lower Quartile, Median, and Upper Quartile in Student Growth Percentile for BSC, non-BSC and for each site

N

15-16

N

16-17

Lower Quart.

SGP

15-16

Lower Quart.

SGP

16-17

Median

SGP

15-16

Median

SGP

16-17

Upper Quart.

SGP

15-16

Upper Quart.

SGP

16-17

Non-BSC 74,475 70,060 26 24 57 54 84 81

BSC 610 614 28 31 54.5 59 84 82

Difference (BSC - non-BSC)

+2 pp +7 pp -2.5 pp +5 pp 0 pp +1 pp

Brooks NA NA NA NA NA NA NA NA

Irving Park 83 119 22 23 50 51.5 74 76

Disney II 153 160 29 33 53 63 79.5 80

Henry 248 226 26 29 52 54 82 81

Lovett 126 109 37 33 74 68 93 94

Table B-15: Percentage of full-year middle school students in BSC and non-BSC sites who met their growth projection in Math in spring 2016 and 2017

% of Q1 in Pre-Test

met Math growth

15-16

% of Q1 in Pre-Test

met Math growth

16-17

% of Q2 in Pre-Test

met Math growth

15-16

% of Q2 in Pre-Test

met Math growth

16-17

% of Q3 in Pre-Test

met Math growth

15-16

% of Q3 in Pre-Test

met Math growth

16-17

% of Q4

in Pre-Test met Math

growth

15-16

% of Q4 in Pre-Test

met Math growth

16-17

Non-BSC 62.57%

(N=14,888)

57.46%

(N=14,387)

59.73%

(N=15,683)

55.75%

(N=15,184)

58.26%

(N=16,099)

56.72%

(N=15,596)

58.37%

(N=15,545)

59.46%

(N=15,135)

BSC** 62.34% 57.75% 59.46 % 59.74% 61.73% 51.32% 69.44% 42.17%

Page 86: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

71

(N=77) (N=71) (N=74) (N=77) (N=81) (N=76) (N=72) (N=83)

Difference

(BSC - non-BSC)

-0.23 pp 0.29 pp -0.27 pp 3.99 pp 3.47 pp -5.40 pp 11.07 pp -17.29 pp

Brooks* NA 41.67%

(N=12)

NA 38.46%

(N=13)

NA 54.55%

(N=11)

NA 50%

(N=12)

Irving Park 65.31%

(N=49)

64.44%

(N=45)

56.52%

(N=46)

63.83%

(N=47)

67.92%

(N=53)

54.90%

(N=51)

88.89%

(N=45)

48.15%

(N=54)

Disney II 75.00%

(N=12)

61.54%

(N=13)

60.00%

(N=15)

50.00%

(N=14)

66.67%

(N=12)

42.86%

(N=14)

50%

(N=12)

35.71%

(N=14)

Henry 43.75%

(N=16)

30.77%

(N=13)

69.23%

(N=13)

56.25%

(N=16)

37.5%

(N=16)

45.45%

(N=11)

26.67%

(N=15)

26.67%

(N=15)

Lovett NA

NA NA NA NA NA NA NA

Source: Chicago Public Schools *Note that these are included for sake of completeness but Ns are very small. **In

2016-17 these exclude Brooks to make it comparable across the two years.

Table B-16: Middle school (grades 6 to 8) MAP Math Spring to Spring Lower Quartile, Median, and Upper Quartile in Student Growth Percentile for BSC, non-BSC and for each site

N

15-16

N

16-17

Lower Quart.

SGP

15-16

Lower Quart.

SGP

16-17

Median

SGP

15-16

Median

SGP

16-17

Upper Quart.

SGP

15-16

Upper Quart.

SGP

16-17

Non-BSC 68,798 64,744 29 27 58 55 83 81

BSC 562 319 30 25 55 51 77 79

Page 87: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

72

Difference (BSC - non-BSC)

+1 pp -2 pp -3 pp -4 pp -6 pp -2 pp

Brooks NA 55 NA 12 NA 41 NA 63

Irving Park 83 208 36 29 65 57 84 82

Disney II 153 55 38 17 63 48 80 73

Henry 248 56 24 8 40.5 39 69 69

Lovett NA NA NA NA NA NA NA NA

Table B-17: Percentage of full-year middle school (grades 6 to 8) students in BSC and non-BSC sites who met their growth projection in Reading in spring 2016 and 2017

% of Q1 in

Pre-Test

met Read

growth

1516

% of Q1 in

Pre-Test

met Read

growth

16-17

% of Q2 in

Pre-Test

met Read

growth

15-16

% of Q2 in

Pre-Test

met Read

growth

16.17

% of Q3 in

Pre-Test

met Read

growth

15-16

% of Q3 in

Pre-Test

met Read

growth

16-17

% of Q4

in Pre-Test

met Read

growth

15-16

% of Q4 in

Pre-Test

met Read

growth

16-17

Non-BSC 67.29 %

(N=14,968)

66.00%

(N=14,406)

65.48%

(N=15,668)

65.33%

(N=15,141)

61.85%

(N=16,107)

62.25%

(N=15,508)

54.99%

(N=15,631)

53.01%

(N=15,061)

BSC** 71.83%

(N=71)

64.79%

(N=71)

62.03 %

(N=79)

61.54%

(N=78)

65.38%

(N=78)

55.56%

(N=81)

52.63%

(N=76)

45.45%

(N=77)

Difference

(BSC - non-

BSC)

4.54 pp -1.21 pp -3.45pp -3.79 pp 3.53 pp -6.69 pp -2.36 pp -7.56 pp

Brooks* NA 63.64%

(N=11)

NA 61.54%

(N=13)

NA 54.55%

(N=11)

NA 53.85%

(N=13)

Irving Park 70.45% 68.18% 64.00% 54.90% 71.43% 60.38% 60.00% 48.00%

Page 88: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

73

(N=44) (N=44) (N=50) (N=51) (N=49) (N=53) (N=50) (N=50)

Disney II 83.33%

(N=12)

76.92%

(N=13)

84.62%

(N=13)

73.33%

(N=15)

85.71%

(N=14)

50.00%

(N=14)

58.33%

(N=12)

46.15%

(N=13)

Henry* 66.67%

(N=15)

42.86%

(N=14)

37.50%

(N=16)

75.00%

(N=12)

26.67%

(N=15)

42.86%

(N=14)

21.43%

(N=15)

35.71%

(N=14)

Lovett NA NA NA NA NA NA NA NA

*Note that these are included for sake of completeness but Ns are very small. **In 2016-17 these exclude Brooks to make it comparable across the two years. Source: CPS student level data.

Table B-18: Middle school (grades 6 to 8) MAP Reading Spring to Spring Lower Quartile, Median, and Upper Quartile in Student Growth Percentile for BSC, non-BSC and for each site.

N

15-16

N

16-17

Lower Quart.

SGP

15-16

Lower Quart.

SGP

16-17

Median

SGP

15-16

Median

SGP

16-17

Upper Quart.

SGP

15-16

Upper Quart.

SGP

16-17

Non-BSC 68,798 64,744 36 35 61 61 82 82

BSC 562 319 36 38 60 58 78 82

Difference (BSC - non-BSC)

0 pp +3 pp -1 pp -3 pp -4 pp 0 pp

Brooks NA 55 NA 36 NA 63 NA 80

Irving Park 199 208 38 40 67 58.5 82 81

Disney II 51 55 57 40 76 71 88 85

Henry 248 56 23 17 43 48 62 76

Lovett NA NA NA NA NA NA NA NA

Page 89: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

74

Appendix C: Excerpts from Original Evaluation Plan

The following are excerpts from the original evaluation plan for the Breakthrough Schools: Chicago, describing the potential questions to guide the evaluation:

• What are the proposed personalized learning models in Breakthrough Schools: Chicago and how are they aligned to the existing research base and LEAP frameworks?

• What are patterns in implementation and readiness at the school site level, and is implementation consistent with intended and proposed program models?

• What are patterns in outcomes related to student capacity and practice? o Participation (or dosage) in personalized learning opportunities o Impact of these opportunities on students’:

▪ Academic achievement (e.g., grades, tests, rubrics) ▪ Engagement with school and learning (e.g., grades, attendance, surveys)

• What are patterns in outcomes related to school staff capacity and practice? o Participation (or dosage) rates in professional development opportunities o Impact of these opportunities on teacher capacity and practice

The following are the potential indicators and data sources by evaluation question mapped out in the original evaluation plan:

Evaluation Question #

Indicator Data sources/types

1

Shared structures and principles between personalized learning models and research base

● Blueprints ● Literature review

1

Shared structures and principles between personalized learning models and LEAP Framework

● Blueprints ● LEAP framework ● School-specific assessment strategies and artifacts

2

Frequency and nature of personalized learning submodels and activities in schools – both students and staff

● Observations ● Interviews and focus groups ● LEAP personalized learning Survey ● Dosage (e.g., logins for digital tools, professional

development participation numbers) 2 Resources (tech, time, space, training)

available for personalized learning activities in schools

● Observations ● Interviews and focus groups ● LEAP personalized learning Survey

3 Student capacity to apply and the actual application of principles of personalized learning

● Observations ● Interviews and focus groups with adults

3 Student academic achievement ● Course grades and GPA ● Standardized assessments ● Digital tool vendor assessments?

3 Student engagement in learning ● Attendance ● Disciplinary

Page 90: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

75

● Course grades and GPA ● CPS 5 Essentials of School Culture and Climate Survey ● Student talk/discussion related to learning

4

School staff knowledge of personalized learning and the LEAP framework

● Observations ● Interviews and focus groups ● LEAP personalized learning Staff Survey

4

Changes in teachers’ instructional approach

● Observations ● Interviews and focus groups ● LEAP personalized learning Staff Survey

*Italicized data sources ideally would include comparison groups.

Page 91: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

76

Appendix D: Data for Survey Figures

This appendix contains data for Figures 1-12 in the report.

Figure 1

Brooks

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 5 16 22 9 % 9.62 30.77 42.31 17.31

Spring 2017 n 8 17 15 10 % 16.00 34.00 30.00 20.00

Fall 2017 n 7 17 13 14 % 13.73 33.33 25.49 27.45

Disney II

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 26 58 114 113

% 8.36 18.65 36.66 36.33

Spring 2017 n 13 52 109 147

% 4.05 16.20 33.96 45.79

Fall 2017 n 25 45 107 124

% 8.31 14.95 35.55 41.20

Henry

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 4 18 23 11

% 7.14 32.14 41.07 19.64

Spring 2017 n 9 41 67 44

% 5.59 25.47 41.61 27.33

Fall 2017 n 14 56 56 34

% 8.75 35.00 35.00 21.25

Irving Park

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 19 57 104 60

% 7.92 23.75 43.33 25.00

Spring 2017 n 5 26 44 23

% 5.10 26.53 44.90 23.47

Fall 2017 n 7 48 49 30

% 5.22 35.82 36.57 22.39

Page 92: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

77

Lindblom

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 2 8 9 11

% 6.67 26.67 30.00 36.67

Spring 2017 n 17 38 38 29

% 13.93 31.15 31.15 23.77

Fall 2017 n 1 1 3 4

% 11.11 11.11 33.33 44.44

Lovett

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 12 19 27 45

% 11.65 18.45 26.21 43.69

Spring 2017 n 11 12 32 60

% 9.57 10.43 27.83 52.17

Fall 2017 n 3 16 21 58

% 3.06 16.33 21.43 59.18

Figure 2

Brooks

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 18 13 13 11

% 32.73 23.64 23.64 20.00

Spring 2017 n 13 12 15 11

% 25.49 23.53 29.41 21.57

Fall 2017 n 8 13 13 13

% 17.02 27.66 27.66 27.66

Disney II

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 38 64 78 129

% 12.30 20.71 25.24 41.75

Spring 2017 n 35 63 123 95

% 11.08 19.94 38.92 30.06

Fall 2017 n 44 87 90 84

% 14.43 28.52 29.51 27.54

Page 93: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

78

Henry

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 4 14 16 24

% 6.90 24.14 27.59 41.38

Spring 2017 n 24 43 51 47

% 14.55 26.06 30.91 28.48

Fall 2017 n 15 58 36 64

% 8.67 33.53 20.81 36.99

Irving Park

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 54 64 72 66

% 21.09 25.00 28.13 25.78

Spring 2017 n 36 29 25 10

% 36.00 29.00 25.00 10.00

Fall 2017 n 26 40 44 28

% 18.84 28.99 31.88 20.29

Lindblom

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 14 6 6 4

% 46.67 20.00 20.00 13.33

Spring 2017 n 39 51 23 12

% 31.20 40.80 18.40 9.60

Fall 2017 n 3 4 1 0

% 37.50 50.00 12.50 0.00

Lovett

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 19 23 32 30

% 18.27 22.12 30.77 28.85

Spring 2017 n 12 29 36 36

% 10.62 25.66 31.86 31.86

Fall 2017 n 9 23 31 41

% 8.65 22.12 29.81 39.42

Page 94: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

79

Figure 3

Brooks

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 2 7 13 4

% 7.69 26.92 50.00 15.38

Spring 2017 n 1 5 2 4

% 8.33 41.67 16.67 33.33

Fall 2017 n 1 1 0 0

% 50.00 50.00 0.00 0.00

Disney II

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 3 14 42 58

% 2.56 11.97 35.90 49.57

Spring 2017 n 4 20 48 90

% 2.47 12.35 29.63 55.56

Fall 2017 n 1 4 24 23

% 1.92 7.69 46.15 44.23

Henry

Don’t Agree Agree a little Mostly Agree Agree a lot

Spring 2017 n 2 6 13 11

% 6.25 18.75 40.63 34.38

Fall 2017 n 4 19 13 3

% 10.26 48.72 33.33 7.69

Irving Park

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 11 44 84 47

% 5.91 23.66 45.16 25.27

Spring 2017 n 5 14 36 20

% 6.67 18.67 48.00 26.67

Fall 2017 n 6 34 33 28

% 5.94 33.66 32.67 27.72

Lindblom

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 2 7 8 10

% 7.41 25.93 29.63 37.04

Spring 2017 n 10 29 32 24

% 10.53 30.53 33.68 25.26

Page 95: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

80

Lovett

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 3 8 10 28

% 6.12 16.33 20.41 57.14

Spring 2017 n 1 4 12 29

% 2.17 8.70 26.09 63.04

Fall 2017 n 0 7 6 11

% 0.00 29.17 25.00 45.83

Figure 4

Brooks

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 5 8 9 6

% 17.86 28.57 32.14 21.43

Spring 2017 n 2 3 4 4

% 15.38 23.08 30.77 30.77

Fall 2017 n 2 0 0 0

% 100.00 0.00 0.00 0.00

Disney II

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 5 17 23 69

% 4.39 14.91 20.18 60.53

Spring 2017 n 11 32 67 53

% 6.75 19.63 41.10 32.52

Fall 2017 n 9 20 14 8

% 17.65 39.22 27.45 15.69

Henry

Don’t Agree Agree a little Mostly Agree Agree a lot

Spring 2017 n 0 7 10 16

% 0.00 21.21 30.30 48.48

Fall 2017 n 4 13 12 12

% 9.76 31.71 29.27 29.27

Page 96: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

81

Irving Park

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 37 41 58 55

% 19.37 21.47 30.37 28.80

Spring 2017 n 26 23 18 8

% 34.67 30.67 24.00 10.67

Fall 2017 n 16 29 32 25

% 15.69 28.43 31.37 24.51

Lindblom

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 13 6 5 3

% 48.15 22.22 18.52 11.11

Spring 2017 n 28 37 19 12

% 29.17 38.54 19.79 12.50

Lovett

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 3 10 11 24

% 6.25 20.83 22.92 50.00

Spring 2017 n 4 9 15 18

% 8.70 19.57 32.61 39.13

Fall 2017 n 0 4 8 12

% 0.00 16.67 33.33 50.00

Figure 5

4th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 16 26 37 36

% 13.91 22.61 32.17 31.30

Spring 2017 n 15 47 41 63

% 9.04 28.31 24.70 37.95

Fall 2017 n 6 50 56 55

% 3.59 29.94 33.53 32.93

Page 97: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

82

5th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 5 26 38 56

% 4.00 20.80 30.40 44.80

Spring 2017 n 5 10 56 70

% 3.55 7.09 39.72 49.65

Fall 2017 n 12 25 51 46

% 8.96 18.66 38.06 34.33

6th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 11 22 69 49

% 7.28 14.57 45.70 32.45

Spring 2017 n 6 38 80 75

% 3.02 19.10 40.20 37.69

Fall 2017 n 6 34 50 85

% 3.43 19.43 28.57 48.57

7th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 16 56 80 47

% 8.04 28.14 40.20 23.62

Spring 2017 n 26 55 73 57

% 12.32 26.07 34.60 27.01

Fall 2017 n 16 49 53 49

% 9.58 29.34 31.74 29.34

8th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 20 46 75 61

% 9.90 22.77 37.13 30.20

Spring 2017 n 11 36 55 48

% 7.33 24.00 36.67 32.00

Fall 2017 n 17 25 39 29

% 15.45 22.73 35.45 26.36

Page 98: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

83

Figure 6

4th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 27 39 28 31

% 21.60 31.20 22.40 24.80

Spring 2017 n 27 40 52 50

% 15.98 23.67 30.77 29.59

Fall 2017 n 28 65 42 34

% 16.57 38.46 24.85 20.12

5th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 8 21 30 66

% 6.40 16.80 24.00 52.80

Spring 2017 n 9 30 55 49

% 6.29 20.98 38.46 34.27

Fall 2017 n 9 42 36 55

% 6.34 29.58 25.35 38.73

6th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 26 33 44 54

% 16.56 21.02 28.03 34.39

Spring 2017 n 38 54 64 42

% 19.19 27.27 32.32 21.21

Fall 2017 n 24 39 50 71

% 13.04 21.20 27.17 38.59

7th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 41 47 52 63

% 20.20 23.15 25.62 31.03

Spring 2017 n 53 59 67 33

% 25.00 27.83 31.60 15.57

Fall 2017 n 19 45 52 49

% 11.52 27.27 31.52 29.70

Page 99: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

84

8th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 45 44 63 50

% 22.28 21.78 31.19 24.75

Spring 2017 n 32 44 35 37

% 21.62 29.73 23.65 25.00

Fall 2017 n 25 34 35 21

% 21.74 29.57 30.43 18.26

Figure 7

4th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 3 9 18 18

% 6.25 18.75 37.50 37.50

Spring 2017 n 1 11 15 37

% 1.56 17.19 23.44 57.81

Fall 2017 n 3 22 33 25

% 3.61 26.51 39.76 30.12

5th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 1 8 16 45

% 1.43 11.43 22.86 64.29

Spring 2017 n 3 6 39 67

% 2.61 5.22 33.91 58.26

Fall 2017 n 2 8 10 12

% 6.25 25.00 31.25 37.50

6th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 7 13 50 36

% 6.60 12.26 47.17 33.96

Spring 2017 n 5 22 40 36

% 4.85 21.36 38.83 34.95

Fall 2017 n 3 17 17 10

% 6.38 36.17 36.17 21.28

Page 100: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

85

7th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 5 23 37 17

% 6.10 28.05 45.12 20.73

Spring 2017 n 10 22 25 21

% 12.82 28.21 32.05 26.92

Fall 2017 n 1 16 10 14

% 2.44 39.02 24.39 34.15

8th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 5 27 36 31

% 5.05 27.27 36.36 31.31

Spring 2017 n 4 17 24 17

% 6.45 27.42 38.71 27.42

Fall 2017 n 3 2 6 4

% 20.00 13.33 40.00 26.67

Figure 8

4th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 3 13 9 21

% 6.52 28.26 19.57 45.65

Spring 2017 n 3 11 23 26

% 4.76 17.46 36.51 41.27

Fall 2017 n 13 32 24 13

% 15.85 39.02 29.27 15.85

5th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 4 7 14 43

% 5.88 10.29 20.59 63.24

Spring 2017 n 5 21 49 42

% 4.27 17.95 41.88 35.90

Fall 2017 n 0 4 10 19

% 0.00 12.12 30.30 57.58

6th Grade

Page 101: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

86

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 17 23 28 44

% 15.18 20.54 25.00 39.29

Spring 2017 n 24 30 28 22

% 23.08 28.85 26.92 21.15

Fall 2017 n 13 17 14 4

% 27.08 35.42 29.17 8.33

7th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 18 20 21 25

% 21.43 23.81 25.00 29.76

Spring 2017 n 22 24 22 12

% 27.50 30.00 27.50 15.00

Fall 2017 n 3 10 12 16

% 7.32 24.39 29.27 39.02

8th Grade

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 21 19 34 24

% 21.43 19.39 34.69 24.49

Spring 2017 n 17 25 11 9

% 27.42 40.32 17.74 14.52

Fall 2017 n 2 3 6 5

% 12.50 18.75 37.50 31.25

Figure 9

Brooks

Agree a little Mostly Agree Agree a lot

Fall 2016 n 0 3 0

% 0.00 100.00 0.00

Spring 2017 n 0 2 0

% 0.00 100.00 0.00

Fall 2017 n 0 6 1

% 0.00 85.71 14.29

Disney II

Agree a little Mostly Agree Agree a lot

Fall 2016 n 3 11 4

% 16.67 61.11 22.22

Spring 2017 n 1 7 3

Page 102: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

87

% 9.09 63.64 27.27

Fall 2017 n 6 14 12

% 18.75 43.75 37.50

Henry

Agree a little Mostly Agree Agree a lot

Fall 2016 n 0 1 3

% 0.00 25.00 75.00

Spring 2017 n 0 0 5

% 0.00 0.00 100.00

Fall 2017 n 0 2 7

% 0.00 22.22 77.78

Irving Park

Agree a little Mostly Agree Agree a lot

Fall 2016 n 4 9 11

% 16.67 37.50 45.83

Spring 2017 n 2 6 9

% 11.76 35.29 52.94

Fall 2017 n 2 8 11

% 9.52 38.10 52.38

Lindblom

Agree a little Mostly Agree Agree a lot

Fall 2016 n 5 8 10

% 21.74 34.78 43.48

Spring 2017 n 2 4 0

% 33.33 66.67 0.00

Fall 2017 n 0 1 0

% 0.00 100.00 0.00

Lovett

Agree a little Mostly Agree Agree a lot

Fall 2016 n 0 2 5

% 0.00 28.57 71.73

Spring 2017 n 0 2 4

% 0.00 33.33 66.67

Fall 2017 n 0 2 5

% 0.00 28.57 71.43

Page 103: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

88

Figure 10

Brooks

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 0 0 1 2

% 0.00 0.00 33.33 66.67

Spring 2017 n 0 1 0 1

% 0.00 50.00 0.00 50.00

Fall 2017 n 1 1 4 1

% 14.29 14.29 57.14 14.29

Disney II

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 1 3 10 4

% 5.56 16.67 55.56 22.22

Spring 2017 n 0 0 5 6

% 0.00 0.00 45.45 54.55

Fall 2017 n 1 4 14 13

% 3.13 12.50 43.75 40.63

Henry

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 0 2 2 0

% 0.00 50.00 50.00 0.00

Spring 2017 n 0 1 3 1

% 0.00 20.00 60.00 20.00

Fall 2017 n 0 1 5 3

% 0.00 11.11 55.56 33.33

Irving Park

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 1 4 7 12

% 4.17 16.67 29.17 50.00

Spring 2017 n 0 2 4 10

% 0.00 12.50 25.00 62.50

Fall 2017 n 0 1 11 9

% 0.00 4.76 52.38 42.86

Page 104: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

89

Lindblom

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 3 10 7 3

% 13.04 43.48 30.43 13.04

Spring 2017 n 1 2 2 1

% 16.67 33.33 33.33 16.67

Fall 2017 n 1 0 0 0

% 100.00 0.00 0.00 0.00

Lovett

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 0 0 3 4

% 0.00 0.00 42.86 57.14

Spring 2017 n 0 0 1 5

% 0.00 0.00 16.67 83.33

Fall 2017 n 0 0 1 6

% 0.00 0.00 14.29 85.71

Figure 11

Grades 1-3

Agree a little Mostly Agree Agree a lot

Fall 2016 n 0 5 10

% 0.00 33.33 66.67

Spring 2017 n 1 3 10

% 7.14 21.43 71.43

Fall 2017 n 1 12 15

% 3.57 42.86 53.57

Grades 4-5

Agree a little Mostly Agree Agree a lot

Fall 2016 n 0 4 12

% 0.00 25.00 75.00

Spring 2017 n 1 5 8

% 7.14 35.71 57.14

Fall 2017 n 1 10 20

% 3.23 32.26 64.52

Page 105: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

90

Grades 6-8

Agree a little Mostly Agree Agree a lot

Fall 2016 n 10 22 17

% 20.41 44.90 34.69

Spring 2017 n 2 17 8

% 7.41 62.96 29.63

Fall 2017 n 7 20 16

% 16.28 46.51 37.21

Figure 12

Grades 1-3

Agree a little Mostly Agree Agree a lot

Fall 2016 n 4 6 5

% 26.67 40.00 33.33

Spring 2017 n 2 4 7

% 15.38 30.77 53.85

Fall 2017 n 2 13 13

% 7.14 46.43 46.43

Grades 4-5

Agree a little Mostly Agree Agree a lot

Fall 2016 n 3 6 7

% 18.75 37.50 43.75

Spring 2017 n 1 4 9

% 7.14 28.57 64.29

Fall 2017 n 2 16 13

% 6.45 51.61 41.94

Grades 6-8

Don’t Agree Agree a little Mostly Agree Agree a lot

Fall 2016 n 5 12 16 16

% 10.20 24.49 32.65 32.65

Spring 2017 n 1 4 7 15

% 3.70 14.81 25.93 55.56

Fall 2017 n 3 4 21 15

% 6.98 9.30 48.84 34.88

Page 106: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

91

Appendix E: School Profiles

Gwendolyn Brooks College Preparatory Academy

Personalized Learning in Context

Several aspects of personalized learning were observed being implemented at Brooks. For example, the school focuses on using formative assessments to determine what students need to learn; students are also aware of and familiar with this practice. While this is no doubt a common (and effective) practice, it also relates to one of the PL practices we identified, “data-driven consultation.” We observed project-based learning in Brooks, evidence of which is presented below. Additionally, Brooks has a set of social/emotional tenets that it encourages students to keep in mind and identified social/emotional learning as a possible component of PL. Lastly, this school is conscious of what other BSC cohort schools are doing. For example, one staff member mentioned they “stole” the role of classroom assistants from another BSC, where classroom assistants are older students who want to go into education as a career and who mentor and instruct younger students.

Interviewees specifically mentioned time and funding as challenges to implementing PL. The school’s aim is to implement PL schoolwide within five years, but that is dependent on having adequate funding and technology.1

Selected Personalized Learning Practices and Examples from Observations, Interviews, and Team Meetings

Personalized Learning Practice

Interview/Professional Learning Community Response from Teachers

Example from Practice

Personalized/ learner

focused/ targeted

instruction

The teacher explains that students right now grouped with other people with whom they work well. After the DFA [formative assessment], they will be grouped with students working on the same things as them. The teacher explains that they want to find a balance between collaboration and personalization; they had lost some collaboration because during personalization, it was more homogeneous, but students were largely working by themselves.

Competency- or proficiency-

Interview response: The teacher says that all students have completed a check for understanding (CfU) that allowed them to

At the beginning of class, there were two problems up on the projector as a formative assessment. Then on the white board there was

1 Note that Brooks, a secondary school, started incorporating PL in grade 7 in 2016-17 and in grade 8 in 2017-18. Therefore, while WEC visited Brooks as often as we visited the other Breakthrough Schools: Chicago, there may be less data present for this school compared to the others.

Page 107: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

92

based progression

proceed to a later concept, but they are not all working on the exact same thing within the worksheet. If they go beyond the class, they are using Khan Academy to accelerate. They have to pass their district-wide algebra test at the end of the year.

Post-PLC feedback: We discuss the goals, content, and supports for our CBL pilot during these [meetings].

a table with “Got it” on one column (with then a list of tasks for that day if they got the questions right, such as “correct #3-5 if needed” and “63-65”). In the second column was “More practice” (with a list of tasks if they got them wrong, such as “correct #1 and 2” and “multiple choice handout 7, 14, 17.”) Students individually worked on those problems first, then they went over the answers together as a whole class. Students got a point for each one they got correct, which then determined which “intervention” or “acceleration” they would get.

Flexible learning

environments/schedules

The principal said that the school bought new desks to support their flexible learning environments, but they were a bit bigger than expected.

Project-based learning

Interview with STEM director:

● Partnered with Lucas Education Research for the architecture/platform for AP KIA Environmental Science course to facilitate project-based model, which is teacher facing platform.

● Partnered with Field Museum for place-based project learning

Students’ first activity is triangle inequality – at the top of the worksheet that the teacher was passing out. This assignment needs to be completed by the end of class today. (Block schedule – 100 minute class.) They were to work as groups. The teacher is giving every group three pieces of spaghetti. Their goal is to make some triangles and measure the lengths of the sides; then, manipulate the spaghetti so that it can’t form a triangle, and measure the sides; then make a conjecture and test it to eventually get how triangle inequalities work. The teacher asks students to read the directions very carefully. The teacher gives an ETA to finish the assignment – about 40 minutes. She is going to set a timer to help students pace themselves at 20:00; students should be about halfway done when the timer goes off.

Peer instruction

The teacher mentions that Stacy (a senior classroom assistant) is doing pull-outs with students who are having trouble with grammar. The principal explains that the school “copied” the classroom assistant concept from Lindblom – these are seniors who are interested in going into education. They made a 4-3-2-1 rubric for themselves to be graded on their work.

At the end of the lesson, the learning assistant (a senior) taught questions 2 and 4 from the CFU [Check for Understanding], projected onto the white board. The teacher asks the two groups who were not being taught to work quietly. Even some students who are not at the front tables are paying attention to her. Shortly after the student started instructing, the teacher had to cut her off because it was the end of the day.

Page 108: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

93

General Personalized Learning Survey Data2

At Brooks, the number of teacher respondents was smaller than at other schools, as only seventh grade incorporated PL in 2016-17. Therefore, we will not report teacher survey responses in this profile.

Student respondents by gender

Gender

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

Male 16 28.6% 15 28.8% 22 40.0% 53 32.5%

Female 40 71.4% 37 71.2% 33 60.0% 110 67.5%

Total 56 52 55 163

Student respondents by race/ethnicity

Race/ethnicity

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

Black/African-American 46 82.1% 41 78.9% 40 74.1% 127 78.4%

Hispanic 7 12.5% 5 9.6% 8 14.8% 20 12.4%

White 0 0.0% 0 0.0% 0 0.0% 0 0.0%

Other 3 5.4% 6 11.5% 6 11.1% 15 9.3%

Total 56 52 54 162

Teacher survey respondents

n

Grade taught Subject taught Implementing PL Strategies3

6-8 9-12 Other ELA Math Social

Studies Science Other B R C

Fall 2016 4 0 2 2 0 1 1 1 2 1 3 0

Spr 2017 2 2 2 0 0 1 1 1 0 0 2 0

Fall 2017 7 7 2 2 2 2 2 2 2 3 3 1

2 Totals across tables may not be the same, as students self-report their demographic characteristics. 3 N = Not yet implementing. B = Beginning to implement a few. R = Regularly implementing for a portion of the time. C = Consistently implementing all of the time.

Page 109: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

94

Relationship to LEAP Framework

The following juxtaposes qualitative data with survey data, specific to each of the four areas of the LEAP Framework.

Learner Focused

LEAP Framework Item Data type Content

Learning opportunities reflect an understanding of individual needs, interests,

and strengths with respect to academic skills and needs

Practice cluster: Targeted instruction +

data-based consultation

The teacher likes Gooru [a software product] because it does not give the answer, just right or wrong. This makes real, authentic interventions more possible. Near the end of the class period, the teacher has the students finish their CFU (Check for Understanding) even if they were not done, so that he could know who needed intervention. It’s a series of 4 questions with bubbles students need to fill in.

Observation

The 8th grade STEM teachers were going to give their formative today, but they realized students are not ready, so they are waiting until Wednesday. They also are going to check notebooks later this week (to assess completeness and progress).

Related Survey Responses

We have highlighted the Learner Focused survey items in which students indicated greater agreement from fall to spring.

Learner Focused survey items – all student respondents4

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher knows what I am interested in. 1.69 1.86 1.87

My teacher knows what things are easy for me to learn. 1.98 2.16 1.96

My teacher knows how I learn best. 1.95 1.94 1.93

My teacher notices if I have trouble learning something. 2.95 2.87 2.72

My teacher will always listen to my ideas. 2.51 2.52 2.54

My teacher knows about my life at home. 1.15 1.17 1.23

My teacher knows about how things are going with my friends. 1.29 1.62 1.38

My teacher knows about the activities I like to do outside of school. 1.29 1.67 1.35

4 For this and other sets of survey responses, we averaged responses, with a response of 1 indicating the least agreement or frequency level. Questions should not be compared with each other; while most items include 4 possible response categories, some include more or fewer.

Page 110: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

95

My teacher knows who my friends from school are. 2.07 2.29 2.20

My teacher knows what I do on the weekend. 1.19 1.12 1.20

My teacher connects what we are learning to the world outside the classroom. 2.15 2.31 1.87

I get the chance to do school work I am good at. 2.60 2.53 2.53

I do school work that makes me want to try hard. 3.13 2.84 2.83

I learn about things I am interested in. 2.60 2.14 2.33

What I learn in class connects with my life outside of school. 1.85 1.94 1.91

My school work interests me. 2.36 2.06 2.00

I know what I am learning now will help me when I grow up. 2.71 2.45 2.80

I learn valuable skills. 3.13 3.04 3.02

Learner Led

LEAP Framework Item Data type Content

Assess and monitor their own progress

Observation

The teacher started the class by passing out a four-week planner for the students to use to track their assignments. She discussed the assignment sheet with them and discussed the grading rubric and its relationship to formative and summative assessments. The teacher said she would post it online, with hyperlinks to all of the assignments. She also posted answer keys to everything. The teacher then passed around an assignment sheet for geometry for the next 4 weeks (Continual Progress Report – CPR.)

Interview/PLC Students get credit for keeping track of assignments and homework

Collaborate with others to achieve goals

Practice cluster:

Flexible learning environments +

project-based learning

The students were involved in a group project to develop a presentation about the Middle Ages. Even though desks were arranged in straight rows, students were organized into groups of four to complete the presentations.

Observation

A student goes to the teacher to ask how to test her conjecture with a partner. The teacher tells her to go to another group who is around the same point as where she is. She compares her data with the data of the students from a different group. She compares it to another student’s and fills in her worksheet.

Page 111: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

96

Reflect upon their learning and continually

refine their strategies Observation

One student figured out the spaghetti triangle activity – he was able to use pieces of spaghetti to create a set of three pieces that did NOT form a triangle. The students realized that the figure he made had the same dimensions as one of the triangles they had created before, and that they thought would work. They had an internal debate over which one was correct – could those side measurements form a triangle or not? They consulted with a student from neighboring group who told them that for each triangle that worked, small + medium > the longest side, and for each triangle that did not, small + medium < the longest side. The students in the group erased the set of data points that they initially thought would form a triangle.

Related Survey Responses

We have highlighted the Learner Led survey items in which students indicated greater agreement from fall to spring. In some cases, less agreement could be a positive development. For instance, if teachers set or keep track of learning goals less often, that could reflect increased student agency.

Learner Led survey items – all student respondents

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher lets me include topics I like in the lessons and units that we study. 2.11 2.18 2.04

My teacher helps me figure out how I will get all my work done. 3.18 3.16 2.98

I talk with my teacher about the kinds of activities that help me learn best. 1.67 1.84 1.71

I talk to my teacher about how I can challenge myself at school. 1.74 1.89 1.75

I talk with my teacher about which subjects I am really good at. 1.72 1.74 1.70

I talk with my teacher about subjects I need help learning. 2.21 2.24 2.63

I tell my teacher what would help me learn. 2.36 2.03 1.92

I can decide which learning activities I will do. 2.33 2.17 1.71

I can choose which students I will do my work with. 3.02 2.90 2.83

I can decide which skills I work on. 2.42 2.45 2.05

I can decide how much time I will spend doing my schoolwork. 2.52 2.75 3.05

I can decide where I do my schoolwork. 2.63 2.80 2.90

I can decide the order in which I complete my assignments. 3.33 3.17 3.24

When I am doing school work, I know exactly what I am trying to learn. 3.18 3.25 2.83

I know what my learning goals are. 3.53 3.45 3.07

Page 112: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

97

I know which skills I need to improve. 3.69 3.80 3.40

My teacher sets learning goals for me. 2.73 3.00 3.00

I set learning goals with my teacher. 2.67 2.46 2.48

I keep track of which of my learning goals I have met. 3.02 2.97 2.82

My teacher keeps track of which of my learning goals I have met. 2.68 2.79 2.46

I keep track of whether I have completed my school work. 3.51 3.56 3.61

I look at my test results to see which skills I still need to work on. 3.40 3.72 3.43

I look at schoolwork other than tests to see what skills I still need to work on. 3.10 3.46 3.14

I learn from mistakes in my work. 3.23 3.24 3.18

Most students in my class help each other learn. 2.96 2.78 2.50

My teacher helps me figure out what I can do to improve my school work. 3.00 3.04 2.70

When I am given the chance to revise my work, I do. 3.38 3.12 3.26

My teacher helps me see which things I am doing well on. 2.79 2.82 2.65

My teacher asks me to decide which skills I still need to work on. 2.47 2.67 2.43

I work with different groups of students during the day. 2.74 2.43 2.61

Other students give me feedback on my school work. 2.74 2.88 2.26

I give feedback to other students on their school work. 2.81 2.88 2.34

I work together with other students on school work. 3.40 3.36 3.03

To what extent does feedback from other students help you improve your work? 2.94 2.65 2.64

When I don’t understand something in class I ask other students for help. 2.85 2.75 2.50

I try to figure problems out on my own before I ask my teacher for help. 3.35 3.26 2.91

I know where to go for help when I have a question. 3.38 3.24 3.24

I can get help on my work in more than one way. 3.40 3.27 2.82

I don’t mind asking the teacher a question in class. 3.16 2.94 2.79

Page 113: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

98

Learner Demonstrated LEAP Framework Item Data type Content

Enter at a level appropriate to their prior knowledge and

learning needs

Practice cluster: Targeted instruction +

competency-based progression

The teacher was at a half table with four students talking through a worksheet on their Chromebooks; a teachers’ aide was floating around checking in with students. The teacher said they group students by their summative scores on the last unit. Students who have not shown mastery spend additional time up with the teacher.

Have supports and pacing that fits their

learning Observation

When asked about the work, a student said she likes doing personalized learning this way because it is at her own pace.

Demonstrate proficiency when ready

Observation

A student was asked about the assessment process and they were able to describe in detail about how they take both formative assessments to figure out where they are what else they need to work on, and summative assessments to show mastery/proficiency. The student described Mastery Connect.

Interview/PLC One teacher discussed a student who really wanted to take an assessment early and she let him (she wanted to catch him while he was “on,” since he was having a good day.)

Receive credit (recognition of learning) based on demonstrated

proficiency (not seat time)

Practice cluster: Competency-based

progression + project-based learning

The teacher explains that a spaghetti triangle activity is a DFA (daily formative assessment) that will lead to differentiation. Some students will get the activity; others will not. Once they are done, they will work in GeoGebra (an online tool) to manipulate triangles. (The teacher demonstrates how it works.) Eventually, this will lead to the same sort of project with quadrilaterals. One group finishes the activity – they figure out the patterns of sides (a, b, and c) that successfully create triangles. One student uses the word “cab” to remember how to do this in the future. The teacher tells the group to get Chromebooks, now that they have finished the worksheet. One student from that group goes to each other group and ask them if they have created the equation small + medium > large. Two groups have; two groups have figured out the pattern but did not create any equations.

Observation

A student answered an observer’s questions about the assessment system. She explained how the formative assessments let you know where you were, and that it is an adjustment to get used to “failing” when you haven’t shown proficiency.

Page 114: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

99

Related Survey Responses

We have highlighted the Learner Demonstrated survey items in which students indicated greater agreement from fall to spring.

Learner Demonstrated Survey Items – all student respondents

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher asks me what I know about new topics before I begin working on them. 2.67 2.54 2.67

If I can show I already understand a topic, my teacher lets me skip it. 1.39 1.54 1.35

My teacher quizzes us on what we know about new topics before we begin learning about them.

2.30 2.20 2.38

When we are learning something new, my teacher helps us understand how it fits in with what we’ve learned before.

3.07 3.16 2.96

When I feel stuck on a problem in class I can skip it and come back to it later. 2.51 2.85 2.36

When I feel stuck on a problem, my teacher or another adult helps me. 3.16 3.23 3.26

My teacher lets me work as fast or as slow as I want. 2.80 2.88 2.78

My teacher gives me as much time as I need to understand my school work. 2.77 2.60 2.38

I am allowed to finish my school work after school if I cannot finish it in class. 3.11 3.31 3.20

My teacher lets me take breaks from my work when I think I need it. 3.02 2.77 2.50

My teacher gives me harder work as I learn more about a topic. 2.98 3.08 2.62

I can move ahead to new topics as soon as I show what I have learned, even if other students are still working on it.

2.31 2.47 2.66

When I don’t understand a topic, I can spend more time on it while other students move on. 2.45 2.43 2.48

My teacher gives different assignments to students based on what they need to learn. 2.00 1.90 1.72

Students who don’t do well on a test are allowed to retake it. 3.59 3.73 3.83

How often have you noticed other students doing different assignments than you on the same topics?

1.91 1.94 2.12

As soon as I have mastered a skill, I can show my teacher that I have learned it. 2.98 2.98 3.00

I can decide how I will show my teacher that I have mastered a skill. 1.89 1.71 1.90

In this class students choose different ways to show that they have mastered the same skill. 1.83 1.60 1.79

My teacher lets me decide the best way to show what I have learned. 1.95 1.67 2.15

Page 115: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

100

Demographics and Academic Outcomes 2016-17 (from publicly available CPS data)

Demographics

Race/Ethnicity % Bilingual

% Diverse Learner

% Free/ Reduced Lunch % Black % Hispanic % White % Other

CPS 37.7% 46.5% 9.9% 5.9% 17.1% 13.7% 80.2%

BSC 45.9% 39.0% 10.7% 4.4% 10.6% 8.6% 65.0%

School 81.4% 15.8% 0.7% 2.1% 0.2% 6.0% 71.8%

NWEA MAP Attainment (Grades 3-8 combined)

Reading Math

Average RIT Score

Percentile Average RIT Score

Percentile

CPS 214.6 222.5

BSC 221.5 230.5

School 235.5 99 242.2 95

NWEA MAP Growth (Grades 3-8 combined)

Reading Math

Avg Pre-test RIT

Score

Avg Post-test RIT

Score

Avg RIT Growth

% Making

National Average Growth

National School Growth

Percentile

Avg Pre-test RIT

Score

Avg Post-test RIT

Score

Avg RIT Growth

% Making

National Average Growth

National School Growth

Percentile

CPS 208.5 215.2 6.7 60.5 215.1 223.2 8.1 56.7

BSC 215.8 221.7 5.9 63.2 223.5 230.8 7.3 56.8

School 232.7 235.5 2.8 63.6 68 241.8 242.9 1.1 43.6 8

Page 116: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

101

SAT/PSAT

Test Avg EBRW score Avg math score Avg Composite % College Ready

SAT – District 483 472 956 35.6%

SAT – School 585 548 1133 91.2%

PSAT 10 – District 447 448 896 41.8%

PSAT 10 – School 516 492 1008 88.9%

PSAT 9 – District 417 431 849 43.1%

PSAT 9 – School 495 494 989 92.1%

Accountability

SQRP Total Points Earned

SY 2016-17 SQRP Rating

SY 2016-17 Accountability Status

School 4.5 Level 1+ Good Standing

Potential Areas for Reflection

● How are PL practices going to be vertically aligned as Brooks continues to add grades in which it incorporates PL?

● One question that emerged during our conferring meeting was how to test fidelity of implementation; Brooks could use the PL practices we identified from all schools’ Blueprints to assist with that process.

Page 117: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

102

Irving Park

Personalized Learning in Context

CICS Irving Park is a K-8 charter school. In 2016-17, it had incorporated personalized learning grades 3-4 and 6-8 to various extents. In Grades 6–8, Irving Park uses Summit Basecamp as its PL platform; staff reflected that implementation went smoothly.

• At Irving Park, we observed several of the PL practices we identified in the blueprint, and many appear to relate to the LEAP Framework, as shown below. For instance, personal learning paths are common throughout the school (though school leaders indicated that they may not be truly personalized yet). We also observed several instances of students reflecting on their learning.

• One potential weakness that emerged from Irving Park’s PL approach (based on observations and survey responses) is that it is strongly teacher-directed; that is, students may not have as much agency in deciding what to work on. School leaders acknowledged the need for continued teacher capacity with respect to competency-based progression. Additionally, Irving Park is consciously trying to more directly engage with their community (i.e., the Learner Connected piece of the LEAP Framework).

Selected Personalized Learning Practices and Examples from Observations, Interviews, and Team Meetings

Personalized Learning Practice

Interview/Professional Learning Community Response from Teachers

Example from Practice

Personalized/ learner focused/

targeted instruction

Personalizing math in 5th grade – students got to choose between ThinkThrough Math and ST math Lexia is personalized. Tailored to meet the needs of the students where they are.

An aide is working with one student on fractions. Three other students sit at a table working on ST Math.

Personal learning paths

Students have choice within PLP, but if they finish independent practice, they can work on different things. Different sites they can go on to show their work. Once they get to a certain point, they get to choose what to work on.

The students in one group are working on drawing equal fractions within circles. This is a worksheet that is part of their PLPs. The PLP (which also included the ST Math) progress tracker, is scheduled out – there are worksheets they are to complete each day, and each page is labeled “Tuesday,” “Wednesday,” etc.

Competency- or proficiency-

based progression

Thinking about more fluid grouping (e.g., if students progress at a different pace through a math curriculum)

Flexible learning environments/

schedules

Interview response: School decided flexible learning was the way to go, so PD is focused on flexible learning.

Page 118: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

103

PLC Discussion: There is still choice around where students sit, but because students were not hitting their minutes, staff had to make it more structured.

Project-based learning

A student explained that the presentation had to begin with two slides about the famous earthquake. Then, the class challenge was to build an earthquake-proof structure, then to talk about the design principles in the PowerPoint. The student said that the teacher had built a table that shook. Students used construction materials to build a small model structure that was supposed to stay up as the table shook. Students would take pictures of their structure and talk about we experience in the PowerPoint.

Data-driven consultation

Interview response: An instructor normally meets with mentees. However, there is a student who is struggling. So instead of doing mentoring, she looked at data to see who else was struggling. If someone is feeling bad about themselves, that trumps mentoring.

Post-PLC feedback: We used data to inform our instruction (Skills locator, NWEA, RGR, etc.)

Peer instruction

The teacher asked, “would anyone like to teach?” Seven of the 12 students in one area raised their hands. The teacher asked one of the students to go to the teacher’s seat. The teacher then asked the student to talk through his answer. As the student began to read the problem back to the group, the teacher reminded him to show his work to the other students. The teacher then checked for understanding by inviting the other students to do a gesture with their wrists to indicate whether they got the same answer.

Positive Behavioral

Interventions and Supports

(PBIS) and

In the PLC, educators talked about Autism Awareness Week and the importance of learning to show empathy and compassion. A lot of times, negative terms are used (this student has a “problem”, etc.)

Page 119: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

104

restorative practices

General Survey Data1

Student respondents by grade

Grade

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

4 57 21.2% 30 28.3% 16 11.0% 103 19.8%

6 83 30.9% 53 50.0% 50 34.3% 186 35.7%

7 53 19.7% 22 20.8% 40 27.4% 115 22.1%

8 76 28.3% 1 0.9% 40 27.4% 117 22.5%

Total 269 106 146 521

Student respondents by gender

Gender

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

Male 133 50.2% 55 52.9% 66 46.5% 254 49.7%

Female 132 49.8% 49 47.1% 76 53.5% 257 50.3%

Total 265 104 142 511

Student respondents by race/ethnicity

Race/ethnicity

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

Hispanic 172 67.7% 63 61.2% 101 72.1% 336 67.6%

White 46 18.1% 24 23.3% 12 8.6% 82 16.5%

Black/African-American 20 7.9% 9 8.7% 10 7.1% 39 7.9%

Other 16 6.3% 7 6.8% 17 12.1% 40 8.1%

1 Totals across tables may not be the same, as students self-report their demographic characteristics.

Page 120: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

105

Total 254 103 140 497

Teacher survey respondents

n

Grade taught Subject taught Implementing PL Strategies2

K 1-3 4-5 6-8 Other ELA Math Social

Studies Science Other B R C

Fall 2016 24 4 9 8 9 3 14 13 10 10 5 3 14 7

Spr 2017 19 3 8 7 12 1 11 9 8 7 4 2 6 11

Fall 2017 21 3 9 8 11 0 15 12 9 9 3 1 13 7

Relationship to LEAP Framework

The following juxtaposes qualitative data with survey data, specific to each of the four areas of the LEAP Framework.

Learner Focused

LEAP Framework Item Data type Content

Learning opportunities reflect an understanding

of individual needs, interests, and strengths

with respect to academic skills and

needs

Practice cluster: Targeted instruction + collaborative learning

models

According to the instruction written on the board, one group is doing “group practice: X satisfies an equation?” and one group is doing “independent Khan: testing solutions to equations, introduction to equations.” The two groups of students not with the teacher are working on laptops.

Practice cluster: Horizontal alignment +

flexible staffing

Two teachers switch their classrooms. In the other classroom next door, a similar station rotation was happening, except in reading and students were using Lexia.

Observation

The teacher called out the names of 5 students she is going to meet with. She told them that they would get whiteboards and markers once they were ready and paying attention. One of the students worried that being a part of the small group would affect her grade; the teacher explained that if they were struggling with soccer, they would practice, so this is the same thing.

2 N = Not yet implementing. B = Beginning to implement a few. R = Regularly implementing for a portion of the time. C = Consistently implementing all of the time.

Page 121: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

106

Interview/PLC Why would I [an instructor] stand in front of a class to do a lesson for students who learned it last year, or who need more or fewer problems?

Learning opportunities reflect an understanding

of individual needs, interests, and strengths

with respect to non-academic skills and

needs including social, cultural, and emotional

Practice cluster: Social/emotional/

noncognitive learning + Positive Behavioral

Interventions and Supports (PBIS) and restorative practices

The teacher stopped to spend some time with a group of two girls. While she talked with these girls, students in the other two groups began to have conversations and work off task. The teacher looked around and said “surfs up!” Some, but not the majority, of students responded, “shhhh.” The teacher reminds them to rotate at a level 5 (the best possible self-rating). Students rotate. The teacher again asks the students to self-assess how the rotation went: “On a scale of 1-5, how was that rotation for you personally? Show with your hands. Thanks for being honest. I’m giving blue group a point because they all were ready with their materials.”

Practice cluster: Targeted instruction +

Positive Behavioral Interventions and

Supports (PBIS) and restorative practices

Two girls are talking inappropriately – the instructor tells them to move farther apart, work on ST Math, and stop talking. She also says they will talk to the teacher about this later. The aide later talks to the student and asks her to apologize at transition time.

Observation One student who is part of the mini-lesson says he did something wrong; the teacher tells him not to be afraid to do something wrong.

Interview/PLC We discussed how to support specific students based on their needs.

Related Survey Responses

We have highlighted the Learner Focused survey items in which students indicated greater agreement from fall to spring.

Learner Focused survey items – all student respondents3

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher knows what I am interested in. 2.66 2.52 2.69

My teacher knows what things are easy for me to learn. 2.75 2.76 2.85

My teacher knows how I learn best. 2.88 2.68 2.83

3 For this and other sets of survey responses, we averaged responses, with a response of 1 indicating the least agreement or frequency level. Questions should not be compared with each other; while most items include 4 possible response categories, some include more or fewer.

Page 122: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

107

My teacher notices if I have trouble learning something. 3.12 3.00 3.11

My teacher will always listen to my ideas. 3.12 2.87 3.06

My teacher knows about my life at home. 1.73 1.80 2.00

My teacher knows about how things are going with my friends. 2.03 2.09 2.46

My teacher knows about the activities I like to do outside of school. 2.16 2.41 2.62

My teacher knows who my friends from school are. 2.93 3.14 3.07

My teacher knows what I do on the weekend. 1.72 1.55 2.07

My teacher connects what we are learning to the world outside the classroom. 2.65 2.65 2.74

I get the chance to do school work I am good at. 2.77 2.69 2.82

I do school work that makes me want to try hard. 3.06 2.95 3.07

I learn about things I am interested in. 2.75 2.70 2.86

What I learn in class connects with my life outside of school. 2.38 2.40 2.52

My school work interests me. 2.64 2.49 2.75

I know what I am learning now will help me when I grow up. 3.21 3.08 3.16

I learn valuable skills. 3.22 3.18 3.16

Learner Focused Survey Items – all teacher respondents

Survey item Fall 2016

Spr 2017

Fall 2017

Think about the one-to-one meetings you have had with students this year. On average, how often do you discuss with individual students…

● Their interests? 3.82 3.75 3.81

● Their strengths? 4.45 4.50 4.29

● Their learning challenges? 4.64 4.50 4.29

● Their behavior? 4.41 4.19 4.48

● Their feelings or general well-being? 4.55 4.63 4.62

I learn about my students by discussing each student's progress with other adults in the school who are working with the student.

4.13 4.19 4.14

I learn about my students by discussing student progress with families (by phone, email or in person meetings).

3.79 3.81 4.10

Page 123: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

108

I actively seek information about my students' outside interests. 4.04 4.31 4.10

I actively seek information about my students' families. 3.92 4.13 4.19

I know my students' family and home context. 3.67 3.81 4.14

I know my students' community context. 3.58 3.81 3.81

I know my students' learning interests. 3.92 4.31 3.90

I know which specific learning standards my students have met. 4.25 4.31 4.05

I know my students' strengths. 4.38 4.63 4.33

I understand my students' learning challenges. 4.33 4.63 4.29

I understand what motivates my students to learn. 4.00 4.38 4.05

How often do you...

● Incorporate student learning interests into your lessons? 3.79 4.13 3.57

● Assign school work to individual students based on non-academic data (e.g., learning preferences, work habits, social-emotional functioning)?

3.21 3.75 3.62

● Assign schoolwork to individual students based on academic data? 4.00 4.13 4.05

● Help students make a connection between what they are learning in the classroom with life outside of the classroom?

4.50 4.50 4.29

Learner Led

LEAP Framework Item Data type Content

Collaborate with learners to identify and

include learner preferences and optimal

learning conditions

Practice cluster: Flexible learning environments + gradual release

A teacher said that she was experimenting with flexible seating options based on which kids could handle them.

Interview/PLC

The assistant principal explained that students work together (with the teacher) to decide what group they are in, then grade their own work to decide how they're progressing.

Assess and monitor their own progress

Observation

A student greeter explains Summit. She indicates that she is mostly ahead, but there are a few areas (in yellow) that she still has not completed. There are numbers on each box that show the number of attempts each student makes. When asked what she is going to work on, she says she is going to try to complete the activities she had not completed yet.

Page 124: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

109

Collaborate with others to achieve goals

Observation A few students are collaborating on a slideshow about the Chicago Cubs.

Interview/PLC This group of students is good at collaborating, especially when they are stuck.

Advocate for needed support form teachers, peers, technology and

other sources

Interview/PLC

Students learn when it’s an appropriate time to advocate for themselves – far beyond just academics.

Reflect upon learning and continually refine

their strategies

Observation

As the students are packing up, the teacher asks them what was the point of today, what was the key takeaway? The students share out loud what they learned. A student said, “We did definitions.” The teacher responded, “Why does that matter?” The student said, “Because we all see them in assessments.” The teacher responded, “Today you need to walk away knowing the different between the last concept unit and this one.”

Interview/PLC Students are learning why they are falling behind – why do I not enjoy the subject? Is it because I am doing problems over and over? Maybe I can watch videos.

Interview/PLC Flexible learning required reflection on part of students and teachers (how students self-regulate, self-direct)

Related Survey Responses

We have highlighted the Learner Led survey items in which students indicated greater agreement from fall to spring. In some cases, less agreement could be a positive development. For instance, if teachers set or keep track of learning goals less often, that could reflect increased student agency.

Learner Led survey items - all student respondents

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher lets me include topics I like in the lessons and units that we study. 2.75 2.50 2.95

My teacher helps me figure out how I will get all my work done. 3.42 3.14 3.58

I talk with my teacher about the kinds of activities that help me learn best. 2.44 2.20 2.56

I talk to my teacher about how I can challenge myself at school. 2.52 2.49 2.56

I talk with my teacher about which subjects I am really good at. 2.61 2.22 2.68

I talk with my teacher about subjects I need help learning. 2.93 2.58 2.88

I tell my teacher what would help me learn. 2.73 2.42 2.82

Page 125: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

110

I can decide which learning activities I will do. 2.90 2.81 3.01

I can choose which students I will do my work with. 2.97 3.20 3.19

I can decide which skills I work on. 2.98 2.84 3.06

I can decide how much time I will spend doing my schoolwork. 2.90 2.79 2.98

I can decide where I do my schoolwork. 2.78 3.27 3.13

I can decide the order in which I complete my assignments. 3.18 3.15 3.35

When I am doing school work, I know exactly what I am trying to learn. 3.23 3.40 3.34

I know what my learning goals are. 3.59 3.48 3.55

I know which skills I need to improve. 3.56 3.59 3.53

My teacher sets learning goals for me. 3.08 3.10 3.05

I set learning goals with my teacher. 3.32 3.10 3.11

I keep track of which of my learning goals I have met. 3.39 3.19 3.42

My teacher keeps track of which of my learning goals I have met. 3.05 2.87 3.24

I keep track of whether I have completed my school work. 3.56 3.76 3.58

I look at my test results to see which skills I still need to work on. 3.52 3.52 3.51

I look at schoolwork other than tests to see what skills I still need to work on. 3.19 3.08 3.38

I learn from mistakes in my work. 3.34 3.03 3.27

Most students in my class help each other learn. 2.98 3.06 3.25

My teacher helps me figure out what I can do to improve my school work. 3.18 2.98 3.32

When I am given the chance to revise my work, I do. 3.20 3.01 3.28

My teacher helps me see which things I am doing well on. 3.19 2.94 3.23

My teacher asks me to decide which skills I still need to work on. 2.98 2.71 3.11

I work with different groups of students during the day. 3.05 3.00 3.16

Other students give me feedback on my school work. 2.62 2.85 2.99

I give feedback to other students on their school work. 2.72 2.95 2.98

I work together with other students on school work. 3.23 3.15 3.39

To what extent does feedback from other students help you improve your work? 2.82 2.90 3.00

When I don't understand something in class I ask other students for help. 2.87 2.98 3.08

I try to figure problems out on my own before I ask my teacher for help. 3.24 3.27 3.26

Page 126: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

111

I know where to go for help when I have a question. 3.42 3.31 3.26

I can get help on my work in more than one way. 3.17 3.22 3.18

I don't mind asking the teacher a question in class. 3.18 3.13 3.24

Learner Led survey items - all teacher respondents

Survey item

Fall 2016

Spr 2017

Fall 2017

Think about the one-to-one meetings you have had with students this year. On average, how often do you...

● Determine with students which learning activities will align with their individual interests? 3.14 3.50 3.19

● Ask students for input on the topics we will study? 3.14 3.25 3.14

● Talk with students about the kinds of activities that will help them learn best? 3.73 3.88 3.24

● Talk with students about how they can challenge themselves at school? 4.14 4.00 3.62

Students make choices on how to learn in class (e.g., read written material, complete a project, whether to work alone or in a group).

3.96 4.38 4.10

Students make choices on what to learn in class (e.g., which skills to practice, what topics to research, which subjects to work on first).

3.42 3.81 3.43

How much do you agree with the following statement? Students can explain how learning activities connect to their learning goals.

2.88 3.13 2.90

Students create goals for their own learning (e.g., which academic skills to improve, what behaviors or work habits to strengthen).

3.58 3.81 4.00

Students create their learning goals with me. 3.38 3.56 3.43

What proportion of your students... Have a unique set of learning goals? 2.63 2.56 2.24

Think about the one-to-one meetings you have had with students this year. On average, how often do you...

● Examine individual student assessment results together with the student? 3.68 3.31 3.48

● Discuss individual student overall progress? 4.14 4.56 3.71

● Provide feedback to individual students on their work? 3.86 4.06 3.95

● Discuss what individual students think are the reasons for their progress or lack of progress? 4.18 4.25 3.57

● Make adjustments to individual student learning goals and/or learning plans with the student? 4.09 3.88 3.52

Page 127: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

112

● Discuss with individual students how they have grown as learners? 4.14 4.06 3.62

When you are providing feedback and support to individual students, on average, how often do you encourage students to reflect upon the effectiveness of their efforts and learning strategies?

4.25 4.31 4.00

Students keep track of their own learning progress. 4.13 4.63 4.05

Students have access to their own performance data. 4.08 4.44 4.19

Students engage in structured reflection on their own learning and progress (e.g., journals, reflection exercise, group sharing time).

4.00 3.56 3.71

Students self-assess their own work. 3.75 4.00 3.48

Students formally discuss how they will work as a team (e.g., determine roles/responsibilities, decision making process).

3.50 3.56 3.29

Students formally exchange feedback on their work with peers. 3.33 3.50 3.00

Students know which specific learning standards they have successfully met. 3.46 3.47 3.05

Students know which specific learning standards they still need to meet. 3.33 3.40 2.86

Students ask for help from peers before seeking my help. 3.71 3.87 3.62

Students seek my help when they need it. 4.08 4.33 4.10

Students know where to go for help (e.g., technology, peers, adults, and other sources). 4.17 4.53 4.29

Students know when they need help with schoolwork. 4.04 4.40 3.95

How many students in your class could tell a visitor exactly what learning goal(s) they are working on? 3.58 3.87 3.52

How true are the following about why you use small groups in your class? ... To complete special projects. 4.04 4.13 3.57

Students build on each other's ideas during discussion. 3.54 3.47 3.48

Students use data and text references to support their ideas. 3.25 3.33 3.14

Students show each other respect. 3.75 3.60 3.48

Students provide constructive feedback to their peers and to me. 3.42 3.13 3.19

Most students participate in the discussion at some point. 3.42 3.47 3.29

Learner Demonstrated

LEAP Framework Item Data type Content

Enter at a level appropriate to their prior knowledge and

learning needs

Observation Laptops were open to a computer adaptive math program. Each student had a login for the program. The program provided customized math problems at the student's ability level.

Interview/PLC The students are in different groups based on entrance tickets.

Page 128: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

113

Have supports and pacing that fits their

learning

Practice cluster: Targeted instruction +

competency-based progression

Students in group doing “Group Practice” are using IXL. There is a long list of topics ("IXL5.1 Which x satisfies an equation") that shows where students are in the progression, and if they finish that topic they get a badge next to it on the list. When they are in the problem itself, it shows their progress for that “S.1” topic in terms of a number. They were aware of how they are doing via a “smart score.” The teacher decides what they work on; they can just go onto the next one in the sequence if they finish and do not want to interrupt the teacher to check on it. The four students at the table explained how they were supposed to do “Group Practice” and help each other if needed. Some were talking to one another about content during the observation.

Practice cluster: Personal learning

paths + competency-based progression

Students have ST Math worksheets that they are to use to track what they are doing and how far they've progressed throughout the week, and at the end of the week, they are supposed to do reflections based on their work. The students apparently have not been so good at that part yet, and need to do better. The students also describe how the teacher tracks their progress on ST Math, using “Gigi the Penguin” on a bulletin board with bar graphs of increasing scores. One student has progressed to the end and is now starting on 4th grade.

Observation

One student went over to his Chromebook and explained how they answer questions in the software and how it moves his progress bar on the bottom of the screen. Then when he gets to 100%, he moves up levels. There also is a board up at the front of the room with everyone's process on these same bars.

Interview/PLC

Thinking about more fluid grouping (e.g., whether students progress at different pace through a math curriculum)

ST Math is competency-based, so students progress at their own pace.

Demonstrate proficiency when ready

Practice cluster: Competency-based progression + peer

instruction

The teacher introduced a rubric about how to evaluate Public speaking. The rubric included, appropriate eye contact; speak in adequate volume; clear pronunciation; appropriate posture; and ability to answer questions. The teacher mentioned that students could try again if they struggle the first time.

Related Survey Responses

We have highlighted the Learner Demonstrated survey items in which students indicated greater agreement from fall to spring.

Page 129: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

114

Learner Demonstrated Survey Items – all student respondents

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher asks me what I know about new topics before I begin working on them. 2.85 2.87 2.76

If I can show I already understand a topic, my teacher lets me skip it. 1.70 1.51 1.98

My teacher quizzes us on what we know about new topics before we begin learning about them.

2.48 2.31 2.56

When we are learning something new, my teacher helps us understand how it fits in with what we've learned before.

3.26 3.04 3.12

When I feel stuck on a problem in class I can skip it and come back to it later. 2.98 3.17 2.86

When I feel stuck on a problem, my teacher or another adult helps me. 3.38 3.29 3.34

My teacher lets me work as fast or as slow as I want. 2.90 2.91 2.80

My teacher gives me as much time as I need to understand my school work. 2.84 2.86 2.90

I am allowed to finish my school work after school if I cannot finish it in class. 3.17 3.05 3.24

My teacher lets me take breaks from my work when I think I need it. 2.57 2.29 2.47

My teacher gives me harder work as I learn more about a topic. 2.70 2.93 2.71

I can move ahead to new topics as soon as I show what I have learned, even if other students are still working on it.

2.59 2.09 2.54

When I don't understand a topic, I can spend more time on it while other students move on. 2.64 2.39 2.57

My teacher gives different assignments to students based on what they need to learn. 2.57 2.42 2.43

Students who don't do well on a test are allowed to retake it. 3.08 2.29 2.91

How often have you noticed other students doing different assignments than you on the same topics?

2.33 2.35 2.32

As soon as I have mastered a skill, I can show my teacher that I have learned it. 3.39 3.04 3.14

I can decide how I will show my teacher that I have mastered a skill. 2.87 2.55 2.75

In this class students choose different ways to show that they have mastered the same skill. 2.76 2.42 2.77

My teacher lets me decide the best way to show what I have learned. 2.99 2.56 2.86

Learner Demonstrated Survey Items – all teacher respondents

Survey item Fall 2016

Spr 2017

Fall 2017

Page 130: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

115

I formally assess student's knowledge about a topic/skill before beginning a new unit of study.

2.88 2.76 3.05

I informally ask students what they know about a topic/subject before I start a new lesson or unit.

3.46 3.41 3.52

I assign work to my students based on their prior knowledge or data. 3.17 3.18 3.29

I use what students already know to direct their work on new topics/skills. 3.29 3.41 3.43

Students have the chance to revise their work after receiving feedback from others. 3.04 3.18 2.90

Students have the opportunity to see examples of work that exemplify excellence in a given subject or skill area.

3.08 3.29 3.33

Students who show that they have already mastered the knowledge and skills at the start of a topic or unit can skip it and move ahead.

3.04 2.81 3.05

Students can practice or review until they fully understand a topic/skill. 3.33 3.38 3.52

Students are allowed to have more time to finish work, even if other students have already moved ahead.

3.54 3.31 3.52

I give more challenging assignments to students who learn faster. 3.33 3.63 3.57

If students master skills faster than others, they move ahead to a new topic, unit or set of skills.

3.25 3.50 3.38

Students can demonstrate proficiency anytime during a unit to demonstrate their mastery of the concepts/skills.

3.29 3.31 3.00

Students have access to multiple options for demonstrating proficiency. 3.00 3.19 3.29

Students are given choices for how they can demonstrate their learning (e.g., project, paper, test, presentation).

2.75 3.00 2.95

Students can design or suggest new ways to demonstrate their own learning. 2.75 3.19 3.05

Students are clear about what they need to be able to do to demonstrate proficiency. 3.38 3.13 2.90

Students receive clear criteria for good performance before they begin working on schoolwork.

3.38 3.19 3.29

Demographics and Academic Outcomes 2016-17 (from publicly available CPS data)

Demographics

Race/Ethnicity % Bilingual

% Diverse Learner

% Free/ Reduced Lunch % Black % Hispanic % White % Other

CPS 37.7% 46.5% 9.9% 5.9% 17.1% 13.7% 80.2%

BSC 45.9% 39.0% 10.7% 4.4% 10.6% 8.6% 65.0%

Page 131: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

116

School 5.8% 67.1% 19.1% 8.1% 19.6% 10.8% 52.2%

NWEA MAP Attainment (Grades 3-8 combined)

Reading Math

Average RIT Score

Percentile Average RIT Score

Percentile

CPS 214.6 222.6

BSC 221.5 230.5

School 219.3 84 228.1 79

NWEA MAP Growth (Grades 3-8 combined)

Reading Math

Avg Pre-test RIT

Score

Avg Post-test RIT

Score

Avg RIT Growth

% Making

National Average Growth

National School Growth

Percentile

Avg Pre-test RIT

Score

Avg Post-test RIT

Score

Avg RIT Growth

% Making

National Average Growth

National School Growth

Percentile

CPS 208.5 215.2 6.7 60.5 215.1 223.2 8.1 56.7

BSC 215.8 221.7 5.9 63.2 223.5 230.8 7.3 56.8

School 212.8 219.4 6.6 63.0 78 220.7 228.2 7.5 55.1 61

Accountability

SQRP Total Points Earned

SY 2016-17 SQRP Rating

SY 2016-17 Accountability Status

School 3.7 Level 1 Not Applicable

Potential Areas for Reflection

• As mentioned above, personalized learning at Irving Park appeared to be primarily teacher-directed. For instance, students may have a choice how to design their projects, but teachers determine what counts as learning. This is borne out by our observations; we did not see many instances of teachers and students partnering on goals, or students articulating their interests.

Page 132: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

117

How could Irving Park leverage its tools such as personalized learning paths and Summit to give students more agency in their learning.

• Incorporating PL at both elementary grades and middle grades in the same school can be challenging, especially with the shift to Summit as students get older. How is personalized learning vertically aligned across grades to make that transition as seamless as possible?

Page 133: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

118

Disney II Magnet School

Personalized Learning in Context

Disney II is a magnet school serving grades PK-12, and is implementing PL schoolwide. It utilizes several aspects of personalized learning; one of the most visible is learning labs that cut across multiple subjects and grades, such as a second grade lab that includes multiple classrooms and ELA and math labs. Learning labs also foster teacher collaboration. Disney II even sees personalized learning as happening in preschool. It uses the Summit platform for PL in the middle grades.

Disney II also showed specific efforts with respect to Learner Connected work, a part of the LEAP Framework with which schools tend to have difficulty: we observed parents working with second grade students during our initial schoolwide observation. Additionally, staff at Disney II explained how they engaged students, parents, and teachers to create a mission and vision statement.

In our conferring meeting, Disney II expressed questions about how project-based learning and peer instruction would be incorporated in the future. We also had several interviews with administrators and observed multiple PLCs in which several challenges were identified, the most common of which were time, funding (for flexible seating, for instance), digital tools, and PL work being either too easy or too difficult for students to grasp. One administrator also was concerned with building the model into the middle grades.

Selected Personalized Learning Practices and Examples from Observations, Interviews, and Team Meetings

Personalized Learning Practice

Interview/Professional Learning Community Response from Teachers

Example from Practice

Personalized/ learner

focused/ targeted

instruction

The educators discussed one student with an IEP who is failing one class. They said he should not be failing because of IEP, lots of supports. They talked about how to grade differently (on completion; give a percentage correct out of however many questions he gets through). The student tries hard and should not be in the class he is failing.

One teacher calls a group of 4 students over to work with her. (The other teacher is working with a group of 6.) One teacher is working with one reading group, another teacher is working with another reading group, and the rest of the students are doing silent reading for 20 minutes. The students doing the silent reading are to write 4 post-it notes based on what they read. Then students will transition to reading groups in which they discuss and share what they read.

Personal learning paths

Interview response: At the middle school level, the students do not have as much control over content, but do over the pathway.

PLC discussion:

Page 134: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

119

● They do pre-assessments that inform different lesson plans for kids to put them on a continuum or a “pathway”. Right now teachers identify the paths, but they want to be able to have students choose where they go.

● At parent-teacher conference, they walked parents through the PLP, and had the students translate, reflect, and explain.

Competency- or proficiency-

based progression

In the platform, personalization comes from pacing. Can redo work, redo content, master skills.

A student is working in a workbook. When asked if she chose to do that or if the teacher told her to, she said the teacher told her to, but her next rotation is “choice,” so then she will be able to work on what she wants. When the student finishes, she goes to grab “fast finishers,” but none are left, so the teacher tells her to go to a bin to pick up new ones, and the student does so. She then picks up a cup of colored pencils and sits on the floor to work on a new worksheet. She said that she can work on that because the finished her other work.

Flexible learning

environments/schedules

Preschool is ideal; there are different centers, and the whole thing is choice time. Diverse set of things to do; the teacher provides direction. “Preschool is perfect personalized learning: choice, excitement.”

Students got up and went to get out their yellow folders, then went to their "choice spaces", which were all over the room on the stools, couch, tables, etc. I was sitting on a stool in the closet/cubby space and a student came up to me and asked if I could please move since that was her choice space, which suggested that these were set and consistent each day.

Collaborative learning

models/labs

Lots of new staff members – learning lab really brought staff together and helped build collaboration on the team.

Project-based learning

Interview response: Last year there was some project-based learning and some teachers continue to do some. They created their own approach for the project-based learning that they have done.

PLC discussion: Discussion about whether project-based learning is happening in 9th grade. There are so many kids not ready for project-based learning. They need

2 students at small tables are creating “number grid puzzles,” putting strips or blocks of laminated numbers in order. They’re not collaborating, but they are working on the same project.

Page 135: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

120

executive functioning intervention during that time.

Vertical alignment

The principal pointed out that a 4th grade teacher was in the classroom looking at the classroom routines and procedures to get at the vertical alignment piece. They also want to be sure to infuse social studies into reading now so that students are prepared for the SAT (even though that is a long way off).

Gradual release

After leaving a classroom, the principal explains that the teacher does gradual release – she does not do things like flexible seating right away.

Flexible staffing

After the bell rang, the next class was the same subject, but was a co-taught class.

General Survey Data1

Student respondents by grade

Grade

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

4 35 10.4% 52 14.8% 55 16.1% 142 13.8%

5 36 10.7% 50 14.3% 51 14.9% 137 13.3%

6 44 13.1% 55 15.7% 55 16.1% 154 15.0%

7 105 31.3% 89 25.4% 91 26.6% 285 27.7%

8 116 34.5% 105 29.9% 90 26.3% 311 30.2%

Total 336 351 342 1029

Student respondents by gender

Gender

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

Male 142 42.6% 174 50.0% 153 48.4% 469 47.0%

Female 191 57.4% 174 50.0% 163 51.6% 528 53.0%

1 Totals across tables may not be the same, as students self-report their demographic characteristics.

Page 136: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

121

Total 333 348 316 997

Student respondents by race/ethnicity

Race/ethnicity

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

Hispanic 141 42.3% 147 42.0% 124 39.2% 412 41.2%

White 135 40.5% 133 38.0% 104 32.9% 372 37.2%

Black/African-American 27 8.1% 29 8.3% 29 9.2% 85 8.5%

Other 30 9.0% 41 11.7% 59 18.7% 130 13.0%

Total 333 350 316 999

Teacher survey respondents

n

Grade taught Subject taught Implementing PL Strategies2

K 1-3 4-5 6-8 9-12 Other ELA Math Social

Studies Science Other B R C

Fall 2016 18 0 0 3 3 7 3 9 7 2 4 2 4 7 7

Spr 2017 12 0 0 4 8 6 1 4 5 2 4 1 2 8 1

Fall 2017 45 6 11 13 22 3 5 16 15 16 15 10 7 19 7

2 N = Not yet implementing. B = Beginning to implement a few. R = Regularly implementing for a portion of the time. C = Consistently implementing all of the time.

Page 137: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

122

Relationship to LEAP Framework

Learner Focused

LEAP Framework Item Data type Content

Learning opportunities reflect an understanding

of individual needs, interests, and strengths

with respect to academic skills and

needs

Practice cluster: Personal learning

plans + competency-based instruction +

flexible learning environments +

project-based learning

Twenty-two students are working independently on laptops. Most of those students are working at desks that have been clustered into tables. About five were in beanbags in the corner. Students on the laptops are working on:

● Their book notes in a google doc (either individual or as a group) the teacher had created with a notes organizer and is housed in the Summit unit;

● Reviewing slides for an assessment (and then teacher pulled some students to review them);

● Working on a study guide with teacher in a discussion to review before taking an assessment; discussion was vibrant and about what people have the right to do versus government surveillance. Students can take the assessment when the teacher says they are ready.

● Working on concepts related to the unit (i.e., on storytelling devices like conflict) in Basecamp that they can progress through on their own and test out of. The teacher noted that they started doing this type of activity because they realized that students were basically re-doing the same concepts at the beginning of every year, so this way students can test out and move on.

Practice cluster: Flexible learning environments +

collaborative learning models/labs +

horizontal alignment

In a combined 2nd grade classroom, two teachers and about 50 students were preparing for a "Goods and Services Fair" by talking through a large to-do list up on the board. The math lab takes up a large double classroom with white boards and rugs/camp chairs on both ends; round and half moon tables throughout the middle for independent work; a long rectangular table in the center for support from high school tutor; and rolling chairs at the tables.

Practice cluster: Content-area

specialized classrooms + collaborative

learning models/labs

Disney II has a 4th grade math lab.

Observation

The teacher announces that it is time to rotate to their third section. She calls Group 3 up to the front. Many students are now working in workbooks at desks, though they still clearly can choose where to sit. The teacher highlights to

Page 138: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

123

the class that group 1, working on math jobs, has done a good job of focusing on their work.

Interview/PLC Teachers addressed plans to intervene with specific students. They also brainstormed plans for creating remedial programs for struggling students.

Learning opportunities reflect an understanding

of individual needs, interests, and strengths

with respect to non-academic skills and

needs including social, cultural, and emotional

Observation

The teacher took a couple of minutes to explain a new school schedule. The change in schedule partly was to change advisory period from a free for all study hall to doing actual advising like social-emotional. The teacher said they would be doing more activities from the Second Step curriculum, to which there was a groan from the students. And then they also would rework some of that time back into the classes so that they are 10 minutes longer. Then there was a check-in activity where each student filled out a written half sheet of paper on "The best thing that happened to me this weekend. And, the worst thing that happened to me this weekend."

Interview/PLC Teachers in the PLC discussed new ideas to help students 1 on 1, from resources to interventions.

Related Survey Responses

We have highlighted the Learner Focused survey items in which students indicated greater agreement from fall to spring.

Learner Focused survey items - all student respondents3

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher knows what I am interested in. 2.50 2.50 2.45

My teacher knows what things are easy for me to learn. 2.85 2.88 2.75

My teacher knows how I learn best. 2.79 2.79 2.71

My teacher notices if I have trouble learning something. 3.05 3.03 3.08

My teacher will always listen to my ideas. 3.08 3.10 3.11

My teacher knows about my life at home. 1.72 1.95 1.76

My teacher knows about how things are going with my friends. 1.88 2.15 1.98

3 For this and other sets of survey responses, we averaged responses, with a response of 1 indicating the least agreement or frequency level. Questions should not be compared with each other; while most items include 4 possible response categories, some include more or fewer.

Page 139: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

124

My teacher knows about the activities I like to do outside of school. 1.84 2.09 1.94

My teacher knows who my friends from school are. 2.68 2.85 2.85

My teacher knows what I do on the weekend. 1.53 1.69 1.62

My teacher connects what we are learning to the world outside the classroom. 2.67 2.91 2.84

I get the chance to do school work I am good at. 3.08 2.94 2.86

I do school work that makes me want to try hard. 3.14 3.05 3.02

I learn about things I am interested in. 2.87 2.81 2.88

What I learn in class connects with my life outside of school. 2.59 2.80 2.53

My school work interests me. 2.79 2.77 2.73

I know what I am learning now will help me when I grow up. 3.13 3.13 3.12

I learn valuable skills. 3.29 3.29 3.20

Learner Focused Survey Items – all teacher respondents

Survey item Fall 2016

Spr 2017

Fall 2017

Think about the one-to-one meetings you have had with students this year. On average, how often do you discuss with individual students...

● Their interests? 2.83 3.09 3.25

● Their strengths? 3.50 4.00 3.94

● Their learning challenges? 3.67 4.09 3.97

● Their behavior? 3.22 3.36 3.59

● Their feelings or general well-being? 3.28 3.73 3.97

I learn about my students by discussing each student's progress with other adults in the school who are working with the student.

3.67 3.72 3.72

I learn about my students by discussing student progress with families (by phone, email or in person meetings).

3.67 3.55 3.78

I actively seek information about my students' outside interests. 4.00 3.73 3.84

I actively seek information about my students' families. 3.94 3.73 3.78

I know my students' family and home context. 3.22 3.60 3.38

I know my students' community context. 3.22 3.30 3.19

Page 140: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

125

I know my students' learning interests. 3.61 3.90 3.81

I know which specific learning standards my students have met. 3.72 4.60 3.75

I know my students' strengths. 4.22 4.60 4.19

I understand my students' learning challenges. 4.11 4.70 4.19

I understand what motivates my students to learn. 3.61 4.00 3.81

How often do you...

● Incorporate student learning interests into your lessons? 3.50 4.00 3.69

● Assign school work to individual students based on non-academic data (e.g., learning preferences, work habits, social-emotional functioning)?

3.00 3.00 3.03

● Assign schoolwork to individual students based on academic data? 3.94 4.10 3.72

● Help students make a connection between what they are learning in the classroom with life outside of the classroom?

4.06 4.20 4.09

Learner Led

LEAP Framework Item Data type Content

Assess and monitor their own progress

Practice cluster: Targeted instruction +

personal learning paths

Students appear to have chosen the activity that is appropriate for them and if they finish one activity can move on to another task they need to complete.

Observation

Students voluntarily decide if they are going to go with diverse learner teacher for more help on the lesson – they can go even if they are not classified as a diverse learner, and if they are a diverse learner who does not feel like they need more help, they do not have to go.

Interview/PLC On the platform it shows who is passing, who needs help. Students can set goals and monitor progress.

Collaborate with others to achieve goals

Observation

A teacher has the students compare their answers with partners as she walks around the classroom to see what other students are doing and make sure they are on task.

Advocate for needed support form teachers, peers, technology and

other sources

Observation

A student asks the teacher what he is supposed to do on a worksheet; she directs him to the instructions she has written on it and tells him to work with friends if he has questions.

Page 141: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

126

Related Survey Responses

We have highlighted the Learner Led survey items in which students indicated greater agreement from fall to spring. In some cases, less agreement could be a positive development. For instance, if teachers set or keep track of learning goals less often, that could reflect increased student agency.

Learner Led survey items – all student respondents

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher lets me include topics I like in the lessons and units that we study. 2.92 3.00 2.94

My teacher helps me figure out how I will get all my work done. 3.35 3.46 3.32

I talk with my teacher about the kinds of activities that help me learn best. 2.41 2.49 2.33

I talk to my teacher about how I can challenge myself at school. 2.54 2.60 2.44

I talk with my teacher about which subjects I am really good at. 2.53 2.61 2.43

I talk with my teacher about subjects I need help learning. 2.79 2.83 2.83

I tell my teacher what would help me learn. 2.72 2.75 2.59

I can decide which learning activities I will do. 2.98 3.00 2.95

I can choose which students I will do my work with. 3.39 3.41 3.18

I can decide which skills I work on. 3.19 3.14 3.00

I can decide how much time I will spend doing my schoolwork. 3.28 3.30 2.85

I can decide where I do my schoolwork. 3.37 3.34 3.24

I can decide the order in which I complete my assignments. 3.44 3.43 3.19

When I am doing school work, I know exactly what I am trying to learn. 3.38 3.40 3.20

I know what my learning goals are. 3.48 3.32 3.49

I know which skills I need to improve. 3.60 3.49 3.54

My teacher sets learning goals for me. 2.91 2.94 2.87

I set learning goals with my teacher. 3.06 3.11 2.86

I keep track of which of my learning goals I have met. 3.25 3.02 3.24

My teacher keeps track of which of my learning goals I have met. 2.92 2.88 2.83

I keep track of whether I have completed my school work. 3.61 3.57 3.51

I look at my test results to see which skills I still need to work on. 3.63 3.49 3.44

I look at schoolwork other than tests to see what skills I still need to work on. 3.41 3.27 3.23

I learn from mistakes in my work. 3.41 3.23 3.30

Page 142: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

127

Most students in my class help each other learn. 3.14 3.09 2.92

My teacher helps me figure out what I can do to improve my school work. 3.14 3.06 3.16

When I am given the chance to revise my work, I do. 3.43 3.42 3.41

My teacher helps me see which things I am doing well on. 3.07 3.12 3.09

My teacher asks me to decide which skills I still need to work on. 2.97 2.87 2.77

I work with different groups of students during the day. 3.17 3.18 3.10

Other students give me feedback on my school work. 3.07 3.09 2.98

I give feedback to other students on their school work. 3.12 3.13 3.05

I work together with other students on school work. 3.47 3.48 3.38

To what extent does feedback from other students help you improve your work? 2.90 2.98 2.86

When I don’t understand something in class I ask other students for help. 3.09 3.15 2.99

I try to figure problems out on my own before I ask my teacher for help. 3.40 3.40 3.27

I know where to go for help when I have a question. 3.42 3.40 3.36

I can get help on my work in more than one way. 3.35 3.29 3.23

I don’t mind asking the teacher a question in class. 3.33 3.25 3.06

Learner Led survey items – all teacher respondents

Survey item

Fall 2016

Spr 2017

Fall 2017

Think about the one-to-one meetings you have had with students this year. On average, how often do you…

● Determine with students which learning activities will align with their individual interests? 2.89 3.20 2.97

● Ask students for input on the topics we will study? 2.71 3.30 2.94

● Talk with students about the kinds of activities that will help them learn best? 3.11 3.70 3.09

● Talk with students about how they can challenge themselves at school? 3.17 4.10 3.56

Students make choices on how to learn in class (e.g., read written material, complete a project, whether to work alone or in a group).

3.72 3.60 3.53

Students make choices on what to learn in class (e.g., which skills to practice, what topics to research, which subjects to work on first).

3.50 3.20 3.28

How much do you agree with the following statement? Students can explain how learning activities connect to their learning goals.

2.61 3.00 2.75

Page 143: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

128

Students create goals for their own learning (e.g., which academic skills to improve, what behaviors or work habits to strengthen).

3.28 3.30 3.41

Students create their learning goals with me. 3.39 3.30 3.50

What proportion of your students… Have a unique set of learning goals? 2.44 2.30 2.19

Think about the one-to-one meetings you have had with students this year. On average, how often do you...

● Examine individual student assessment results together with the student? 3.39 3.60 3.03

● Discuss individual student overall progress? 3.83 4.20 3.50

● Provide feedback to individual students on their work? 3.67 4.30 4.09

● Discuss what individual students think are the reasons for their progress or lack of progress? 3.50 4.10 3.47

● Make adjustments to individual student learning goals and/or learning plans with the student? 3.11 3.40 3.34

● Discuss with individual students how they have grown as learners? 3.39 3.80 3.59

When you are providing feedback and support to individual students, on average, how often do you encourage students to reflect upon the effectiveness of their efforts and learning strategies?

3.67 3.50 3.66

Students keep track of their own learning progress. 3.50 3.80 3.63

Students have access to their own performance data. 4.11 4.40 3.63

Students engage in structured reflection on their own learning and progress (e.g., journals, reflection exercise, group sharing time).

3.56 3.10 3.47

Students self-assess their own work. 3.33 3.40 3.53

Students formally discuss how they will work as a team (e.g., determine roles/responsibilities, decision making process).

2.94 3.20 3.38

Students formally exchange feedback on their work with peers. 2.94 2.90 3.31

Students know which specific learning standards they have successfully met. 3.41 3.60 3.35

Students know which specific learning standards they still need to meet. 3.29 3.60 3.29

Students ask for help from peers before seeking my help. 2.71 3.60 3.45

Students seek my help when they need it. 3.94 4.20 4.13

Students know where to go for help (e.g., technology, peers, adults, and other sources). 3.76 4.10 3.90

Students know when they need help with schoolwork. 3.41 4.00 3.94

How many students in your class could tell a visitor exactly what learning goal(s) they are working on? 3.11 3.70 3.39

How true are the following about why you use small groups in your class? … To complete special projects. 3.33 3.70 3.63

Page 144: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

129

Students build on each other's ideas during discussion. 3.18 3.40 3.58

Students use data and text references to support their ideas. 3.18 3.60 3.03

Students show each other respect. 3.41 3.50 3.71

Students provide constructive feedback to their peers and to me. 3.00 3.20 3.29

Most students participate in the discussion at some point. 3.29 3.60 3.45

Learner Demonstrated

LEAP Framework Item Data type Content

Enter at a level appropriate to their prior knowledge and

learning needs

Practice cluster: Targeted instruction + personal learning

paths + competency-based progression

Students in math are assigned to groups and groups are constantly changing based on assessment data (pre-assessment). Pre-assessments are done frequently, sometimes weekly. Teachers develop pre-assessments, and then create different lesson plans for students based on these on a continuum or "pathway". One teacher noted that although right now teachers determine where students go in a lesson plan, they want to be able to shift towards students choose their pathway after the pre-assessment.

Observation The students are divided into 3 groups based on an assessment (not NWEA, because it's 2nd grade.) The groups are: Math with Me; Math Jobs + Fast Finisher; and Choice.

Interview/PLC

In a conversation with a teacher, she said she pre-assesses the students every day and forms groups based on that. She said she thought it would be a lot of work at first, to adapt groups each day based on the pre-assessments, but that the kids tend to stay in the same groups, especially at the upper end of proficiency.

Have supports and pacing that fits their

learning

Practice cluster: Targeted instruction + competency-based

progression

Seven students are working with the main teacher at a table in the front of the room in front of the white board. These students remain at that table the entire duration of the observation. These students had not finished a paragraph assignment ("TEAL" or Topic sentence, Evidence, Analysis, Link - so what?) so they were working on it together. They had to respond to the question, "What are dystopian novels so popular with young people?"

Interview/PLC Math group: In the platform, personalization comes from pacing. Can redo work, redo content, master skills.

Interview/PLC ELA group: Pacing (e.g., when does the teacher sit and do a seminar discussion with them) is something they still need to figure out.

Demonstrate evidence of learning in multiple

ways

Practice cluster: Personal learning

paths +

The unit all students are working on is the dystopian genre of fiction. Students could choose from 10 different books that represented different reading levels, with some guidance from teachers. They all were to work on a "summative project" that

Page 145: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

130

project-based learning

could either be a Socratic discussion with the teacher, or creating an experience (e.g., game) from a close read of the book for peers to go through. There are 2 teachers in the room, 1 main teacher and 1 support teacher/coach. About 29 students in the room. On the board it states the agenda is: review close read, writing (genre 3 or notes), discussion presentation, and small groups/project.

Observation

Two students were working next to each other at a table on tablets using Mathletics. They explained to me that once they log in, they get to choose from a menu of 6-7 “assignments” to work on. They click on one and then answer questions associated with that concept. Once they get at least three assignments done they get to race against someone in the game and can earn points to then be able to update their avatar with more outfits, tools, etc. The two students were working on the same problem together and helping each other get through it.

Related Survey Responses

We have highlighted the Learner Demonstrated survey items in which students indicated greater agreement from fall to spring.

Learner Demonstrated Survey Items – all student respondents

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher asks me what I know about new topics before I begin working on them. 3.01 3.21 3.10

If I can show I already understand a topic, my teacher lets me skip it. 2.62 2.67 1.98

My teacher quizzes us on what we know about new topics before we begin learning about them.

2.91 3.08 2.78

When we are learning something new, my teacher helps us understand how it fits in with what we've learned before.

3.18 3.18 3.28

When I feel stuck on a problem in class I can skip it and come back to it later. 2.96 3.17 3.06

When I feel stuck on a problem, my teacher or another adult helps me. 3.43 3.36 3.35

My teacher lets me work as fast or as slow as I want. 3.00 2.85 2.71

My teacher gives me as much time as I need to understand my school work. 2.96 2.94 2.71

I am allowed to finish my school work after school if I cannot finish it in class. 3.38 3.43 3.49

My teacher lets me take breaks from my work when I think I need it. 2.40 2.42 2.25

My teacher gives me harder work as I learn more about a topic. 3.01 3.16 3.00

Page 146: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

131

I can move ahead to new topics as soon as I show what I have learned, even if other students are still working on it.

2.96 2.88 2.70

When I don't understand a topic, I can spend more time on it while other students move on. 2.87 2.84 2.78

My teacher gives different assignments to students based on what they need to learn. 2.68 2.95 2.73

Students who don't do well on a test are allowed to retake it. 3.10 3.16 3.24

How often have you noticed other students doing different assignments than you on the same topics?

2.36 2.46 2.35

As soon as I have mastered a skill, I can show my teacher that I have learned it. 3.35 3.25 3.26

I can decide how I will show my teacher that I have mastered a skill. 2.91 2.79 2.72

In this class students choose different ways to show that they have mastered the same skill. 2.89 2.77 2.65

My teacher lets me decide the best way to show what I have learned. 3.00 2.84 2.75

Learner Demonstrated Survey Items – all teacher respondents

Survey item Fall 2016

Spr 2017

Fall 2017

I formally assess student's knowledge about a topic/skill before beginning a new unit of study.

2.89 3.09 2.94

I informally ask students what they know about a topic/subject before I start a new lesson or unit.

3.33 3.64 3.25

I assign work to my students based on their prior knowledge or data. 2.89 3.00 2.94

I use what students already know to direct their work on new topics/skills. 3.06 3.18 3.19

Students have the chance to revise their work after receiving feedback from others. 2.56 3.18 3.41

Students have the opportunity to see examples of work that exemplify excellence in a given subject or skill area.

2.67 3.18 3.13

Students who show that they have already mastered the knowledge and skills at the start of a topic or unit can skip it and move ahead.

2.94 3.27 3.00

Students can practice or review until they fully understand a topic/skill. 3.00 3.50 3.22

Students are allowed to have more time to finish work, even if other students have already moved ahead.

3.11 3.64 3.44

I give more challenging assignments to students who learn faster. 2.89 3.36 3.19

If students master skills faster than others, they move ahead to a new topic, unit or set of skills.

2.94 3.55 3.22

Page 147: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

132

Students can demonstrate proficiency anytime during a unit to demonstrate their mastery of the concepts/skills.

3.00 3.27 3.13

Students have access to multiple options for demonstrating proficiency. 2.56 2.64 2.69

Students are given choices for how they can demonstrate their learning (e.g., project, paper, test, presentation).

2.56 2.45 2.38

Students can design or suggest new ways to demonstrate their own learning. 2.44 2.64 2.50

Students are clear about what they need to be able to do to demonstrate proficiency. 2.72 3.18 3.06

Students receive clear criteria for good performance before they begin working on schoolwork.

3.17 3.27 3.31

Learner Connected

LEAP Framework Item Data type Content

Learner Connected

Observation Parents are working with 2nd graders at a hallway table on literacy.

Interview/PLC

Disney II surveyed students, parents, and teachers. Gathered all the information. Created the ideal graduate, but it was too many words. We realized at that time we were out of our league. Two parents own a branding firm. We met on a Sunday with a small group of parents/ teachers/students, revised our mission, came up with a vision statement. Mission statement is final goal, vision statement is how you get there. Values statement too. Here’s our mission, here’s our visions. Totally student and parent-driven. Broke up into 3 groups who all ended up saying the same thing. Founded on the idea of students being who they are. Did town hall meetings for preK-12 – 6 at HS campus, 1 at each campus with parents. Presented all of this work and revised student handbook.

Demographics and Outcomes 2016-17 (from publicly available CPS data)

Demographics

Race/Ethnicity % Bilingual

% Diverse Learner

% Free/ Reduced Lunch % Black % Hispanic % White % Other

CPS 37.7% 46.5% 9.9% 5.9% 17.1% 13.7% 80.2%

BSC 45.9% 39.0% 10.7% 4.4% 10.6% 8.6% 65.0%

Page 148: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

133

School 15.6% 46.8% 29.8% 7.8% 6.1% 9.8% 46.6%

NWEA MAP Attainment (Grades 3-8 combined)

Reading Math

Average RIT Score

Percentile Average RIT Score

Percentile

CPS 214.6 222.6

BSC 221.5 230.5

School 224.2 94 234.7 92

NWEA MAP Growth (Grades 3-8 combined)

Reading Math

Avg Pre-test RIT

Score

Avg Post-test RIT

Score

Avg RIT Growth

% Making

National Average Growth

National School Growth

Percentile

Avg Pre-test RIT

Score

Avg Post-test RIT

Score

Avg RIT Growth

% Making

National Average Growth

National School Growth

Percentile

CPS 208.5 215.2 6.7 60.5 215.1 223.2 8.1 56.7

BSC 215.8 221.7 5.9 63.2 223.5 230.8 7.3 56.8

School 218.8 224.4 5.5 63.9 73 227.3 234.8 7.5 61.6 72

Accountability

SQRP Total Points Earned

SY 2016-17 SQRP Rating

SY 2016-17 Accountability Status

School 4.2 Level 1+ Good Standing

SAT/PSAT

Test Avg EBRW score Avg math score Avg Composite % College Ready

SAT – District 483 472 956 35.6%

SAT – School 493 466 959 35.2%

Page 149: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

134

PSAT 10 – District 447 448 896 41.8%

PSAT 10 – School 461 438 899 44.1%

PSAT 9 – District 417 431 849 43.1%

PSAT 9 – School 439 430 869 54.1%

Potential Areas for Reflection

● Disney II could build out the Learner Led aspects of implementation, such as partnering to create goals with students; this could indicate that learning is still largely teacher-driven, and a lack of student agency. While Disney II has consciously tried to incorporate student voice into the school vision, how could it better do so with respect to academics?

● Teachers expressed some issues with Summit; how could Disney II train teachers on the platform without taking time away from other important initiatives, even those that also relate to PL?

Page 150: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

135

Patrick Henry Elementary School

Personalized Learning in Context

Henry is a neighborhood elementary school. It has made a deliberate effort to gradually incorporate PL. In 2016-17, this school chose to do personalized learning in grades 2-6, with 5th and 6th grade using Summit Basecamp. However, Henry’s work was largely targeted toward 3rd grade. Most of the observations conducted were of 3rd or 5th grade classrooms. In the fall of the second year of observations, mixed age grouping was also present inside of “units,” in which students get to choose a subject to work on from a menu of topics outside of core areas.

Henry’s “Genius Hour,” in which students choose a project to work on, is an exemplar of project-based learning, where this school is generally strong. There is also evidence of collaboration among students, and pacing may be another strength. Further, Henry’s “PLC” (a presentation wrap-up of the year in which teachers could collaborate and learn from others) was the only such presentation of which we are aware.

Interviewees and PLCs identified challenges such as time, funding, digital tools, staffing, that material wasn’t aligned to students’ needs, the requirement to meet standards, and union resistance.

Selected Personalized Learning Practices and Examples from Observations, Interviews, and Team Meetings

Personalized Learning Practice

Interview/Professional Learning Community Response from

Teachers

Example from Practice

Project-based learning

Piloting Genius Hour in one 3rd grade classroom and one 2nd grade classroom. Identifying students’ interests. Students create projects at the end. Hopefully, this will progress to all grades later.

In "Genius Hour," students get to choose a long-term project to work on. One student is working on a Pokemon PowerPoint on his own at a desk. Three boys are on a carpet practicing their completed PowerPoint presentations out loud to each other.

Personal learning paths

Through coaching from the core team, and guiding the staff, we’re going to prioritize voice and choice, and help develop what they started in course catalog in creating units students can choose from.

At the end of a "round" (about 17 minutes) the students can switch activities. They record this in their binder, which has a sheet with a "choice schedule". They check this choice schedule with their teacher each week. They choose what to work on from a menu of options (Lexia, etc) and check off when they have completed it. The menu of options is pretty much the same from week to week; the choice is more about when students do these activities, not if they do or what it looks like.

Competency- or proficiency-

Henry is working on competency-based progression. Administration feels it

The teacher assigned the students to work with partners whom she assigns to find the answer and

Page 151: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

136

based progression

needs to push this to build the students to understand mastery of skills.

use the rules they just discussed. When groups get the right answer, they can move on to "round 3."

Flexible learning

environments/schedules

At the schoolwide meeting, the assistant principal started by talking about “Fun Friday”, and showed a video with students discussing why they liked Fun Friday. The first student talked about making friendship bracelets. Another student said they could do other stuff (like basketball.) Another student said she always wanted to do jewelry.

Mixed-age grouping

Each unit group contains 3rd and 4th graders, and each of the subject areas includes the same literacy standards, though the 3rd graders are assessed on third grade standards and the 4th graders on fourth grade standards.

Vertical alignment

From the schoolwide meeting: Each grade level did something innovative. As a team, they will create a structure and format to replicate a schoolwide vision. They need to start to mesh everything together and ensure they have a common language.

Social/ emotional

and/or non-cognitive learning

As the students gather in the front of the room, the teacher suggests that they sit next to someone that’ll have a positive impact on them instead of someone they’ll be tempted to chat with.

Gradual release

The principal indicated that the units and the “levels of autonomy” were new for third grade (though they had used the “levels of autonomy” in the upper grades previously).

See “Levels of Autonomy” graphic below.

Page 152: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

137

General Survey Data1

At Henry, third grade was most heavily invested in PL. However, the 4-8 student survey would not capture these students’ feelings about PL. Therefore, the student survey results reported here may not be reported by students whom we actually observed. Additionally, the number of teacher survey respondents is quite low, which limits the utility of the teacher survey responses we provide below.

Student respondents by grade

Grade

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

4 1 1.6% 62 35.6% 79 41.4% 142 33.3%

5 58 93.6% 59 33.9% 59 30.9% 176 41.2%

6 3 4.8% 53 30.5% 53 27.8% 109 25.5%

Total 62 174 191 427

Student respondents by gender

Gender

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

Male 28 47.5% 75 43.4% 83 45.1% 186 44.7%

1 Totals across tables may not be the same, as students self-report their demographic characteristics.

Page 153: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

138

Female 31 52.5% 98 56.7% 101 54.9% 230 55.3%

Total 59 173 184 416

Student respondents by race/ethnicity

Race/ethnicity

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

Hispanic 54 94.7% 140 83.3% 152 84.4% 346 85.4%

White 1 1.8% 11 6.6% 7 3.9% 19 4.7%

Black/African-American 2 3.5% 9 5.4% 5 2.8% 16 4.0%

Other 0 0.0% 8 4.8% 16 8.9% 24 5.9%

Total 57 168 180 405

Teacher survey respondents

n

Grade taught Subject taught Implementing PL Strategies2

1-3 4-5 6-8 ELA Math Social

Studies Science Other B R C

Fall 2016 5 3 2 0 4 4 4 1 2 1 3 1

Spr 2017 5 4 1 0 4 5 4 1 1 0 2 3

Fall 2017 9 4 8 2 6 6 5 3 1 0 5 4

Relationship to LEAP Framework

The following juxtaposes qualitative data with survey data, specific to each of the four areas of the LEAP Framework.

2 N = Not yet implementing. B = Beginning to implement a few. R = Regularly implementing for a portion of the time. C = Consistently implementing all of the time.

Page 154: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

139

Learner Focused

LEAP Framework Item Data type Content

Learning opportunities reflect an understanding

of individual needs, interests, and strengths

with respect to academic skills and

needs

Practice cluster: Targeted instruction +

flexible learning environments

The teacher is collecting exit tickets (apparently from a math lesson.) He's giving tickets back to students and asking them to try problems again one more time. Students appear to be working on math, social studies, or literacy, most alone, but some in groups. Almost all students are at desks/tables, though a few work on the carpet working in Summit on literacy.

Observation

One student is working on a spelling program on his Chromebook. He explains to me that he needs help with it, and he isn't as far along as other students. He says he sometimes works with a different teacher, too. (It appears this is some sort of special education or remediation.)

Interview/PLC

In the beginning, the teacher only taught one topic/week. Now, it’s easier to do re-teaching – the teacher can go back at a later date or go over it again if they need help. For instance, some students had to go to distributive property lesson 3 times that week. She teaches a different strategy each time.

Learning opportunities reflect an understanding

of individual needs, interests, and strengths

with respect to non-academic skills and

needs including social, cultural, and emotional

Practice cluster: Flexible learning environments + social/emotional

and/or non-cognitive learning

One classroom also has a "calm corner" at the front ("find your Yeti body.")

Related Survey Responses

We have highlighted the Learner Focused survey items in which students indicated greater agreement from fall to spring.

Page 155: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

140

Learner Focused survey items - all student respondents3

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher knows what I am interested in. 2.51 2.44 2.39

My teacher knows what things are easy for me to learn. 2.98 2.96 2.85

My teacher knows how I learn best. 2.81 2.96 2.79

My teacher notices if I have trouble learning something. 3.21 3.16 3.17

My teacher will always listen to my ideas. 3.14 2.92 2.92

My teacher knows about my life at home. 1.41 1.56 1.48

My teacher knows about how things are going with my friends. 2.18 2.23 2.09

My teacher knows about the activities I like to do outside of school. 2.16 1.87 1.78

My teacher knows who my friends from school are. 2.74 2.98 2.86

My teacher knows what I do on the weekend. 1.66 1.68 1.58

My teacher connects what we are learning to the world outside the classroom. 2.49 2.68 2.56

I get the chance to do school work I am good at. 2.16 2.53 2.54

I do school work that makes me want to try hard. 2.89 2.87 2.88

I learn about things I am interested in. 2.67 2.85 2.86

What I learn in class connects with my life outside of school. 2.34 2.41 2.20

My school work interests me. 2.79 2.78 2.82

I know what I am learning now will help me when I grow up. 3.20 3.31 3.27

I learn valuable skills. 2.91 3.35 3.25

Learner Focused Survey Items – all teacher respondents

Survey item Fall 2016

Spr 2017

Fall 2017

Think about the one-to-one meetings you have had with students this year. On average, how often do you discuss with individual students...

● Their interests? 2.50 3.60 3.22

3 For this and other sets of survey responses, we averaged responses, with a response of 1 indicating the least agreement or frequency level. Questions should not be compared with each other; while most items include 4 possible response categories, some include more or fewer.

Page 156: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

141

● Their strengths? 3.25 4.40 4.00

● Their learning challenges? 4.00 4.40 4.11

● Their behavior? 3.25 3.40 3.11

● Their feelings or general well-being? 3.25 4.60 4.00

I learn about my students by discussing each student's progress with other adults in the school who are working with the student.

3.67 4.60 3.78

I learn about my students by discussing student progress with families (by phone, email or in person meetings).

3.33 4.00 3.44

I actively seek information about my students' outside interests. 4.33 4.60 3.78

I actively seek information about my students' families. 3.67 4.40 3.56

I know my students' family and home context. 3.00 4.40 4.00

I know my students' community context. 4.00 4.80 3.89

I know my students' learning interests. 4.00 4.60 3.89

I know which specific learning standards my students have met. 3.00 3.80 4.33

I know my students' strengths. 3.67 4.60 4.38

I understand my students' learning challenges. 3.67 4.60 4.56

I understand what motivates my students to learn. 3.33 4.60 4.22

How often do you...

● Incorporate student learning interests into your lessons? 4.00 4.20 4.11

● Assign school work to individual students based on non-academic data (e.g., learning preferences, work habits, social-emotional functioning)?

3.00 3.60 3.22

● Assign schoolwork to individual students based on academic data? 4.00 4.60 4.33

● Help students make a connection between what they are learning in the classroom with life outside of the classroom?

4.67 4.40 4.67

Learner Led

LEAP Framework Item Data type Content

Collaborate with learners to identify and

include learner

Practice cluster: Project-based learning + participatory culture

Genius hour – students have more choice of what they work on.

Page 157: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

142

preferences and optimal learning conditions

Practice cluster: Flexible learning environments + gradual release

There is a cabinet on which the students’ names are posted with the levels of autonomy they have (gradual release). The levels of autonomy are:

● Level 1: can work at: your own personal desk/table

● Level 2: can work at: your own personal desk/table; another table-top around the classroom

● Level 3: can work at: your own personal desk/table; another table-top around the classroom; on the floor around the classroom;

● Level 4: can work at: your own personal desk/table; another table-top around the classroom; on the floor around the classroom; in the hallway or closet.

As of mid-October 2017, about 1/3 of the students were in each of the first 3 levels. No students were in Level 4 yet.

Interview/PLC Discussion of “Genius Hour.”

Partner in setting learning goals and plans

Interview/PLC 3rd grade has PL time – students create a choice board at the beginning of the week

Articulate their interests, strengths, and

needs Observation

After 10 minutes, the teacher announces that they’re switching to work on units. These are areas outside of core topics that students have chosen to work on, and many students go into other classrooms. This particular teacher teaches a unit on film study; one student tells me that his unit is “animals in captivity.” Students ranked their preference in a Google form (from 1-5), and the teachers put the students in groups based on that. At the end of the week, they’ll take an assessment in Google docs, which will inform small-group instruction during PL time during the following week.

Assess and monitor their own progress

Observation

A teacher asks students to open up their trackers (on Google Classroom) on Chromebooks and enter the assignments that match up with the standards. The assignments they're tracking relate to CCSS. (For example, "multiply and divide within 100" is 3.OA.C.7.) The teacher helps students find the standards within their trackers. She also reviews the students' work that she had assigned in the workbooks and assigns check/check-plus/check-minus, apparently based on completeness.

Observation A student checks Lexia to see how many minutes she needs to complete by the end of the week and writes it down on her choice schedule.

Page 158: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

143

Interview/PLC The activities students are doing are based on a pre-assessment. Students pick their lessons based on their own analyses.

Collaborate with others to achieve goals

Observation

A teacher tells students to solve word problems on the associative property with partners, and tells them to go where they'd like. The students work with each other to solve the problems in different ways.

Reflect upon learning and refine their

strategies

Observation

Up on one of the side walls there was a board with different learning objectives related to math ("Ratios and percents" "Negative number", etc) and then students put their names next to the one they are working on. Next to that was a board with sticky notes students had described mistakes they had made in math and how they worked them out.

Interview/PLC During the week, students can switch projects. Some switch midweek, see it’s too hard, go back. They also self-assess: check, check-plus, check-minus.

Related Survey Responses

We have highlighted the Learner Focused survey items in which students indicated greater agreement from fall to spring. In some cases, less agreement could be a positive development. For instance, if teachers set or keep track of learning goals less often, that could reflect increased student agency.

Learner Led survey items - all student respondents

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher lets me include topics I like in the lessons and units that we study. 2.55 2.83 2.69

My teacher helps me figure out how I will get all my work done. 3.41 3.38 3.31

I talk with my teacher about the kinds of activities that help me learn best. 2.44 2.23 2.29

I talk to my teacher about how I can challenge myself at school. 2.48 2.23 2.22

I talk with my teacher about which subjects I am really good at. 2.64 2.39 2.48

I talk with my teacher about subjects I need help learning. 3.04 2.90 2.86

I tell my teacher what would help me learn. 2.67 2.62 2.71

I can decide which learning activities I will do. 2.60 2.88 2.94

I can choose which students I will do my work with. 2.76 3.10 3.00

I can decide which skills I work on. 2.92 3.06 3.01

I can decide how much time I will spend doing my schoolwork. 2.93 2.81 2.76

Page 159: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

144

I can decide where I do my schoolwork. 3.02 3.11 3.13

I can decide the order in which I complete my assignments. 2.91 3.25 3.13

When I am doing school work, I know exactly what I am trying to learn. 3.19 3.07 3.18

I know what my learning goals are. 3.62 3.42 3.38

I know which skills I need to improve. 3.49 3.50 3.52

My teacher sets learning goals for me. 2.55 2.82 2.91

I set learning goals with my teacher. 3.11 2.90 3.05

I keep track of which of my learning goals I have met. 3.23 3.02 3.24

My teacher keeps track of which of my learning goals I have met. 3.04 2.84 3.01

I keep track of whether I have completed my school work. 3.23 3.42 3.36

I look at my test results to see which skills I still need to work on. 3.57 3.36 3.40

I look at schoolwork other than tests to see what skills I still need to work on. 3.33 3.07 3.17

I learn from mistakes in my work. 3.38 3.22 3.22

Most students in my class help each other learn. 2.87 3.13 3.02

My teacher helps me figure out what I can do to improve my school work. 3.29 3.17 3.06

When I am given the chance to revise my work, I do. 3.09 3.11 3.21

My teacher helps me see which things I am doing well on. 3.07 3.10 3.23

My teacher asks me to decide which skills I still need to work on. 2.96 3.01 3.08

I work with different groups of students during the day. 3.09 3.06 2.92

Other students give me feedback on my school work. 2.37 2.64 2.76

I give feedback to other students on their school work. 2.39 2.70 2.70

I work together with other students on school work. 3.15 3.25 3.18

To what extent does feedback from other students help you improve your work? 2.78 2.77 2.66

When I don't understand something in class I ask other students for help. 2.77 3.01 2.98

I try to figure problems out on my own before I ask my teacher for help. 3.15 3.16 3.11

I know where to go for help when I have a question. 3.43 3.31 3.26

I can get help on my work in more than one way. 3.09 3.10 3.03

I don't mind asking the teacher a question in class. 3.08 2.90 2.99

Page 160: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

145

Learner Led survey items - all teacher respondents

Survey item

Fall 2016

Spr 2017

Fall 2017

Think about the one-to-one meetings you have had with students this year. On average, how often do you...

● Determine with students which learning activities will align with their individual interests? 2.67 2.80 3.67

● Ask students for input on the topics we will study? 2.33 3.20 3.67

● Talk with students about the kinds of activities that will help them learn best? 3.00 3.20 4.22

● Talk with students about how they can challenge themselves at school? 3.00 3.40 3.67

Students make choices on how to learn in class (e.g., read written material, complete a project, whether to work alone or in a group).

3.33 4.60 4.22

Students make choices on what to learn in class (e.g., which skills to practice, what topics to research, which subjects to work on first).

3.67 4.00 3.89

How much do you agree with the following statement? Students can explain how learning activities connect to their learning goals.

1.67 2.80 2.78

Students create goals for their own learning (e.g., which academic skills to improve, what behaviors or work habits to strengthen).

2.00 3.60 4.00

Students create their learning goals with me. 2.00 3.60 3.78

What proportion of your students... Have a unique set of learning goals? 1.67 2.20 2.44

Think about the one-to-one meetings you have had with students this year. On average, how often do you...

● Examine individual student assessment results together with the student? 2.33 2.80 3.67

● Discuss individual student overall progress? 2.33 3.60 4.44

● Provide feedback to individual students on their work? 4.33 4.00 4.67

● Discuss what individual students think are the reasons for their progress or lack of progress? 4.00 3.80 4.56

● Make adjustments to individual student learning goals and/or learning plans with the student? 2.67 3.40 4.11

● Discuss with individual students how they have grown as learners? 3.33 4.20 4.22

When you are providing feedback and support to individual students, on average, how often do you encourage students to reflect upon the effectiveness of their efforts and learning strategies?

4.00 4.20 4.22

Students keep track of their own learning progress. 2.67 4.00 4.44

Page 161: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

146

Students have access to their own performance data. 2.33 4.40 4.67

Students engage in structured reflection on their own learning and progress (e.g., journals, reflection exercise, group sharing time).

2.67 4.40 4.11

Students self-assess their own work. 2.67 4.20 4.00

Students formally discuss how they will work as a team (e.g., determine roles/responsibilities, decision making process).

2.67 3.80 3.56

Students formally exchange feedback on their work with peers. 2.33 4.00 3.56

Students know which specific learning standards they have successfully met. 1.33 3.00 3.56

Students know which specific learning standards they still need to meet. 1.33 2.80 3.67

Students ask for help from peers before seeking my help. 4.33 4.80 4.11

Students seek my help when they need it. 4.00 4.60 4.22

Students know where to go for help (e.g., technology, peers, adults, and other sources). 4.67 4.60 4.67

Students know when they need help with schoolwork. 4.00 4.60 4.33

How many students in your class could tell a visitor exactly what learning goal(s) they are working on? 2.33 2.80 3.78

How true are the following about why you use small groups in your class? … To complete special projects. 2.33 3.60 3.00

Students build on each other’s ideas during discussion. 3.67 3.60 3.56

Students use data and text references to support their ideas. 2.67 3.60 3.22

Students show each other respect. 3.67 3.80 3.78

Students provide constructive feedback to their peers and to me. 3.00 3.60 3.22

Most students participate in the discussion at some point. 3.33 3.20 3.44

Learner Demonstrated

LEAP Framework Item Data type Content

Enter at a level appropriate to their prior knowledge and

learning needs

Practice cluster: Targeted instruction + competency-based

progression

A teacher commences a small-group lesson on the associative property.

Have supports and pacing that fits their

learning

Practice cluster: Competency-based

progression + flexible learning

environments

A group of girls is working in the hallway. They are also working on array problems, but theirs are more challenging than those of the other students in the class. They get to work on a different assignment because they take a formative test on Mondays that determines what they work on for the rest of the week. If they have trouble, they will work with the teacher on the lesson, so these are students that had done well on the formative

Page 162: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

147

assessment. The students are working together at the portable white board to solve the problems.

Interview/PLC Student ownership of learning through a program that uses a platform that includes goal setting, mentoring, assessment, and pacing of curriculum.

Demonstrate evidence of learning in multiple

ways

Observation

One student working at a table appears to be making some sort of physical board game with a path and spinner. Two other students were also working together on a board game that appeared to include multiplication facts.

Interview/PLC

Students can demonstrate understanding of a math standard/show their teacher what they know in 3 ways: 1. Math worksheets (basic goal – list of properties/strategies to incorporate); 2. Real life application project (“plan a party”); 3. Word problems.

Related Survey Responses

We have highlighted the Learner Demonstrated survey items in which students indicated greater agreement from fall to spring.

Learner Demonstrated Survey Items – all student respondents

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher asks me what I know about new topics before I begin working on them. 2.73 2.91 2.69

If I can show I already understand a topic, my teacher lets me skip it. 1.86 1.66 1.94

My teacher quizzes us on what we know about new topics before we begin learning about them.

3.04 2.55 2.49

When we are learning something new, my teacher helps us understand how it fits in with what we’ve learned before.

3.22 3.37 3.26

When I feel stuck on a problem in class I can skip it and come back to it later. 2.56 2.77 2.60

When I feel stuck on a problem, my teacher or another adult helps me. 3.34 3.34 3.12

My teacher lets me work as fast or as slow as I want. 2.68 2.67 2.62

My teacher gives me as much time as I need to understand my school work. 2.79 2.89 2.84

I am allowed to finish my school work after school if I cannot finish it in class. 2.79 3.08 2.83

My teacher lets me take breaks from my work when I think I need it. 1.95 2.05 2.09

My teacher gives me harder work as I learn more about a topic. 2.70 2.78 2.66

Page 163: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

148

I can move ahead to new topics as soon as I show what I have learned, even if other students are still working on it.

3.03 2.73 2.86

When I don’t understand a topic, I can spend more time on it while other students move on. 2.82 2.51 2.51

My teacher gives different assignments to students based on what they need to learn. 2.61 2.64 2.63

Students who don’t do well on a test are allowed to retake it. 2.83 2.57 2.81

How often have you noticed other students doing different assignments than you on the same topics?

2.59 2.43 2.35

As soon as I have mastered a skill, I can show my teacher that I have learned it. 3.48 3.17 3.25

I can decide how I will show my teacher that I have mastered a skill. 3.02 2.88 2.80

In this class students choose different ways to show that they have mastered the same skill. 3.10 2.88 2.88

My teacher lets me decide the best way to show what I have learned. 3.07 3.07 3.02

Learner Demonstrated Survey Items – all teacher respondents

Survey item Fall 2016

Spr 2017

Fall 2017

I formally assess student’s knowledge about a topic/skill before beginning a new unit of study.

2.00 3.20 3.33

I informally ask students what they know about a topic/subject before I start a new lesson or unit.

3.75 4.00 3.78

I assign work to my students based on their prior knowledge or data. 3.50 3.60 3.56

I use what students already know to direct their work on new topics/skills. 3.75 4.00 3.78

Students have the chance to revise their work after receiving feedback from others. 2.75 3.80 3.11

Students have the opportunity to see examples of work that exemplify excellence in a given subject or skill area.

3.00 3.20 3.11

Students who show that they have already mastered the knowledge and skills at the start of a topic or unit can skip it and move ahead.

2.75 2.80 3.00

Students can practice or review until they fully understand a topic/skill. 3.50 3.40 3.56

Students are allowed to have more time to finish work, even if other students have already moved ahead.

3.75 3.40 3.67

I give more challenging assignments to students who learn faster. 3.25 3.40 3.33

If students master skills faster than others, they move ahead to a new topic, unit or set of skills.

2.50 3.00 3.22

Page 164: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

149

Students can demonstrate proficiency anytime during a unit to demonstrate their mastery of the concepts/skills.

2.75 2.60 3.00

Students have access to multiple options for demonstrating proficiency. 2.50 2.80 2.89

Students are given choices for how they can demonstrate their learning (e.g., project, paper, test, presentation).

3.25 3.40 3.00

Students can design or suggest new ways to demonstrate their own learning. 2.75 3.20 2.78

Students are clear about what they need to be able to do to demonstrate proficiency. 2.25 2.80 3.56

Students receive clear criteria for good performance before they begin working on schoolwork.

3.00 3.40 3.56

Demographics and Outcomes 2016-17 (from publicly available CPS data)

Demographics

Race/Ethnicity

% Bilingual

% Diverse Learner

% Free/ Reduced Lunch

% Black % Hispanic % White % Other

CPS 37.7% 46.5% 9.9% 5.9% 17.1% 13.7% 80.2%

BSC 45.9% 39.0% 10.7% 4.4% 10.6% 8.6% 65.0%

School 3.5% 83.9% 7.1% 5.5% 53.7% 12.8% 86.3%

NWEA MAP Attainment (Grades 3-8 combined)

Reading Math

Average RIT Score

Percentile Average RIT Score

Percentile

CPS 214.6 222.6

BSC 221.5 230.5

School 209.9 64 217.7 62

Page 165: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

150

NWEA MAP Growth (Grades 3-8 combined)

Reading Math

Avg Pre-test RIT

Score

Avg Post-test RIT

Score

Avg RIT Growth

% Making

National Average Growth

National School Growth

Percentile

Avg Pre-test RIT

Score

Avg Post-test RIT

Score

Avg RIT Growth

% Making

National Average Growth

National School Growth

Percentile

CPS 208.5 215.2 6.7 60.5 215.1 223.2 8.1 56.7

BSC 215.8 221.7 5.9 63.2 223.5 230.8 7.3 56.8

School 202.6 210.5 7.9 58.9 68 208.5 218.5 10.0 59.5 69

Accountability

SQRP Total Points Earned

SY 2016-17 SQRP Rating

SY 2016-17 Accountability Status

School 3.6 Level 1 Good Standing

Potential Areas for Reflection

• How can Henry further develop Learner Led practices? For instance, we had little evidence that students knew to advocate for support from teachers, peers, or technology, or that they were able to articulate their interests, strengths, or needs on their own.

• How can Henry further develop Learner Connected practices (engaging with the outside community, parents, etc.)?

• Staff noted in our conferring session that they may be interested in using the PL practices WEC identified as a way to frame PL within the school.

Page 166: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

151

Lindblom Math and Science Academy

Personalized Learning in Context

Lindblom is a selective enrollment high school. Observations captured personalized learning occurring with particular teachers or classrooms but PL did not appear to be systematically integrated across all grades. However, Lindblom nonetheless has several distinct strengths. For instance, during our observations, we saw “learner led” practices more frequently than others. Such practices came through in Lindblom’s colloquia, where the school makes room for student interests and passions. Colloquium time can also be used for students to get extra support on a range of subjects. Further, Lindblom has “student assistants,” older students who assist with teaching younger students. Another distinct practice was Lindblom’s writing centers, in which students can receive help on their writing from peers.

Lindblom also uses standards-based grading, which school leadership identified as a challenge with respect to its integration with CPS grading requirements. During our conferring meeting, Lindblom staff expressed a desire for teachers to use data more transparently when interacting with students, and anticipated challenges with competency-based progression in the future. Other challenges that school administration during interviews identified include time and digital tools (e.g., district software).

Selected Personalized Learning Practices and Examples from Observations, Interviews, and Team Meetings

Personalized Learning Practice

Interview/Professional Learning Community Response from

Teachers

Example from Practice

Peer instruction

Lindblom has student assistants who help teach courses and assist students who are having trouble.

Project-based learning

Students are working on long-term projects. Students can choose from: 1. Create own and submit; 2. Paper cubes with equations on each side. They work on these outside of class. All projects have the same standards.

Students are working on their final project for the semester (they had just started the previous week.) The goal is to create a computer program, but there is a complicating factor - they must create an external controller (i.e., they cannot just use a joystick). Students are going through the design phase, so most are still fleshing their ideas out, though they are starting to coalesce around certain ideas. They are tasked with requesting tech workshops from the teacher so that he can help them with any difficulties, so they need to send him tasks.

Content area specialized classrooms

Given that Lindblom is a high school, many classrooms observed were elective courses that were specific and included multiple grades.

Mixed-age grouping

Page 167: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

152

Collaborative learning

models/labs

Students in a STEM course are also working with the music theory/composition class, who will come up with themes for their games.

General Survey Data1

At Lindblom, only a small number of students responded to the survey in the fall of 2017, so those results should be interpreted with caution. Additionally, due to the low teacher response rate in the fall of 2017, we do not report any of those results.

Student respondents by grade

Grade

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

7 1 3.2% 67 46.9% 0 0.0% 68 36.8%

8 30 96.8% 76 53.2% 11 100.0% 117 63.2%

Total 31 143 11 185

Student respondents by gender

Gender

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

Male 15 48.4% 54 38.3% 1 11.1% 70 38.7%

Female 16 51.6% 87 61.7% 8 88.9% 111 61.3%

Total 31 141 9 181

Student respondents by race/ethnicity

Race/ethnicity

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

Black/African-American 23 74.2% 92 67.2% 6 66.7% 121 68.4%

Hispanic 4 12.9% 27 19.7% 2 22.2% 33 18.6%

White 1 3.2% 8 5.8% 0 0.0% 9 5.1%

Other 3 9.7% 10 7.3% 1 11.1% 14 7.9%

1 Totals across tables may not be the same, as students self-report their demographic characteristics.

Page 168: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

153

Total 31 137 9 177

Teacher survey respondents (Grades 6-8 only)

n

Subject taught Implementing PL Strategies2

ELA Math Social

Studies Science Other N B R C

Fall 2016 23 4 3 1 1 15 0 11 8 4

Spr 2017 6 0 2 1 1 3 0 2 3 1

Fall 2017 1 0 1 0 0 0 0 1 0 0

Relationship to LEAP Framework

The following juxtaposes qualitative data with survey data, specific to each of the four areas of the LEAP Framework.

Learner Focused

LEAP Framework Item Data type Content

Learning opportunities reflect an understanding

of individual needs, interests, and strengths

with respect to academic AND

nonacademic skills and needs

Observation

Wednesday they do social emotional learning (SEL) support and flex classes that students sign up for based on their own need via computers (there are 3 flex sessions and teachers need to teach two). Teachers have autonomy over what they teach. Can be academic support or enrichment, but if struggling then teacher can pull students in.

Observation

Work tables in one classroom are heterogeneously mixed based on students’ last unit tests. One high, two middle, one low (had to make some adjustments based on behavior).

Learning opportunities reflect an understanding

of individual needs, interests, and strengths

with respect to non-academic skills and

Practice cluster: Project-based learning

+ social emotional and/or non-cognitive

learning

“Student ambassadors" classroom. Older students mixed with younger ones who are having trouble in the guise of school leadership. This is a combination of students who need more support (with things like study skills, etc.) and becoming student leaders. The students are in groups and had worked on Ambassador Goals and Ambassador Norms. One norm that they decided on was doing a “community

2 N = Not yet implementing. B = Beginning to implement a few. R = Regularly implementing for a portion of the time. C = Consistently implementing all of the time.

Page 169: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

154

needs including social, cultural, and emotional

builder” (ice breaker) activity every class. Two teachers are facilitating the discussion for the students. The discussion is pretty free-flowing. A second norm is coming up with a reflection. One girl describes her experience in the DR last summer where students would do “big love” or “shout outs” to other students. Third norm: Vegas rule. What happens in 325 stays in 325. A poster with the goals includes 1) Support and get to know each other inside and outside of class; 2) be leaders and role models in a variety of ways; 3) remain open-minded and positive. One student mentions that not all students feel comfortable with all of the other students. The teacher explains this is the same way staff meetings work: the teacher knows who she wants to sit by, but the administration makes them sit with people from different departments. The students start to talk in their groups again. Based on this, the teacher asks the students to find someone new to introduce themselves to and share high points/low points from the week with.

Related Survey Responses

We have highlighted the Learner Focused survey items in which students indicated greater agreement from fall to spring.

Learner Focused survey items - all student respondents3

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher knows what I am interested in. 2.52 2.10 2.71

My teacher knows what things are easy for me to learn. 2.45 2.37 2.57

My teacher knows how I learn best. 2.62 2.33 2.71

My teacher notices if I have trouble learning something. 3.07 2.69 2.86

My teacher will always listen to my ideas. 3.04 2.73 3.86

My teacher knows about my life at home. 1.18 1.35 1.57

My teacher knows about how things are going with my friends. 1.83 1.63 1.29

My teacher knows about the activities I like to do outside of school. 1.86 1.60 2.00

My teacher knows who my friends from school are. 2.52 2.38 1.43

3 For this and other sets of survey responses, we averaged responses, with a response of 1 indicating the least agreement or frequency level. Questions should not be compared with each other; while most items include 4 possible response categories, some include more or fewer.

Page 170: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

155

My teacher knows what I do on the weekend. 1.34 1.30 1.71

My teacher connects what we are learning to the world outside the classroom. 3.03 2.61 2.57

I get the chance to do school work I am good at. 2.90 2.61 3.57

I do school work that makes me want to try hard. 3.07 2.62 3.43

I learn about things I am interested in. 2.89 2.49 3.00

What I learn in class connects with my life outside of school. 2.62 2.41 2.43

My school work interests me. 2.55 2.12 3.00

I know what I am learning now will help me when I grow up. 2.90 2.53 2.71

I learn valuable skills. 3.21 2.86 3.29

Learner Focused Survey Items – all teacher respondents

Survey item Fall 2016

Spr 2017

Think about the one-to-one meetings you have had with students this year. On average, how often do you discuss with individual students...

● Their interests? 3.04 3.50

● Their strengths? 3.87 3.50

● Their learning challenges? 4.00 3.83

● Their behavior? 3.52 3.17

● Their feelings or general well-being? 3.74 3.83

I learn about my students by discussing each student's progress with other adults in the school who are working with the student.

3.52 3.50

I learn about my students by discussing student progress with families (by phone, email or in person meetings).

3.17 2.50

I actively seek information about my students' outside interests. 3.34 3.17

I actively seek information about my students' families. 2.96 2.33

I know my students' family and home context. 2.17 2.00

I know my students' community context. 2.65 2.83

I know my students' learning interests. 3.09 3.33

I know which specific learning standards my students have met. 4.26 3.67

Page 171: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

156

I know my students' strengths. 4.09 3.50

I understand my students' learning challenges. 3.96 3.50

I understand what motivates my students to learn. 3.70 3.50

How often do you...

● Incorporate student learning interests into your lessons? 3.87 3.83

● Assign school work to individual students based on non-academic data (e.g., learning preferences, work habits, social-emotional functioning)?

3.00 3.00

● Assign schoolwork to individual students based on academic data? 3.39 2.33

● Help students make a connection between what they are learning in the classroom with life outside of the classroom?

4.13 4.00

Learner Led

LEAP Framework Item Data type Content

Collaborate with learners to identify and

include learner preferences and optimal

learning conditions

Practice cluster: Flexible learning environments +

content area specialized

classrooms + mixed-age grouping + project-based

learning

Colloquium – where in the school day/week do we make room for the interests and passions of the students? Student interest/special topics/support. 150 minutes in addition to lunch (“long block.”) 3 flex classes that students sign up for each week depending on what they need help with. At the end of the day is advisory – social/emotional. One classroom of students were planning Lindblom’s upcoming 100-year anniversary (in 2.5 years.) Others include sailing, cosmetology, animation (cosmetology and animation are student-run – students lesson plan and do curriculum.) Original design was to give back to Englewood community. Some still do that, others are based on students’ interests/passions.

Baxter Lab. First day of semester. The project they are working on is called the Baxter Challenge. Students are going to be brainstorming and developing a solution for at-home dialysis, to make the project better, improve service. The students are meeting with mentors from Baxter. Just working on engineering project right now. Some students chose to do this, some assigned. They are used to not necessarily getting the colloquiums they want after being here for a while. Mostly juniors and seniors.

Collaborate with others to achieve goals

Practice cluster: Project-based

learning + peer instruction

A senior student is teaching a colloquium. High school students will go to Randolph Elementary to tutor younger students in math starting next week, so students are just meeting each other for the first time. (This is basically an orientation.) This is a program that the senior leading the colloquium had previously participated in, but it died, and she wanted to bring it back. Before I arrived in the classroom, the student had conducted various ice-breakers and had the other students read two articles on math teaching. (David Bornstein, “A Better Way to Teach Math,” NY Times, and “Principles of Deeply

Page 172: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

157

effective math teaching,” homeschoolmath.net.) There was a teacher in the room but she was just observing and will order buses. When I arrived the student was starting a Family Feud game for the students to participate in about what behaviors students should have when tutoring. The students were to slap her hand to buzz in. (One student slaps her hand as joke before it starts.) The first correct answer is “be patient.” It’s the #1 answer – the students all clap. The next student says “love” (of material”) – wrong answer. #3 answer is “acknowledging successes;” the student teacher reminds the other students to be encouraging. One student says “responsibility.” The student teacher asks, how so? She didn’t put it up on the board, but she says it’s a good answer. One student answers, “respect” – another good answer that she had not written. One other student gives a wrong answer. She prompts them – what are good things you can do to be a good tutor. Other correct answers were “try different approaches,” “be friendly.”

Observation

Students are working as pairs creating a presentation under certain guidelines – the history of the invention and the changes it has gone through to get to this point. They will work through this and then do presentations in the next class.

Interview/PLC Before any formal assessments, students do work in groups

Advocate for needed support from teacher, peers, technology, and

other sources

Practice cluster: Targeted instruction

+ collaborative learning models/labs

+ peer instruction

Designed to be like university writing centers where there are peer writing tutors and writing center staff that tutor students. Students can opt into coming in for extra help. The peer tutors are 11th and 12th grade students who get trained during the summer to help the adult tutors. Often the students just want extra practice in a certain subject.

Related Survey Responses

We have highlighted the Learner Led survey items in which students indicated greater agreement from fall to spring. In some cases, less agreement could be a positive development. For instance, if teachers set or keep track of learning goals less often, that could reflect increased student agency.

Learner Led survey items – all student respondents

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher lets me include topics I like in the lessons and units that we study. 2.61 2.54 2.57

My teacher helps me figure out how I will get all my work done. 3.50 2.99 3.43

I talk with my teacher about the kinds of activities that help me learn best. 2.05 2.03 1.75

I talk to my teacher about how I can challenge myself at school. 1.95 2.11 2.50

Page 173: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

158

I talk with my teacher about which subjects I am really good at. 2.10 2.10 1.67

I talk with my teacher about subjects I need help learning. 2.57 2.49 1.67

I tell my teacher what would help me learn. 2.19 2.27 2.33

I can decide which learning activities I will do. 2.14 2.15 2.83

I can choose which students I will do my work with. 2.82 2.68 2.67

I can decide which skills I work on. 2.21 2.29 2.71

I can decide how much time I will spend doing my schoolwork. 2.50 2.70 3.17

I can decide where I do my schoolwork. 2.14 2.70 2.33

I can decide the order in which I complete my assignments. 3.14 2.92 3.17

When I am doing school work, I know exactly what I am trying to learn. 3.32 2.94 3.25

I know what my learning goals are. 3.55 3.06 3.00

I know which skills I need to improve. 3.32 3.21 3.00

My teacher sets learning goals for me. 3.32 3.03 3.33

I set learning goals with my teacher. 2.45 2.42 1.67

I keep track of which of my learning goals I have met. 2.65 2.77 3.00

My teacher keeps track of which of my learning goals I have met. 2.95 2.51 3.67

I keep track of whether I have completed my school work. 3.30 3.25 4.00

I look at my test results to see which skills I still need to work on. 3.60 3.35 4.00

I look at schoolwork other than tests to see what skills I still need to work on. 3.30 3.05 3.25

I learn from mistakes in my work. 3.44 3.12 3.67

Most students in my class help each other learn. 3.04 2.91 3.83

My teacher helps me figure out what I can do to improve my school work. 3.04 2.71 3.33

When I am given the chance to revise my work, I do. 3.04 3.11 3.83

My teacher helps me see which things I am doing well on. 3.00 2.74 3.00

My teacher asks me to decide which skills I still need to work on. 2.88 2.58 3.00

I work with different groups of students during the day. 2.75 2.89 2.40

Other students give me feedback on my school work. 2.96 3.10 3.33

I give feedback to other students on their school work. 3.04 2.99 3.33

I work together with other students on school work. 3.63 3.29 3.80

Page 174: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

159

To what extent does feedback from other students help you improve your work? 2.88 2.92 3.33

When I don’t understand something in class I ask other students for help. 3.08 2.76 4.00

I try to figure problems out on my own before I ask my teacher for help. 3.36 2.94 4.00

I know where to go for help when I have a question. 3.32 2.89 4.00

I can get help on my work in more than one way. 3.32 2.92 4.00

I don’t mind asking the teacher a question in class. 3.18 2.88 4.00

Learner Led survey items – all teacher respondents

Survey item

Fall 2016

Spr 2017

Think about the one-to-one meetings you have had with students this year. On average, how often do you…

● Determine with students which learning activities will align with their individual interests? 3.09 2.33

● Ask students for input on the topics we will study? 2.83 2.50

● Talk with students about the kinds of activities that will help them learn best? 3.35 2.83

● Talk with students about how they can challenge themselves at school? 3.52 3.50

Students make choices on how to learn in class (e.g., read written material, complete a project, whether to work alone or in a group).

3.26 3.16

Students make choices on what to learn in class (e.g., which skills to practice, what topics to research, which subjects to work on first).

3.13 2.83

How much do you agree with the following statement? Students can explain how learning activities connect to their learning goals.

3.09 2.83

Students create goals for their own learning (e.g., which academic skills to improve, what behaviors or work habits to strengthen).

3.30 3.17

Students create their learning goals with me. 3.17 2.67

What proportion of your students... Have a unique set of learning goals? 2.30 2.33

Think about the one-to-one meetings you have had with students this year. On average, how often do you …

● Examine individual student assessment results together with the student? 3.73 2.67

● Discuss individual student overall progress? 3.91 3.17

● Provide feedback to individual students on their work? 4.13 4.17

Page 175: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

160

● Discuss what individual students think are the reasons for their progress or lack of progress? 3.70 3.50

● Make adjustments to individual student learning goals and/or learning plans with the student? 3.56 2.83

● Discuss with individual students how they have grown as learners? 3.57 3.17

When you are providing feedback and support to individual students, on average, how often do you encourage students to reflect upon the effectiveness of their efforts and learning strategies?

4.04 3.50

Students keep track of their own learning progress. 3.74 3.50

Students have access to their own performance data. 4.48 4.67

Students engage in structured reflection on their own learning and progress (e.g., journals, reflection exercise, group sharing time).

3.52 3.17

Students self-assess their own work. 3.74 3.17

Students formally discuss how they will work as a team (e.g., determine roles/responsibilities, decision making process).

3.65 2.83

Students formally exchange feedback on their work with peers. 3.43 3.33

Students know which specific learning standards they have successfully met. 4.13 3.80

Students know which specific learning standards they still need to meet. 4.22 3.60

Students ask for help from peers before seeking my help. 3.35 3.20

Students seek my help when they need it. 4.22 3.60

Students know where to go for help (e.g., technology, peers, adults, and other sources). 4.30 4.00

Students know when they need help with schoolwork. 4.04 3.20

How many students in your class could tell a visitor exactly what learning goal(s) they are working on? 3.70 3.40

How true are the following about why you use small groups in your class? … To complete special projects. 4.00 3.50

Students build on each other's ideas during discussion. 3.48 3.60

Students use data and text references to support their ideas. 3.26 3.20

Students show each other respect. 3.70 3.60

Students provide constructive feedback to their peers and to me. 3.35 3.40

Most students participate in the discussion at some point. 3.43 3.40

Learner Demonstrated

LEAP Framework Item Data type Content

Page 176: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

161

Demonstrate evidence of learning in multiple

ways Observation

Students were each using a computer (the class took place in a computer lab). They used various types of technology to create their presentations: PowerPoint, Google images, Google slides, other Google docs, Wikipedia.

Related Survey Responses

We have highlighted the Learner Demonstrated survey items in which students indicated greater agreement from fall to spring.

Learner Demonstrated Survey Items – all student respondents

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher asks me what I know about new topics before I begin working on them. 2.97 2.65 3.11

If I can show I already understand a topic, my teacher lets me skip it. 1.54 1.45 1.50

My teacher quizzes us on what we know about new topics before we begin learning about them.

2.20 2.33 2.89

When we are learning something new, my teacher helps us understand how it fits in with what we've learned before.

3.33 2.91 3.33

When I feel stuck on a problem in class I can skip it and come back to it later. 2.38 2.42 3.00

When I feel stuck on a problem, my teacher or another adult helps me. 3.03 3.02 3.75

My teacher lets me work as fast or as slow as I want. 2.45 2.32 2.88

My teacher gives me as much time as I need to understand my school work. 2.52 2.43 3.13

I am allowed to finish my school work after school if I cannot finish it in class. 3.00 2.90 3.63

My teacher lets me take breaks from my work when I think I need it. 1.87 1.90 2.75

My teacher gives me harder work as I learn more about a topic. 3.17 2.95 3.50

I can move ahead to new topics as soon as I show what I have learned, even if other students are still working on it.

2.00 2.06 1.75

When I don't understand a topic, I can spend more time on it while other students move on. 1.83 2.07 2.63

My teacher gives different assignments to students based on what they need to learn. 1.52 1.90 1.50

Students who don't do well on a test are allowed to retake it. 3.63 3.59 4.00

How often have you noticed other students doing different assignments than you on the same topics?

1.43 1.95 1.63

As soon as I have mastered a skill, I can show my teacher that I have learned it. 3.27 2.75 3.57

I can decide how I will show my teacher that I have mastered a skill. 2.69 2.26 2.29

Page 177: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

162

In this class students choose different ways to show that they have mastered the same skill. 2.69 2.28 1.86

My teacher lets me decide the best way to show what I have learned. 2.66 2.26 2.00

Learner Demonstrated Survey Items – all teacher respondents

Survey item Fall 2016

Spr 2017

I formally assess student's knowledge about a topic/skill before beginning a new unit of study. 3.09 2.60

I informally ask students what they know about a topic/subject before I start a new lesson or unit. 3.26 3.00

I assign work to my students based on their prior knowledge or data. 3.13 2.50

I use what students already know to direct their work on new topics/skills. 3.22 2.67

Students have the chance to revise their work after receiving feedback from others. 3.48 3.33

Students have the opportunity to see examples of work that exemplify excellence in a given subject or skill area.

3.18 3.17

Students who show that they have already mastered the knowledge and skills at the start of a topic or unit can skip it and move ahead.

1.87 2.17

Students can practice or review until they fully understand a topic/skill. 3.52 3.00

Students are allowed to have more time to finish work, even if other students have already moved ahead.

3.26 3.00

I give more challenging assignments to students who learn faster. 3.04 3.00

If students master skills faster than others, they move ahead to a new topic, unit or set of skills. 2.43 2.50

Students can demonstrate proficiency anytime during a unit to demonstrate their mastery of the concepts/skills.

3.00 2.00

Students have access to multiple options for demonstrating proficiency. 3.13 2.50

Students are given choices for how they can demonstrate their learning (e.g., project, paper, test, presentation).

2.70 1.50

Students can design or suggest new ways to demonstrate their own learning. 2.74 2.17

Students are clear about what they need to be able to do to demonstrate proficiency. 3.43 2.50

Students receive clear criteria for good performance before they begin working on schoolwork. 3.39 3.50

Page 178: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

163

Demographics and Outcomes 2016-17 (from publicly available CPS data)

Demographics

Race/Ethnicity % Bilingual % Diverse Learner

% Free/ Reduced Lunch

% Black % Hispanic % White % Other

CPS 37.7% 46.5% 9.9% 5.9% 17.1% 13.7% 80.2%

BSC 45.9% 39.0% 10.7% 4.4% 10.6% 8.6% 65.0%

School 71.6% 24.4% 1.8% 2.2% 1.2% 5.6% 64.5%

NWEA MAP Attainment (Grades 3-8 combined)

Reading Math

Average RIT Score

Percentile Average RIT Score

Percentile

CPS 214.6 222.6

BSC 221.5 230.5

School 234 99 244.5 97

NWEA MAP Growth (Grades 3-8 combined)

Reading Math

Avg Pre-test RIT

Score

Avg Post-test RIT

Score

Avg RIT Growth

% Making

National Average Growth

National School Growth

Percentile

Avg Pre-test RIT

Score

Avg Post-test RIT

Score

Avg RIT Growth

% Making

National Average Growth

National School Growth

Percentile

CPS 208.5 215.2 6.7 60.5 215.1 223.2 8.1 56.7

BSC 215.8 221.7 5.9 63.2 223.5 230.8 7.3 56.8

School 231.4 234.1 2.7 66.1 54 241.7 244.5 2.8 46.2 28

Page 179: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

164

SAT/PSAT

Test Avg EBRW score Avg math score Avg Composite % College Ready

SAT – District 483 472 956 35.6%

SAT – School 572 549 1121 86.5%

PSAT 10 – District 447 448 896 41.8%

PSAT 10 – School 506 504 1010 83.3%

PSAT 9 – District 417 431 849 43.1%

PSAT 9 – School 479 487 965 88.8%

Accountability

SQRP Total Points Earned

SY 2016-17 SQRP Rating

SY 2016-17 Accountability Status

School 4.6 Level 1+ Good Standing

Potential Areas for Reflection

● Given that high school tends to be more personalized environment (students get to pick electives, classrooms may be mixed-age based on ability level), what steps could Lindblom take to incorporate aspects of personalized learning into classrooms that are already sorted by subject and/or grade?

● Could more transparent interactions with students using data increase student buy-in to aspects of personalized learning such as competency-based progression?

Page 180: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

165

Joseph Lovett Elementary School

Personalized Learning in Context

Lovett is a neighborhood elementary school that was incorporating personalized learning into grades 2-5 in 2016-17. Its academic performance has improved rapidly over the past several years, and leadership closely tracks the school’s academic metrics. For instance, during both grade-level meetings we observed, school leadership consistently used data (test scores, attendance) to discuss student progress. We also observed a meeting with outside visitors in which the principal used data to discuss Lovett’s performance.

We observed a number of practices at Lovett that were indicative of personalized learning. The most visually apparent was its flexible learning environments, in which students engage in PL on their computers while sitting on risers or beanbags in the hallway, or on the floor on their computers. On “Flex Friday,” students can choose different courses, academic and non-academic. Students at Lovett have “tracking and reflection sheets” where they indicate the progress they have made toward their learning goals. Additionally, students are combined into multi-grade classrooms (for instance, 2nd-3rd grade). This allows younger students to progress at a faster pace if they are more advanced and helps older students who may need additional support.

One note of interest is that Lovett had fewer practice clusters than the other schools we observed. Other challenges identified by staff include student attendance, digital tools, staff capacity, and the fact that work can often be mismatched to student ability (either too easy or too hard).

Selected Personalized Learning Practices and Examples from Observations, Interviews, and Team Meetings

Personalized Learning Practice

Interview/Professional Learning Community Response from Teachers

Example from Practice

Personalized/ learner

focused/ targeted

instruction

Interview response: Students are grouped on ability, test scores, conference data, RIT scores. Grouped the girls together because they were having trouble with “motive.” The younger students are actually the most advanced.

Post-PLC feedback: By identifying particular students who are “off-track,” I can personalize instruction and conference to get them “on-track.” The introduction of Skills Navigator tool

The teacher mostly instructs individually even though she's working with a group of students. The teacher then assigns students to reading support groups by group color.

Page 181: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

166

provides a way to target which skills each student needs to work on and which are already mastered. This obviously is a great help in planning instruction.

Personal learning paths

A teacher said she will walk up to a student and ask what they are working on in Lexia; they will look at their trackers to see, but it is better than before. It is intervention and enrichment.

The teacher reminds students to update progress ("tracking and reflection") sheets, which include what they do each day, in each rotation, whether they finish it, and how they did.

Competency- or proficiency-

based progression

Students have to show skill to progress with deeper content.

The teacher has uploaded 2 different versions of a story (three little pigs, more and less traditional.) The teacher tells the students to read both books before they start working on the MyOn assignment. The students can use either MyOn for Compass for point of view. If they do not get it as well, they should start with Compass; if they feel better, they can start with MyOn. If they get to the assignment on MyOn and they cannot figure out how to do the assignment, they can always go back to Compass, since it's more of a teaching tool.

Flexible learning

environments/schedules

There are 2-3 flex spaces near the classroom doors. Nooks by stairwells. Bleachers at each end of the hall are portable – teachers will do small-group instruction there.

● The students scatter. Those that go out into the hallway to work take beanbags with them. At this point, there are students from several classrooms working in the hallway. The teacher goes to the hallway to make sure her students are good before conferencing with individual students.

● Across the classroom, several students work in Chromebooks on either MyOn or Compass at desks. Four work at desks, 3 work together in a corner at the back of the room on the floor, 1 works alone on the carpet in the back of the room. Six other students do opinion writing worksheets.

Mixed-age grouping

Students are scattered across the classroom, most on the floor on carpets or spongy chairs. All of the students working on the floor are on their Chromebooks on ThinkThroughMath and the teacher is working with a group of 6 at a table at the front of the room on rounding. This is designated math time in this 2nd-3rd grade classroom. (In fact, the whole day has defined blocks of time for each subject, according to the schedule on the wall of the classroom.)

Page 182: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

167

Project-based learning

Discussion about project-based learning. Is it useful/good? Yes. That is where you see students demonstrate what they learn. In the real world, you do not take a test every week. Kids collaborate, love tech. They love to find and present information. They are excited about using tech (Google slides, presentation software) to show what they know. They may argue but they get to the finish line.

Data-driven consultation

PLC discussion: A teacher discussed giving high-flyers a pre-test on how to do measurement (convert feet to inches) – average was around 1 out of 18. They need to be taught how to even use a ruler or read a graph. It was very eye-opening.

Post-PLC feedback: We regularly look over and analyze data in order to discuss trends (or troubleshoot and brainstorm). We often discuss teaching strategies, as well as edTech programs (i.e., usage, benefits, etc).

Peer instruction

On “Flex Friday,” kids can choose different courses, academic and non-academic. Had kids teach Spanish classes. (It is cool to see kids “lesson plan.”)

One student was working with other students and teaching them how to do math. She pointed out the math “bell ringer” activity on the board, and said that the teacher calls students to the board to solve the problem. She said they are helping each other.

Social emotional

and/or noncognitive

learning

This meeting helped me reflect on the conversations I should continue to have with students in terms of progress in school and attendance. We want to motivate our students as much as possible.

The teacher conferencing with the student tells the student that it is not that he did not feel like doing the work; it is that he did not persevere. She emphasizes the importance of perseverance, and uses that as a frame to set some goals.

General Survey Data1

Student respondents by grade

Grade

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

1 Totals across tables may not be the same, as students self-report their demographic characteristics.

Page 183: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

168

4 43 38.1% 39 33.1% 40 33.3% 122 34.8%

5 36 31.9% 37 31.4% 46 38.3% 119 33.9%

6 34 30.1% 42 35.6% 34 28.3% 110 31.3%

Total 113 118 120 351

Student respondents by gender

Gender

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

Male 50 45.1% 62 52.5% 58 54.2% 170 50.6%

Female 61 55.0% 56 47.5% 49 45.8% 166 49.4%

Total 111 118 107 336

Student respondents by race/ethnicity

Race/ethnicity

Fall 2016 Spring 2017 Fall 2017 Total

n % n % n % n %

Black/African-American 79 73.2% 88 74.6% 77 71.3% 244 73.1%

Hispanic 18 16.7% 20 17.0% 20 18.5% 58 17.4%

White 2 1.9% 1 0.9% 1 0.9% 4 1.2%

Other 9 8.3% 9 7.6% 10 9.3% 28 8.4%

Total 108 118 108 334

Teacher survey respondents

n

Grade taught Subject taught Implementing PL Strategies2

1-3 4-5 6-8 ELA Math Social

Studies Science Other R C

2 N = Not yet implementing. B = Beginning to implement a few. R = Regularly implementing for a portion of the time. C = Consistently implementing all of the time.

Page 184: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

169

Fall 2016 7 4 3 0 7 7 7 7 0 4 3

Spr 2017 6 3 3 2 6 6 5 5 1 2 4

Fall 2017 7 5 3 1 7 7 7 7 1 4 3

Relationship to LEAP Framework

The following juxtaposes qualitative data with survey data, specific to each of the four areas of the LEAP Framework.

Learner Focused

LEAP Framework Item Data type Content

Learning opportunities reflect an understanding of individual needs, interests,

and strengths with respect to academic skills and needs

Practice cluster: Targeted instruction +

flexible learning environments

The students are grouped into several groups, and the special ed instructor tells them where they would go (“literacy centers” involve fairy tales – compare/contrast, compare/contrast, contractions, persuasive writing, read to self, teacher and me.) He also reminded the students what to pay attention to when they broke out into their various literacy groups. The students are assigned groups based on animals (brown bears, red robins, blue jays, purple pandas, orange ostriches, green gophers). The teacher tells them where to go, asks them to move as fast as they can at noise level 0. The teacher threatens them with moving them to the yellow group if they don't do so.

In the hallway, six students are sitting on beanbags on their Chromebooks. One student explains that they are all “brown bears.” They are working in thelearningodyssey on comparing and contrasting.

Practice cluster: Flexible learning

environments + mixed-age grouping

In a mixed 4th-5th grade classroom, students are sitting around the classroom, at desks, tables, or on the floor. Many are working in their Chromebooks on TTM. The teacher is working with a small group on the floor at the front of the classroom.

Observation

A teacher uses the data from ST Math/TTM to determine what students need help with and with whom she needs to work in small groups. The small groups are more fluid in math than reading based on what students need help with.

Interview/PLC Kids receive instruction based on what they need. One size fits all will not work.

Page 185: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

170

Learning opportunities that reflect an understanding of individual needs, interests,

and strengths with respect to non-academic skills and needs including social, cultural and emotional

Observation

The teacher is very encouraging with the small group – she is pushing students who may not think they have the ability to do certain rounding problems. “I think you can do this.” “You can do it!” When students in this group get the right answers, they are very happy.

Interview/PLC Mondays are social/emotional.

Related Survey Responses

We have highlighted the Learner Focused survey items in which students indicated greater agreement from fall to spring.

Learner Focused survey items - all student respondents3

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher knows what I am interested in. 2.57 2.82 3.12

My teacher knows what things are easy for me to learn. 3.01 3.13 3.28

My teacher knows how I learn best. 2.97 3.09 3.29

My teacher notices if I have trouble learning something. 3.25 3.25 3.51

My teacher will always listen to my ideas. 3.21 3.19 3.12

My teacher knows about my life at home. 1.54 1.84 1.95

My teacher knows about how things are going with my friends. 2.19 2.52 2.78

My teacher knows about the activities I like to do outside of school. 1.84 2.28 2.53

My teacher knows who my friends from school are. 2.79 2.96 3.18

My teacher knows what I do on the weekend. 1.54 1.88 1.95

My teacher connects what we are learning to the world outside the classroom. 2.68 2.98 3.05

I get the chance to do school work I am good at. 2.75 2.83 2.95

I do school work that makes me want to try hard. 3.19 3.31 3.46

I learn about things I am interested in. 2.92 3.10 3.18

What I learn in class connects with my life outside of school. 2.45 2.90 2.98

My school work interests me. 2.86 2.88 3.10

3 For this and other sets of survey responses, we averaged responses, with a response of 1 indicating the least agreement or frequency level. Questions should not be compared with each other; while most items include 4 possible response categories, some include more or fewer.

Page 186: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

171

I know what I am learning now will help me when I grow up. 3.38 3.42 3.57

I learn valuable skills. 3.34 3.45 3.49

Learner Focused Survey Items – all teacher respondents

Survey item Fall 2016

Spr 2017

Fall 2017

Think about the one-to-one meetings you have had with students this year. On average, how often do you discuss with individual students...

● Their interests? 3.33 4.33 4.57

● Their strengths? 3.67 5.00 5.00

● Their learning challenges? 4.33 5.00 5.00

● Their behavior? 3.83 4.33 4.29

● Their feelings or general well-being? 4.33 4.83 4.86

I learn about my students by discussing each student's progress with other adults in the school who are working with the student.

2.71 4.17 4.29

I learn about my students by discussing student progress with families (by phone, email or in person meetings).

3.57 4.00 4.00

I actively seek information about my students' outside interests. 4.00 4.67 4.86

I actively seek information about my students' families. 3.43 4.00 4.43

I know my students' family and home context. 3.71 4.33 4.00

I know my students' community context. 4.43 4.67 4.43

I know my students' learning interests. 4.00 4.83 4.71

I know which specific learning standards my students have met. 4.71 4.67 4.43

I know my students' strengths. 4.86 4.83 4.43

I understand my students' learning challenges. 4.86 5.00 4.43

I understand what motivates my students to learn. 4.43 4.67 4.29

How often do you...

● Incorporate student learning interests into your lessons? 4.00 4.50 4.57

● Assign school work to individual students based on non-academic data (e.g., learning preferences, work habits, social-emotional functioning)?

3.43 3.83 4.43

● Assign schoolwork to individual students based on academic data? 4.71 4.83 4.86

Page 187: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

172

● Help students make a connection between what they are learning in the classroom with life outside of the classroom?

4.43 4.67 4.86

Learner Led

LEAP Framework Item Data type Content

Collaborate with learners to identify and

include learner preferences and optimal

learning conditions

Observation

The teacher asks students by name where they want to work. Choices include MyOn - Read & Respond; Compass (nonfiction text features); writing (Tables 3 & 6); Word Work (suffixes) (Tables 1 & 2); Lexia (Hallway). Each table is designed for certain activities.

Interview/PLC Moved from teaching kids to having them figure out how best to learn for themselves.

Partner in setting learning goals and plans

Observation

The teacher calls one student up to her desk for a conference, and to set some goals. She has him look at his tracking sheet. It looks similar to the one from last week, and she asks him to tell her what he thinks about it, what his reflections and thoughts are, because, she tells him, it does not matter what she thinks. She tells him to be honest with himself and to set goals that will benefit him. He says it makes him unhappy. She asks why – he says he forgot to do something. He only completed one activity last week. The teacher writes the student’s goal (to finish his assignments) on a piece of paper with ways to achieve that goal. The teacher gives this student a high-five once they finish conferencing. She then calls up another student to conference with him. The conference proceeds in a similar manner.

Interview/PLC Empowered learners: Teacher-student conferencing – what is important to the child. Students are responsible for formulating their own activities.

Assess and monitor their own progress

Observation A student who was working on contractions closes his Chromebook and starts to fill out a reading tracking and reflection sheet.

Interview/PLC One teacher talked specifically about what to give to high-flyers. I have taught you the skill, now apply the skill. Flowing up to lead to measuring themselves; scaffolding.

Collaborate with others to achieve goals

Observation Three students are doing math together on a white board.

Interview/PLC With MyOn – pushing a book, or series of books. Students had to choose, work with a partner, and focus on a skill. The teacher tries to have them collaborate and hold each other accountable.

Related Survey Responses

We have highlighted the Learner Led survey items in which students indicated greater agreement from fall to spring. In some cases, less agreement could be a positive development. For instance, if teachers set or keep track of learning goals less often, that could reflect increased student agency.

Page 188: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

173

Learner Led survey items - all student respondents

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher lets me include topics I like in the lessons and units that we study. 3.02 3.20 3.04

My teacher helps me figure out how I will get all my work done. 3.46 3.54 3.58

I talk with my teacher about the kinds of activities that help me learn best. 2.60 2.89 2.86

I talk to my teacher about how I can challenge myself at school. 2.39 2.78 2.97

I talk with my teacher about which subjects I am really good at. 2.67 2.98 3.09

I talk with my teacher about subjects I need help learning. 2.73 3.19 3.37

I tell my teacher what would help me learn. 2.73 2.98 3.08

I can decide which learning activities I will do. 2.72 3.04 3.08

I can choose which students I will do my work with. 3.24 3.33 3.29

I can decide which skills I work on. 2.60 3.05 2.91

I can decide how much time I will spend doing my schoolwork. 2.54 2.83 2.52

I can decide where I do my schoolwork. 3.12 3.40 3.30

I can decide the order in which I complete my assignments. 3.07 3.45 3.31

When I am doing school work, I know exactly what I am trying to learn. 2.97 3.22 3.29

I know what my learning goals are. 3.47 3.53 3.49

I know which skills I need to improve. 3.32 3.67 3.55

My teacher sets learning goals for me. 3.12 3.51 3.42

I set learning goals with my teacher. 2.82 3.32 3.10

I keep track of which of my learning goals I have met. 2.87 3.23 3.13

My teacher keeps track of which of my learning goals I have met. 3.15 3.40 3.28

I keep track of whether I have completed my school work. 3.26 3.29 3.46

I look at my test results to see which skills I still need to work on. 3.14 3.40 3.29

I look at schoolwork other than tests to see what skills I still need to work on. 3.04 3.17 3.27

I learn from mistakes in my work. 3.20 3.52 3.47

Most students in my class help each other learn. 3.14 3.41 3.21

Page 189: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

174

My teacher helps me figure out what I can do to improve my school work. 3.00 3.45 3.34

When I am given the chance to revise my work, I do. 3.31 3.41 3.33

My teacher helps me see which things I am doing well on. 2.96 3.38 3.30

My teacher asks me to decide which skills I still need to work on. 2.95 3.15 3.10

I work with different groups of students during the day. 3.15 3.49 3.33

Other students give me feedback on my school work. 3.02 3.37 3.24

I give feedback to other students on their school work. 3.18 3.37 3.11

I work together with other students on school work. 3.33 3.47 3.50

To what extent does feedback from other students help you improve your work? 2.90 2.96 2.94

When I don't understand something in class I ask other students for help. 3.13 3.36 3.29

I try to figure problems out on my own before I ask my teacher for help. 3.23 3.31 3.32

I know where to go for help when I have a question. 3.32 3.44 3.48

I can get help on my work in more than one way. 3.21 3.26 3.39

I don't mind asking the teacher a question in class. 3.22 3.32 3.37

Learner Led survey items - all teacher respondents

Survey item

Fall 2016

Spr 2017

Fall 2017

Think about the one-to-one meetings you have had with students this year. On average, how often do you...

● Determine with students which learning activities will align with their individual interests? 3.00 3.50 4.43

● Ask students for input on the topics we will study? 3.17 3.67 4.57

● Talk with students about the kinds of activities that will help them learn best? 3.50 4.33 4.57

● Talk with students about how they can challenge themselves at school? 3.33 5.00 4.71

Students make choices on how to learn in class (e.g., read written material, complete a project, whether to work alone or in a group).

3.71 4.50 4.43

Students make choices on what to learn in class (e.g., which skills to practice, what topics to research, which subjects to work on first).

3.43 4.33 4.43

How much do you agree with the following statement? Students can explain how learning activities connect to their learning goals.

3.29 3.83 3.00

Page 190: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

175

Students create goals for their own learning (e.g., which academic skills to improve, what behaviors or work habits to strengthen).

3.43 4.00 4.14

Students create their learning goals with me. 4.14 4.50 4.43

What proportion of your students... Have a unique set of learning goals? 2.43 3.00 2.71

Think about the one-to-one meetings you have had with students this year. On average, how often do you...

● Examine individual student assessment results together with the student? 3.50 4.17 4.43

● Discuss individual student overall progress? 4.33 4.67 4.43

● Provide feedback to individual students on their work? 4.33 5.00 4.57

● Discuss what individual students think are the reasons for their progress or lack of progress? 4.17 5.00 4.57

● Make adjustments to individual student learning goals and/or learning plans with the student? 3.83 4.67 4.43

● Discuss with individual students how they have grown as learners? 4.17 4.67 4.43

When you are providing feedback and support to individual students, on average, how often do you encourage students to reflect upon the effectiveness of their efforts and learning strategies?

4.00 4.67 4.71

Students keep track of their own learning progress. 4.14 4.83 4.57

Students have access to their own performance data. 3.86 4.50 4.57

Students engage in structured reflection on their own learning and progress (e.g., journals, reflection exercise, group sharing time).

3.57 4.33 4.57

Students self-assess their own work. 3.43 4.00 4.71

Students formally discuss how they will work as a team (e.g., determine roles/responsibilities, decision making process).

3.29 4.00 4.14

Students formally exchange feedback on their work with peers. 3.43 4.00 4.00

Students know which specific learning standards they have successfully met. 3.43 3.67 4.00

Students know which specific learning standards they still need to meet. 3.14 3.67 3.86

Students ask for help from peers before seeking my help. 4.14 4.83 4.57

Students seek my help when they need it. 4.29 4.67 4.57

Students know where to go for help (e.g., technology, peers, adults, and other sources). 4.57 5.00 4.71

Students know when they need help with schoolwork. 4.71 4.83 4.71

How many students in your class could tell a visitor exactly what learning goal(s) they are working on? 3.57 4.33 4.00

How true are the following about why you use small groups in your class? … To complete special projects. 4.71 4.33 4.57

Page 191: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

176

Students build on each other’s ideas during discussion. 3.43 3.67 3.86

Students use data and text references to support their ideas. 3.29 4.00 3.86

Students show each other respect. 3.86 4.00 4.00

Students provide constructive feedback to their peers and to me. 3.43 3.67 3.86

Most students participate in the discussion at some point. 3.86 4.00 4.00

Learner Demonstrated

LEAP Framework Item Data type Content

Enter at a level appropriate to their prior knowledge and

learning needs

Practice cluster: Targeted instruction + competency-based

progression

The teacher goes to the 5 students on the carpet and asks whether they like Achilles or Midas better. All chime in but one student who had been leading the group still dominates the conversation. The teacher challenges these students on feelings/thoughts and explains that many Greek tragedies have similar themes. The teacher tries to get students to tie these themes to their lives.

The teacher returns to the group of 4 boys at the table, asking them questions about plot. He asks what characters’ motives are.

The other group of 3 boys work together to interpret pictures in their books about Midas. The teacher goes over to them. He sees that they are struggling and asks about their “somebody, wanted, but, so” worksheets. The teacher discusses more about the plot with them, and tying it into their lives and families.

Have supports and pacing that fits their

learning

Observation

The teacher explains how the groups work – each day, they work on something different (rounds 1-4), but even within groups there is differentiation. For instance, the red group are the lower students, and receive more supports and have different graphic organizers. This is consistent across the pod. The other three teachers in the room – one is special ed, the other two are student teachers. Usually students are allowed to move between classrooms, but today, the adjoining classroom is doing a video lesson, so they remain in this classroom for the most part. (A few go into the hallway.)

Interview/PLC A teacher describes “fix-up Fridays,” in which students can catch up on work they had not gotten done and the students who are farther ahead can do additional work.

Page 192: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

177

Related Survey Responses

We have highlighted the Learner Demonstrated survey items in which students indicated greater agreement from fall to spring.

Learner Demonstrated Survey Items – all student respondents

Survey item Fall 2016

Spr 2017

Fall 2017

My teacher asks me what I know about new topics before I begin working on them. 3.02 3.23 3.37

If I can show I already understand a topic, my teacher lets me skip it. 1.91 2.08 2.20

My teacher quizzes us on what we know about new topics before we begin learning about them.

2.58 2.61 2.84

When we are learning something new, my teacher helps us understand how it fits in with what we’ve learned before.

3.37 3.45 3.64

When I feel stuck on a problem in class I can skip it and come back to it later. 2.96 3.15 2.93

When I feel stuck on a problem, my teacher or another adult helps me. 3.31 3.42 3.59

My teacher lets me work as fast or as slow as I want. 2.25 2.30 2.07

My teacher gives me as much time as I need to understand my school work. 2.99 3.05 2.97

I am allowed to finish my school work after school if I cannot finish it in class. 2.88 3.00 2.87

My teacher lets me take breaks from my work when I think I need it. 2.19 2.46 2.21

My teacher gives me harder work as I learn more about a topic. 2.84 3.07 3.09

I can move ahead to new topics as soon as I show what I have learned, even if other students are still working on it.

2.70 2.85 3.00

When I don’t understand a topic, I can spend more time on it while other students move on. 2.70 2.80 2.69

My teacher gives different assignments to students based on what they need to learn. 3.06 3.03 3.25

Students who don’t do well on a test are allowed to retake it. 2.86 2.91 2.71

How often have you noticed other students doing different assignments than you on the same topics?

2.32 2.54 2.67

As soon as I have mastered a skill, I can show my teacher that I have learned it. 3.42 3.33 3.42

I can decide how I will show my teacher that I have mastered a skill. 2.91 2.80 3.01

In this class students choose different ways to show that they have mastered the same skill. 2.99 2.92 3.04

My teacher lets me decide the best way to show what I have learned. 3.05 3.15 3.07

Page 193: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

178

Learner Demonstrated Survey Items – all teacher respondents

Survey item Fall 2016

Spr 2017

Fall 2017

I formally assess student’s knowledge about a topic/skill before beginning a new unit of study.

3.14 3.50 3.57

I informally ask students what they know about a topic/subject before I start a new lesson or unit.

3.71 4.00 3.71

I assign work to my students based on their prior knowledge or data. 3.29 3.83 3.57

I use what students already know to direct their work on new topics/skills. 3.71 3.67 3.71

Students have the chance to revise their work after receiving feedback from others. 2.86 3.67 3.29

Students have the opportunity to see examples of work that exemplify excellence in a given subject or skill area.

3.29 3.67 3.71

Students who show that they have already mastered the knowledge and skills at the start of a topic or unit can skip it and move ahead.

3.57 3.33 3.86

Students can practice or review until they fully understand a topic/skill. 3.00 3.33 3.43

Students are allowed to have more time to finish work, even if other students have already moved ahead.

3.57 3.50 3.57

I give more challenging assignments to students who learn faster. 3.86 4.00 3.57

If students master skills faster than others, they move ahead to a new topic, unit or set of skills.

3.57 3.83 3.86

Students can demonstrate proficiency anytime during a unit to demonstrate their mastery of the concepts/skills.

3.00 3.00 3.43

Students have access to multiple options for demonstrating proficiency. 3.29 3.33 3.43

Students are given choices for how they can demonstrate their learning (e.g., project, paper, test, presentation).

3.00 3.50 3.29

Students can design or suggest new ways to demonstrate their own learning. 2.71 3.17 3.43

Students are clear about what they need to be able to do to demonstrate proficiency. 3.29 3.83 3.86

Students receive clear criteria for good performance before they begin working on schoolwork.

3.71 4.00 4.00

Page 194: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

179

Demographics and Outcomes 2016-17 (from publicly available CPS data)

Demographics

Race/Ethnicity

% Bilingual

% Diverse Learner

% Free/ Reduced Lunch

% Black % Hispanic % White % Other

CPS 37.7% 46.5% 9.9% 5.9% 17.1% 13.7% 80.2%

BSC 45.9% 39.0% 10.7% 4.4% 10.6% 8.6% 65.0%

School 83.2% 16.1% 0.2% 0.5% 5.9% 11.6% 92.1%

NWEA MAP Attainment (Grades 3-8 combined)

Reading Math

Average RIT Score

Percentile Average RIT Score

Percentile

CPS 214.6 222.6

BSC 221.5 230.5

School 213.6 61 220.1 48

Page 195: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting

180

NWEA MAP Growth (Grades 3-8 combined)

Reading Math

Avg Pre-test RIT

Score

Avg Post-test RIT

Score

Avg RIT Growth

% Making

National Average Growth

National School Growth

Percentile

Avg Pre-test RIT

Score

Avg Post-test RIT

Score

Avg RIT Growth

% Making

National Average Growth

National School Growth

Percentile

CPS 208.5 215.2 6.7 60.5 215.1 223.2 8.1 56.7

BSC 215.8 221.7 5.9 63.2 223.5 230.8 7.3 56.8

School 205.5 213.6 8.1 63.5 88 209.9 220.2 10.3 62.5 88

Accountability

SQRP Total Points Earned

SY 2016-17 SQRP Rating

SY 2016-17 Accountability Status

School 4.3 Level 1+ Good Standing

Potential Areas for Reflection

● While Lovett is strong in PL across many categories, we observed fewer practice clusters at Lovett than at other schools in the BSC cohort. How could Lovett better combine PL-related practices to optimize personalized learning?

● How can Lovett take the next step to make PL remains sustainable?

Page 196: Final Report - LEAP Innovations · 2019-08-12 · Please contact Annalee Good (annalee.good@wisc.edu) with questions related to this report. 4 • Examples of practices supporting