table of contents indicator 1: graduation rate 15...

224
Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) TABLE OF CONTENTS INDICATOR 1: GRADUATION RATE ........................................................................... 1 Prepared by the National Dropout Prevention Center for Students with Disabilities INDICATOR 2: DROPOUT RATE................................................................................. 15 Prepared by the National Dropout Prevention Center for Students with Disabilities INDICATOR 3: ASSESSMENT .................................................................................... 27 Prepared by the National Center on Educational Outcomes INDICATOR 4: RATES OF SUSPENSION AND EXPULSION .................................... 54 Prepared by the Data Accountability Center INDICATOR 5: LEAST RESTRICTIVE ENVIRONMENT (LRE) ................................... 68 Prepared by the National Institute for Urban School Improvement-Leadscape INDICATOR 7: PRESCHOOL OUTCOMES ................................................................ 83 Prepared by the Early Childhood Outcomes Center INDICATOR 8: PARENT INVOLVEMENT.................................................................. 102 Prepared by the National Parent Technical Assistance Center (NPTAC) at PACER Center, Region 1 PTAC at Statewide Parent Advocacy Network, Regional 2 PTAC at Exceptional Children's Assistance Center, Region 3 PTAC at Partners Resource Network, Region 4 PTAC at Wisconsin FACETS, Region 5 PTAC at PEAC Parent Center, and Regional 6 PTAC at Matrix Parent Network and Resource Center INDICATORS 9, 10: DISPROPORTIONATE REPRESENTATION DUE TO INAPPROPRIATE IDENTIFICATION ......................................................................... 115 Prepared by the Data Accountability Center and the National Center on Response to Intervention INDICATOR 11: TIMELY INITIAL EVALUATIONS .................................................... 132 Prepared by the Data Accountability Center INDICATOR 12: EARLY CHILDHOOD TRANSITION............................................... 144 Prepared by the National Early Childhood Technical Assistance Center INDICATOR 13: SECONDARY TRANSITION ........................................................... 154 Prepared by the National Secondary Transition Technical Assistance Center INDICATOR 14: POST-SCHOOL OUTCOMES ........................................................ 161 Prepared by the National Post-School Outcomes Center

Upload: others

Post on 08-Feb-2021

0 views

Category:

Documents


0 download

TRANSCRIPT

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010)

    TABLE OF CONTENTS

    INDICATOR 1: GRADUATION RATE ........................................................................... 1 Prepared by the National Dropout Prevention Center for Students with Disabilities

    INDICATOR 2: DROPOUT RATE................................................................................. 15 Prepared by the National Dropout Prevention Center for Students with Disabilities

    INDICATOR 3: ASSESSMENT .................................................................................... 27 Prepared by the National Center on Educational Outcomes

    INDICATOR 4: RATES OF SUSPENSION AND EXPULSION .................................... 54 Prepared by the Data Accountability Center

    INDICATOR 5: LEAST RESTRICTIVE ENVIRONMENT (LRE) ................................... 68 Prepared by the National Institute for Urban School Improvement-Leadscape

    INDICATOR 7: PRESCHOOL OUTCOMES ................................................................ 83 Prepared by the Early Childhood Outcomes Center

    INDICATOR 8: PARENT INVOLVEMENT .................................................................. 102 Prepared by the National Parent Technical Assistance Center (NPTAC) at PACER Center, Region 1 PTAC at Statewide Parent Advocacy Network, Regional 2 PTAC at Exceptional Children's Assistance Center, Region 3 PTAC at Partners Resource Network, Region 4 PTAC at Wisconsin FACETS, Region 5 PTAC at PEAC Parent Center, and Regional 6 PTAC at Matrix Parent Network and Resource Center

    INDICATORS 9, 10: DISPROPORTIONATE REPRESENTATION DUE TO INAPPROPRIATE IDENTIFICATION ......................................................................... 115 Prepared by the Data Accountability Center and the National Center on Response to Intervention

    INDICATOR 11: TIMELY INITIAL EVALUATIONS .................................................... 132 Prepared by the Data Accountability Center

    INDICATOR 12: EARLY CHILDHOOD TRANSITION ............................................... 144 Prepared by the National Early Childhood Technical Assistance Center

    INDICATOR 13: SECONDARY TRANSITION ........................................................... 154 Prepared by the National Secondary Transition Technical Assistance Center

    INDICATOR 14: POST-SCHOOL OUTCOMES ........................................................ 161 Prepared by the National Post-School Outcomes Center

  • Part B 2012 SPP/APR Indicator Analyses (FFY 2010)

    INDICATOR 15: TIMELY CORRECTION OF NONCOMPLIANCE ............................ 181 Prepared by the Data Accountability Center

    INDICATORS 16, 17, 18 AND 19: DISPUTE RESOLUTION UNDER PART B ......... 191 Prepared by the Center for Appropriate Dispute Resolution in Special Education (CADRE)

    INDICATOR 20: TIMELY AND ACCURATE DATA ................................................... 218 Prepared by the Data Accountability Center

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 1

    INDICATOR 1: GRADUATION RATE Prepared by NDPC-SD

    INTRODUCTION

    The National Dropout Prevention Center for Students with Disabilities (NDPC-SD) was assigned the task of compiling, analyzing, and summarizing the data for Indicator 1—Graduation—from the FFY 2010 Annual Performance Reports (APRs) and amended State Performance Plans (SPPs), which were submitted by states to the Office of Special Education Programs (OSEP) on February 1st of 2012. The text of the indicator is as follows:

    Percent of youth with IEPs graduating from high school with a regular diploma.

    This report summarizes NDPC-SD’s findings for Indicator 1 across the 50 states, commonwealths and territories, and the Bureau of Indian Education (BIE), for a total of 60 agencies. For the sake of convenience, in this report the term “states” is inclusive of the 50 states, the commonwealths and territories, and the BIE.

    MEASUREMENT

    The Part B Measurement Table indicates that states are to use the, “Same data as used for reporting to the Department under Title I of the Elementary and Secondary Education Act (ESEA).” These data are reported in the Consolidated State Performance Report exiting data.

    Sampling is not permitted for this indicator, so states must report graduation information for all of their students with disabilities. States were instructed to, “Report using the graduation rate calculation and timeline established by the Department under the ESEA” and to, “Describe the results of the State’s examination of the data for the year before the reporting year (e.g., for the FFY 2010 APR, use data from the 2009-2010 school year), and compare the results to the target for the 2009-10 school year. Provide the actual numbers used in the calculation.” Additional instructions were to, “Provide a narrative that describes the conditions youth must meet in order to graduate with a regular diploma and, if different, the conditions that youth with IEPs must meet in order to graduate with a regular diploma. If there is a difference, explain why.” Finally, states’ performance targets were to be the same as their annual graduation rate targets under Title I of the ESEA.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 2

    IMPLICATIONS OF THE GRADUATION RATE MEASUREMENT

    The four-year adjusted cohort graduation rate defines a “graduate” as someone who receives a regular high school diploma in the standard number of years—specifically, four. Students who do not meet the criteria for graduating with a regular diploma cannot be included in the numerator of the calculation, but must be included in the denominator. The new calculation also excludes students who receive a modified or special diploma, a certificate, or a General Education Development (GED) from being counted as graduates. It is adjusted to reflect transfers into and out of the cohort (i.e., out of the school), as well as loss of students to death.

    The equation below shows an example of the four-year graduation rate calculation for the cohort entering ninth grade for the first time in the fall of the 2006-07 school year and graduating by the end of the 2009-10 school year.

    # of cohort members receiving a regular HS diploma by end of the 2009-10 school year

    # of first-time 9th graders in fall 2006 (starting cohort) + transfers in – transfers out – emigrated out –

    deceased during school years 2006-07 through 2009-10

    States may obtain permission from the U.S. Department of Education to report one or more additional cohorts that span a different number of years (for example, a five-year cohort or a five-year plus a six-year cohort, etc.). Because students with disabilities and students with limited English proficiency face additional obstacles to completing their coursework and examinations within the standard four-year timeframe, the use of such extended cohort rates can help ensure that these students are ultimately counted as graduates, despite their longer stay in school than the traditional four years. It should be noted that states are prohibited from using this provision exclusively for youth with disabilities and youth with limited English proficiency. Several states have taken advantage of this option, and it is likely that this provision for using extended cohorts will become more important in years to come as many states have increased their academic credit and course requirements for all students to graduate.

    The requirement to follow every child in a cohort necessitates the use of longitudinal data systems that employ unique student identifiers. Most states have these in place, or are well on the way to developing such systems. A few states have had difficulty meeting this need and have had to request permission from the Department of Education to report using a different calculation method or data set.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 3

    CALCULATION METHODS

    States will not be required to implement the new adjusted cohort rate calculation until the 2010-11 school year and many have not yet done so. In FFY 2010, only 20 states (33%) reported using the adjusted cohort calculation. Of the remaining 40 states, 30 (50%) reported a leaver rate, four states (7%) reported a cohort rate, three states (5%) reported an event rate, and three states (5%) reported using other calculations. Figures 1 – 5 show states’ graduation rates, based on the type of calculation employed.

    Figure 1

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 4

    Figure 2

    Figure 3

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 5

    Figure 4

    Figure 5

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 6

    STATES’ PERFORMANCE ON THE INDICATOR

    As shown in Figure 6, 17 states (28%) met or exceeded their FFY 2010 graduation rate targets and 43 states (72%) did not. These results are down from FFY 2009, during which 25 states (42%) met their graduation rate targets. Of those that met their graduation target, 13 states (22%) also met their dropout rate target in FFY 2010.

    A factor that adversely impacted states’ performance against their targets was that 35 states (58%) raised their graduation rate targets from last year. As reported in the FFY 2009 APRs, targets ranged from 25.0% to 91.3% with mean of 71.2% and median of 75.3%. In the current APRs, targets ranged from 22.0% to 90.0% with mean of 72.8% and median of 80.0%.

    Figure 7 shows that more than half the states (33 states or 55%) made progress and improved their rates, whereas 24 states (40%) reported a decrease (slippage) in their graduation rates from FFY 2009. One state’s rate remained at the FFY 2009 level and two states were unable to make the comparison because they lacked comparable data.

    Despite this progress, across each of the four common methods of calculation (leaver, adjusted cohort, cohort, and event formulas), average graduation rates for students with disabilities appeared to decline during FFY 2010. Several factors contributed to this. Some of them resulted in an actual decrease in the rate, whereas others are artifacts of changes in targets or measurement between FFY 2009 and FFY 2010.

    One relatively minor factor in reducing the rates involves states that have very low numbers of students with disabilities. In these states, small fluctuations in the number of graduates from year to year can yield drastic swings in the graduation rate, thereby raising or lowering averages. Another fairly minor factor is the slight increase in the number of states that calculated an adjusted cohort rate (as opposed to an event or leaver rate). As indicated in Figures 1, 2, and 4, the adjusted cohort rates were generally lower than event or leaver rates. This year saw an increase in the number of adjusted cohort rates and a decline in the number of leaver rates, and as a likely consequence, a depression of the average graduation rates. Finally, at least one state reported that their FFY 2009 data was suspect, resulting in what appeared to be a substantial decrease in their rate from FFY 2009 to FFY 2010.

    In examining Figure 7, it is apparent that the amount of slippage in those states whose graduation rate declined from FFY 2009 was generally greater than the amount of progress made by states that improved their graduation rate. The mean gain in states that made progress was 3.4% with a median of 2.3% (N=24), whereas the mean amount of slippage in states that slipped was -12.0% with a median of -5.3% (N=33).

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 7

    Figure 6

    Figure 7

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 8

    IMPROVEMENT STRATEGIES AND ACTIVITIES

    States were instructed to report the strategies, activities, timelines, and resources they employed in order to improve the special education graduation rate. The range of proposed activities was considerable, though many states described the use of data-based decision making to guide improvement activities and to identify at-risk youth.

    Most states acknowledged the connections between their activities for at least Indicators 1 and 2. Thirty-eight states (63%) reported the same set of activities for both indicators. Another nine states (15%) described activities common to both indicators. Many states clustered at least some, if not all, of their activities for Indicators 1, 2, 4, 13, and 14, indicators intimately tied to secondary transition. In these states, there was a concerted focus to promote successful secondary transition practices as a means to keep youth engaged in and participating in school-related activities. Additionally, 28 states (47%) also reported activities aimed at engaging parents and families in becoming partners in educating their children.

    The use of research-based/evidence-based strategies and interventions as well as “promising practices” around school completion continued among states. Twelve states (20%) mentioned statewide efforts to identify (and subsequently disseminate) effective practices in their LEAs that focused on school completion. A handful of states described various efforts to develop a toolkit or suite of resources that LEAs could use to develop and support local school completion initiatives.

    There are a number of evidence-based school-completion programs that have demonstrated efficacy for students with disabilities. The IES Practice Guide on Dropout Prevention (Dynarski, et al., 2008) describes several of these approaches to keeping youth in school and discusses the degrees of evidence supporting each. For example, it recommends the diagnostic use of data systems to support a realistic estimate of the number of students who drop out and to help identify individual students at high risk of dropping out. The practice guide also recommends assigning adult advocates to students at risk of dropping out as well as providing academic support and enrichment to improve academic performance. Additional research is under way to evaluate the efficacy of many of the other promising practices that address school completion, so additional evidence-based practices are on the horizon.

    Table 1 lists several commonly described interventions and the number of states reporting their use in the APR.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 9

    Table 1

    Evidence-based and promising practices reported in the FFY 2010 APRs

    Nature of intervention Number of states

    Used research/evidence-based practices 48

    Response to Intervention 44

    Positive Behavior Supports 32

    Parental/family engagement efforts 28

    Academic initiatives 27

    Vocational education / CTE 17

    Credit recovery programs 11

    Mentoring programs 9

    Recovery/reentry programs 6

    Statewide initiatives

    Thirty-seven states (62%) reported that school completion was a state priority, though only 24 (40%) reported that they were developing or implementing any sort of statewide initiative that would impact their graduation, dropout, and/or reentry/recovery rates.

    Georgia One statewide initiative continues in the State of Georgia, which has implemented its GraduateFIRST initiative since 2007. The program currently has three cohorts of schools, for a total of 131 schools, all of which have developed and are implementing local school completion initiatives for students with disabilities. One reason for the success of this program is the ongoing support and follow-up provided to each participating school via Georgia’s network of collaboration coaches. The coaches, who were trained by NDPC-SD and State personnel under Georgia’s State Personnel Development Grant (SPDG), are each assigned several schools in which they support the local work, serving as trainers, mentors, content resources, and cheerleaders for the ongoing work. Additionally, the program is briefly described in a brief developed by the Regional Resource Center Program’s Student Performance and Achievement Priority Team. This brief may be found online at http://www.ndpc-sd.org/documents/12.Spotlight_GraduateFirst.pdf.

    http://www.ndpc-sd.org/documents/12.Spotlight_GraduateFirst.pdfhttp://www.ndpc-sd.org/documents/12.Spotlight_GraduateFirst.pdf

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 10

    Kentucky Kentucky is also implementing a statewide initiative focused on school completion. The State’s continuous improvement monitoring process requires every district in which one or more students with disabilities drops out to conduct a root-cause analysis of their data at the district, school, and student level to identify the cause(s) of the dropout.

    While this effort is focused only on youth with disabilities, the Kentucky Department of Education also developed the Kentucky College and Career Readiness (CCR) delivery plan to address school completion for all students. The plan focuses accountability at the school/district level to increase the rate of its students who leave high school ready for college, career, or both. One of the strategies of the CCR delivery plan is the collection and use of data. This has resulted in the development of the Persistence to Graduation Tool, an early warning tool that identifies students who are at risk of dropping out. Accompanying the data tool is a suite of evidence-based practices to address any needs identified in the school.

    Alabama Alabama’s First Choice Initiative is a program designed to increase the graduation rate and to improve the post-school outcomes of Alabama youth with and without disabilities. It provides multiple pathways to graduation and provides a variety of safeguards and supports to assist struggling learners. The components of the program are: credit recovery, credit advancement (earning credit in non-traditional ways), graduation coaches for at-risk students, and multiple diploma options.

    NDPC-SD intensive states In collaboration with NDPC-SD, ten states (AR, BIE, LA, MI, MO, NE, NC, UT, WA, and WV) are currently working on statewide initiatives to improve their school completion rates. State Education Agency (SEA) and Local Education Agency (LEA) staff in these states are receiving training and technical assistance from NDPC-SD to help them develop model sites for dropout prevention initiatives or address other state/local data-related or other needs around school completion. Additionally, the State of Georgia and Miami-Dade County Public School District in Florida are continuing the work they initiated with NDPC-SD under its first round of OSEP funding.

    Nebraska Several states chose topics related to school completion for the Results portion of their OSEP continuous improvement visits in 2011. Among those states was Nebraska, which was already working intensively with the National Dropout Prevention Center for Students with Disabilities to develop, pilot, and disseminate a toolkit of resources and materials for schools to use in designing and developing local school completion initiatives. Nebraska wanted to leverage their work with NDPC-SD and reengage youth with disabilities who had dropped out of high school. Getting these youth back into

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 11

    educational programs can be an effective strategy for improving the post-secondary outcomes for these youth.

    In September 2011, Nebraska held its first stakeholder meeting, at which information about dropout, graduation, reentry/recovery, and other related topics was presented to and discussed with a broad stakeholder group. A product of that meeting was a 4-year strategic plan, which has the goal of developing, piloting, and disseminating (statewide) a reentry program for youth with disabilities in Nebraska.

    Among the strategies Nebraska has chosen to support this goal are: 1) Increasing awareness at state and local level regarding dropout reentry

    strategies; 2) Increasing capacity of current programs focused on dropout prevention to target

    students with disabilities who have left school but remain eligible for special education;

    3) Developing partnerships with other entities that can have statewide impact on providing reentry services to students with disabilities; and

    4) Partnering with general education initiatives to increase graduation rates.

    The State has posted information about their efforts and progress on this work at the following link: http://www.education.ne.gov/sped/reentry.html.

    Examples of other improvement activities

    Data-based decision making Data-based decision making was a nearly ubiquitous activity, reported by 54 states (90%) in this APR in one form or another. States are examining their school completion data and considering that information when targeting technical assistance to LEAs, awarding LEA improvement grants, looking for effective practices, and identifying topics for professional development.

    Eleven states (18%) described work on an early warning system using their longitudinal data to identify youth who are at risk of dropping out of school. The data being employed include information about students’ attendance, behavior, grade retention, and academic performance on state assessments. In general, states that reviewed this sort of information about their students have experienced success in using it to inform their work. Examples of states that examined such risk and protective factors related to school completion include Alabama, Arkansas, Massachusetts, Michigan, Nebraska, and West Virginia.

    While data-based decision making has a low level of supporting evidence in the educational literature, as discussed in the 2008 IES Practice Guide on Dropout Prevention, the practice is logical and essential for examining the factors within the school environment that contribute to dropout rates and for diagnosing the extent to

    http://www.education.ne.gov/sped/reentry.html

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 12

    which schools will need to implement strategies to address dropping out. In addition, the implementation of any improvement strategy must involve continually returning to the individual student data to monitor the success of the strategy and to adjust approaches as needed. It should also be noted that the dearth of supporting evidence is more a result of the lack of studies that directly evaluate the effect this practice has on keeping youth in school than to its lack of validity.

    As discussed above, while the use of data analysis is critical in identifying areas of need, it is not a strategy or intervention, per se, for keeping youth in school, but rather a tool to support the greater effort. Once the students’ needs have been identified, it is necessary to provide rigorous instruction in academics, career skills, and self-advocacy in order to keep at-risk youth engaged in school and to foster their success.

    Identification of effective practices Kansas, Missouri, North Carolina, South Dakota, Tennessee, and Wisconsin were among the ten states that reported efforts to identify and examine the programs being implemented in their LEAs that had graduation rates above the state average. They are working to share these promising practices among the other districts in the state through various means, including websites, communities of practice, newsletters, and conference presentations.

    Eleven states (18%) indicated in their APRs that they are actively engaging in evaluation of their improvement activities to identify those which yield measurable improvements in the desired impact area. The states incorporating evaluation into their improvement activities are Georgia, Hawaii, Iowa, Illinois, Indiana, Kansas, Kentucky, Nebraska, Pennsylvania, South Dakota, and Vermont.

    Reentry programs Including Nebraska, six states (17%) described reentry/recovery programs in their APRs. While there are many such programs around the country, most operate on a local level, rather than statewide, as Nebraska intends for their initiative. This makes it difficult to locate and identify them. Reentry programs may be operating in many states, but because of their local nature, they simply do not get reported in states’ APRs.

    Reentry programs generally involve a school system and a combination of one or more community agencies, businesses or business organizations, colleges or community colleges, or faith-based organizations. Their focus varies, depending on their genesis and the population they serve. One commonality is that reentry programs frequently offer options for credit-recovery—a necessity if the goal is to obtain a high school diploma, as the majority of returning students are credit deficient. Another common characteristic of reentry programs is their flexibility. The needs of the populations they serve are often quite diverse, so flexibility in scheduling, venue for instruction, mode of

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 13

    instructional delivery, and entry/exit from the program are all beneficial elements that help them address their audiences effectively.

    COMMONALITIES AMONG STATES THAT MADE PROGRESS OR MET TARGETS

    Table 2 shows some of the school completion activities states engaged in and indicates whether they made progress from FFY 2009 or achieved their FFY 2010 targets for Indicator 1.

    Table 2 States’ performance and some of their activities

    Improvement activity

    Number of states that made progress

    Number of states that met

    graduation target

    Transition-related activities 31 17

    Data-based decision making 30 15

    Indicated graduation & dropout were a priority 22 11

    Using one or more evidence-based practices 28 16

    Statewide initiative related to school completion 16 8

    Filtering the data to select states that made progress and engaged in all of the above activities leaves ten states (17%). Only five states (8%) met their graduation target and engaged all of the activities in the table above. The same five states met their target, made progress, and engaged in the above improvement activities.

    CONCLUSIONS AND RECOMMENDATIONS

    The overall quality of states’ APRs for FFY 2010 was the best since the SPP/APR came into existence. States generally provided the required information about their definitions, calculations, and data in a clear form. The descriptions of improvement activities were generally more concise than in years past as well. As more states switch over to using the adjusted cohort rate calculation, it will continue to become easier to quantify states’ improvements and compare progress for the nation overall.

    While Indicators B1 and B2 are performance indicators (as opposed to compliance indicators), in these lean fiscal times there is increasing importance being placed on the identification of activities that will improve states’ graduation and dropout rates for students with disabilities. The difficulty of judging what activities were most beneficial

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 14

    based solely on the brief amount of information contained in the APR is a difficult task at best. Without knowing the particulars about each activity or intervention, its implementation within a state, and having some impact data for the activity, there is basically no way to determine what worked well and what did not.

    To advance “the work” of improving school completion rates in the nation, more states need to engage in meaningful evaluation of their SPP improvement activities and to report on what worked in particular contexts for their students with disabilities. Information of this nature can benefit other states struggling with similar issues. The Regional Resource Center Program has posted resources to support states in their evaluation of improvement activities at the following URL: http://www.rrcprogram.org/content/view/191/288/.

    http://www.rrcprogram.org/content/view/191/288/

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 15

    INDICATOR 2: DROPOUT RATE Prepared by NDPC-SD INTRODUCTION

    The National Dropout Prevention Center for Students with Disabilities (NDPC-SD) was assigned the task of compiling, analyzing, and summarizing the data for Indicator 2—Dropout—from the FFY 2010 Annual Performance Reports (APRs) and the revised State Performance Plans (SPPs), which were submitted to the Office of Special Education Programs (OSEP) in February of 2012. The text of the indicator is as follows:

    Percent of youth with IEPs dropping out of high school.

    This report summarizes the NDPC-SD’s findings for Indicator 2 across the 50 states, commonwealths and territories, and the Bureau of Indian Education (BIE), for a total of 60 agencies. For the sake of convenience, in this report the term “states” is inclusive of the 50 states, the commonwealths and territories, as well as the BIE.

    MEASUREMENT

    The OSEP Part B Measurement Table for this submission indicates that, “Sampling is not allowed.” Additionally, it advises that states should provide state-level dropout data and that they should, “Describe the results of the State’s examination of the data for the year before the reporting year (e.g., for the FFY 2010 APR, use data from 2009-2010), and compare the results to the target. Provide the actual numbers used in the calculation.” States were also instructed to, “Provide a narrative that describes what counts as dropping out for all youth and, if different, what counts as dropping out for youth with IEPs. If there is a difference, explain why.”

    Additionally, the Measurement Table indicates that states must, “Report using the dropout data used in the ESEA graduation rate calculation and follow the timeline established by the Department under the ESEA.” The instructions for completing the Consolidated State Performance Report (for ESEA reporting) instruct states to provide the dropout rates calculated using the annual event school dropout rate for students leaving school in a single year determined in accordance with the National Center for Education Statistics’ (NCES) Common Core of Data (CCD) for the previous school year.

    In the FFY 2010 APRs, most states followed the above guidance. The major exceptions were territories and commonwealths, which are not required to submit data under the ESEA. These states reported using their §618 exiting data.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 16

    CALCULATION METHODS

    Though it is less of an issue now than in the past, comparisons of dropout rates among states are still confounded by the existence of multiple methods for calculating dropout rates and the fact that different states employ different calculations to fit their circumstances. The dropout rates reported in the FFY 2010 APRs were generally calculated using one of three methods: an event rate calculation, a cohort rate calculation, or an adjusted cohort rate calculation.

    The NCES event rate, reported by the vast majority of states (47 states, or 78%), yields a very basic snapshot of a single year’s group of dropouts. While the cohort method generally yields a higher dropout rate than the event calculation, it provides a more accurate picture of the attrition from school over the course of four years than do event or adjusted cohort methods. As the name suggests, the cohort method follows a group or cohort of individual students from 9th through 12th grades. Nine states (15%) reported a cohort-based dropout rate. Leaver rates provide an estimate of the dropout rate for a cohort of students. Calculations of this type generally result in higher rates than do event-rate calculations. This year, four states (7%) reported using a leaver rate.

    Figures 1 – 3 show states’ dropout rates, based on the method employed in calculating their dropout rate for the FFY 2010 APR (using 2009-10 data).

    Figure 1

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 17

    Figure 2

    Figure 3

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 18

    STATES’ PERFORMANCE ON THE INDICATOR

    Because states are not required to specify dropout-rate targets under ESEA, they have continued using their SPP targets for improvement. In FFY 2010, 36 states (60%) met their SPP performance target for Indicator 2 and 24 states (40%) missed their target. These are nearly the same proportions as in FFY 2009, in which 35 states met their target and 25 states missed the target.

    In FFY 2010, 44 states had the same performance against their target as they did in FFY 2009—that is, they either met their target during FFY 2009 and FFY 2010, or missed their target during both federal fiscal years.

    Over the years of the SPP, states have generally improved at setting realistic, achievable targets for improvement. Most states’ performance was quite close to the target they had set, regardless of whether they met or missed that target. Only seven states (12%) performed more than five percentage points above or below their target. Figure 4 compares each state’s dropout rate with its target. Note: to meet the target on this indicator, a state must be at or below the target value they specified in the SPP.

    Figure 4

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 19

    As illustrated in Figure 5, 34 states (57%) made progress, lowering their dropout rate. The mean amount by which these states lowered their dropout rates was –1.8%, with a median value of –0.8%. This was an improvement over FFY 2009, during which only 17 states made progress. The mean amount of improvement in FFY 2009 was –3.4%, with a median value of –1.2%, so while fewer states made progress in FFY 2009, their progress was greater than that of states in FFY 2010.

    In FFY 2010, 22 states (37%) experienced slippage and saw dropout rates increase. The mean amount of increase in these states was 1.9%, with a median value of 0.9%. In four states (7%), dropout rates remained unchanged from the previous year. In contrast , in FFY 2009, dropout rates increased in 38 states, with a mean increase of 2.3% and a median value of 0.9%.

    Figure 5

    IMPROVEMENT STRATEGIES AND ACTIVITIES

    States were instructed to report the strategies, activities, timelines, and resources they employed in order to improve the special education graduation rate. The range of proposed activities was considerable, though many states described the use of data-based decision making to guide improvement activities and to identify at-risk youth.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 20

    Most states acknowledged the connections between their activities for at least Indicators 1 and 2. Thirty-eight states (63%) reported the same set of activities for both indicators. Another nine states (15%) described activities common to both indicators. Many states clustered at least some, if not all, of their activities for Indicators 1, 2, 4, 13, and 14, indicators intimately tied to secondary transition. In these states, there was a concerted focus to promote successful secondary transition practices as a means to keep youth engaged in and participating in school-related activities. Additionally, 28 states (47%) also reported activities aimed at engaging parents and families in becoming partners in educating their children.

    The use of research-based/evidence-based strategies and interventions as well as “promising practices” around school completion continued among states. Twelve states (20%) mentioned statewide efforts to identify (and subsequently disseminate) effective practices in their Local Education Agencies (LEAs) that focused on school completion. A handful of states described various efforts to develop a toolkit or suite of resources that LEAs could use to develop and support local school completion initiatives.

    There are a number of evidence-based school-completion programs that have demonstrated efficacy for students with disabilities. The IES Practice Guide on Dropout Prevention (Dynarski, et al., 2008) describes several of these approaches to keeping youth in school and discusses the degrees of evidence supporting each. For example, it recommends the diagnostic use of data systems to support a realistic estimate of the number of students who drop out and to help identify individual students at high risk of dropping out. The practice guide also recommends assigning adult advocates to students at risk of dropping out as well as providing academic support and enrichment to improve academic performance. Additional research is under way to evaluate the efficacy of many of the other promising practices that address school completion, so additional evidence-based practices are on the horizon.

    Table 1 lists several commonly described interventions and the number of states reporting their use in the Annual Performance Report (APR).

    Table 1

    Evidence-based and promising practices reported in the FFY 2010 APRs

    Nature of intervention Number of states

    Used research/evidence-based practices 48

    Response to Intervention 44

    Positive Behavior Supports 32

    Parental engagement efforts 28

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 21

    Academic initiatives 27

    Vocational education / Career and Technical Education (CTE)

    17

    Credit recovery programs 11

    Mentoring programs 9

    Recovery/reentry programs 6

    Statewide initiatives Thirty-seven states (62%) reported that school completion was a state priority, though only 24 (40%) reported that they were developing or implementing any sort of statewide initiative that would impact their graduation, dropout, and/or reentry/recovery rates.

    Georgia One statewide initiative continues in the State of Georgia, which has implemented its GraduateFIRST initiative since 2007. The program currently has three cohorts of schools, for a total of 131 schools, all of which have developed and are implementing local school completion initiatives for students with disabilities. One reason for the success of this program is the ongoing support and follow-up provided to each participating school via Georgia’s network of collaboration coaches. The coaches, who were trained by NDPC-SD and State personnel under Georgia’s State Personnel Development Grant (SPDG), are each assigned several schools in which they support the local work, serving as trainers, mentors, content resources, and cheerleaders for the ongoing work. Additionally, the program is described in a brief developed by the Regional Resource Center Program’s Student Performance and Achievement Priority Team, which may be found at http://www.ndpc-sd.org/documents/12.Spotlight_GraduateFirst.pdf.

    Kentucky Kentucky is also implementing a statewide initiative focused on school completion. The State’s continuous improvement monitoring process requires every district in which one or more students with disabilities drops out to conduct a root-cause analysis of their data at the district, school, and student level to identify the cause(s) of the dropout.

    While this effort is focused only on youth with disabilities, the Kentucky Department of Education also developed the Kentucky College and Career Readiness (CCR) delivery plan to address school completion for all students. The plan focuses accountability at the school/district level to increase the rate of its students who leave high school ready for college, career or both. One of the strategies of the CCR delivery plan is the collection and use of data. This has resulted in the development of the Persistence to Graduation Tool, an early warning tool that identifies students who are at risk of

    http://www.ndpc-sd.org/documents/12.Spotlight_GraduateFirst.pdfhttp://www.ndpc-sd.org/documents/12.Spotlight_GraduateFirst.pdf

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 22

    dropping out. Accompanying the data tool is a suite of evidence-based practices to address any needs identified in the school.

    Alabama Alabama’s First Choice Initiative is a program designed to increase the graduation rate and to improve the post-school outcomes of Alabama youth with and without disabilities. It provides multiple pathways to graduation and provides a variety of safeguards and supports to assist struggling learners. The components of the program are credit recovery, credit advancement (earning credit in non-traditional ways), graduation coaches for at-risk students, and multiple diploma options.

    NDPC-SD intensive states In collaboration with NDPC-SD, ten states (AR, BIE, LA, MI, MO, NE, NC, UT, WA, and WV) are currently working on statewide initiatives to improve their school completion rates. SEA and LEA staff in these states are receiving training and technical assistance from NDPC-SD to help them develop model sites for dropout prevention initiatives or address other state/local data-related or other needs around school completion. Additionally, the State of Georgia and Miami-Dade County Public School District in Florida are continuing the work they initiated with NDPC-SD under its first round of OSEP funding.

    Nebraska Several states chose topics related to school completion for the Results portion of their OSEP continuous improvement visits in 2011. Among those states was Nebraska, which was already working intensively with the National Dropout Prevention Center for Students with Disabilities to develop, pilot, and disseminate a toolkit of resources and materials for schools to use in designing and developing local school completion initiatives. Nebraska wanted to leverage their work with NDPC-SD and reengage youth with disabilities who had dropped out of high school. Getting these youth back into educational programs can be an effective strategy for improving the post-secondary outcomes for these youth.

    In September 2011, Nebraska held its first stakeholder meeting, at which information about dropout, graduation, reentry/recovery, and other related topics was presented to and discussed with a broad stakeholder group. A product of that meeting was a 4-year strategic plan, which has the goal of developing, piloting, and disseminating (statewide) a reentry program for youth with disabilities in Nebraska.

    Among the strategies Nebraska has chosen to support this goal are: 1) Increasing awareness at state and local level regarding dropout reentry

    strategies;

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 23

    2) Increasing capacity of current programs focused on dropout prevention to target students with disabilities who have left school but remain eligible for special education;

    3) Developing partnerships with other entities that can have statewide impact on providing reentry services to students with disabilities; and

    4) Partnering with general education initiatives to increase graduation rates.

    The State has posted information about their efforts and progress on this work at the following link: http://www.education.ne.gov/sped/reentry.html.

    Examples of other improvement activities

    Data-based decision making Data-based decision making was a nearly ubiquitous activity, reported by 54 states (90%) in this APR in one form or another. States are examining their school completion data and considering that information when targeting technical assistance to LEAs, awarding LEA improvement grants, looking for effective practices, and identifying topics for professional development.

    Eleven states (18%) described work on an early warning system using their longitudinal data to identify youth who are at risk of dropping out of school. The data being employed include information about students’ attendance, behavior, grade retention, and academic performance on state assessments. In general, states that reviewed this sort of information about their students have experienced success in using it to inform their work. Examples of states that examined such risk and protective factors related to school completion include Alabama, Arkansas, Massachusetts, Michigan, Nebraska, and West Virginia.

    While data-based decision making has a low level of supporting evidence in the educational literature, as discussed in the 2008 IES Practice Guide on Dropout Prevention, the practice is logical and essential for examining the factors within the school environment that contribute to dropout and for diagnosing the extent to which schools will need to implement strategies to address dropping out. In addition, the implementation of any improvement strategy must involve continually returning to the individual student data to monitor the success of the strategy and to adjust approaches as needed. It should also be noted that the dearth of supporting evidence is more a result of the lack of studies that directly evaluate the effect this practice has on keeping youth in school than to its lack of validity.

    As discussed above, while the use of data analysis is critical in identifying areas of need, it is not a strategy or intervention, per se, for keeping youth in school, but rather a tool to support the greater effort. Once the students’ needs have been identified, it is necessary to provide rigorous instruction in academics, career skills, and self-advocacy in order to keep at-risk youth engaged in school and to foster their success.

    http://www.education.ne.gov/sped/reentry.html

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 24

    Identification of effective practices Kansas, Missouri, North Carolina, South Dakota, Tennessee, and Wisconsin were among the ten states that reported efforts to identify and examine the programs being implemented in their LEAs that had graduation rates above the state average. They are working to share these promising practices among the other districts in the state through various means, including websites, communities of practice, newsletters, and conference presentations.

    Eleven states (18%) indicated in their APRs that they are actively engaging in evaluation of their improvement activities to identify those which yield measurable improvements in the desired impact area. The states incorporating evaluation into their improvement activities are Georgia, Hawaii, Iowa, Illinois, Indiana, Kansas, Kentucky, Nebraska, Pennsylvania, South Dakota, and Vermont.

    Reentry programs Including Nebraska, six states (17%) described reentry/recovery programs in their APRs. While there are many such programs around the country, most operate on a local level, rather than statewide, as Nebraska intends for their initiative. This makes it difficult to locate and identify them. Reentry programs may be operating in many states, but because of their local nature, they simply do not get reported in states’ APRs.

    Reentry programs generally involve a school system and a combination of one or more community agencies, businesses or business organizations, colleges or community colleges, or faith-based organizations. Their focus varies, depending on their genesis and the population they serve. One commonality is that reentry programs frequently offer options for credit-recovery—a necessity if the goal is to obtain a high school diploma, as the majority of returning students are credit deficient. Another common characteristic of reentry programs is their flexibility. The needs of the populations they serve are often quite diverse, so flexibility in scheduling, venue for instruction, mode of instructional delivery, and entry/exit from the program are all beneficial elements that help them address their audiences effectively.

    COMMONALITIES AMONG STATES THAT MADE PROGRESS OR MET TARGETS

    Table 2 shows some of the school completion activities states engaged in and indicates whether they made progress from FFY 2009 or achieved their FFY 2010 targets for Indicator 2.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 25

    Table 2

    Performance of states that engaged in certain activities

    Improvement activity

    Number of states that made progress

    Number of states that met dropout

    target

    Transition-related activities 32 34

    Data-based decision making 33 33

    Indicated graduation & dropout were a priority 22 25

    Using one or more evidence-based practices 28 30

    Statewide initiative related to school completion 15 16

    Filtering the data using the above criteria leaves eight states (13%) that made progress, met their dropout target, and engaged all of the activities in the table above.

    CONCLUSIONS AND RECOMMENDATIONS

    The overall quality of states’ APRs for FFY 2010 was the best since the SPP/APR came into existence. States generally provided the required information about their definitions, calculations, and data in a clear form. The descriptions of improvement activities were generally more concise than in years past as well. As more states switch over to using the adjusted cohort rate calculation, it will continue to become easier to quantify states’ improvements and compare progress for the nation overall.

    While Indicators B1 and B2 are performance indicators (as opposed to compliance indicators), in these lean fiscal times, there is increasing importance being placed on the identification of activities that will improve states’ graduation and dropout rates for students with disabilities. The difficulty of judging what activities were most beneficial based solely on the brief amount of information contained in the APR is a difficult task at best. Without knowing the particulars about each activity or intervention, its implementation within a state, and having some impact data for the activity, there is basically no way to determine what worked well and what did not.

    To advance the work of improving school completion rates in the nation, more states need to engage in meaningful evaluation of their SPP improvement activities and to report on what worked in particular contexts for their students with disabilities. Information of this nature can benefit other states struggling with similar issues. The Regional Resource Center Program has posted resources to support states in their

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 26

    evaluation of improvement activities at the following URL: http://www.rrcprogram.org/content/view/191/288/.

    http://www.rrcprogram.org/content/view/191/288/

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 27

    INDICATOR 3: ASSESSMENT Prepared by NCEO

    INTRODUCTION The National Center on Educational Outcomes (NCEO) analyzed the information provided by states for Part B Indicator 3 (Assessment), which includes both participation and performance of students with disabilities in statewide assessments. This indicator also includes a measure of the extent to which districts in a state are meeting the Elementary and Secondary Education Act (ESEA) reauthorized as No Child Left Behind (NCLB) Adequate Yearly Progress (AYP) criterion for students with disabilities.

    Indicator 3 information in this report is based on Annual Performance Report data from 2010-2011 state assessments. States submitted their data in February 2012 using baseline information and targets (unless revised) submitted in their State Performance Plans (SPPs) first presented in December 2005.

    This report summarizes data and progress toward targets for the Indicator 3 subcomponents of (a) percent of districts meeting AYP, (b) state assessment participation, and (c) state assessment performance. All information contained in this report is an analysis or summary of state data for a given content area (or overall for AYP) across grades three through eight, and one tested grade in high school. Because states disaggregated data to varying degrees, not all states are represented in all data summaries. For example, some states disaggregated by grade band, or provided only information summed across grades. For AYP, some states provided this information only by content area, which could not be aggregated to an overall AYP rate.

    This report includes an overview of our methodology, followed by findings for each component of Part B Indicator 3 (AYP, Participation, and Performance). We conclude by addressing data slippage and progress as well as state Improvement Activities.

    DATA SOURCES AND MEASUREMENT APPROACHES

    We obtained APRs used for this report from the RRCP Web site in February, March, and April 2012. We entered data into working documents from original APR submissions and then, following the April week of clarification, we verified all data using revised APRs submitted in that month. In instances of disagreement, we used new data from revised APRs for analyses. For the analyses in this report, we used only the information that states reported in their APRs for 2010-2011 assessments.

    Three components comprise the data in Part B Indicator 3:

    • 3A is the percent of districts (based on those with a disability subgroup that meets the state’s minimum “n” size) that meet the state’s Adequate Yearly Progress (AYP) objectives for progress for the disability subgroup.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 28

    • 3B is the participation rate for children with IEPs who participate in the various assessment options (Participation).

    • 3C is the proficiency rate (based on grade-level, modified or alternate achievement standards) for children with IEPs (Proficiency).

    3B (Participation) and 3C (Performance) have subcomponents:

    • The number of students with Individualized Education Programs (IEPs). • The number of students in a regular assessment with no accommodations. • The number of students in a regular assessment with accommodations. • The number of students in an alternate assessment measured against GRADE

    LEVEL achievement standards. • The number of students in an alternate assessment measured against

    MODIFIED achievement standards. • The number of students in an alternate assessment measured against

    ALTERNATE achievement standards.

    States provided data disaggregated to the level of these subcomponents, which included for components 3B and 3C the two content areas of Reading or English Language Arts and Mathematics. Some states disaggregated data by specific grade levels tested only, or by grade bands. Some states provided these content-specific data by both disaggregating by grade and by providing an overall data point. Most states provided only an overall data point.

    For Improvement Activities (IAs), states were directed to describe these for the year just completed (2010-2011) as well as projected changes for upcoming years. The analysis of 2010-2011 Improvement Activities used the OSEP coding scheme consisting of letters A–J, with J being “other” activities. The NCEO Improvement Activities coding process used 11 subcategories under J (“other”) to capture specific information about the types of activities undertaken by states (see Appendix A for examples of each of these sub-categories). These 11 sub-categories were the same as those used to code data from school years 2009-2010, 2008-2009, 2007-2008, and 2006-2007 and only slightly modified from those used to code 2005-2006 data. Consistent with the previous report, we omitted the J12 category in the current analysis. Quality was assured by having the primary coder review with a second reviewer those IAs that were difficult to classify into the categories. The review process addressed specific IAs in individual states, as well as the selection of exemplars for each of the IA categories.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 29

    Percent of Districts Meeting State’s Adequate Yearly Progress Objective (Component 3A)

    Component 3A (AYP) is defined for states as:

    Percent = [(# of districts meeting the State’s AYP objectives for progress for the disability subgroup (i.e., children with IEPs)) divided by (total # of districts that have a disability subgroup that meets the State’s minimum “n” size in the State)] times 100.

    Figure 1 shows the ways in which regular and unique states provided AYP data on their APRs. Seven states indicated that AYP requirements of ESEA did not apply to them; one regular state indicated that AYP did not apply because that state is a single district. Forty-two regular states and three unique states reported AYP data in their APRs in a way that the data could be aggregated across states. Seven states either provided data broken down by content area (four states), or grade level (three states), which made them inappropriate for Indicator 3A.

    Figure 2

    Ways in Which Regular and Unique States Provided AYP Data

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 30

    Of the 49 states with AYP information, 24 met their 2010-2011 targets for AYP, and seven states did not, as shown in Table 1. The remaining 18 regular states and all 10 unique states were not included in this met/not met analysis because they did not provide an overall value for baseline data, targets, or 2010-2011 actual data. States that met targets were likely to have had lower than average baseline data and set lower than average targets (though average targets were lower than one year ago). Further, targets that were met were set much lower than targets that were not met. Finally, states that met targets reported lower than average actual data, and also reported lower actual data in the 2010-2011 in comparison to the previous year. States that did not meet targets were likely to have higher than average baseline data, set higher than average targets, and reported higher than average actual data.

    Table 1 Percentage of Districts Making AYP in 2010-11 within Regular States and State

    Entities that Provided Baseline, Target, and Actual Data

    N

    BASELINE (MEAN %)

    TARGET (MEAN %)

    ACTUAL DATA (MEAN %)

    OVERALL 31 51.8% 37.3% 62.4% MET 24 48.5% 26.8% 58.7% NOT MET 7 63.3% 73.4% 75.1% REGION 1 4 54.3% 48.7% 64.3% REGION 2 7 37.7% 25.2% 53.0% REGION 3 4 73.5% 39.6% 68.4% REGION 4 5 63.4% 63.6% 75.6% REGION 5 6 51.8% 37.0% 48.0% REGION 6 5 40.6% 17.7% 46.6%

    In five of the RRC regions, 2010-2011 target data were lower than baseline values. On the other hand, the mean actual data values were reported to be higher than targets in all six regions, and higher than average baselines in four regions.

    Figure 2 shows the percentage of districts making AYP in 2010-2011 for the 45 regular states and state entities that provided overall data. We sorted data by current values, and grouped by states that reported baseline information and those that did not provide baseline information that we could use in analysis. From a quick glance at the figure, the reader can see a wide range of reported change in values across states with both data points (baselines and targets). Many states (n=19) showed a net increase in the percentage of LEAs making AYP since baseline. Few states (n=9) showed a net

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 31

    decrease in the percentage of LEAs not making AYP since baseline. The range in values was from 0% to 100%, with most states reporting more than 50% of districts within the state making AYP (28 states); 15 states reported less than 50% of districts making AYP.

    Figure 2

    Change in the Proportion of Districts Meeting AYP Since Baseline

    Figure 3 shows progress and slippage data and the wide range in these across states.

    Forty-five regular and unique states reported overall information for AYP in 2009-2010 and 2010-2011 used in progress/slippage comparisons. Of these 40 states, 25 showed progress, ranging from 1% to 98.3%, with an average of 34.8% progress, and a median of 31.2%. Most states' progress ranged between 1% and about 66%. Slippage was experienced by nine states, ranging from 10.7% to 62.8%, with an average of 25.6%, and a median of 17.5%. Most states' slippage ranged between about 11% and 45%. Only two states with data for 2010-2011 and 2009-2010 experienced no change in AYP

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 32

    across the two years. It appears that recent progress from 2009-2010 to 2010-2011 is responsible for much of the change from baseline shown in Figure 2.

    Figure 3

    Percentage of Progress or Slippage for AYP in Regular and Unique States from 2009-10 to 2010-11

    Note: AYP does not apply to eight states; these states are included in the ‘No Change’ states.

    PARTICIPATION OF STUDENTS WITH DISABILITIES IN STATE ASSESSMENTS (COMPONENT 3B)

    The participation rate for children with IEPs includes children who participated in the regular assessment with no accommodations, in the regular assessment with accommodations, in the alternate assessment based on grade-level achievement standards, in the alternate assessment based on modified achievement standards, and in the alternate assessment based on alternate achievement standards. Component 3B

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 33

    (participation rates) was calculated by obtaining a single number of assessment participants and dividing by the total number of students with IEPs enrolled, or by summing several numbers and then computing percentages as shown below:

    Participation rate numbers required for equations are:

    a. # of children with IEPs in assessed grades; b. # of children with IEPs in regular assessment with no accommodations (percent

    = [(b) divided by (a)] times 100); c. # of children with IEPs in regular assessment with accommodations (percent =

    [(c) divided by (a)] times 100); d. # of children with IEPs in alternate assessment against grade level achievement

    standards (percent = [(d) divided by (a)] times 100); e. # of children with IEPs in alternate assessment against modified achievement

    standards (percent = [(d) divided by (a)] times 100); and f. # of children with IEPs in alternate assessment against alternate achievement

    standards (percent = [(e) divided by (a)] times 100).

    In addition to providing the above numbers, states also were asked to account for any children included in ‘a’, but not included in ‘b’, ‘c’, ‘d’ or ‘e’.

    Thirty-eight regular and nine unique states provided data for student participation on statewide reading assessments for students with disabilities in 2012 APRs. In this section, data and text will focus on participation in reading assessments; data for math assessments were nearly identical. The average participation rate on 2010-2011 assessments across all states (with sufficient data) was 96.86%. One regular state reported a participation rate of 100%. Thirteen additional states reported participation rates of 99.0% or more. Twenty-eight regular states, and five unique states, reported participation rates between 95.0% and 98.9%.

    Table 2 shows the percentage of students with IEPs participating in large-scale assessment in reading in 2010-2011 for 47 regular and unique states that provided baseline, target, and actual data. Thirty-six states met the targets they set for participation; 11 states did not meet their targets. States that met their targets for this indicator reported actual data that, on average, met targets and surpassed baseline data. States that did not meet their targets, had actual data that did not meet target values, but did surpass baselines.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 34

    Table 2

    Percentage of Students With Disabilities Participating in Large-Scale Assessment Within Regular and Unique States that Provided Baseline, Target,

    and Actual Data

    N

    BASELINE

    (MEAN %)

    TARGET

    (MEAN %)

    ACTUAL DATA

    (MEAN %)

    OVERALL 47 97.5% 98.0% 96.9%

    MET 36 96.9% 95.9% 98.5%

    NOT MET 11 82.1% 96.1% 91.5%

    REGION 1 4 97.9% 98.8% 98.3%

    REGION 2 7 95.6% 95.7% 98.1%

    REGION 3 8 97.6% 97.3% 98.9%

    REGION 4 6 98.2% 95.6% 98.1%

    REGION 5 10 96.7% 96.3% 98.4%

    REGION 6 12 91.1% 94.0% 92.4%

    In five of the six RRC regions, actual 2010-2011 data for the states included was higher than that of baseline values, and in four of these regions, actual data was higher than targets. This was an identical finding as last year. The states in Region 4, on average, experienced a drop in the percentage of students participating in statewide assessment in comparison to last year. The states in region 5 reported identical actual performance (98.4%) as last year, and the other regions demonstrated increases in actual data values. For the most part states have made progress toward 100% participation in large-scale assessment for students with disabilities as shown in Figure 4. Since the time states set baseline values, 26 states have made progress toward 100% participation for the students with disabilities 1 has seen no change, and 21 have seen a decrease in participation. Eight states did not report baseline information. Eight states have seen participation increase by more than 5 percentage points since baseline to a maximum of 48.4 percentage points for unique states and 12.6 percentage points for regular states. Six states have seen their increase in participation since baseline push total rates from less than 95% to more than 95%.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 35

    Figure 4

    Change in the Participation of Students with Disabilities in Large-Scale Assessment Since Baseline Within Regular and Unique States

    Figure 5 shows progress and slippage in participation rates. Fifty-two regular and unique states reported overall information for student participation in 2009-2010 and 2010-2011 used in progress/slippage comparisons. Of these 52 states, 27 showed progress, ranging from 0.02% to 32.9%, with an average of 3.8% progress, and a median of 0.5%. Most states' progress ranged between 0.1% and 12%. Slippage was experienced by 20 states, ranging from 0.04% to 2%, with an average of 0.5%, and a median of 0.3%. Excluding two apparent low outliers, much of these states' slippage ranged between about 0.1% and 2%. Five states with sufficient data experienced no change in student participation across the last two years; the remaining eight states were missing data for one or both years. There was little change in progress and slippage since 2009-2010 when 26 showed progress last year, and 21 showed slippage.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 36

    Figure 5

    Percentage of Progress or Slippage for Student Participation in Large-Scale Assessment within Regular and Unique States

    PERFORMANCE OF STUDENTS WITH DISABILITIES ON STATE ASSESSMENTS (COMPONENT 3C) State assessment performance of students with IEPs comprises the rates of those children achieving proficiency on the regular assessment with no accommodations, the regular assessment with accommodations, the alternate assessment based on grade-level achievement standards, the alternate assessment based on modified achievement standards, and the alternate assessment based on alternate achievement standards. The calculation of the proficiency rate component (3C) of Indicator 3 includes the computation of the following rates:

    Proficiency Rate numbers required for equations are (Full academic year students only):

    a. # of children with IEPs in assessed grades; b. # of children with IEPs in assessed grades who are proficient or above as

    measured by the regular assessment with no accommodations (percent = [(b) divided by (a)] times 100);

    c. # of children with IEPs in assessed grades who are proficient or above as measured by the regular assessment with accommodations (percent = [(c) divided by (a)] times 100);

    d. # of children with IEPs in assessed grades who are proficient or above as measured by the alternate assessment against grade level achievement standards (percent = [(d) divided by (a)] times 100);

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 37

    e. # of children with IEPs in assessed grades who are proficient or above as

    measured by the alternate assessment against modified achievement standards (percent = [(d) divided by (a)] times 100); and

    f. # of children with IEPs in assessed grades who are proficient or above as measured against alternate achievement standards (percent = [(e) divided by (a)] times 100).

    Thirty-two regular states and unique states reported 2010-2011 assessment proficiency data in some way. Thirty-three states reported data for reading proficiency, and thirty-two reported math proficiency data. Data for the proficiency sub-indicator had differences between content areas, and separate analyses were completed and are presented in this section.

    Reading

    Forty-six regular and unique states provided data for proficiency on statewide reading assessments for students with disabilities in 2012 APRs. All states ranged from 1.6% to 76.0% in student reading proficiency in 2010-2011. Thirteen states reported proficiency rates of less than 25% for an average 15.5%. The largest group of states reported proficiency rates between 25% and 50% (n=21); their average was 38.0%. Twelve states reported student proficiency rates of more than 50%, for an average of 60.9% per state. The overall average proportion of the states' students with disabilities who reached or exceeded proficiency in 2010-2011 was about 38%.

    Table 3 shows the percentage of students with IEPs scoring as proficient in large-scale assessments in reading in 2010-2011 for 33 regular and unique states that provided baseline, target, and actual data. Across these states, the average rate of proficiency for students with disabilities has increased by 4.8% since baseline; however, current performance averages 18.5% below the states' mean target. Six states met or surpassed their targets, and 27 states did not meet their target for this sub indicator in reading. States achieving performance targets in reading had a higher average baseline value, and an higher average in actual data (from 2010-2011 school year) than states that did not meet targets. States that did not meet targets reported more challenging targets than states that did meet targets. This distinction in target-setting is also detectable when viewing the overall target mean and the actual 2010-2011 reading performance average; the target is nearly 20% higher than this year's performance, on average.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 38

    In five of the six RRC regions, average actual 2010-2011 proficiency rates for those states with sufficient data in the regions were higher than baseline values, though below average targets in all regions. The states in Region 2 experienced, on average, a drop in the proportion of proficient students from baseline to current year's performance. The relative number of states in each region reporting sufficient data for computation was half of each region's states or even fewer in four of the six regions.

    Table 3

    Average Reading Proficiency Percentages in 2010-11 for Regular and Unique States that Provided Baseline, Target, and Actual Data

    N

    BASELINE

    (MEAN %)

    TARGET

    (MEAN %)

    ACTUAL DATA

    (MEAN %)

    OVERALL 33 33.1% 56.4% 37.9%

    MET 6 36.8% 34.9% 41.6%

    NOT MET 27 32.3% 61.2% 37.1%

    REGION 1 3 29.3% 56.8% 33.6%

    REGION 2 4 53.0% 56.9% 41.1%

    REGION 3 6 39.5% 62.4% 49.6%

    REGION 4 4 33.3% 59.2% 43.1%

    REGION 5 5 45.0% 67.4% 50.4%

    REGION 6 11 17.9% 46.9% 23.9%

    Forty-one states provided data for both the baseline year and 2010-2011. As shown in Figure 8, of these states, 10 showed slippage from baseline, for an average decrease of 11.0%. Thirty-one states showed progress between baseline and the 2010-2011 school year for an average of 9.8%. Nine states reported progress during the time of at least 10 percentage points. These states reported an average gain in the reading proficiency rate for students with disabilities of 16.5%.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 39

    Figure 6

    Change in the Proficiency of Students with Disabilities in Large-Scale Reading Assessment Since Baseline within Regular and Unique States

    Forty-six of the regular and unique states reported overall information for student reading proficiency in 2009-2010 and 2010-2011 that could be used in progress/slippage comparisons. Figure 9 shows these data and the narrow range of movement seen across states. The highest degree of slippage was about 11%, and the highest degree of progress was 14%. The 19 states with slippage showed an average decrease of 3.2%. One state showed no change in the percentage of students with disabilities scoring as proficient on its statewide reading assessment. The 26 states showing progress reported an average increase of 4.0%.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 40

    Figure 7

    Percentage of Progress or Slippage for Student Proficiency in Large-Scale Reading Assessment within Regular and Unique States

    Math Forty-six states provided student proficiency data for students with disabilities participating on the statewide mathematics assessment in 2010-2011. All states ranged from 2% to 73% in student math proficiency in 2010-2011. The overall average proportion of the states' students with disabilities who reached or exceeded proficiency in 2010-2011 was about 35.7%. Thirteen states reported proficiency rates of less than 25% for an average 14.4%. The largest group of states reported proficiency rates between 25% and 50% (n=24); their average was 37.7%. Nine states reported student proficiency rates of more than 50%, for an average of 61.3% per state. Table 4 shows the percentage of students with IEPs scoring as proficient in large-scale assessments in math in 2010-2011 for 32 regular and unique states that provided baseline, target, and actual data. Across these states, the average rate of proficiency for students with disabilities has increased by 4.8% since baseline; however, current performance averages 17.5% below the states' mean target. Four states met or

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 41

    surpassed their targets, and 28 states did not meet their target for this sub indicator in math. States achieving performance targets in math had a higher average baseline value, and a higher average in actual data (from 2010-2011 school year) than states that did not meet targets. States that did not meet targets reported more challenging targets than states that did meet targets. This distinction in target-setting is also detectable when viewing the overall target mean and the actual 2010-2011 math performance average; the target is more than 20% higher than this year's performance, on average.

    In five of the six RRC regions, average actual 2010-2011 proficiency rates for those states with sufficient data in the regions were higher than baseline values, though below average targets in all regions. The states in Region 2 experienced, on average, a drop in the proportion of proficient students from baseline to current year's performance. The relative number of states in each region reporting sufficient data for computation was half of each region's states or even fewer in four of the six regions.

    Table 4 Average Mathematics Proficiency Percentages in 2010-11 for Regular and

    Unique States that Provided Baseline, Target, and Actual Data

    N

    BASELINE

    (MEAN %)

    TARGET

    (MEAN %)

    ACTUAL DATA

    (MEAN %)

    OVERALL 32 31.5% 53.8% 36.3%

    MET 4 38.3% 36.8% 45.6%

    NOT MET 28 30.5% 56.2% 35.0%

    REGION 1 3 27.7% 53.7% 31.7%

    REGION 2 4 44.3% 57.4% 41.6%

    REGION 3 6 39.0% 62.9% 51.2%

    REGION 4 4 37.0% 57.1% 46.4%

    REGION 5 5 37.8% 63.9% 44.5%

    REGION 6 10 17.6% 40.5% 18.6%

    Forty-one states provided data for both the baseline year and 2010-2011. As shown in Figure 11, of these states, 12 showed slippage from baseline, for an average decrease of 10.1%. Twenty-nine states showed progress between baseline and the 2010-2011 school year for an average of 10.7%. Twelve states reported progress during the time of at least 10 percentage points. These states reported an average gain in the proficiency rate for students with disabilities of 16.1%.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 42

    Figure 8

    Change in the Proficiency of Students with Disabilities in Large-Scale Mathematics Assessment Since Baseline within Regular and Unique States

    Forty-five of the regular and unique states reported student math proficiency data in 2009-2010 and 2010-2011 that could be used in progress/slippage comparisons. Figure 12 shows these data. The 14 states with slippage showed an average decrease of 7.4 percentage points. The 27 states with progress reported an average increase of 3.8 percentage points. Mostly, there was a small degree of difference across states regarding progress and slippage. Two exceptions to the analysis that there has been little change in math proficiency performance include an instance of slippage of 44.4 percentage points and an instance of progress of 25.5 percentage points; without these data, slippage-progress range would be about 28 percentage points.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 43

    Figure 9 Percentage of Progress or Slippage for Student Proficiency in Large-Scale Mathematics

    Assessment within Regular and Unique States

    IMPROVEMENT ACTIVITIES The task for NCEO in presenting the improvement activities (IAs) during 2010-2011 was defined in the same way as it had been for the 2008-2009 and 2009-2010. Rather than reporting on all IAs from all state APRs, using various quantitative methods of analyses – which was an approach used in the past – NCEO reported in a qualitative manner on a subgroup of selected IAs that best fit with the OSEP definition of each IA category. Through the process of identifying IAs from various states, NCEO coders observed some issues or themes throughout the selected IAs. These observations are commented on here.

    Analysis Procedures

    The review of the APRs for improvement activities (IAs) followed the OSEP categories A through I and J1 through J11. One coder and one assistant coder from NCEO were involved in this process. First, we did a thorough read-through of all of the Indicator 3 Improvement Activities sections in state APRs. We identified IAs that represented the

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 44

    various types defined by the OSEP categories. On completion of this review, we made decisions about which states’ IAs would be identified to represent each category. Some decision rules the NCEO coders followed in selecting IA examples to represent the categories were:

    1. Identified IA examples that best fit with the OSEP definition of each category. 2. Sought to identify IA examples from as many states as possible. 3. Attempted to draw out IA examples in APRs from states throughout all six

    regions of the U.S., as specified by OSEP in the Regional Resource Center Program.

    4. Selected no more than one IA category example from any one state, excluding instances where states had individual IAs that fit into multiple categories.

    The first decision rule was facilitated by requiring agreement between the two raters’ reviews of the IAs identified, and that data demonstrate representation of various aspects of each IA category. The second rule resulted in IAs being drawn from 18 different states and entities, out of the 50 regular states and the 10 unique state entities. The third rule yielded the identification of IAs from fairly similar numbers of states in each region (mean=3.5, median=3, range=2-5 states per region). The fact that two regions each had five IAs identified in their states and state entities may be attributable to: a) one region having unique state entities and the largest overall number of states and state entities, and b) the second-largest overall number of states than the rest of the regions. The final decision rule resulted in exactly 18 states or state entities representing 20 improvement activities. There were 16 instances in which IAs were coded in more than one category (see Appendix A). The findings of the improvement activities review are exhibited in Appendix A.

    Themes in APR Improvement Activities

    In reviewing the APRs, the coders noticed some aspects of the text of the IAs that may serve as overall themes about how states wrote their IAs. Some of these themes were observed and detailed in previous years’ reports from NCEO to OSEP, and some of these themes seemed unique to the APRs covering 2010-2011. These four themes are stated in the following list, and described here.

    Technology

    States continued to expand their applications of technology. It appears that more use is being made of detailed student data for decision making at the state level. States were also working to align professional development offerings with needs identified in statewide and regional data. School district personnel were increasingly encouraged and trained to make use of local data to pinpoint improvement efforts, including professional development activities. In many cases states were making recommendations to the local level based on state analyses of local data. While not

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 45

    new, descriptions of these activities seem to be more common across reports each year.

    Professional Development

    In the area of professional development, communities of practice were a growing trend, with states seeking to encourage sharing of best practices among educators. Using the Internet for training (webinars) and document-sharing has become the norm. Increasingly, states were noting the use of online curriculum for students. Data-driven decision making also was a continuing theme. Although this began as an approach related to interventions for students, several states noted that they are aligning professional development programming with needs identified in analyses of regional or statewide data. Several states noted that they are using surveys to assess issues of policy and practice. This trend seems likely to continue, given the increasing availability of inexpensive online survey tools.

    Help for Students

    In activities directed at students, growth in assistive technology options may be behind increased reports from several states about improvement activities related to assistive technology banks, assistive technology training, and enhanced services for blind or deaf students. Programs and services specifically addressing transition to postsecondary and career also appeared to be on the increase. Some states noted that they had instituted exam preparation programs, to help students with disabilities prepare for standardized tests.

    Assessments and Standards

    Trends were also apparent in states’ approaches to assessment. Adaptive online testing had been instituted by additional states for 2010-2011 assessments. Several states noted they had developed test item banks. Exam preparation programs, noted above, are also related to this theme.

    Fewer states seemed to be going it alone on standards and assessments. More states were participating in national/regional consortia or reported partnering with another state or looking at other states’ approaches. In addition, several additional states have adopted or plan to adopt the Common Core State Standards.

    CONCLUSIONS AND RECOMMENDATIONS

    State reports of AYP assessment data and AYP data showed a wide range of slippage and progress across all states, and state explanations of these changes were similarly variable. It appears that states are focusing on issues that they thought were partially responsible for current rates for AYP, participation, or performance. In general, AYP rates within states appeared to have shifted downward, by significant margins in some

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 46

    states. Participation rates on the other hand appeared to have leveled off and are quite similar for reading and mathematics. As for performance, it appeared that many states were making gains on an annual or nearly annual basis, and data points for 2010-2011 were typically higher than state baseline values across most states and RRC regions. On average, it appears that there was a difference in student performance between content areas, with states reporting higher proficiency rates in math than in reading. In addition, states appeared to make more progress in mathematics than reading in 2010-2011. As states continue to tackle issues in assessment with prescribed improvement activities and high participation rates, it is quite possible that increases in performance will continue. At the same time, states’ abilities to meet increasingly challenging AYP targets was waning during 2010-2011,and targets may need to be re-evaluated.

  • Part B SPP/APR 2012 Indicator Analyses-(FFY 2010) Page 47

    Appendix A. State Improvement Activities: Examples by Category

    Description (Category Code) State Examples

    Improve data collection and reporting – improve the accuracy of data collection and school district/service agency accountability via technical assistance, public reporting/ dissemination, or collaboration across other data reporting systems. Developing or connecting data systems. (A)

    Improve data collection procedures for 616 and 618 reporting purposes. Collaboration continues with FSAIS and RP&E for assessment data. Activities were conducted during testing week in May 2011 to verify student participation and the administration of selected accommodations. Additionally, GDOE utilizes the Ready Results database to review and analyze assessment data. (GU)

    Improve systems administration and monitoring – refine/revise monitoring systems, including continuous improvement and focused monitoring. Improve systems administration. (B)

    Self-Assessment/Monitoring:

    The monitoring system in place during 2010-2011 was aligned with the SPP indicators. The system linked compliance, data and programs and services by requiring districts to review compliance in areas related to SPP indicators and to examine their data compared to state targets.

    Each district identified for monitoring had completed a self-assessment reviewing their state assessment performance and participation rates against the state annual SPP targets, completing a protocol to identify needs for continuous improvement in curriculum and instruction and reviewing compliance requirements related to participation in state assessments. The protocol for state assessment was adapted from a document used