worthy goals, limited success: intervention programs in ... · did not make significant growth...

16
© Copyright 2007 by EdSource, Inc. REPORT FEBRUARY 2007 EdSource ® is a not-for-profit 501(c)(3) organization established in California in 1977. Independent and impartial, EdSource strives to advance the common good by developing and widely distributing trustworthy, useful information that clarifies complex K–12 education issues and promotes thoughtful decisions about California’s public school system. WHEN CALIFORNIA POLICYMAKERS officially embraced school account- ability in the spring of 1999, the central goal was an across-the-board increase in academic achievement in the state’s K–12 public schools. The Public Schools Accountability Act (PSAA) created a system of sum- marizing performance on state standardized tests and ranking schools accordingly, with monetary awards granted for meeting improvement targets. Policymakers also included an intervention program to help schools that were not meeting state goals. California’s test scores have improved substantially over the last several years, but state-led interven- tions’ overall contribution to this improvement has been slight, accord- ing to official evaluations. On the one hand, they have provided about $1.3 billion in extra state revenues to low-performing schools, at the least giving them a chance to examine their own performance. On the other hand, the evaluations to date have shown more problems than progress. Despite this, the intervention concept remains a linchpin in the state’s standards- based reform efforts, with interven- tion programs affecting more than a quarter of California’s 9,000 public schools and nearly the same propor- tion of school districts. In attempting to get it right, California has continually modified policies and processes for interven- tions since the PSAA was passed. In addition, the federal government has had its own accountability require- ments and intervention strategies, particularly since the enactment of No Child Left Behind (NCLB) in 2002. Although the state and federal programs emphasize similar academic skills, they require a different approach to measuring performance and thus determining which schools need help. The result has been a tangle of programs, each with its own acronym, eligibility criteria, funding, and timeline. In addition, the varied require- ments of the programs create confusion, particularly because hundreds of schools are in multiple programs at the same time. Further, the rate of improvement in California simply does not match the pace at which federal performance targets rise under NCLB. These targets escalate until 2013–14, when all students are expected to be proficient on state academic content standards in English and math. At that point, many observers expect that without some change in policy, almost all California schools and districts that receive funding under NCLB will be in Program Improvement, the federal intervention program. This report looks at the history of the state’s intervention programs and reviews the evaluations that have been completed. It contrasts those programs with federal requirements, including newly created interventions for school districts. And it looks to the future, providing data regarding the potential impact of NCLB throughout Califor- nia and highlighting the state’s newest school improvement program. California policymakers create a series of intervention programs The intent of the PSAA was to create greater incentives for schools to increase student learning, as measured by results of the state’s standardized tests. Worthy Goals, Limited Success: Intervention Programs in California

Upload: others

Post on 26-Aug-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

R E P O R TF E B R U A R Y 2 0 0 7

EdSource® is a not-for-profit 501(c)(3) organization established in California in 1977.

Independent and impartial, EdSource strives to advance the common good by developing and widely distributing trustworthy, usefulinformation that clarifies complex K–12 education issues and promotes thoughtful decisions about California’s public school system.

WHEN CALIFORNIA POLICYMAKERSofficially embraced school account-ability in the spring of 1999, thecentral goal was an across-the-boardincrease in academic achievement inthe state’s K–12 public schools. ThePublic Schools Accountability Act(PSAA) created a system of sum-marizing performance on statestandardized tests and ranking schoolsaccordingly, with monetary awardsgranted for meeting improvementtargets. Policymakers also included anintervention program to help schoolsthat were not meeting state goals.

California’s test scores haveimproved substantially over the lastseveral years, but state-led interven-tions’ overall contribution to thisimprovement has been slight, accord-ing to official evaluations. On the one hand, they have provided about$1.3 billion in extra state revenues tolow-performing schools, at the leastgiving them a chance to examine theirown performance. On the other hand,the evaluations to date have shownmore problems than progress. Despitethis, the intervention concept remainsa linchpin in the state’s standards-

based reform efforts, with interven-tion programs affecting more than aquarter of California’s 9,000 publicschools and nearly the same propor-tion of school districts.

In attempting to get it right, California has continually modifiedpolicies and processes for interven-tions since the PSAA was passed. Inaddition, the federal government hashad its own accountability require-ments and intervention strategies,particularly since the enactment ofNo Child Left Behind (NCLB) in2002. Although the state and federalprograms emphasize similar academicskills, they require a differentapproach to measuring performanceand thus determining which schoolsneed help. The result has been a tangleof programs, each with its ownacronym, eligibility criteria, funding,and timeline.

In addition, the varied require-ments of the programs createconfusion, particularly becausehundreds of schools are in multipleprograms at the same time. Further,the rate of improvement in Californiasimply does not match the pace at

which federal performance targets riseunder NCLB. These targets escalateuntil 2013–14, when all students areexpected to be proficient on stateacademic content standards inEnglish and math. At that point,many observers expect that withoutsome change in policy, almost allCalifornia schools and districts thatreceive funding under NCLB will bein Program Improvement, the federalintervention program.

This report looks at the history ofthe state’s intervention programs andreviews the evaluations that have beencompleted. It contrasts those programswith federal requirements, includingnewly created interventions for schooldistricts. And it looks to the future,providing data regarding the potentialimpact of NCLB throughout Califor-nia and highlighting the state’s newestschool improvement program.

California policymakers create aseries of intervention programsThe intent of the PSAA was to creategreater incentives for schools to increasestudent learning, as measured by resultsof the state’s standardized tests.

Worthy Goals, Limited Success:Intervention Programs in California

Page 2: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

E D S O U R C E R E P O R T

2 ● Intervention Programs ● February 2007

The law created a method ofsummarizing test score results into onenumber—an Academic PerformanceIndex (API) score. Based on API scores,the state divides schools by 10 perform-ance levels (deciles) and ranks them

from 1 (lowest) to 10 (highest). Thestate also uses API scores to set annualimprovement targets for schools andstudent subgroups, which are based onethnicity and status as disabled, socio-economically disadvantaged, or Englishlearner. By publicizing API scores anddecile rankings, policymakers try topressure schools to improve towardspecified achievement levels.

Eligibility for the state’s interven-tion programs is based on schools’ APIscores. Over time, these programs haveevolved and continue to be modifiedperiodically. The first program—theImmediate Intervention/Underper-forming Schools Program (II/USP)—was initiated in 1999. Two years later,legislators created School Assistanceand Intervention Teams (SAITs) to aidII/USP schools that were not improv-ing along with schools that struggled toimprove in a second program, the HighPriority Schools Grant Program(HPSGP). The latter, authorized in2001, was similar to II/USP in manyways but different in the school selec-tion process, some program particulars,and funding.

On the whole, the schools in inter-vention programs under PSAA haveserved disproportionate numbers ofstudents who are Latinos, AfricanAmericans, English learners, and/orfrom low-income families. Thus abroader intention behind the interven-tion programs has been to help close theachievement gap between these studentsand their white or Asian, English-fluent,and higher-income counterparts.

The state’s first intervention program(II/USP) under PSAA was voluntary The Immediate Intervention/Under-performing Schools Program (II/USP)was the first major effort to intervene inschools subsequent to the state adoptionof content standards, which specifywhat students are to know and be able todo in each subject and/or grade. The

program provided funding for an exter-nal evaluator to work with the schoolcommunity during a “planning year” toassess needs and develop an action planfor reform. Once the plan was developed,the state provided funding to implementthe plan over two to three years. Varyinglevels of success, as measured byperformance results, met with differentconsequences—but a school that madeno progress faced state sanctions.

Three cohorts of 430 schools eachwere selected for II/USP from 1999through 2001. No new cohorts havebeen authorized by legislation sincethen. Schools that were in the bottomhalf of the API rankings and did notmeet their API growth targets in theprior year were eligible for theprogram. The California Departmentof Education asked districts whethereligible schools wished to volunteer toparticipate.

II/USP provided funds to carry outan action plan designed with thehelp of an external evaluator Once a school was selected, its districtreceived $50,000 to contract with anexternal evaluator, chosen from a list ofstate-approved entities and individuals.Evaluators tended to be private consult-ing groups, county offices of education,or persons affiliated with researchorganizations or universities. Statepolicy dictated that the evaluator couldnot already be working for the district.The evaluator was supposed to be anoutside, neutral party who was to iden-tify weaknesses under the assumptionthat these schools would not see them-selves clearly. Working with the schoolsite council or a team of staff andcommunity members, evaluators wereto develop an action plan describingbarriers to improvement and strategiesto overcome them. The action plan alsohad to include an examination ofstudent achievement data broken downby student subgroups.

Evaluations of State Intervention Programs Find Little Overall Effect on Student Achievement ......5

California Layers the Federal Accountability Systems on Top of the State’s Approach ....................8

Federal Requirements Force the State To Create Interventions for School Districts ..............11

California’s Quality Education Investment Act Embodies a New Approach to Interventions ............15

On the Horizon: What Can California Expect Next? ..15

This report was written by:Brian EdwardsSusan FreyMary Perry

Researched by:Brian EdwardsNoli Brazil

EdSource thanks the William and Flora Hewlett Foundation forits investment in our core work.

Inside This Report

Williams lawsuit also aimed to closethe achievement gap

In addition to intervention programs under thePublic Schools Accountability Act, legislators haveresponded to a lawsuit filed on behalf of studentsin low-performing schools. The settlement ofWilliams v. California in 2004 included account-ability measures (such as empowering countysuperintendents to intervene in the lowest-performing schools) and about $1.2 billion tomake facilities repairs, buy textbooks, create astatewide facilities inventory, and continue theHigh Priority Schools Grant Program (described on page 4).

Page 3: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

E D S O U R C E R E P O R T

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

February 2007 ● Intervention Programs ● 3

Evaluators were usually strangers to the school personnel and had no decision-making power at the school.Although some evaluators formed goodworking relationships with their schools, others spent little time withthem. There were a few reports of evalu-ators’working with multiple schools andnot tailoring action plans to individualschools. The external evaluator’s workwas expected to conclude at the end ofthe planning year, and implementationof the action plan was left to the school.

After the State Board of Educationapproved the action plan at the end of

the planning year, schools received$200 per pupil (or $50,000, whicheverwas greater) for each of two years toimplement the plan. The state tried tosend these funds well before the begin-ning of the school year to facilitateschools’ implementation efforts. Partic-ipating districts were required to matchthe implementation grant to bolsterimprovement efforts and demonstrate“buy-in” to the II/USP process.

II/USP schools that met all theirtargets in both “implementation years”graduated from the program and nolonger received implementation fund-ing. On the other hand, if a school didnot meet its growth targets in the firstyear of implementation, its districtgoverning board was required to inter-vene in some way, such as reassigningpersonnel or amending site-specificcollective bargaining agreements if anyexisted. If a school did not meet itsgrowth targets in either year but made“significant growth”—defined by thestate board as improving by one APIpoint in either year—the schoolreceived a third year of implementa-tion funding.

Legislators created SAITs as analternative to more severe sanctionsUnder the original PSAA, a school thatdid not make significant growth facedserious sanctions. The State Superinten-dent of Public Instruction (SPI) was toassume all rights and responsibilities forthe school from the governing districtand could reassign the principal.Furthermore, the PSAA required theSPI to choose at least one other serioussanction from a list that included clos-ing the school. Some policymakers sawII/USP as not giving struggling schoolsenough time to improve and as beingtoo punitive.

In 2001—before any schools facedstate takeover—policymakers in Sacra-mento amended the statute to allow foranother option: the assignment of a

school assistance and intervention team(SAIT). These teams are supposed to becomposed of people who are experts inhelping struggling schools. Many teamscome from county offices of educationor consortia of county offices, but thereare also university-affiliated groups,research organizations, or privateconsultants, some of which operate forprofit. (To see the state’s list of SAITs,go to: www.cde.ca.gov/ta/lp/sm/saitproviders.asp)

SAITs have 60 days to complete areport outlining corrective actions,which the local school board must thenadopt within the next 30 days. Underthe SAIT approach, an interventionteam recommends corrective actionsafter it assesses the school’s imple-mentation of “essential programcomponents,” a set of nine factors thestate considered necessary for improve-ment under the state accountabilitysystem. The components vary some-what by grade level. (See the box onpage 4 for a description of the ninecomponents.)

In 2001 a total of 24 II/USPschools failed to achieve significantgrowth and were deemed “state-monitored” schools. Each was assigneda SAIT. The state did not take over anyof these schools. As of November2006, 253 II/USP schools had beenassigned SAITs, and 109 of them hadsuccessfully exited the SAIT process.

SAITs are funded with a combina-tion of state and district dollars. Boththe state and district contribute toSAIT members’ salaries. In addition,the state provides $150 per pupil toimplement the intervention team’srecommendations, and districts arerequired to match these funds withmoney or in-kind contributions.

SAITs have played an importantrole in II/USP and in the High PrioritySchools Grant Program, California’ssecond-generation intervention pro-gram under PSAA.

The federal Comprehensive SchoolReform Demonstration programpredated state efforts

The federal Comprehensive School Reform Demon-stration program (CSRD) started one year beforeCalifornia created the Immediate Intervention/Underperforming Schools Program (II/USP). CSRD’sapproach was to provide schools with extraresources so they could adopt established whole-school reform models. CSRD initially provided thestate with enough funding for 80 schools to assesstheir needs and select a reform model from afederally approved list. The implementation funding was for three years.

The CSRD application process and programmaticrequirements were similar to action plan require-ments for II/USP. Given the similarities between thestate and federal programs, California consideredthe initial 80 CSRD schools as part of II/USP. Thissubset of CSRD schools had to fulfill the require-ments of both the federal and state programs.

The state continued receiving funding throughCSRD, later renamed Comprehensive SchoolReform. Altogether 331 schools have received$238 million funneled through various stateprograms. The federal government discontinuedthe program in 2006.

Page 4: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

The High Priority Schools GrantProgram addresses some of II/USP’s weaknessesIn 2001, when II/USP was in its thirdyear of implementation, state policy-makers established the High PrioritySchools Grant Program (HPSGP) aspart of PSAA. (The SAIT concept waspart of the same legislation.) The twointervention programs have much incommon. Both provide supplementalresources for schools to implement aplan to raise student academic achieve-ment. They both hold participatingschools accountable for results. Bothrequire schools to involve the commu-nity. And should HPSGP schools notachieve specified levels of growth on theAPI, they too face state sanctions,including state takeover or beingrequired to work with a SAIT.

But the new program differs fromII/USP in several ways, including theschool selection process, some programparticulars, and the amount of fundinggranted to schools.

High Priority Schools Grant Programdiffers from II/USP in the detailsOne difference between the programs iswhich schools participate. In HPSGP,schools in API deciles 1–5 are techni-cally eligible for assistance; but priorityfor funding goes to schools with thelowest API scores. That has meant thatprimarily Decile 1 schools—those inthe bottom 10% of the API rank-ings—have participated.

Although both II/USP and HPSGPare voluntary, the latter program putsmore pressure on eligible schools toparticipate. If a school is “invited” to

apply to HPSGP, the local governingboard must discuss at a public meetingreasons to accept or deny the invitationand explain what will be done for theschool if the board decides not to apply.

The two programs also differ intheir planning requirements. HPSGPallows schools to use existing plans toserve as their action plans and enterimplementation mode immediately—as long as the plan addresses a multitudeof program requirements. That is incontrast to II/USP, which requiredschools to take a year to draw up a sepa-rate action plan. Part of the reason forthe difference is that schools already inII/USP or the federal CSRD wereallowed to participate in HPSGP, and aseparate action plan would likely havebeen duplicative. (In fact, almost halfthe schools in the first cohort of HPSGPhad also received some funding fromII/USP or CSRD.)

Schools in HPSGP that choose totake a year for planning, however, receivegrants of $50,000, the same amountgiven to II/USP schools from 1999through 2001.

HPSGP has also differed fromII/USP in the external evaluator compo-nent. II/USP forbade district personnelfrom serving as evaluators. Respondingto feedback from the field on II/USP,state policymakers specifically alloweddistrict staff to serve as evaluators for the first cohort of HPSGP schools.However, for the second cohort, the stateis discouraging districts from playingthat role while still urging districts to beactively involved in school improvementefforts, such as playing a role in adistrict/school liaison team and helpingthe school implement its action plan.

HPSGP also provides greater imple-mentation funding than II/USP—$400 as opposed to $200 per pupil.(Schools that were already in II/USPwhen they joined HPSGP received anadditional $200 per pupil to bring theirfunding to $400 per pupil.)

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

E D S O U R C E R E P O R T

4 ● Intervention Programs ● February 2007

The California Department of Education has identified nine “essentialprogram components” of successful schools

Early in the SAIT process, the intervention team assesses the school’s implementation of “essentialprogram components,” a set of nine factors seen as necessary for a school to improve student achieve-ment. The nine components vary somewhat by grade level, but in their basic form they include:

1. Use of English and math instructional materials that are aligned to state content standards, includ-ing materials to help struggling students.

2. Dedication of time to standards-aligned instruction in English/reading/language arts and math.

3. Participation in the School Administrator Training Program.

4. Employment of fully credentialed, highly qualified teachers and participation in the Math and Read-ing Professional Development Program (created in 2001 by Assembly Bill 466 and reauthorized in2006 by Senate Bill 472).

5. Student achievement monitoring system (use of data to monitor student progress on curriculum-embedded assessments and modify instruction).

6. Ongoing instructional assistance and support for teachers (use of content experts and instructionalcoaches).

7. Monthly teacher collaboration by grade level (K–8) and department (9–12) facilitated by the principal.

8. Lesson- and course-pacing schedule (K–8) and master schedule flexibility for sufficient numbers ofintervention courses (9–12).

9. Fiscal support: use of general and categorical funds of the school or district to support the school’sEnglish/reading/language arts and math program goals.

Page 5: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

Districts are expected to play agreater role in HPSGPHPSGP seeks greater involvement ofthe district in the school improvementprocess by requiring districts to partici-pate in the development of the actionplan. They must also submit annualreports to the California Department ofEducation (CDE); and these reportsmust include participating schools’ useof instructional materials, coursesoffered, levels of parental involvement,teacher professional development, andprincipal’s experience.

In addition, under HPSGP, partici-pating schools must send their teachersand principals to professional develop-ment programs recently created by thestate. Teachers are expected to participatein the Math and Reading ProfessionalDevelopment Program training. Underthat program, teachers receive training inthe use of district-adopted instructionalmaterials that are aligned to the state’scontent standards. Similarly, principalsof High Priority schools are supposed toattend the state’s Administrator TrainingProgram, which is a 160-hour intensiveinstitute made up of three modules: 1)Leadership and Support of StudentInstructional Programs, 2) Leadershipand Management for InstructionalImprovement, and 3) Instructional Tech-nology to Improve Pupil Performance.

Triggering of sanctions differs fromII/USP schoolsFinally, the triggering of sanctions isdifferent. High Priority schools havemore time to improve before sanctionscan be applied, but the “significantgrowth” requirement calls for more APIimprovement. II/USP gave schools twoyears to meet API growth targets or atleast make “significant growth” on theAPI (defined by the state board as oneAPI point in either implementation year).HPSGP, on the other hand, givesschools three years to meet API targetsor make significant growth, which is

defined as a minimum total increase of10 API points over three years, with growthbeing positive in two out of three years.

High Priority schools that do notshow significant growth are deemed“state-monitored” schools and can

either be required to work with a SAITor be taken over by the state. So far, all53 schools that have not shown signifi-cant growth have been assigned SAITsfrom the list of state-approved teams. Itis interesting to note that most HPSGPschools in this category began asII/USP or CSRD schools; few thatparticipated only in HPSGP have failedto show the requisite growth.

Evaluations of state interventionprograms find little overall effecton student achievementOf the $1.3 billion invested in II/USPand HPSGP, a small part was set aside to assess the effectiveness of the twointervention programs. The state con-tracted with the American Institutes for Research (AIR) to evaluate bothprograms. AIR’s reports—published inJune 2003 and September 2005 onII/USP and a Year 1 report in Septem-ber 2006 on HPSGP—found that bothprograms produced only modest effectson student achievement for most schools.

AIR’s evaluations of II/USP identi-fied three major problems: 1) the statedid not target the schools most in need;2) planning was divorced from imple-mentation in participating schools; and3) greater district involvement wasneeded. Those schools that did succeedhad strong leadership and a spirit ofcollaboration among teachers, theresearchers found.

Factors that hindered greatersuccess for HPSGP schools, accordingto AIR, included a resource deficit facedby many participating schools, lack ofdistrict support, and flaws in programdesign and implementation.

Evaluation of II/USP raises issuesregarding implementation and designIn its evaluation of II/USP, AIRfound that small increases in partici-pating schools’ rate of achievementgrowth during the planning yeartended to dissipate in the implementa-tion years. Although the researchersfound a negligible gain for II/USPschools overall, that was partly becauseof averaging the scores of all schools.In addition, the researchers said thatachievement trends in participatingschools were more associated withwhat the local district was doing thanwith II/USP directly. (For where tofind the full evaluation online, see “ToLearn More” on page 13.)

AIR found that, as intended by thePSAA, schools spent their funds on goodsand services directly related to instruction,such as professional development andrelease time for teachers, instructionalmaterials, and personnel. However, thereport pointed out a number of ways inwhich the program in reality often did not match policymakers’ vision. Goingbeyond that, AIR raised questions aboutthe basic design of the program.

II/USP did not reflect policymakers’vision in a number of waysThe actual implementation of pro-grams often varies from what policydictates, and II/USP is no exception.For example:

E D S O U R C E R E P O R T

February 2007 ● Intervention Programs ● 5

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

Those schools that did succeed had strong leadership and a spirit

of collaboration among teachers, AIR researchers found.

Page 6: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

● The voluntary nature of II/USP,which was designed to ensure thatschools had bought in to the program,was not realized. A majority of par-ticipating schools did not volunteer for the program. (Apparently theirgoverning districts “volunteered” theseschools without the schools’ knowl-edge.) However, that did not appear tohave any long-term effect on improve-ment efforts or achievement gains.

● Some school personnel complainedof a lack of sufficient informationabout II/USP, which they believedwas an impediment either to buy-inat the school site or appropriateimplementation.

● Schools’ planning was often divorcedfrom implementation. This was inpart because external evaluatorstended to be involved only initiallyrather than continually tying imple-mentation back to the needsassessment and action plan. ThePSAA did not require evaluators toprovide this longer-term assistance.

● Funding arrived late for manyschools, hampering their planningand implementation activities.

● The threat of sanctions did not havethe intended effect. AIR found thatmany participants were aware of thepotential sanctions for II/USPschools, but they held mixed views onthe ability of such threats to motivateimprovement. Some educators saw theprogram as punitive rather than moti-vating, an attitude that the researchersdeemed “disheartening.” In addition,

many educators did not believe thatthe state would actually carry outsevere sanctions. They believed thatless serious consequences, such asrequired public hearings or the assignment of a state assistance team(SAIT), were more likely to occur.History has proved them right.

AIR found flaws in eligibility criteria,funding levels, and the district’s roleGoing beyond implementation details,AIR found flaws in the II/USP policyitself. For example, the researcherspointed out that II/USP eligibility crite-ria did not target schools most in need.Schools near the middle of the API rank-

ings that had missed only one subgroupgrowth target in one year were as eligibleas the lowest-ranking schools that hadrepeatedly missed several targets. In addi-tion, AIR reported that while planninggrants appeared sufficient for mostschools, most survey respondents believedthat the implementation grants were not.

Another important issue that theresearchers raised concerned the role ofthe district. AIR’s June 2003 report statedthat despite “the powerful influence ofdistrict context in conditioning schools’achievement growth…II/USP did littleto harness and direct district influence orto hold districts accountable for ensuringthe success of their II/USP schools.”

Some schools succeeded, and theyhad several things in commonSchools that succeeded did so by align-ing goals, activities, and resources to

create a coordinated and coherentinstructional program, AIR found.Keys to such coherence included strongleadership from the principal and othersite leaders and a spirit of collaborationand professional community amongteachers, the researchers said.

In a research study published in June2006—Similar Students, Different Results:Why Do Some Schools Do Better?—EdSource, AIR, Stanford University,and UC–Berkeley looked at differencesin performance (as measured by APIscores) among a group of elementaryschools that were primarily in deciles 2through 6 (with a majority in deciles 3and 4). The researchers’ results reinforceAIR’s findings in the II/USP evalua-tion. Schools that had higher API scoreswere likely to use student-assessmentdata to inform instruction and to havestrong leadership from the principal, acollaborative community of teachers,and an alignment of goals, activities,and resources toward teaching the state’sstandards. In addition, principals fromthe more successful schools reportedthat their districts shared their intensefocus on achievement.

Evaluation of HPSGP suggests ways to make the program more effectiveIn its Year 1 evaluation of HPSGP, AIRfound that this program also had littleimpact on the achievement of studentsin the average participating school.Perhaps just as important, AIR foundthat after being in the program for threeyears, many participating schools wereoperating with fewer resources thansimilar schools. In addition, the evalua-tors noted that a lack of support of theHigh Priority schools by some districtsas well as breakdowns in implementa-tion of the program have to datecompromised policymakers’ intent. It isimportant to note, however, that AIRwill continue to analyze performanceand resource data as well as surveyresponses from participating and simi-

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

E D S O U R C E R E P O R T

6 ● Intervention Programs ● February 2007

“II/USP did little to harness and direct district influence or to hold

districts accountable for ensuring the success of their II/USP schools,”

according to AIR.

Page 7: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

E D S O U R C E R E P O R T

February 2007 ● Intervention Programs ● 7

lar nonparticipating schools. Its Year 2report is due in late summer 2007.

AIR finds a modest positive effect When AIR compared the test scores ofHigh Priority schools to non-HPSGPschools, it looked only at HPSGPschools that received planning grantsand on-time implementation funds andthat had not participated in any otherreform program (e.g., II/USP).

AIR found a very modest positiveprogram effect. AIR tracked both groups’achievement on 12 tests (two to five testsper year) during three years of programimplementation. The tests were the norm-referenced component of the STARprogram (SAT-9 and CAT/6 Survey)and the California Standards Tests inEnglish and math. For elementaryschools, the High Priority schoolsperformed better by a statistically signifi-cant margin on 7 of 12 tests, about thesame on three, and worse on two. For theseven tests on which the High Priorityschools did better, the difference wasslight—0.03 standard deviations, whichtranslates to about 1.8 points on a test forwhich the average scale score was 308.2.

For middle and high schools, theresults were similar, with middle schoolsin HPSGP scoring better than compari-son schools on 8 of 12 tests and highschools in the program scoring better on6 of 12 tests. The performance differen-tial was also very small—0.02 and 0.01standard deviations, respectively.

Participating High Priority Schoolshad a staffing deficit at one point AIR found that High Priority schools atone point in the program (2004–05)operated with a staffing deficit ascompared to similar schools and thestate average. When AIR conductedthese resource analyses, it focused onHPSGP schools that had received fundsonly through HPSGP and not throughII/USP or CSRD (“pure HPSGP”schools). The researchers found that the

average number of full-time equivalentadministrators, teachers, and pupilsupport staff per 100 students was5.42, while comparison schools had5.68 and the state average was 5.54.(Staff members who do not work at aspecific school site, such as a teachercoach who helps teachers throughout adistrict, are not counted.) In a school of1,000 students, this staffing differentialwould equate to one fewer employeethan the average school and almostthree fewer than a comparison school.

AIR found that many High Prior-ity schools have used their programfunds to address staffing needs despiteguidance from CDE that the short-term funding of the program shouldnot be spent on the long-term cost ofpermanent personnel.

AIR found disparities not just instaffing broadly speaking, but also in theteaching force in particular. AmongHigh Priority schools, 90% of teacherswere credentialed, which was better thancomparison schools (87%) but worsethan the state’s average school (94%).

On a related note, districts withschools in the program generally failedto comply with an assurance required as part of the application process. Bythe second year of implementation, the percentage of fully credentialed teachers in participating schools was supposed to at least match thedistrictwide average. AIR found thatonly 56% of the pure HPSGP schoolswere able to meet this mark, and thepercentage did not go up in the thirdyear of implementation.

District support, or lack thereof,affects reform effortsAs part of its Year 1 evaluation effort,AIR spent time at 16 schools in ninedistricts to interview staff and commu-nity members. Among these “case study”schools, district offices played varyingroles in the reform effort. AIR viewedthree districts as quite helpful, four as

impeding their schools’ efforts, and twoas neither a great help nor hindrance.

Districts helped by providingstudent assessment data and profes-sional development and by recruitingand maintaining strong staffs. Incontrast, districts that underminedschool improvement efforts did nothelp secure stable school leadership,were focused on turning around district-level financial crises as opposed tohelping individual schools improve, anddid not offer targeted support for thesestruggling schools.

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

AIR suggests policy and implementationchanges for HPSGP in the Year 1 report

AIR recommended several preliminary policy andimplementation modifications, including:

● Enhancing the role of the district and holding itaccountable for school improvement and forestablishing and maintaining conditions forsuccess.

● Enhancing CDE’s monitoring of nonachievementmeasures, such as participating schools’ spend-ing levels and districts’ compliance with theiragreements to provide resources.

● Clarifying the long-term role of external evalua-tors and incorporating some measurement oftheir effectiveness.

● Targeting “failure” early by monitoring theperformance of HPSGP schools annually andidentifying actions for schools that do not meettheir API growth targets in a given year.

● Modifying the timing for distribution of funds toschools and establishing timelines so schoolscan plan effectively and make smooth transi-tions out of HPSGP.

● Issuing guidance for integrating the program’sobjectives and API growth targets into schools’Single Plan for Pupil Achievement in a mean-ingful way.

Page 8: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

E D S O U R C E R E P O R T

8 ● Intervention Programs ● February 2007

AIR finds flaws in implementationand possibly program design AIR found during its site visits thatabout one-third of the 16 schoolsvisited had been able to implement themain components as intended. Theseschools understood what the program

required and allowed, and they created arelatively stable teaching staff and schoolleadership team able to spend fundsstrategically over a multiyear period.

But in about half of the schoolsvisited, there were “substantial fundamental breakdowns in the imple-mentation” of HPSGP caused by a lackof awareness of the program, variabilityin the use—and perceived effective-ness—of external evaluators, anddisruptions in effective planning. AIRposited that implementation may nothave gone as desired in part because ofan absence of “preconditions” facilitat-ing success, such as district support,stable school leadership, and resourcesat least on par with comparable schools.

A state-level implementation issuealso arose. Some schools did not receivetheir funds in a timely manner, andsome did not receive the intended $400per pupil during at least part of theimplementation period. With a fixedpot of money, CDE determined thenumber of eligible schools that couldbe funded based on their 2000–01enrollments even though enrollmentscould, and did, change—substantiallyin some cases. With the total allotmentthe same every year for each school

regardless of enrollment changes, someschools received about $100 per pupilin some years while others received upto $800 per pupil.

It is interesting to note that theresearchers found that implementationproblems per se were not a predictor of

success. Some of the schools withimplementation problems made littleimprovement, while others showedconsistent growth in API scores.

In its report, AIR went beyondimplementation issues to question thedesign of the policy. The research teamsuggested that the “lack of substantialHPSGP impact may result from thebasic design of the program; i.e., a rela-tively short-term injection of funds maybe insufficient to substantially affectschool performance.” (For where to findthe full Year 1 evaluation online, see “ToLearn More” on page 13.)

California layers the federalaccountability systems on top of the state’s approach While the state was developing itsaccountability and intervention systems,the federal government had its ownsystems, as outlined in the Elementaryand Secondary Education Act (ESEA).Prior to 2002, the federal system wasrelatively flexible. That meant that it hadaffected only a small subset of schools inCalifornia, and the state had expected tomeet its requirements with the systems ithad set up in the late 1990s. With therevised ESEA of 2002, which is known

as the No Child Left Behind Act(NCLB), the federal governmentextended its reach considerably, applyingvery specific, rigorous performancetargets to all schools. This made thefederal systems much more visible.

Technically, California could havechosen not to go along with the federalapproach if it also was willing to forego alarge piece of federal funding. Althoughsome states considered passing up themoney to avoid meeting the federalconditions, California policymakers never seriously weighed that option.Leaders in Sacramento decided to adoptthe NCLB system and simultaneouslymaintain the accountability program thestate had already established. There are several reasons California has chosento run both systems: ● Adopting the federal approach

ensures that Basic Grant funds underTitle I of ESEA continue flowing toCalifornia. These funds are used tohelp schools with large percentagesof children from low-income fami-lies. In 2006–07 California isreceiving about $1.7 billion in Title IBasic Grant funding, which is about2.5% of the $67.1 billion in totalrevenue for K–12 education this year.

● Elements of the state program (i.e.,API scores) could be used to meetfederal funding conditions.

● State policymakers could not be surethat the federal system would be main-tained over the long term; if it wentaway and the state had abandoned itsown systems, California would nothave an accountability system.

State and federal programs differ inexpectations and who is accountableDespite some strong reasons for operat-ing both systems, California educatorsand parents have been frustrated byseveral important differences betweenthe two approaches to accountability.For example, the two systems measure

The two systems (state and federal) measure performance differently,

which creates different incentives for schools. Thus some schools are

“successful” in one system but “failing” in the other.

Page 9: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

E D S O U R C E R E P O R T

February 2007 ● Intervention Programs ● 9

performance differently, which createsdifferent incentives for schools. Thisleads to situations in which someschools are “successful” in one systembut “failing” in the other.

The two systems measure schoolperformance differentlyCalifornia’s API system focuses on improvement of students at nearly all performance levels, with more creditawarded to gains made by lower-performing students. Under this system,all schools are expected to either showprogress or maintain high achievement.Targets are not subject-specific; excellentperformance in one area can compensatefor weak achievement in another.

The federal measure centers noton improvement but on specific goals. Inevery school, a specified percentage ofstudents and all student subgroups mustdemonstrate proficiency on state stan-dards in math and English every year.(Since 2006 the state and federal government recognize the same sub-groups.) The yearly federal targets—known as “annual measurable ob-jectives” (AMOs)—are subject-specificand vary by grade span (elementary,middle, high school). But for every gradespan, they periodically increase until2013–14 when all students must beproficient in English and math. For highschools, results from the grade-level Cali-fornia Standards Tests (CSTs) are notused for NCLB purposes. The only testthat matters is the California HighSchool Exit Exam (CAHSEE), whichmostly covers material that students aresupposed to have been exposed to wellbefore they take the exam.

In addition to the AMOs, schoolsand subgroups face a rigorous testingparticipation rate minimum of 95%,and schools must meet specified APIcriteria involving minimum scores orimprovement. High schools must alsomeet a certain graduation rate. All thosefactors—which some would argue form

a more well-rounded approach toschool accountability than one based onAPI scores alone—compose the federalperformance measure, “adequate yearlyprogress” (AYP).

The measures used by the state andfederal systems create different incen-tives. Under the API system, schoolshave an incentive to work with allstudents who could realistically movefrom one CST performance level toanother, especially students who scoredhigh in the “far below basic” perform-ance level. Helping a student at that levelmove up—or preventing a student whoscored low in the “below basic” levelfrom slipping down a level—will help aschool’s API score the most.

In contrast, under the AYP system,schools have an incentive, especially inthe short term, to focus their attentionon students who could realistically move from “basic” to “proficient” (orslide down the opposite direction).

Even if schools ignore both sets ofincentives, they might do well accordingto one measure and miss the othersystem’s targets. Every year, hundreds ofCalifornia schools find themselves inthis position.

Under the federal system, more alternative schools and fewer highschools are affected Another significant difference is the set of schools whose performance is measured and that are subject to inter-ventions. Under the original statesystem, only “mainstream” schoolsreceived API scores while “alternative”schools (those serving a majority ofat-risk students) were held accountableto a different set of measures. Onlyschools in the main API system withlow scores and/or low gains have beeneligible for extra funding and the associated interventions. Elementary,middle, and high schools have beenroughly proportionally represented instate intervention programs because the

API system has separate rankings foreach level.

In the federal system, all schools—whether “mainstream”or “alternative”—are measured for adequate yearly progress.Schools that receive Title I Basic Grantfunding because they serve a high percent-age of poor students are the only schoolssubject to interventions, however. Suchinterventions are triggered by repeatedlyfailing to make adequate yearly progress.(In other words, extra funds come becauseof the students served, but extra account-ability requirements are the result ofinadequate performance.)

High schools are also less likely to beidentified for intervention under thefederal system for two reasons. First, fewer high schools receive Basic Grants,which decreases the percentage of highschools that are eligible for the federal intervention program. Second, high schools are less likely than middle (butnot elementary) schools to not make AYP. This perhaps occurs because themajor performance measure is theCAHSEE, which is less of a grade-leveltest than the CSTs used in middle andelementary schools. (California uses theCAHSEE for AYP purposes because themath portion of that test is the only mathexam that all 10th grade students take.)

The federal system holds districtsand county offices accountableWhile the state system focuses on holding schools accountable, the federalsystem places equal attention on localeducation agencies and the state as awhole, measuring their performance inmuch the same way as schools andsubgroups. (“Local education agencies”refers to districts and county offices ofeducation. County offices are heldaccountable only for the schools thatthey run, which are generally schools forexpelled or adjudicated youths, orspecial-needs students.) This means thatlocal education agencies can be placed inan intervention program. That program

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

Page 10: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

E D S O U R C E R E P O R T

10 ● Intervention Programs ● February 2007

is much like the federal interventionprogram for schools, described next.

Finally, the interventions imple-mented when schools miss academicperformance targets are different underthe state and federal systems. The set offederal interventions is called “ProgramImprovement.”

Escalating performance targets andconsequences characterize “ProgramImprovement” for schools under NCLBUnder Program Improvement forschools, the consequences escalate withevery year that a school fails to makeAYP. And only by making AYP in twoconsecutive years can a school exit

Program Improvement. Justturning around a school in thatposition is difficult, but

performance targets thatincrease nearly every

year make the challenge even

greater. (TheCaliforniaC o u n t ySuperin-

tendents Educational Services As-sociation or CCSESA divides itself into11 regions in the state. Figure 1 showsthe number of schools that are inProgram Improvement or at risk ofgoing into Program Improvement bythose regions.)

Consequences escalate every yearthat adequate progress is not madeA school enters Program Improvement(PI) after it has missed the same AYPindicator (e.g., the annual measurableobjective in math) in two consecutiveyears. Once in PI, the school must reviseits plan for how its Title I Basic Grantfunds are spent and implement therevised plan promptly. The school mustalso set aside at least 10% of its BasicGrant for teacher professional develop-ment. In addition, the governing districtmust pay for transportation for pupilswho exercise their option under NCLBto transfer out of PI schools. Thedistrict must also provide technicalassistance to help the school improve.

For Title I schools that again fail tomake AYP, the governing district mustcontinue the activities described aboveand provide or pay for supplementalinstructional services for students whorequest them. However, if the districtitself is identified for PI (see page 11),it must hire an outside agency to pro-vide those services.

After a school has missed AYP for afourth year, the district must continuethe activities described above and takesome “corrective action” with the

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

figure 1 About 23% of California schools are in Program Improvement

Total Additional SchoolsSchools in Schools in PI at Risk for PI

Region in 2006–07 in 2007–08

Number Percent Number Percent

1. North Coast 404 61 15% 36 9%(Del Norte, Humboldt, Mendocino,Lake, Sonoma)

2. Northeastern 439 30 7% 22 5%(Siskiyou, Modoc, Trinity, Shasta,Lassen,Tehama, Plumas, Butte, Glenn)

3. Capital Service Region 788 120 15% 31 4%(Colusa, Yolo, Sutter, Yuba, Sierra,Nevada, Placer, El Dorado,Sacramento, Alpine)

4. Bay 1,153 235 20% 51 4%(Marin, Napa, Solano, Contra Costa,Alameda, San Francisco, San Mateo)

5. South Bay 608 124 20% 31 5%(Santa Clara, Santa Cruz, San Benito,Monterey)

6. Delta Sierra 467 106 23% 26 6%(Amador, San Joaquin, Calaveras,Tuolumne, Stanislaus)

7. Central Valley 749 262 35% 53 7%(Merced, Mariposa, Madera, Fresno,Kings, Tulare)

8. Costa Del Sur 654 158 24% 47 7%(San Luis Obispo, Kern,Santa Barbara, Ventura)

9. Southern 1,353 272 20% 47 3%(Orange, San Diego, Imperial)

10. RIMS 994 284 29% 72 7%(Riverside, Inyo, Mono, San Bernardino)

11. Los Angeles 1,944 588 30% 125 6%(Los Angeles)

State Total 9,553 2,240 23% 541 6%

Note: The percentages in this table reflect the percentages of the schools in each individual region that are in Program Improvement. Therefore, they do not add up to 100% for the state as a whole.

Data: California Department of Education (CDE) EdSource 2/07

1

6

5 7

8

9

11

10

2

4

3

Page 11: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

E D S O U R C E R E P O R T

February 2007 ● Intervention Programs ● 11

school, which could include replacingcertain staff members, appointing anoutside expert to help lead the school, orother actions. At this stage, districts arealso encouraged to provide additionalTitle I resources to the school. UnderNCLB, states must set aside a portion oftheir Title I monies to help local educa-tion agencies in these situations.

Specifically, states must reserve 4% of their Title I Basic Grants to assist struggling schools and LEAs. In 2005–06 California’s set-asideamounted to about $70 million. Mostof the reserved funds go toward helpingschools and LEAs that receive BasicGrants and have not been successful instate or federal intervention programs.However, about one-seventh of themoney goes to the “Statewide Systemof School Support” (S4), which workswith these schools. The S4 is comprisedof three entities: ● The Regional System of District and

School Support (RSDSS), whichconsists of 11 regional supportcenters organized around the 11CCSESA regions. The RSDSS is theprimary recipient of the funding.

● A federally funded ComprehensiveAssistance Center.

● The California Department ofEducation (CDE). If aid from this support network

does not help the school improveenough and it fails to make AYP yetagain, the district and school togethermust develop a plan to restructure theschool, which can include contractingwith an outside organization to run theschool, reopening it as a charter, makinglarge-scale staff replacements, turningover operation of the school to thestate, or some “other restructuring.”

Finally, if a Title I school does notmake adequate yearly progress for sixyears, the district and school mustimplement the restructuring plan. In2004–05, 271 California schools hadreached that level in the federal inter-

ventions system. (This is possiblebecause the state measured adequateyearly progress, then based on APIgrowth targets, prior to the passage ofNCLB.) Although federal policymakersmay have envisioned a major overhaul ofpersonnel and/or governance at thatpoint, many California districts havestopped short of drastically reformingthe governance structure for relevantschools, according to a report by theCenter on Education Policy (CEP), anonpartisan, Washington D.C.–basededucation policy research organization.

Center on Education Policy saysmany schools in the “restructuring”phase do not make major overhaulsStates vary in how they monitor andsupport schools in restructuring, ac-cording to a March 2006 report by CEP.In some, the state education agency heav-ily influences what type of restructuringwill occur with each school and signs offon the plans. In others, the state agencyleaves everything up to the districts.

California is taking a middle-ground approach, with CDE offeringtools to help districts decide on theircourse of action with relevant schools.These tools include regional workshopsconducted by the S4 network as well asdistrict-level self-assessments. Theseassessments look at a district’s supportfor schools in seven categories, such as standards-based curriculum, pro-fessional development, and fiscaloperations. CDE also suggests thatschools assess the extent to which theyhave the “essential program compo-nents” in place. (See page 4.) Finally,districts and schools together are askedto use a survey tool to examine andrefine their practices concerningstudents with disabilities. After theseself-assessments, districts and schoolscan use CDE-developed worksheets tothink through whether the restructuringoptions they are considering will achievethe desired improvement.

From the case studies that CEPconducted, it became clear that theseprocesses are forcing districts to askthemselves difficult questions. They arenot lightly choosing one option amonga list of five, but instead are wrestlingwith the details of their needs and thebest way to address them.

California requires districts toreport their restructuring choices to thestate. Of the 271 schools in restructur-ing in 2004–05:● 76% chose the “other restructuring”

category, ● 28% replaced all or most of the

staff, ● 14% contracted with an outside

organization to run the school, ● 13% had no plan, and ● 2% reopened as charters. (Percentages do not sum to 100%because some schools took advantage ofmore than one option.)

Examples of “other restructuring”include providing teachers with moreprofessional development, using teachercoaches, creating more opportunitiesfor teachers to analyze student achieve-ment data and/or collaborate oninstructional approaches, and addingextra periods for students struggling inmath and English.

Federal requirements force thestate to create interventions forschool districts Perhaps in recognition of the criticalrole that districts play in schoolperformance, the federal accountabilitysystem includes measuring the aca-demic performance of local educationagencies and interventions for thosethat are not performing satisfactorily.The federal performance measurementand sequence of interventions forLEAs are similar to those for schools.But California has less experience withLEA accountability and is finding itchallenging. A recently begun pilotprogram may help the state and

Page 12: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

E D S O U R C E R E P O R T

12 ● Intervention Programs ● February 2007

regional support agencies learn how toefficiently help struggling districts geton track. A grant from the Bill &Melinda Gates Foundation will signifi-cantly broaden the scope of that effort.

Program Improvement for districts is similar to PI for schoolsAs is the case with schools, localeducation agencies must receive Title Ifunds to be eligible for ProgramImprovement, and about 95% of themstatewide do. LEAs enter ProgramImprovement in much the same wayschools do. If for each of two consec-

utive years the LEA does not makeadequate yearly progress on the sameindicator (e.g., the annual measurableobjective in English), it enters ProgramImprovement—unless students in anyof three specific grade spans (3–5,6–8, or 10) have in either year met theAYP indicator that the district as awhole failed. Today, 165 of Califor-nia’s 1,034 LEAs are in ProgramImprovement. In addition, CDE dataindicate that 76 more LEAs are at riskof entering PI in 2007–08.

Assisting and sanctioning districtsand county offices of education is a new

and challenging frontier for California.Although the state has intervened in afew LEAs to aid recovery from fiscalcrises, California does not have experi-ence becoming involved in local agenciesfor purely academic reasons. The state’ssize and the diversity of its districtsmake it difficult for CDE to intervenedirectly. Thus California has limiteditself to providing some financial assis-tance and assigns technical-assistanceresponsibilities to people who arecloser to the problem. When an LEAenters Program Improvement, the stateexpects the district to complete a self-

figure 2 Local Education Agencies (LEAs) enter Program Improvement (PI) if they fail to make adequate progress

Number of Years the LEA Did Not Make Adequate Yearly Progress (AYP) and Did Not Meet AYP Criteria in each Grade Span

One Two Three Four Five

Planning Plan Implementation Corrective Action

PI Year 1 PI Year 2 PI Year 3California Department

of Education (CDE) CDE CDE● Disseminate PI results with assistance

of the LEA to the general public.● Provide or arrange for

technical assistance to the LEA.

LEA● Notify parents or guardians, with CDE

assistance, of:● The identification of the LEA as in PI.● Reasons for PI identification.● How they can get involved in improving

the LEA.● Actions CDE will take to improve

the LEA.● Revise/develop improvement plan

within three months of identification.● Consult with parents, guardians, school

staff, and others in development of the plan.

● Implement the plan immediately in current school year following plandevelopment.

● Reserve not less than 10% of its Title I,Part A funds for high-quality profes-sional development.

Did notmake AYPand did notmeet AYPgrade spancriteria

Did notmake AYPand did notmeet AYPgrade spancriteria

Continue:● Provide technical

assistance to the LEA.

LEAContinue:● Implement the plan

from Year 1.

Continue:● Technical assistance to the LEA.● Notify parents/guardians and the public

of corrective action taken by CDE.

Add:● Provide public hearing to the LEA within 45

days following notice of corrective action.● May take corrective action at any time

during improvement process, if neces-sary, but must take action during Year 3.

● Take at least one corrective action:● Defer programmatic funds or reduce

administrative funds.● Institute new curriculum and profes-

sional development for staff.● Replace the LEA staff.● Remove individual schools from jurisdic-

tion of the LEA and arrange for governance.● Appoint trustee in place of superin-

tendent and school board.● Abolish or restructure the LEA.

In conjunction with one of the above, CDEmay authorize student transfers to aschool not in PI in another LEA, with paidtransportation.

Data: California Department of Education (CDE) EdSource 2/07

Page 13: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

E D S O U R C E R E P O R T

February 2007 ● Intervention Programs ● 13

evaluation to identify barriers tosuccess. The state also requires the LEAto contract with a county office ofeducation or other external entity tohelp get it on the path to improvement.California provides $50,000 to theLEA, and $10,000 for each ProgramImprovement school, to help the LEAimplement recommended changes.

As with schools, the interven-tions/sanctions get stronger each yeara local education agency does not makeadequate yearly progress. In the firstyear of Program Improvement, CDEmust provide or arrange for technicalassistance and help the LEA informthe community of its status. The LEAitself must revise its Title I plan inconsultation with school staff andparents and begin implementing therevised plan. In addition, the LEA mustset aside at least 10% of its Title IBasic Grant for teacher professionaldevelopment.

If the LEA fails to make adequateyearly progress again, CDE mustcontinue to provide technical assistanceand the LEA must continue to imple-ment the revised Title I plan.

A third year of not making adequateyearly progress leads to corrective actionby the State Board of Education. Theboard must choose at least one sanctionfrom a list of six that includes replacingLEA staff, assuming governance overindividual schools, and abolishing orrestructuring the LEA. In addition, thestate may authorize student transfers toa non-Program Improvement school inanother LEA, with paid transportation.(See Figure 2 on page 12 for the full listof possible sanctions.)

The state is learning as it goes—with a pilot program as the first stepAs the AIR evaluations and the SimilarStudents study discussed earlier show,district involvement appears importantin any effort to help underperformingschools better serve their students.

With 165 of the state’s 1,034 LEAs inProgram Improvement—and 76 moreat risk of entering the program in thecoming year—the state is trying tofind a way to get these districts on trackto meet their AYP requirements andprovide better support to their schools.

As part of that attempt, CDE hascreated a pilot program in which districtassistance and intervention teams(DAITs) from four county offices ofeducation are helping districts at risk ofentering the corrective action phase of

Program Improvement. The DAITs areexpected to help with needs assessmentand mentor the district leadership. TheDAIT process is supposed to be acollaborative and intense effort toimprove district leadership and policiesto support school improvement. Teamsare expected to help districts prioritizekey actions for improvement in sevenareas: governance, alignment of curricu-lum and assessments to the state’sacademic content standards, fiscal operations, parent and community

To see AIR’s evaluations of II/USP and HPSGP, go towww.air.org/publications/pubs_ehd_school_reform.aspx and scroll to the bottom of the page.

WestEd, a nonprofit research, development, and service agency, has posted on its website an “R&DAlert” titled Focus on Turning Around Low-Performing Schools and Districts. The alert briefly summarizesresearch on strategies for helping underachieving schools get on track and discusses California’sdistrict assistance and intervention team (DAIT) process. See: www.wested.org/cs/we/view/rs/831

The Legislative Analyst’s Office (LAO) has issued a report on California’s Alternative Schools AccountabilityModel (ASAM) that includes recommended policy changes. See Improving Alternative Education in California at: www.lao.ca.gov

To see the Bush Administration’s Building on Results: A Blueprint for Strengthening the No ChildLeft Behind Act (proposals for amending NCLB), go to:www.ed.gov/policy/elsec/leg/nclb/buildingonresults.pdf

The Aspen Institute, whose mission is to “foster enlightened leadership and open-minded dialogue”on a range of issues, has published a paper on NCLB reauthorization. See: www.aspeninstitute.org

For a more in-depth look at the state’s newest intervention program, see EdSource’s Quality Education Investment Act embodies a new approach to interventions (2/07), available to downloadfor free at: www.edsource.org

To Learn More

Works CitedBitter, C., Perez, M., Parrish, T., Gonzalez, R., Socias, M., Salzfass, L., Chaney, K., Gubbins, P., Anand, P., Dawson, K.,

Yu, V., Delancey, D., and Esra, P. (2005). Evaluation Study of the Immediate Intervention/Underperforming Schools

Program of the Public Schools Accountability Act of 1999. Palo Alto, Calif.: American Institutes for Research (AIR).

Harr, J.J., Parrish,T., Socias, M., Gubbins, P., and Spain,A. (2006). Evaluation Study of California’s High Priority Schools

Grant Program: Year 1 Report. Palo Alto, Calif.: American Institutes for Research (AIR).

O’Day, J.A. and Bitter, C. (2003). Evaluation of the Immediate Intervention/Underperforming Schools Program and

the High Achieving/Improving Schools Program of the California Public Schools Accountability Act of 1999. Palo

Alto, Calif.: American Institutes for Research (AIR).

Page 14: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

E D S O U R C E R E P O R T

14 ● Intervention Programs ● February 2007

involvement, human resources, datasystems, and professional development. With AYP requirements escalatingannually, the need for district assis-tance will only grow. Fortunately forCalifornia, the Gates Foundation hasrecently offered $15.5 million to help

expand this pilot program. In Novem-ber 2006, CDE announced that theGates money would be used to expandfrom four to 15 teams. Each of the 11new teams will serve one of theCCSESA regions. The work of thenew teams will be informed by lessons

learned by the original four,and the plan is for all 58

county offices to learn “bestpractices” from the

group of 15 officesinvolved in the

pilot project.Figure 3shows howthe localeducation

agencies in PI in 2006–07—or at riskof entering PI because they did notmake adequate yearly progress in theprior year—are distributed among the11 CCSESA regions.

According to CDE and CCSESA,the timeline, in calendar years, for theGates-funded project is as follows: 2006● Evaluate work of initial four inter-

vention teams and share bestpractices to refine 2007 outreach andsupport (completed).

2007● Develop regional infrastructure in

county offices of education.● Establish intervention teams and

expand into 15 districts.● Provide data collection tools and

online database for team members todiagnose needs and apply appropri-ate interventions.

● Communicate the findings and bestpractices to curriculum and instruc-tion personnel in all 58 counties.

2008● Collect and evaluate data on the

second year of intervention in pilotdistricts.

● Refine the improvement plan andpractices based on findings.

2009● Implement intervention practices on

a statewide scale.● Develop statewide training for all

intervention teams.● Publish a report that can be shared

with other states.

figure 3 In two regions, about a third of the LEAs are in Program Improvement

Total Additional LEAsLEAs in LEAs in PI at Risk for PIRegion in 2006–07 in 2007–08

Number Percent Number Percent

1. North Coast 95 8 8% 10 11%(Del Norte, Humboldt, Mendocino,Lake, Sonoma)

2. Northeastern 128 5 4% 5 4%(Siskiyou, Modoc, Trinity, Shasta,Lassen, Tehama, Plumas, Butte, Glenn)

3. Capital Service Region 96 7 7% 6 6%(Colusa, Yolo, Sutter, Yuba, Sierra,Nevada, Placer, El Dorado,Sacramento, Alpine)

4. Bay 97 14 14% 8 8%(Marin, Napa, Solano, Contra Costa,Alameda, San Francisco, San Mateo)

5. South Bay 82 12 15% 4 5%(Santa Clara, Santa Cruz, San Benito,Monterey)

6. Delta Sierra 63 7 11% 6 10%(Amador, San Joaquin, Calaveras,Tuolumne, Stanislaus)

7. Central Valley 131 26 20% 5 4%(Merced, Mariposa, Madera, Fresno,Kings, Tulare)

8. Costa Del Sur 104 23 22% 12 12%(San Luis Obispo, Kern,Santa Barbara, Ventura)

9. Southern 88 14 16% 5 6%(Orange, San Diego, Imperial)

10. RIMS 69 24 35% 6 9%(Riverside, Inyo, Mono, San Bernardino)

11. Los Angeles 81 25 31% 9 11%(Los Angeles)

State Total 1,034 165 16% 76 7%

Note: The percentages in this table reflect the percentages of the local education agencies (LEAs) in each individual region thatare in Program Improvement (PI). Therefore, they do not add up to 100% for the state as a whole.

Data: California Department of Education (CDE) EdSource 2/07

1

6

5 7

8

9

11

10

2

4

3

Page 15: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.

E D S O U R C E R E P O R T

February 2007 ● Intervention Programs ● 15

California’s Quality EducationInvestment Act (QEIA) embodies a new approach to interventions As part of a settlement of a lawsuit overfunding for K–12 schools and commu-nity colleges, the state in 2006 created a$2.7 billion, seven-year interventionprogram. The “Quality EducationInvestment Act” (QEIA)—to be imple-mented beginning in 2007–08—shareselements of the other state and federalintervention programs discussed above.But it is also different in many ways.

Like the other interventions, QEIAprovides supplemental funds for schoolsand requires them to formally plan theuse of those funds to improve studentachievement. However, QEIA differs inthe pool of eligible schools, the time givento schools to improve, the consequencesfor lack of progress, and the amount offunding provided. It also emphasizesresource measures that schools anddistricts must meet, such as small classsizes and experienced teachers.

QEIA is like the other state inter-vention programs in that its funding istriggered by performance rather thanstudent demographics. It targets schoolsin the bottom 20% of the 2005 APIrankings regardless of the students theyserve or their rate of improvement.

QEIA provides more funding thanprevious state programs—$500 foreach K–3 pupil, $900 for each studentin grades 4–8, and $1,000 for each highschool student. (It provides two-thirdsof these amounts in 2007–08, which isconsidered a planning and preparationyear.) That compares to $200 per pupilin II/USP and $400 in HPSGP.

The authorizing legislation forQEIA is more explicit than previousstate programs about coordinatingfunding. Participating schools arerequired to integrate QEIA funds intotheir Single Plan for Pupil Achieve-ment, in which schools document howthey plan to direct a multitude of fund-

ing sources toward improving studentperformance.

Unlike the other state and federalinterventions, QEIA does not sanctionschools for not making satisfactoryprogress. On the other hand, the programis a bit more prescriptive than other stateintervention programs in that supplemen-tal resources can be cut off if a school isnot meeting benchmarks in teaching expe-rience levels, class sizes, and professionaldevelopment for teachers and principals.(The program does allow 15% of schoolsto take an “alternative”approach to schoolreform that does not necessarily involvemeeting some of the specific benchmarks,but the approach must be research-based.)In addition, under both the standard andalternative programs, student perform-ance benchmarks must be met.

One key element of QEIA that iscertainly different from II/USP is theexpected high level of district involve-ment. Districts must sign a set ofassurances regarding their participationin the program. For example, districtsmust ensure that each administrator in afunded school is confirmed to have“exemplary qualifications and experi-ence” by the end of 2008–09 and eachyear thereafter and that administratorswill receive leadership training.

On the horizon: What can California expect next? Nearly all California schools haveshown steady improvement on account-ability measures since the enactment ofthe Public Schools Accountability Actin 1999. However, a substantial numberof schools continue to struggle. In addi-tion, the rate of improvement has notbeen strong enough to prevent anincrease in the number of schools anddistricts entering the federal interven-tion program. California is not the onlystate experiencing this situation. As theNo Child Left Behind Act comes up for reauthorization in the near future,

federal lawmakers will likely hear callsfor change—even from organizationsthat supported the Act in its very earlyyears. The Bush Administration hasalready proposed amendments thatCongress may take up in 2007.

Schools’ rate of improvement maynot match increasing federal targetsObservers may not agree on the cause,but it is undeniable that the state’sschools have improved their API scores.This can be seen in many ways. In eachof the past seven API cycles, from abouta half to more than three-quarters ofschools have met their schoolwide andsubgroup API growth targets. Andnearly all—99.6%—of schools havemet their growth targets in at least oneof the last seven cycles. Another way oftracking growth is by the median APIscore. This measure also indicatesimprovement: the median API forelementary schools climbed from 674in 2000 to 751 in 2005. The mediansfor middle and high schools have alsoincreased, from 656 to 714 (middle),and from 638 to 680 (high school).

Despite improvement in API scores,growth in the percentage of studentsscoring proficient and above on theCalifornia Standards Tests in Englishand math has not been fast enough to prevent the number of ProgramImprovement schools from swellingrapidly. From 2002–03 to 2006–07,that number has grown from 1,201 to2,240 (or 13% to 23% of all schools).In addition, 541 more schools are atrisk of entering Program Improvement(PI) in 2007–08 because they did notmake adequate yearly progress last year.

Will NCLB reauthorization bringabout changes in federal accountability?As federal lawmakers consider the reau-thorization of NCLB, they may feelpressure to reconsider the interventionprovisions of the Act because of the

Page 16: Worthy Goals, Limited Success: Intervention Programs in ... · did not make significant growth faced serious sanctions. The State Superinten-dent of Public Instruction (SPI) was

E D S O U R C E R E P O R T

rapidly swelling ranks of PI schools anddistricts nationally.

Although most people agree withNCLB’s principles, not all like the Act’sdetails. Even among the segment that hassupported NCLB’s provisions, some arecalling for substantial amendments. Forexample, Michael Petrilli, vice presidentfor National Programs and Policy for theconservative Fordham Foundation, statedin a January 2007 commentary: “I’ve gradually and reluctantly come to theconclusion that NCLB as enacted isfundamentally flawed and probablybeyond repair.”Petrilli calls for eliminatingseveral of the law’s highly visible provi-sions, asserting in part: “No more ‘cascadeof sanctions’ for failing schools. No morefederal guarantee of school choice for chil-dren not being well-served….The stateswould decide when and how to intervenein failing schools.”

Although not proposing changes asdrastic as those Petrilli has called for, theBush Administration in early 2007 sug-gested changes to the Act that wouldaffect federal interventions, among otherprovisions. Examples include: ● Augmenting Title I funding such that

high schools would get more and elemen-tary and middle schools would not get less.

● Increasing funding to support schoolsin Program Improvement.

● Requiring schools in PI to offersupplemental education services in thefirst year along with school choice, butallowing schools to target choice andsupplemental services to students notyet scoring “proficient.”

● Requiring more substantial changes tothe staff or governance of schools inthe “restructuring” phase of PI.

● Providing funds for schools in restruc-turing to offer “scholarships” to low-income students to receive intensivetutoring or attend a private school orout-of-district public school.

● Authorizing districts with schools inrestructuring to remove limitations onteacher transfers from collective bar-

gaining agreements. (See “To LearnMore” on page 13 for where to findthe full set of proposals.)The chairman of the House Committee

on Education and the Workforce, Represen-tative George Miller, will play a leading rolein NCLB-reauthorization discussions. Hewelcomes some of the administration’s ideas but finds the latter two proposals listedabove unacceptable. Although the admin-istration and Congress may tussle over policydirection, both Miller and Senator EdwardKennedy, chairman of the Senate’s Health,Education, Labor, and Pensions Committee,have NCLB reauthorization “on their listsfor action”and are “laying the groundwork”for the process, according to the Jan. 10,2007 edition of Education Week. And PresidentGeorge W. Bush has already expressed hisdesire for reauthorization to occur this year.

This is in contrast to the relativelyrecent conventional wisdom that held thatfederal policymakers might not tackle reauthorization until 2009. Not onlyis the timing of the reauthorization inquestion, but also its ultimate outcome.

Many prognosticators predict that thereauthorization will yield only small tweaks. However, the recently-altered political landscape in Washington, D.C.,and pressure for significant changes fromgroups that have until now been supportiveof NCLB, may bring about more thanminor changes. California’s major educationstakeholders have come to a consensus onchanges they seek—a key one being accep-tance of California’s “growth model” ofmeasuring school performance, whichwould greatly affect which schools aretargeted for intervention. These stakeholdershave already begun actively pressing theircase with federal officials.

Meanwhile, the state’s education community hopes to see continuing im-provement in public schools generally andto discover the special combination ofincentives and resources that will liftstudent achievement in the schools that todate have had trouble meeting perform-ance benchmarks.

520 San Antonio Rd, Suite 200, Mountain View, CA 94040-1217 • 650/917-9481 • Fax: 650/917-9482 • [email protected]

www.edsource.org • www.californiaschoolfinance.org • www.ed-data.k12.ca.us

Davis Campbell, PresidentPresident, California School Boards AssociationGovernance Institute

Lawrence O. Picus, Vice PresidentProfessor, Rossier School of Education,University of Southern California

Martha Kanter, Fiscal OfficerChancellor, Foothill-De Anza Community College District

John B. Mockler, SecretaryPresident, John Mockler & Associates, Inc.

Ray BacchettiScholar in Residence, Carnegie Foundation for the Advancement of Teaching

Pam BradyPresident-elect, California State PTA

Christopher CrossChair and CEO, Cross & Joftus, LLC

Gerald C. HaywardPartner, Management Analysis & Planning, Inc.

Santiago JacksonAssistant Superintendent, Division of Adult and Career Education, Los Angeles Unified School District

Jacqueline JacobbergerPresident, League of Women Voters of California

Kelvin K. LeeSuperintendent (Retired), Dry Creek Joint Elementary School District

Donna LillyCo-president, American Association of UniversityWomen–California

Paul J. MarkowitzTeacher, Las Virgenes Unified School District

Ted OlssonManager (Retired), Corporate Community Relations, IBM Corporation

Amado M. PadillaProfessor of Education, Stanford University

Peter SchragContributing Editor, Sacramento Bee

Don ShalveyCEO and Co-founder, Aspire Public Schools

Krys WulffDirector-at-large, American Association of University Women

2006–07 EdSource Board of Directors

Trish WilliamsEdSource Executive Director

© C

opyr

ight

200

7 by

EdS

ourc

e, I

nc.