the indiana acuity efficacy study: year 1 results and implications terry spradlin june 24, 2009...

35
The Indiana Acuity Efficacy Study: Year 1 Results and Implications Terry Spradlin June 24, 2009 CCSSO National Conference on Student Assessment

Upload: della-mcdonald

Post on 18-Dec-2015

216 views

Category:

Documents


0 download

TRANSCRIPT

The Indiana Acuity Efficacy Study: Year 1 Results and Implications

Terry SpradlinJune 24, 2009

CCSSO National Conference on Student Assessment

2

About the Center for Evaluation & Education Policy

•The Center for Evaluation & Education Policy (CEEP) is a client-focused, self-funded research center associated with the School of Education at Indiana University

•CEEP promotes and supports rigorous program evaluation and nonpartisan policy research primarily, but not exclusively, for education, human service and non-profit organizations

•In the area of K-12 education policy, CEEP’s mission is to help inform, influence and shape sound policy through effective, nonpartisan research and analysis

3

CEEP Associates focus their broad spectrum of experience and capabilities to produce high impact

within the following "Areas of Excellence":

•Educational Evaluation

oEarly Childhood Education Evaluation

oLiteracy Evaluation

oMath, Science and Technology Evaluation

•Education Policy Research & Technical Assistance

•Health, Human Services & Community Development Evaluation

Table of Contents

I. Indiana’s Comprehensive Assessment Plan

II. 2008-09 Testing Schedule (formative and summative)

III. Objectives of Efficacy Study in Indiana

IV. Study Design and Methods

V. Year 1 Findings (from Qualitative Analyses)

VI. Future Study Components

4

I. Indiana’s Comprehensive Assessment Plan

• Adopted by the Indiana State Board of Education on November 1, 2006

• Plan called for moving the Indiana Statewide Testing for Educational Progress-Plus (ISTEP+) from Fall to Spring and the implementation of formative/diagnostic assessments

• Features implemented during the 2008-09 school year:o Wireless Generation’s mClass Reading 3D and Math (Grades K-2 formative)o CTB/McGraw-Hill’s Acuity Assessment Program (Grades 3-8 formative)o Phase-out of the Graduation Qualifying Exam (GQE)

• Class of 2011 last to be required to pass GQE• To be replaced with end-of-course assessments in core subject areas

o Move ISTEP+ from Fall to Spring• (Students in grades 3-10 were tested twice during the 2008-09 school

year)

5

II. 2008-09 Fall Testing Schedule (Formative and Summative)

6

II. 2008-09 Spring Testing Schedule (Formative and Summative)

7

III. Objectives of Efficacy Study in Indiana

• Objectives of CEEP Study:

o Evaluate the effects of CTB/McGraw-Hill’s Acuity Assessment Program, a formative assessment system, on instructional practice and student achievement during the initial/pilot year of implementation in 510 schools

o Information intended to inform CTB and the IDOE about the kind of support needed to make the implementation of Acuity most effective during subsequent school years

8

IV. Study Design and Methods Qualitative Methods

• Fall statewide online survey of administrators and testing coordinators from Acuity and non-Acuity schools to address:

o Why schools did or did not participate in Acuity, their interest in participating in the Acuity Program in future school years, and their present use of other formative assessment systems

o Administrators and testing coordinators’ expectations of their school’s participation in Acuity and its potential impact on instruction and student achievement outcomes, and the levels of their initial training and preparedness

o The survey window was open from September 2 to September 19, 2008

o Respondents included superintendents, principals, and testing coordinators• 237 respondents: 70% of school districts in Indiana were represented by at least one survey

respondent• 99 respondents were from schools using Acuity, while the remainder were from non-Acuity

schools

9

IV. Study Design and Methods (cont.)

Qualitative Methods

• Spring statewide online survey of Acuity schools to measure attitudes and perceptions of teachers and administrators on the benefits and outcomes of their participation in the Acuity Assessment Program during the 2008-09 school year

o The survey window was open from April 20 to May 8, 2009

o Respondents included teachers, testing coordinators, and principals• 731 respondents: 87% of school corporations participating in Acuity in

Indiana were represented by at least one survey respondent

10

IV. Study Design and Methods (cont.)

Qualitative Methods (cont.)

• Intensive case study on the use of the Predictive and Diagnostic assessment results to: 1) inform and enhance instruction2) drive improvement in student achievement outcomes as measured by the summative

assessment

o The Project Team identified 12 school corporations for the case study that were reflective of Indiana’s student population based on prior ISTEP+ performance, free and reduced-price meal program eligibility data, school size, and local type (urban, suburban, and rural)

o Qualitative procedures including a questionnaire, focus group sessions, and onsite visits were used to inform the case study regarding the extent to which these corporations have implemented the Acuity Program and how they utilized Acuity data to inform or alter instruction

11

Quantitative Methods• Completion of a Comparison-Group Study using regression and other statistical techniques to

analyze quantitative data collected from the Acuity predictive and diagnostic assessments as well as from Indiana’s ISTEP+ summative assessment

o These analyses will be used to assess the degree to which use of the predictive and diagnostic assessments are associated with increased achievement on the state-required summative assessments in mathematics and English/language arts

o These analyses will account for the varying level of participation by pilot schools selecting among predictive and diagnostic assessments that best meet the needs of their students

o Comparison schools will be matched with Acuity schools using prior ISTEP+ performance, free and reduced-price meal program data, school size, and local type (urban, suburban, and rural)

12

V. Year 1 Findings (from Qualitative Analyses)

A. The End-of-Year survey and comparisons to the Fall survey

B. Case Study Site Visitso Pressing Issueso Positive Educator Feedbacko Educator Recommendations and Considerations

C. Teacher and Student Impressions of Acuity (Video)

13

A. End-of-Year survey and comparisons to Fall survey

14

Details of End-of-Year Survey

Spring (End-of-Year) statewide online survey• 731 respondents included 460 (63%) teachers, 119 (16%) principals, 77 (11%) testing

coordinators, and 75 (10%) other school personnel

16-question survey examining:• Educator opinions regarding Acuity Assessment Program content, technology/user

experience, professional development, and customer support after use of the system for a full school year

• In addition, the survey was intended to obtain suggestions for improvement of the program and to gauge views regarding the impact of the program on classroom instruction, general student achievement, and student achievement on ISTEP+

• Some questions from the Spring survey were comparable to questions asked in the Fall survey, so comparisons were also made between educator opinions at the beginning of the school year and in the Spring

15

Survey Results: Overall Educator Opinions of Formative Assessment

16

Frequency of Formative Assessment Use

• Question 3 of the Spring survey (“How often did you access the Acuity Assessment System?”) is comparable to Question 5 of the Fall survey: “How often do you plan to use the Acuity Assessment System?”

• Respondents most commonly used the program only during administration windows. However, respondents used the program once/week, multiple times/week, and once/month more than predicted

17

Percent of Respondents in

Fall Survey

Percent of Respondents in Spring Survey

Once/week3.1 14.2

Multiple times/week 3.1 13.7

Once/month 4.1 15.9

Only during administration windows 53.1 39.8

Other (Varied)36.6 16.4

Total Number of Respondents 98 724

Other Frequency of Use Issues

• Most Acuity schools (70%) used the online tools exclusively in their administration of the Acuity Assessment Program and another 22% use online and paper/pencil tests.

• Educators are not using all components of the system as often as necessary to maximize the use and benefits of the system (e.g., only 42% of respondents indicated the use of the Instructional Resources and only 16% indicated the use of the Item Bank)

• Approximately one-third of respondents indicated that they were still not comfortable using class roster and student reports

18

Perceived Impact on Classroom Instruction

• A majority of respondents (51%) indicated that participation in the Acuity Assessment Program helped to somewhat improve classroom instruction

• 32% felt participation had no impact on instruction• 14% indicated that participation led to a decreased quality of instruction (due to

the multiple assessments scheduled during the school year, both formative and summative, and the “interruption” to teaching time)

19

Percent of Respondents in Fall Survey

Percent of Respondents in Spring Survey

Greatly decreased quality of instruction 1.0 2.8

Somewhat decreased quality of instruction0 11.4

No impact on instruction 5.1 32.3

Somewhat improved quality of instruction68.7 51.0

Greatly improved quality of instruction 25.3 2.5

Total Number of Respondents 99 718

Perceived Impact on Student Achievement

• A plurality (47%) of respondents indicated that participation in the Acuity Assessment Program during the 2008-09 school year had no impact on student achievement; 43% of respondents indicated that participation led to somewhat improved student achievement outcomes

• The opinions about the impact of Acuity on student achievement were not as favorable as the very optimistic outlook expressed by educators participating in the Fall survey

20

Percent of Respondents in Fall Survey

Percent of Respondents in Spring Survey

Greatly decreased achievement 0 1.1

Somewhat decreased achievement 1.0 8.0

No impact on achievement 4.0 46.5Somewhat improved achievement 67.7 42.7

Greatly improved achievement 27.3 1.7

Total Number of Respondents 99 723

Perceived Impact on Student Performance on ISTEP+

• A plurality (49%) of respondents indicated that they believe participation in Acuity will lead to improved student performance on the spring 2009 ISTEP+ (note: Spring ISTEP+ had not been fully administered or scored at the time of the survey)

• 42% stated that participation would have no impact on student ISTEP+ scores• Like with student achievement in general, the opinions about the impact of Acuity

on student performance on ISTEP+ were not as favorable as the very optimistic outlook expressed by educators participating in the Fall survey

21

Percent of Respondents in Fall Survey

Percent of Respondents in Spring Survey

Greatly decreased performance 0 1.0Somewhat decreased performance 1.0 4.9No impact on performance 8.1 41.8Somewhat improved performance 73.7 48.5Greatly improved performance 17 3.8

Total Number of Respondents 99 717

Educator’s Views on Impact of Acuity on Instruction and/or Student Achievement

(Q 11) Why do you think classroom instruction and/or student achievement declined, improved, or did not change? Of the 568 written responses:

• For those who believed Acuity led to improvement in classroom instruction and/or student achievement, the largest number of respondents (99) cited the reason “Teachers are better able to target teaching material based on demonstrated student needs”

• For those who believed Acuity led to no change in classroom instruction and/or student achievement, the largest number of respondents (66) cited the reason “Because the test is in its first year, its impact is yet to be determined”

• For those who believed Acuity led to declined classroom instruction and/or student achievement, the largest number of respondents (112) indicated the volume of assessment during the school year was overwhelming for teachers and students

22

(Q 13) What was the most helpful component of the Acuity Assessment Program?

• The largest number of respondents (93 out of 503) indicated the reports were the most helpful component of the Acuity Assessment Program

• 77 respondents indicated the Instructional Resources were the most helpful component

23

Most Helpful Program Components

Frequency PercentReports 93 18.5Instructional Resources

77 15.3

Provided data to drive instruction 60 11.9

Immediate feedback 58 11.5Custom tests 38 7.6ISTEP+ preparation 35 7.0Ability to assign customized, individual assignments

27 5.4

Alignment with state standards 24 4.8

Provided data and resources that were useful for remediation 23 4.6

Total Number of Respondents

503

(Q 14) How can CTB/McGraw-Hill enhance the Acuity Assessment Program for future assessment

administrations? • The largest number of

respondents (80 out of 417) indicated they would like the program to be more user-friendly. (Make program use less time-consuming and enhance features to assign skills and Instructional Resources to students and create custom tests)

• 75 respondents indicated they would like the alignment of questions with Indiana Academic Standards and their school’s curriculum to be improved

24

Suggested Program Improvements

Frequency PercentMake the program more user-friendly 80 19.2

Improve Alignment of Questions 75 18.0Improve the layout and formatting of the assessments 66 15.8

Provide additional training 58 11.5Improve reporting features to make them easier to use and understand 30 7.2

ISTEP+ preparation 35 7.0Improve the Instructional Resources 27 6.5

Provide a Larger Bank of Questions and More Questions Per Standard/Sub skill

19 4.6

Increase Instructions and Audio For Students with Accommodations 11 2.6

Total Number of Respondents

417

B. Case Studies

Site Visit Suggestions and Implications

25

Pressing Issues

Provide more professional development Simplify and improve reports Ensure Item Bank includes sufficient number of items for each standard Make Instructional Resources more engaging and automate the

instructional resource assignment process Adequate computer access in schools must be addressed Align Diagnostic assessments with school curriculum and pacing guides Improve developmental appropriateness of assessments Provide clear guidance on the accommodations for students with special

needs

26

Positive Educator Feedback

General Teacher Feedback: The program is very user-friendly overall and the immediate scoring is one of

the program’s strengths

Students can easily log on, study, or take tests independently

Student practice (using Instructional Resource activities) are easy to set up and teachers like the ability to have students practice skills often

Teachers are able to give students individualized practice. (High and low level students’ needs are met)

Teachers said they liked the online features of the instant reports

27

Positive Educator Feedback (cont.)

Miscellaneous Taking the tests online has reportedly been enjoyable for students and has improved many students’

computer skills

Most administrators indicated they would like to continue using formative assessment in future years and said it could replace many of the other assessments their students currently take

Teachers reportedly like that the questions are tied to Indiana Academic Standards, are aligned with ISTEP+, and provide standardized tests that can be used with Response to Intervention (RTI) for continuous progress monitoring

School personnel like that the program makes them look closely at their curriculum and ensure that it is standard-driven and appropriate

The program provides teachers and administrators with information about skills they need to address both on a classroom level and school-wide. Many school personnel like that the tool can help school personnel drill down to the standard and indicator levels to identify student needs

The Instructional Resources reportedly provide a great resource for remediation efforts

28

Educator Recommendations & Considerations

• Reportso Teachers commented that the reports could be simplified to better meet their needs as classroom

teacherso School personnel would also like to have access to a parent report and a website that provides links

to educational websites parents could access at home to remediate their children’s skill deficits• Information on Teaching Strategies

o Teachers would like information on teaching and learning strategies they can use to address the specific problems identified by the assessments

o Due to issues with adequate computer access in some schools, some teachers would like some suggestions/strategies for remediation that do not require the students to use computers

• Tips From Educatorso Students are given their score after taking the assessment but they don’t necessarily know what that

score means. Some teachers found it beneficial to write down students’ scores and provide feedback immediately when they return to class

o Some schools require every teacher to complete an “Acuity Data Analysis Organizer,” which is a form created to facilitate teacher involvement in the program. The form requires teachers to look at the assessment report for each class and list all standards in the appropriate category based upon percentage. Teachers are then asked to describe their plan of action: “What do you plan on doing with this class to improve on Critical Need Skills?”

29

C. Teacher and Student Impressions of the Acuity Assessment Program (Video)

30

V. Year 1 Summary of Findings

Educators like the technology platform of the system and tests

No dispute about the relevancy of the formative assessments with state academic standards and the summative accountability test

Some concern about alignment of particular formative assessments with the curriculum pacing guides of schools

Teachers want more training on using student and class reports to inform and improve instruction

Educators offered many suggestions for modification to Acuity; CTB appears to be listening

31

V. Year 1 Summary of Findings (cont.)

Educators expressed high levels of satisfaction with customer service and attentiveness by CTB

Overall, some evidence participation is impacting instruction, though not compelling – “just get it over with” mentality pervasive in some schools

Likely to change moving forward as user experience and training increases

Educators generally recognized the benefit of the formative assessment system, but impact on instruction and student achievement outcomes yet to be fully determined.

32

VI. Future Study Components

• Quantitative analysis of Indiana Acuity data

o What is the association between Acuity participation by Indiana schools and ISTEP performance?

o How does participation in the Acuity program associate with differential gains on ISTEP scores between schools that do and do not participate in Acuity?

33

VI. Future Study Components (cont.)

• Quantitative analysis of Indiana Acuity data

o Regression models will be specified to determine association between Acuity predictive components and ISTEP math and language arts for each participating grade

o Variables and covariates include: • School size (large, small)• School location (urban, suburban, rural) • Socio-economic status (high, medium, low)• Program participation• Student scores prior to Acuity participation

34

CEEP Contact Information

Terry E. Spradlin, MPAAssociate Director for Education Policy

1900 East Tenth StreetBloomington, Indiana 47406-7512

812-855-4438Fax: 812-856-5890http://ceep.indiana.edu

35