michigan department of education common core assessment option report

Upload: shane-vander-hart

Post on 04-Jun-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    1/23

    Report onOptions for Assessments

    Aligned with the

    Common CoreState Standards

    Submittedto the

    Michigan Legislature

    December 1,2013

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    2/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    andy Archer . . . . . . . . . . . . . . Content Area Literacy Consultant, Of ce of Education Improvement and Innovation

    att Ayotte . . . . . . . . . Online Assessment Specialist, Of ce of Systems, Psychometrics, and Measurement Research

    ephen Best . . . . . . . . . . . . . . . . . . . . . . . . . . Assistant Director, Of ce of Education Improvement and Innovation

    il Chase . . . . . . . . . . . . Composition and Professional Development Manager, Of ce of Standards and Assessment

    b Clemmons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Executive Director, School Reform Of ce

    ug Collier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Financial Manager, Of ce of Assessment Business Operations

    nce Dean . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Director, Of ce of Standards and Assessment

    egg Dionne . . . . . . . . . . . Supervisor, Curriculum and Instruction, Of ce of Education Improvement and Innovation

    n Ellis . . . . . . . . . . . . . . . . . . . . . . . . . Communications and Process Specialist, Division of Accountability Services

    e Fransted . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Department Analyst, Of ce of Standards and Assessment

    nda Forward . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Director, Of ce of Education Improvement and Innovation

    l Grif n . . . . . . . . . . . . . . . . . . . . . . . . . . Consultant, Urban Education, Grant Requirements, and Credit Recovery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Of ce of Education Improvement and Innovation

    m Grif ths . . . . . . . . . . . . . . . . . . . . . . . . . . . . Test Adminstration Manager, Of ce of Standards and Assessment

    my Henry . . . . . . . . . . . Consultant, Statewide System of Support, Of ce of Education Improvement and Innovation

    th Anne Hodges . . . . . . . . . . . . . . . . . Mathematics Consultant, Of ce of Education Improvement and Innovation

    nda Howley . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Accessibility Specialist, Of ce of Standards and Assessment

    ve Judd . . . . . . . . . . . . . . . . . . . . . . . . . Director, Of ce of Systems, Psychometrics, and Measurement Research

    nessa Keesler . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Deputy Superintendent, Division of Education Services

    t King . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Senior Project Management Specialist, . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Of ce of Systems, Psychometrics, and Measurement Research

    m Korkoske . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Financial Analyst, Of ce of Assessment Business Operations

    seph Martineau . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Deputy Superintendent, Division of Accountability Services

    m Mathiot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Composition Editor, Of ce of Standards and Assessment

    ndy Middlestead . . . . . . . . . . . . . . . . . . . . . . . . . Test Development Manager, Of ce of Standards and Assessment

    m Mull . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Graphic Designer, Of ce of Standards and Assessment

    n Paul . . . . . . . . . . . . . . . . . . . . . . . English Learner Assessment Consultant, Of ce of Standards and Assessment

    ren Ruple . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Statewide System of Support Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Of ce of Education Improvement and Innovation

    ul Stemmer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . National Assessment of Educational Progress State Coordinator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Of ce of Standards and Assessment

    eve Viger . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Measurement Research and Psychometrics Manager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Of ce of Systems, Psychometrics, and Measurement Research

    annon Vlassis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Graphic Designer, Of ce of Standards and Assessment

    m Young . . . . . . . . . . . . . . Interim Assessment Consultant, Secondary Level, Of ce of Standards and Assessment

    TABLE OF CONTENTS

    Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 - 5

    Summative Assessment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 - 21

    Content and Item Type Alignment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 - 7

    Transparency and Governance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 - 9

    Overall Design and Availability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

    Test Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

    Scoring and Reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

    Cost - Standard Product . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

    Constructed Response Standard Product Cost Implications . . . . . . . . . . . . . . . . . . . . .

    Interim Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 - 33

    Content and Item Type Alignment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

    Transparency and Governance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

    Overall Design and Availability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Test Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 - 27

    Scoring and Reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 - 29

    Cost - Standard Product . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

    Constructed Response Standard Product Cost Implications . . . . . . . . . . . . . . . . . . . . . 3

    Accessibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 - 35

    Technical Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 - 37

    Formative Assessment Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 - 39

    Local Implications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 - 41

    Su mmary Conclusio ns and Reco mmendation s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 - 4 3

    References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

    Appendices

    Appendix A - Survey Appendix B - Survey Responses

    Appendix C - Survey Questions Cross Reference Table

    ACKNOWLEDGEMENTSThe completion of this report would not have been possible without the dedicated efforts of the

    Michigan Department of Education (MDE) staff listed below in alphabetical order.

    addition, MDE would like to thank Brian Gong, Executive Director of the Center for Assessment at the Nationalnter for the Improvement of Educational Assessment, for providing an independent review of a draft of this reportthe purpose of ensuring its clarity and usefulness.

    nally, MDE would like to express its sincere appreciation to all the service providers that responded to thebreviated Request for Information (RFI) represented by the survey that forms the basis for this reports content.

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    3/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    COMMON COREAssessment

    OptionsReport

    INTRODUCTION

    chigan students and educators need a rich, next-neration assessment system that is suitable for themerous, high-stakes purposes toward which it will beplied. The solutions described in this report must bensidered in light of how the test results will be used,d the fact that every school, educator and communityll feel real consequences of their use, both intendedd possibly unintended. Michigans transition to new,line assessments that include multiple measuressigned to capture student achievement and growth,a powerful opportunity to improve the strength ofr entire education system. This report represents anportant source of information about the various optionsailable to the state.

    e Legislative Resolution

    th the House Concurrent Resolution 11 passed by Michigan House of Representatives on September 2013 and the substitute for House Concurrentsolution 11 passed by the Senate on October 24,13 (subsequently adopted by the House on October 2013) included a requirement for the State Board ofucation and MDE to develop and submit a report on

    tions for assessments fully aligned with the Commonre State Standards (CCSS). The report was to bempleted and submitted to both chambers of thegislature by December 1, 2013 and be factual andbiased. In addition, the nal resolution expressedreference for state assessments that are computeraptive, provide real-time results, are able to be givenice per year, and assist in the evaluation of individualchers. Other requirements included availability by the14-15 school year in grades 3 through 11.

    order to comply with the nal resolution, the primaryquirement for assessment solutions described in this

    report is that they be adequately aligned with Michiganscollege- and career-ready standards, in this case theCCSS. Some aspects of alignment (e.g., coverage ofthe mathematics and reading standards) are relativelystraightforward. Other facets are more challengingto capture and have far-reaching implications forcategories such as cost. An example of this is the useof Constructed-Response (CR) items; test questionsthat require students to develop a short or long writtenresponse. These are often signi cantly better than othertypes of items for measuring complex skills such asresearch, problem solving or communicating reasoning,that are found in the CCSS. However, these types ofitems are often time-consuming for students to answerand are the most expensive and complicated to score.Because CR items have signi cant implications for avariety of categories presented in this report, referenceswill be made to them in appropriate sections, and overallimplications will be described in the summary conclusionsand recommendations section.

    MDE Request for Information Process

    In order to complete this project by December 1, 2013,the decision was made to develop a survey coveringthe primary topics of concern and permit any vendorregistered to do business in Michigan through the statesBuy4Michigan web system to respond. Developmentof the survey commenced immediately following theapproval of the Senate resolution on October 24, whenit was apparent that the nal resolution was highlylikely to require this report. Through the Buy4Michiganwebsite, 185 entities are registered under the category ofeducational examination and testing services. The surveywas posted to the site; each registered entity receiveda noti cation email indicating that an opportunity wasavailable for them on October 30, and indicated that allreplies were due in two weeks. Twelve service providerssubmitted responses, all of which were included in thereport.

    The survey questions were separated into three distinctcategories, to capture information on the three primarytypes of assessment solutions that are essential elementsof a balanced assessment system needed to supporteducational improvement. The goal of this was to learnwhat solutions were available or being developed for: Summative purposes (e.g., test like MEAP for

    high-stakes accountability) Interim purposes (e.g., tests administered

    multiple times throughout the year to measurestudent growth with relative frequency), and

    Formative purposes (e.g., resources to supportreal-time measurement of student learning).

    It was also important to ask about these different classesto ensure that no service provider was excluded forhaving a viable solution for one product in any category,versus requiring that each vendor have somethingfor all three categories. MDE is open to the idea thatthe strongest overall solution in the end may involveselecting the best in class for each type, although thisconcept introduces substantial risk on aspects such as thecomparability of test scores.

    Responding to the extensive survey in two weeks wasundoubtedly a challenging task for service providers, asthe questions were detailed and covered a wide rangeof topics. MDE is appreciative that so many quali edteams made the time to submit complete responses. Oneservice provider, the Northwest Evaluation Association(NWEA), which currently has products deployed in anumber of Michigan schools, chose not to complete thesurvey and instead submitted a letter explaining someaspects of their assessments and why they elected tonot complete the survey. Since they did not submit a fullresponse, NWEA is not included in the report. However,as they were the only vendor to submit such a letter,and many Michigan stakeholders are familiar with whatthey have to offer, MDE felt it was appropriate to includetheir letter in Appendix B with the completed surveyinformation from the other entities. The table belowprovides a summary of the report development schedule.

    Report Development Milestones

    Senate passes resolutionand survey developmentbegins

    October 24, 2013

    Survey posted toBuy4Michigan website October 30, 2013

    Responses due fromservice providers November 13, 2013

    Report submitted toLegislature and StateBoard of Education

    December 1, 2013

    Organization and Composition of the Report

    Once responses were received, MDE staff membersneeded to review them all, compile them by category,and assign ratings. In order to complete this task, teamsof staff with relevant subject matter expertise wereassigned to each category with explicit instructions onhow to view the responses and assign ratings. It wasdetermined that a Consumer Reports type of displaywould be the most user-friendly. The tables displayed inthe body of the report provide a snapshot of how eachservice provider completed the questions germane toeach category. There are three important caveats aboutthe ratings assigned:

    Due to the timeline, it was not possible tothoroughly evaluate the quality of evidenceprovided by each service provider. The highest

    rating is based on complete responses that includedsome evidence indicating they were likely to meetall requirements, the middle rating indicatingunclear or partial meeting of requirements, etc.Therefore, development and rigorous vetting ofscoring criteria could not be accommodated.Additionally, the decision was made to limit thenumber of rating categories to three, to help ensurethat even if a longer timeline had been availableand a more rigorous, ne-grained (e.g., 5 or7 categories) scoring system developed, only minorchanges in scoring would have likely resulted.

    Responses from service providers were notcompared against each other, only against thecontent of the survey questions. Comparingresponses across multiple survey questionsrelated to each category would have requiredsubstantially more time in order to evaluate thequality of the response and accompanyingevidence.

    It is important to remember that many of thesolutions described in this report are underconstruction, so a true evaluation of their qualitieswill not be possible until after the rst year ofoperational test administration.

    Based on these caveats, it is essential to recognizethat this report alone is not suf cient to determinewhich assessments would truly be viable with regard tomeasuring the full breadth of career- and college-readystandards, interfacing with state systems, not addingadditional burdens to local districts and schools, and costeffectiveness.

    A conscious decision was made not to consolidate theratings for each category into an overall ExecutiveSummary. This process would have diluted the responsesprovided by each service provider by not properlyaccounting for the many areas where solutions arepartially available or in various stages of development.Based on this, each category should be reviewed onits own merits and given equal weight. Additionally,in a number of cases, the survey responses required a

    yes or no response, but the opportunity to providecomments for the purpose of further clari cation wasmade available. This introduced nuances, or possibleopportunities for negotiation in areas such as controlover data or opportunities to have Michigan educatorsinvolved with test question development, that could notbe captured equitably in each sections table or narrative.The survey responses from each service provider areincluded in their entirety, unaltered, in Appendix B, ifany readers of this report are interested in exploring thecomments that accompanied some responses.

    In addition to the speci c items listed in the nalresolution, four key documents guided the developmentof the survey questions and helped shape thelenses through which the responses were viewed byDepartment staff. Two of the documents, the Stanfor Educational and Psychological Testing and theStandards and Assessments Peer Review Guidance

    have been important sources of requirements fortechnical quality and ensuring that all state standardsand assessment systems meet criteria speci ed underthe federal Elementary and Secondary Education Act.Recently, two other documents have been produced toguide the development of high quality, next-generationassessments and thoroughly de ne the requirementsand responsibilities of clients (e.g., states) andservice providers in all aspects of bringing large-scaleassessment programs to operational status. Respectively,these are the CCSSO Assessment Quality Principlesthe Operational Best Practices for Statewide Large-Scale

    Assessment Programs-2013 Edition.

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    4/23

    Common Core Assessment Options Report December 1, 2013

    Content&

    Item TypeAlignment

    SUMMATIVE

    December 1, 2013 Common Core Assessment Options Report

    KEY: Appears to fully meet requirements based on responses provided

    Unclear if meets or appears to partially meet requirements based on responses provided

    Does not appear to meet requirements based on responses provided

    NR No response or did not indicate having a summative product

    TRODUCTIONe Common Core State Standards are organized intoe content areas: Mathematics, Reading, Writing,stening and Speaking. They provide goals andnchmarks to ensure that students are achievingrtain skills and knowledge by the end of each year.ey were carefully written so that students leave high

    hool with a deep understanding of the content andlls they need to be career- and college-ready. Itmportant, then, that the summative assessmentscurately re ect the intended content emphasis andportant understandings of each grade level, 38, and

    gh school.

    ultiple-choice and technology-enhanced item types critical components of an assessment, and in order

    truly assess the rigor and higher-order thinkinglls required from the CCSS and career and college

    adiness, an assessment solution must offer abstantial number of constructed response items asll. Constructed response test questions are essentialthey are the only item type that can truly measurertain areas of the CCSS such as writing, research,d problem solving skills. Please note a detailed report

    on constructed response items in the cost section ofthis report. The quantity of constructed response itemswill also be covered in both the cost and scoring andreporting sections of this report. Performance tasksprovide insights into students depth of knowledge on

    important content because they require students toengage in authentic problem solving and to perseverethrough the multiple steps of the task.

    Service Provider Content Alignment Item Types

    Content alignedto the CCSS

    Solutionaddresses all 5content areas

    Solutionaddresses all grade

    levels (G3-G11)

    Quali cations foreducators involved inalignment for content,diversity and special

    populations

    Standard item types(multiple choice and

    constructed response)will be available

    Diverse set oftechnology-enhanced

    item types will beavailable

    Performance tasks/assessments

    will be available

    ACT Aspire

    Amplify Education, Inc. NR NR NR NR NR NR NR

    College Board

    CTB/McGraw-Hill

    CurriculumAssociates LLC

    Discovery EducationAssessment NR NR NR NR NR NR NR

    Houghton Mif inHarcourt/Riverside

    Measured Progress NR NR

    PARCC

    Scantron NR NR NR NR NR NR NR

    Smarter Balanced

    Triumph Learning NR NR NR NR NR NR NR

    CONCLUSIONOf the 12 respondents, two of them, Measured Progressand PARCC, indicated that their solutions included all vesubject areas for summative assessments and were ableto demonstrate suf cient evidence that their solution

    is aligned with the CCSS. Smarter Balanced has all butspeaking as part of their current solution.

    In terms of i tem types, CTB McGraw-Hill, PARCC, andSmarter Balanced were able to demonstrate item typesthat included standard item types, technology enhanceditem types, and performance tasks for all grade levelsand content areas.

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    5/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    KEY: Appears to fully meet requirements based on responses provided

    Unclear if meets or appears to partially meet requirements based on responses provided

    Does not appear to meet requirements based on responses provided

    NR No response or did not indicate having a summative product

    SUMMATIVE

    Transparency&

    Governance

    TRODUCTIONs essential that Michigans educators and studentsve an assessment system that meets the uniqueeds of the state while providing opportunities forlid comparison with other states and large-scaleessment systems. This means that a balance mustfound between customizability and compromise,th service providers (e.g., with off-the-shelf products)d other states (e.g., with multi-state consortia), inder to nd the best solution for these two competingals. Michigan is one of a few states that haveni cant experience with this challenge. Our currentessment programs include tests that are state-

    eci c and completely customized (e.g., MEAP) andts that are not customizable (e.g., the ACT, which

    part of the Michigan Merit Examination) as they areministered across many states and therefore muststatic for important reasons such as test securityd comparability. Over the course of several monthstestimony and debate around implementation of themmon Core State Standards, it was apparent thatchigans ability to retain control over elements suchpersonally-identi able student data was crucial. Thisction of the report includes ratings for responsessurvey questions documenting opportunities forchigan educators and MDE staff to have a direct and

    bstantive in uence in the development and operationalplementation of the assessments.

    e ratings in the table above were made in light of high-stakes purposes that summative tests are

    signed to inform. These types of tests are the ones

    where comparability across schools, districts and/orstates is paramount, and historically, data from themhave formed the foundation for accountability systems.In light of this, it is essential that opportunities exist forMichigan educators to provide input in the developmentof the products used in the accountability system. Thisis very important with regard to demonstrating validity,especially when these instruments will be used foraccountability metrics and evaluations. As administratorsof these systems, it is also critical that MDE staff havea formal governance role to assure the results of theassessments are defensible.

    If readers are interested in r eviewing the speci c surveyquestions and a particular service providers response,Appendix B contains the necessary information.

    CONCLUSIONAs documented in the table above, it is evidentthat there is a limited number of options where theopportunity to strike a balance between a customizablesolution for Michigan and a purely off-the-shelf productexists for summative assessments. Based on theresponses to the survey questions on this topic, onlythe College Board, CTB/McGraw-Hill, Houghton Mif inHarcourt/Riverside, Measured Progress, PARCC andSmarter Balanced appear to be developing solutionsthat permit robust opportunities for Michigan educatorinvolvement in developing test questions. MDE inputinto essential areas of governance and design is onlyapparent with this same group of service providers,

    albeit with much more opportunity in the case ofHoughton Mif in Harcourt/Riverside, PARCC, andSmarter Balanced. The other service providers clearlyindicated that signi cantly fewer opportunities existed forMDE input in these two areas.

    It is also important to note that clear differences existwith regard to Michigan control of student data for thesehigh-stakes summative tests. In light of that criticalfactor, only CTB/McGraw-Hill, Houghton Mif in Harcourt/Riverside, PARCC, and Smarter Balanced would berecommended for further consideration based on theresponses to this survey.

    Service Provider Clear opportunities forMichigan educators to participate in...Clear opportunities forMDE involvement in...

    Clear evidence theState of Michiganretains sole and

    exclusive ownershipof all student data

    Test questiondevelopment processes

    Bias/sensitivity andaccessibility reviews of

    test questions

    Test questionscoring

    processesTest design

    Test question scoringadministration andreporting processes

    Technicalquality

    processes

    Retains sole andexclusive ownership o

    all student data

    ACT Aspire

    Amplify Education, Inc. NR NR NR NR NR NR NR

    College Board

    CTB/McGraw-Hill

    CurriculumAssociates LLC

    Discovery EducationAssessment NR NR NR NR NR NR NR

    Houghton Mif inHarcourt/Riverside

    Measured Progress

    PARCC

    Scantron NR NR NR NR NR NR NR

    Smarter Balanced

    Triumph Learning NR NR NR NR NR NR NR

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    6/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    KEY: Appears to fully meet requirements based on responses provided

    Unclear if meets or appears to partially meet requirements based on responses provided

    Does not appear to meet requirements based on responses provided

    NR No response or did not indicate having a summative product

    SUMMATIVE

    OverallDesign &

    Availability

    TRODUCTIONchigan is committed to adopting online assessments,well as providing paper-and-pencil test options

    hile schools and districts continue to acquire andplement the technology required to administer onlineessments. MDE believes strongly that the nature of

    mputer-adaptive assessment, where each studenteives a customized test event based on his or her

    rformance, is the best solution for improving howdent achievement and growth is measured. This isrticularly true in the case of high-achieving students,d students that may be much lower than average duefactors such as disability, learning English, etc.

    hen reviewing the survey responses, MDE asked eachpondent to note if their solution offered a computer-aptive or computer-based test, as well as a paper-d-pencil option for administration. One key differencetween computer-adaptive and computer-basedessments is that a computer-adaptive assessment

    enti cally selects items based on estimated studentility level, therefore a unique test is crafted for eachdent. A computer-based test is a xed-form testmilar to paper-and-pencil solutions) where everydent will see the same items. MDE also believes

    at offering a re-take opportunity for the summativeessments is a key component of our desiredessment system.

    All but one of the solutions presented clearly offeredan online administration option; although only three,Curriculum Associates, CTB McGraw-Hill, and SmarterBalanced, offered an online computer-adaptive deliveryof their assessment. Two of the three, CTB McGraw-Hilland Smarter Balanced, clearly offered a comparablepaper-pencil alternative to their proposed computer-

    adaptive assessment. Two of the solutions presentedwere not clear in their offerings as they may have onlyhad a computer-adaptive solution in certain grades orcontent areas.

    Of the solutions presented, ACT Aspire and CTB McGraw-Hill did not offer a re-take option for their summativeassessment.

    Based on the i nformation provided in the survey, manyof the solutions would be fully available by the 2014-2015 school year as desired. ACT Aspire, CTB McGraw-Hill, Houghton-Mif in Harcourt Riverside, PARCC,Scantron, and Smarter Balanced indicated they wouldhave solutions ready within that timeframe.

    CONCLUSIONGiven the information provided, if Michigan desires tocontinue in the direction of adopting an online computer-adaptive assessment system with a comparable paper-pencil alternative and a re-take option, it appears thatthe Smarter Balanced solution would be the only oneprepared to meet those requirements. If Michigandecides that a computer-adaptive solution is not indeeda requirement then the solutions from Houghton-Mif inHarcourt Riverside and PARCC would also be suitableoptions.

    Service Provider Overall Test Design Availability

    Solution will beavailable in a computer-

    adaptive modality

    Solution will beavailable in a computer-based

    modality

    Solution will includea comparable

    paper-pencil option

    Solution will offer are-test option

    Solution will be fully available(including all item types) for the

    20142015 school year

    ACT Aspire

    Amplify Education, Inc. NR

    College Board NR

    CTB/McGraw-Hill

    CurriculumAssociates LLC

    Discovery EducationAssessment NR

    Houghton Mif inHarcourt/Riverside

    Measured Progress

    PARCC

    Scantron NR

    Smarter Balanced

    Triumph Learning NR

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    7/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    SUMMATIVE

    TestSecurity

    KEY: Appears to fully meet requirements based on responses provided

    Unclear if meets or appears to partially meet requirements based on responses provided

    Does not appear to meet requirements based on responses provided

    NR No response or did not indicate having a summative product

    TRODUCTIONhool accountability (including designation as priorityfocus schools) and educator evaluation are high-kes uses of Michigans next generation assessment.st results from the next generation assessment mustrefore be valid for such uses. A key to maintaining

    lidity is the assurance that student performanceects the learning that students have experienced

    her than advance familiarity with test questions oreiving inappropriate assistance in obtaining a high

    ore.

    itical to assuring that student performance re ectsdent learning are two issues:

    Keeping test questions secure. Timely monitoring for security breaches and an

    ability to respond appropriately.is section focuses on survey questions providingidence regarding how well each solution is able todress these two issues.

    The number of test forms available for administrationto students is critical to keeping test questions secure.A minimum standard is having at least one additionaltest form to administer to students in the event of asecurity breach. Even better is to have many formsavailable such that multiple forms can be administeredin the same classroom. In the optimal situation, eachstudent would receive a unique test form, as is the casefor Computer Adaptive Testing (CAT). Providers wereidenti ed as meeting this criterion if they meet theminimum standard of having at least one additional testform available in the event of a security breach.

    Timely provision of security-related data to MDE iscritical in being able to monitor for security breaches andrespond appropriately. MDE will need to be providedwith timely access to security-related data in orderto perform forensic analyses on the data for potential

    security breaches. Timely analysis is needed to initiateand conduct investigations, and (if possible) provide forre-testing, before the testing window closes in the caseof a security breach. Providers were asked whether MDEwould be provided with timely access to security-relateddata for analysis.

    CONCLUSIONBecause maintaining test security is so integralto appropriate high-stakes use of Michigans nextgeneration assessments, MDE quali ed only those thatclearly indicated that at least one test form is availablefor use in the event of a breach of test security andclearly indicated that security-related data would beprovided to MDE in a timely manner. The only serviceproviders meeting this criteria based on the responsesprovided are CTB/McGraw-Hill, PARCC, and SmarterBalanced.

    Service Provider Assessment Integrity and Security

    Multiple forms are used inoperational testing with others

    available for emergency ormisadministration.

    MDE will be provided timely and adequateinformation needed to monitor and investigate

    test administration, including student level dataand psychometric data to perform forensic and

    security analyses

    ACT Aspire

    Amplify Education, Inc. NR NR

    College Board

    CTB/McGraw-Hill

    CurriculumAssociates LLC

    Discovery EducationAssessment NR NR

    Houghton Mif inHarcourt/Riverside

    Measured Progress NR NR

    PARCC

    Scantron NR NR

    Smarter Balanced

    Triumph Learning NR NR

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    8/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    KEY: Appears to fully meet requirements based on responses provided

    Unclear if meets or appears to partially meet requirements based on responses provided

    Does not appear to meet requirements based on responses provided

    NR No response or did not indicate having a summative product

    SUMMATIVE

    Scoring&

    Reporting

    TRODUCTIONture scoring and reporting functions of state testingograms need to provide (1) faster, virtually instant,ults back to the classroom; (2) more de nition as to depth of knowledge demonstrated by students on the

    ntent and standards being assessed; and (3) exibleting reports that offer students, parents, teachers,d administrators the ability to access data speci callystomized according to their individual learning,ching, and/or evaluation needs.

    do this, we need systems designed to take the mostcient advantage of the data available.

    ose who responded to the MDE request for informationgarding their summative assessment offerings relatedhe Common Core State Standards were presentedth a series of questions in two major areas regardingoring and Reporting. The two areas are Data Analysispabilities and Scoring, and Assessment Reporting.

    the area of Data Analysis Capabilities and Scoring, thecus was on vendor-provided products and data thatuld allow the MDE to run analyses verifying that vendorults were suf cient and accurate measures, as well as

    ovide the MDE with additional opportunities for researchd evaluation using the supplied data.

    ere was also emphasis on the amount of input theate would have into the design of student-level andgregate data sets, statistical procedures, and scoringotocols. Having opportunities at the design level makeossible to assure the service provider is implementing

    processes that are the most current and ef cient, with anaim to obtaining the highest degree of reliability.

    In Assessment Reporting, the areas examined includevendor provisions for: reporting at a level suf cient to provide necessary

    information to educators, MDE, and satisfyfederal and state requirements.

    reporting of assessment results that will be

    timely (i.e., signi cantly improved over resultsfrom current, paper-pencil tests). Theimmediacy with which reports can be obtainedfollowing testing is of constant concern to ourstakeholders at all levels. It is critical that newsystems take advantage of the opportunitiesmade available by computer-delivered testing.

    Service Provider Data Analysis Capabilities and Scoring Assessment Reporting

    MDE will have suf cientinformation for veri cation

    and analysis done in-house,using vendor-provided

    products and data.

    MDE will have directin uence on student and

    aggregate level datastructures, psychometric

    procedures, andscoring procedures and

    protocols.

    Reporting will be at alevel suf cient to

    provide necessaryinformation to

    educators, MDE, andto satisfy federal and state

    requirements.

    Reporting ofassessment results

    will be timely(i.e., signi cantly

    improved over resultsfrom current,

    paper-pencil tests).

    MDE and schools/districts will be providedwith all data underlying

    the reports and willhave the capability to

    perform further analysisif desired.

    Students who testwith State-approved

    accommodations will recthe same menus and typeof score reports provided

    students in thegeneral population.

    ACT Aspire

    Amplify Education, Inc. NR NR NR NR NR NR

    College Board

    CTB/McGraw-Hill

    CurriculumAssociates LLC

    Discovery EducationAssessment NR NR NR NR NR NR

    Houghton Mif inHarcourt/Riverside

    Measured Progress NR NR NR NR NR NR

    PARCC NR

    Scantron NR NR NR NR NR NR

    Smarter Balanced

    Triumph Learning NR NR NR NR NR NR

    assurance that MDE and schools/districts willbe provided with all data underlying thereports and will have the capability to performfurther analyses if desired. Many schools wantand need the capability to examine data i n waysthat serve their unique populations. This alsoassures that data will be available as needed tothose involved in efforts where improvementis a critical priority.

    parity for students who test with State-approvedaccommodations to the extent they will receivethe same menus and types of score reportsprovided to students in the general population.

    CONCLUSIONThe symbols displayed in the table on these pagesprovide a visual representation of how service providersoffering a summative assessment product appearto meet the requirements for scoring and reporting.Based on responses provided, CTB/McGraw-Hill and

    the Smarter Balanced Assessment Consortium appearto fully meet requirements in all scoring and reportingcategories.

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    9/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    SUMMATIVE

    CostStandard Product

    TRODUCTIONis table displays the average, per-student cost for thendard summative products offered by each service

    ovider. While most offered thorough solutions forost of the desired grade span (3-11) indicated in theolution, there was some degree of variability. Averagest was generated by taking the mean price for eachodality (i.e., computer-based assessment/computer-aptive assessment (CBA/CAT) or paper/pencil) acrossgrades. This table is provided as an informational

    apshot, to which MDE staff did not attempt to assignings; therefore no conclusions are provided for thisction. While these proposed costs give some idea as tohich products are likely to be more or less expensive ineneral sense, the information gathered by the surveynsuf cient to determine an accurate cost model.noted in the introduction to the report, that level

    detailed information can only be produced by goingough the full, formal state procurement process. The

    ade Levels column of this sections table indicates thatvice providers reported having items of each typeonly those grades. Additional notes about this areluded in the Exceptions Column.

    Additionally, a major driver of both cost and alignment

    is the number and type of constructed response items.Since the issues around these types of test questions areso pervasive, MDE staff determined it was necessary todisplay information about them in a separate table onpages 18-19.

    Summative Assessment Per Student Cost(Standard Product)

    Service Provider

    Average per student cost Types of Test Questions Included

    Exceptions

    CBA/CAT P & P Grade Levels Multiple Choice(ELA & Math)

    ConstructedResponse

    (ELA & Math)

    TechnologyEnhanced

    (ELA & Math)

    Performance Assessment(ELA & Math)

    ACT Aspire 22.00 28.00 Grades 3 - 10 No constructed response testquestions at Grade 9

    Amplify Education, Inc. NR NR NR NR NR NR NR

    College Board NR 27.75 Grades 9 - 12

    No constructed response testquestions in Mathematics;

    no constructed response testquestions in ELA grades

    9 and 10

    CTB/McGraw-Hill 27.00 27.00 Grades 3 - 11

    CurriculumAssociates LLC 11.00 NA Grades 3 - 12

    No technology enhanced testquestions in Grades 9-12

    Discovery EducationAssessment NR NR NR NR NR NR NR

    Houghton Mif inHarcourt/Riverside 20.00 25.00 Grades 3 - 12

    No performance assessmentsavailable for ELA

    Measured Progress NR NR NR NR NR NR NR

    PARCC 30.00 34.00 Grades 3 - 11

    Scantron NR NR Grades 3 - 12 NR NR NR NR

    Smarter Balanced 22.50 15.62 Grades 3 - 8, 11

    Triumph Learning NR NR NR NR NR NR NR

    The Grade Levels column indicates that service providers reported having items of each type for only those grades. Additional notes about this are included in the Exceptions Column.

    KEY: Appears to include this type of test question based on responses provided

    Appears to include this type of question on some, but not all, subjects or grade levels. Please see the comment in the exception column

    Does not appear to include this type of test question based on responses provided

    NR No response or did not indicate having an summative product

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    10/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    TRODUCTIONhile multiple-choice and technology-enhanced testestions are types of items that are well-understood,sy to score, and comparatively cheap to produce,ly assessing the rigor and higher-order thinking skills

    quired by career- and college-ready standards requiresmething more substantive. Any assessment solutionat seeks to demonstrate the capability to measured provide rich, student achievement and growthormation on constructs such as writing, research,mmunicating reasoning and problem solving to thegree described in the CCSS, must offer test stimulihere students have the opportunity to do more thanect a, b or c.

    amples of this idea include asking a student tommarize a reading passage in his or her own words,write about the process he or she used to solve aath problem rather than just selecting the correctswer. Educators deserve strong strong data on how

    dents are achieving and growing on these challengingpics. To attempt to learn more about what options areailable now or in the near future to support this idea, survey included questions speci c to constructed-ponse items. Service providers were asked to list number of constructed-response test questions that

    me with their standard product; numbers that areplayed in the following table.

    CONCLUSIONACT Aspire, CTB/McGraw-Hill, Houghton Mif in Harcourt/Riverside and Smarter Balanced appear to includeenough constructed-response items to measure studentachievement deeply.

    NOTESAs indicated in the text above and mentioned in otherappropriate sections of this report, constructed-responsetest questions are considerably more expensive to

    SUMMATIVE

    score than other types of test questions and studentresponses. Therefore, the survey included an opportunityfor service providers to indicate whether or not theywere able to provide additional constructed-responseitems beyond what they offered in their standardpackage, and a corresponding pricing structure.However, the portion of the survey seeking to gatherinformation on this augmented option functioneddifferently, depending on the method the serviceprovider used to complete the survey. As a result,

    service providers interpreted the augmentation sectiondifferently and the information was not consistentor reliable. This was discovered as MDE staff beganexamining responses to this section and it wasimmediately evident that service providers interpretedthis section in dramatically different ways. Therefore, thedecision was made to not include information from thesurvey questions on augmented options (questions 70-73 in Appendix A).

    Summative AssessmentConstructed Response (CR) Test Questions Included in Per Student Cost Estimate

    S er vi ce P ro vi der M at he ma ti cs C R Tes t Q ues tio ns E LA CR Tes t Q ue st io ns

    ExceptionsHand ScoredShort Answer

    Hand ScoredExtendedResponse

    AI ScoredShort Answer 1

    AI ScoredExtended

    Response 1Hand ScoredShort Answer

    Hand ScoredExtendedResponse

    AI ScoredShort Answer 1

    AI ScoredExtended

    Response 1

    ACT Aspire 0 4 -5* 1 0 0 1 1 -2 1

    *No hand scoredextended

    response test questions in Grade 9

    mathematics

    Amplify Education, Inc. NR NR NR NR NR NR NR NR

    College Board 0 0 0 0 NR 1** NR NR

    **No hand scoreextended respons

    test questions in

    Grades 9-10 ELA

    CTB/McGraw-Hill 4 1 0 0 3 1 0 0

    CurriculumAssociates LLC 0 0 0 0 0 0 0 0

    Discovery EducationAssessment NR NR NR NR NR NR NR NR

    Houghton Mif inHarcourt/Riverside 3 1 0 0 3 2 0 0

    Measured Progress NR NR NR NR NR NR NR NR

    PARCC NotSpeci edNot

    Speci edNot

    Speci edNot

    Speci edNot

    Speci edNot

    Speci edNot

    Speci edNot

    Speci ed

    Scantron NR NR NR NR NR NR NR NR

    Smarter Balanced 0 4 - 7 0 0 5 1 0 0

    Triumph Learning NR NR NR NR NR NR NR NR

    1 Arti cial Intelligence

    ConstructedResponse

    Standard ProductCost Implications

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    11/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    Content&

    Item TypeAlignment

    INTERIM

    KEY: Appears to fully meet requirements based on responses provided

    Unclear if meets or appears to partially meet requirements based on responses provided

    Does not appear to meet requirements based on responses provided

    NR No response or did not indicate having an interim product

    TRODUCTIONerim assessments are given periodically throughout school year. They provide information to educators

    out student learning and potential success with themmative assessments. The goal is to determinedent achievement after instruction while there is still

    me to remediate areas in which students have doneorly. Michigan desires to have an interim assessmentstem that mirrors its summative counterpart andat uses an item pool that is independent from themmative assessment item pool.

    ultiple-choice and technology-enhanced item types critical components of an assessment, in order

    truly assess the rigor and higher-order thinkinglls required from the CCSS and career and college

    adiness standards an assessment solution must offerubstantial number of constructed-response items asll. Please note that a detailed report on constructedponse items is included in the cost section of this

    port. The quantity of constructed-response itemsll also be covered in both the cost and scoring andporting sections of this report. Performance tasksovide insights into students depth of knowledge onportant content because they require students togage in authentic problem solving and to persevereough the multiple steps of the task. Comments onntent alignment are based on survey responses only.

    CONCLUSIONAmplify Education Inc., CTB Mcgraw-Hill, MeasuredProgress, PARCC, and Triumph report having all vecontent areas represented in interim assessments.Of these solutions, Measured Progress, and PARCC,

    demonstrated that their solutions were aligned tothe CCSS through the survey. Five solutions (AmplifyEducation Inc., CTB McGraw-Hill, Measured Progress,Smarter Balanced, and Triumph) report the ability tooffer standard item types, technology enhanced items,and performance tasks for the interim assessments theyare building.

    Service Provider Content Alignment Item Types

    Content alignedto the CCSS

    Solutionaddresses all 5content areas

    Solutionaddresses all grade

    levels (G3-G11)

    Quali cations foreducators involved

    in alignment for content,diversity, and special

    populations

    Standard item types(multiple choice and con-

    structed response)will be available

    Diverse set oftechnology-enhanced

    item types will beavailable

    Performance tasks/assessments

    will be available

    ACT Aspire

    Amplify Education, Inc.

    College Board NR NR NR NR NR NR NR

    CTB/McGraw-Hill

    CurriculumAssociates LLC

    Discovery EducationAssessment

    Houghton Mif inHarcourt/Riverside

    Measured Progress

    PARCC NR NR NR

    Scantron

    Smarter Balanced

    Triumph Learning NR

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    12/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    Transparency&

    Governance

    KEY: Appears to fully meet requirements based on responses provided

    Unclear if meets or appears to partially meet requirements based on responses provided

    Does not appear to meet requirements based on responses provided

    NR No response or did not indicate having an interim product

    INTERIM

    TRODUCTIONs essential that Michigans educators and studentsve an assessment system that meets the uniqueeds of the state while providing opportunities forlid comparison with other states and large-scaleessment systems. This means that a balance mustfound between customizability and compromise,th service providers (e.g., with off-the-shelf products)d other states (e.g., with multi-state consortia), inder to nd the best solution for these two competingals. Michigan is one of a few states that haveni cant experience with this challenge. Our currentessment programs include tests that are state-

    eci c and completely customized (e.g., MEAP) andts that are not customizable (e.g., the ACT, which

    part of the Michigan Merit Examination) as they areministered across many states and therefore muststatic for important reasons such as test securityd comparability. Over the course of the months oftimony and debate around implementation of themmon Core State Standards, it was readily apparent

    at retaining Michigan control over elements such asrsonally-identi able student data was crucial. Thisction of the report includes ratings for responsessurvey questions documenting opportunities forchigan educators and MDE staff to have a direct and

    bstantive in uence in the development and operationalplementation of the assessments.

    e ratings in this sections table above were made inht of the purposes that interim tests are typically

    designed to inform. For example, interim tests can beused to inform educator evaluations if all teachers in thesame grade and subject administer them under the sameconditions. Since it is likely that interim tests will be usedfor some accountability systems or purposes in Michiganschools, it is just as important that opportunities

    for involvement in test development and design bedocumented for them as for summative tests. If readersare interested in reviewing the speci c survey questionsand a particular service providers response, Appendix Bcontains the necessary information.

    CONCLUSIONIn addition to more service providers indicating thatthey have an aligned interim solution compared to thiscategory for summative assessments, it is clear that

    more opportunities exist for Michigan participation intest question development and governance activities.Based on the responses to the survey questions onthis topic, CTB/McGraw-Hill, Houghton Mif in Harcourt/Riverside, Measured Progress, PARCC, Scantron, SmarterBalanced, and Triumph Learning all provide substantial

    opportunities for Michigan educator involvement indeveloping test questions. Lack of MDE input intoessential areas of governance and design eliminatesMeasured Progress and Triumph Learning from thelist. Since all the r emaining service providers exceptScantron af rm Michigans control over student data,

    CTB/McGraw-Hill, Houghton Mif in Harcourt/Riverside,PARCC, and Smarter Balanced would be recommendedfor further consideration with regard to their interimassessment solutions.

    It is also important to note that clear differences exist

    with regard to Michigan control of student data forthese interim tests. In light of that critical factor, onlyCTB/McGraw-Hill, Houghton Mif in Harcourt/Riverside,PARCC, and Smarter Balanced would be recommendedfor further consideration based on the responses to thissurvey.

    Service Provider Clear opportunities forMichigan educators to participate in...Clear opportunities forMDE involvement in...

    Clear evidence theState of Michiganretains sole and

    exclusive ownershipof all student data

    Test questiondevelopment processes

    Bias/sensitivity andaccessibility reviews of

    test questions

    Test question scoring processes Test design

    Test question scoringadministration andreporting processes

    Technicalquality

    processes

    Retains sole andexclusive ownership of

    all student data

    ACT Aspire

    Amplify Education, Inc.

    College Board NR NR NR NR NR NR NR

    CTB/McGraw-Hill

    CurriculumAssociates LLC

    Discovery EducationAssessment

    Houghton Mif inHarcourt/Riverside

    Measured Progress

    PARCC

    Scantron NR

    Smarter Balanced

    Triumph Learning NR NR NR

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    13/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    OverallDesign &

    Availability

    KEY: Appears to fully meet requirements based on responses provided

    Unclear if meets or appears to partially meet requirements based on responses provided

    Does not appear to meet requirements based on responses provided

    NR No response or did not indicate having an interim product

    INTERIM

    TRODUCTIONchigan is committed to building an interim assessment

    stem that is completely internet-based, as well asoviding paper-and-pencil test options while schoolsd districts continue to acquire and implement thehnology required to administer online assessments.

    DE believes strongly that the nature of computer-aptive assessment, where each student receives astomized test event based on his or her performance,he best solution for improving how studenthievement and growth is measured. This is particularlye in the case of hi gh-achieving students, and students

    at may be much lower than average due to factorsch as disability, learning English, etc.

    chigan also desires an interim assessment systemat would provide a great amount of exibility andplications for Michigan educators. This would requireat an interim assessment is available to be given atst twice a year, which would allow it to be used as an

    d-of-course, or a mid-year checkpoint as examples forucators and their students.

    hen reviewing the survey responses, MDE asked eachpondent to note if their solution offered a computer-aptive or computer-based test, as well as a paper-d-pencil option for administration. One key differencetween computer-adaptive and computer-based

    assessments is that a computer-adaptive assessmentscienti cally selects items based on estimated studentability level; therefore a unique test is crafted for eachstudent. A computer-based test is a xed-form test(similar to paper-and-pencil) where every student willsee the same items.

    All but one of the solutions presented clearly offered anonline administration option, although three, CurriculumAssociates, CTB McGraw-Hill, and Smarter Balanced,offered an online computer-adaptive delivery of theirassessment. Of those three, CTB McGraw-Hill andSmarter Balanced clearly offered multiple administrationopportunities per year of their interim system.

    Based on the i nformation provided in the survey, manyof the solutions would be fully available by the 2014-2015 school year as desired. ACT Aspire, CTB McGraw-Hill, Houghton-Mif in Harcourt Riverside, PARCC,Scantron, and Smarter Balanced all would have solutionsready within that timeframe.

    CONCLUSIONGiven the information provided, if Michigan desires aninterim assessment solution that could be administered

    in an online computer-adaptive system with acomparable paper-pencil alternative and offer multipleadministrations per year, it appears that the solutionspresented from CTB McGraw-Hill and Smarter Balancedwould be prepared to meet those requirements. IfMichigan decides that a computer-adaptive solutionis not a requirement, then the solutions from ACT

    Aspire, Discovery Education, Houghton-Mif in HarcourtRiverside, Measured Progress, PARCC and Scantronwould also be suitable options.

    Service Provider Overall Test Design Availability

    Solution will beavailable in a computer

    adaptive modality

    Solution will beavailable in a computer

    based modality

    Solution will includea comparable

    paper-pencil option.

    Interim Solution(s) will haveopportunity for multiple (at leasttwice per year) administrations.

    Solution will be fully available(including all item types) for the

    2014-2015 school year.

    ACT Aspire

    Amplify Education, Inc.

    College Board NR

    CTB/McGraw-Hill

    CurriculumAssociates LLC

    Discovery EducationAssessment

    Houghton Mif inHarcourt/Riverside

    Measured Progress

    PARCC

    Scantron

    Smarter Balanced

    Triumph Learning

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    14/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    TestSecurity

    INTERIM

    KEY: Appears to fully meet requirements based on responses provided

    Unclear if meets or appears to partially meet requirements based on responses provided

    Does not appear to meet requirements based on responses provided

    NR No response or did not indicate having an interim product

    TRODUCTIONhool accountability (including designation as priorityfocus schools) and educator evaluation are high-kes uses of Michigans next generation assessment.st results from the next generation assessment mustrefore be valid for such uses. A key to maintaining

    lidity is the assurance that student performanceects the learning that students have experienced

    her than advance familiarity with test questions oreiving inappropriate assistance in obtaining a high

    ore.

    Timely provision of security-related data to MDE iscritical in being able to monitor for security breaches andrespond appropriately. MDE will need to be providedwith timely access to security-related data in orderto perform forensic analyses on the data for potentialsecurity breaches. Timely analysis is needed to initiateand conduct investigations, and (if possible) provide forre-testing, before the testing window closes in the caseof a security breach. Providers were asked whether MDEwould be provided with timely access to security-relateddata for analysis.

    CONCLUSIONBecause maintaining test security is so integralto appropriate high-stakes use of Michigans nextgeneration assessments, for interim assessments MDEquali ed only those that clearly indicated that securityrelated data would be provided to MDE in a timelymanner. The only service providers meeting this criteriabased on responses provided were CTB/McGraw-Hill,Houghton Mif in Harcourt/Riverside, Measured Progress,PARCC, and Smarter Balanced.

    Service Provider Assessment Integrity and Security

    MDE will be provided timely and adequate information needed to monitor andinvestigate test administration, including student level data and psychometric data

    to perform forensic and security analyses

    ACT Aspire

    Amplify Education, Inc.

    College Board NR

    CTB/McGraw-Hill

    CurriculumAssociates LLC

    Discovery Education

    Assessment

    Houghton Mif inHarcourt/Riverside

    Measured Progress

    PARCC

    Scantron

    Smarter Balanced

    Triumph Learning

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    15/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    Scoring&

    Reporting

    KEY: Appears to fully meet requirements based on responses provided

    Unclear if meets or appears to partially meet requirements based on responses provided

    Does not appear to meet requirements based on responses provided

    NR No response or did not indicate having an interim product

    INTERIM

    TRODUCTIONture scoring and reporting functions of state testingograms need to provide (1) faster, virtually instant,ults back to the classroom; (2) more de nition as to depth of knowledge demonstrated by students on the

    ntent and standards being assessed; and (3) exibleting reports that offer students, parents, teachers,d administrators the ability to access data speci callystomized according to their individual learning,ching, and/or evaluation needs.

    do this, we need systems designed to take the mostcient advantage of the data available.

    ose who responded to the MDEs request forormation regarding their interim assessment offeringsated to the Common Core State Standards wereesented with a series of questions in two major areasgarding Scoring and Reporting. The two areas areta Analysis Capabilities and Scoring, and Assessmentporting.

    the area of Data Analysis Capabilities and Scoring, thecus was on vendor-provided products and data thatuld allow the MDE to run analyses verifying that the

    ndor results were suf cient and accurate measures, asll as provide the MDE with additional opportunities forearch and evaluation using the supplied data.

    ere was also emphasis on the amount of input theate would have into the design of student-level andgregate data sets, statistical procedures, and scoringotocols. Having opportunities at the design l evel make

    it possible to assure the service provider is implementingprocesses that are reliable, ef cient, and valid for theintended purposes.

    In Assessment Reporting, the areas examined includevendor provisions for: reporting at a level suf cient to provide necessary

    information to educators, MDE, and to satisfyfederal and state requirements.

    reporting of assessment results that will be timely(i.e., signi cantly improved over results fromcurrent, paper-pencil tests). The immediacywith which reports can be obtained followingtesting is of constant concern to ourstakeholders at all levels. It is critical that newsystems take advantage of the opportunitiesmade available by computer-delivered testing.

    assurance that MDE and schools/districts willbe provided with all data underlying thereports and will have the capability to performfurther analysis if desired. Many schools wantand need the capability to examine data i n waysthat serve their unique populations. This alsoassures that data will be available as needed tothose involved in efforts where improvementis a critical priority.

    parity for students who test with state-approvedaccommodations to the extent they will receivethe same menus and types of score reportsprovided to students in the general population.

    CONCLUSIONThe symbols displayed in the table on these pagesprovide a visual representation of how service providersoffering an interim assessment product appear to meetthe requirements for scoring and reporting. Based onresponses provided, CTB/McGraw-Hill and the SmarterBalanced Assessment Consortium appear to fully meetrequirements in all scoring and reporting categories.

    Service Provider Data Analysis Capabilities and Scoring Assessment Reporting

    MDE will have suf cientinformation for veri cation

    and analysis done in-house,using vendor-provided

    products and data.

    MDE will have directin uence on student and

    aggregate level datastructures, psychometric

    procedures, andscoring procedures and

    protocols.

    Reporting will be at alevel suf cient to

    provide necessaryinformation to

    educators, MDE, andto satisfy federal and state

    requirements.

    Reporting ofassessment results

    will be timely(i.e., signi cantly

    improved over resultsfrom current,

    paper-pencil tests).

    MDE and schools/districts will be providedwith all data underlying

    the reports and willhave the capability to per-

    form further analysisif desired.

    Students who testwith State-approved

    accommodations will recthe same menus and typeof score reports provided

    students in thegeneral population.

    ACT Aspire

    Amplify Education, Inc.

    College Board NR NR NR NR NR NR

    CTB/McGraw-Hill

    CurriculumAssociates LLC

    Discovery EducationAssessment

    Houghton Mif inHarcourt/Riverside

    Measured Progress

    PARCC NR

    Scantron

    Smarter Balanced

    Triumph Learning NR NR

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    16/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    KEY: Appears to fully meet requirements based on responses provided

    Appears to include this type of question on some, but not all, subjects or grade levels. Please see the comment in the exception column

    Does not appear to meet requirements based on responses provided

    NR No response or did not indicate having an interim product

    TRODUCTIONis table displays the average, per-student cost for thendard interim products offered by each service provider.hile most offered thorough solutions for most of thesired grade span (3-11) indicated in the resolution,re was some degree of variability. Average cost was

    nerated by taking the mean price for the computer-sed assessment/computer-adaptive assessment (CBA/

    AT) solution offered across all grades. This table isovided as an informational snapshot, to which MDE staffd not attempt to assign ratings. While these proposedsts give some idea as to which products are likely to beore or less expensive in a general sense, the informationthered by the survey is insuf cient to determine ancurate cost model. As noted in the introduction to report, that level of detailed information can onlyproduced by going through the full, formal state

    ocurement process. The Grade Levels column of thisctions table indicates that service providers reported

    ving items of each type for only those grades. Additionaltes about this are included in the Exceptions Column.

    dditionally, a major driver of both cost and alignment is the number and type of constructed response items. Since issues around these types of test questions are so pervasive, MDE staff determined it was necessary to displayormation about them in a separate table on pages 32-33.

    INTERIM

    Interim Assessment Per Student Cost(Standard Product)

    Exceptions

    Service ProviderAverage perstudent cost

    CBA/CAT

    Types of Test Questions Included

    GradeLevels Multiple Choice(ELA & Math)Constructed Response

    (ELA & Math)Technology Enhanced

    (ELA & Math)Performance Assessment

    (ELA & Math)

    ACT Aspire 7.00 Grades 3 - 12

    No technologyenhanced test

    questions inmathematics;no technologyenhanced test

    questions in ELAgrades 9-10

    Amplify Education, Inc. 4.25 Grades 3 - 12

    College Board NR NR NR NR NR NR

    CTB/McGraw-Hill 13.00 Grades 3 - 12

    CurriculumAssociates LLC 11.00 Grades 3 - 8

    Discovery EducationAssessment 8.00 Grades 3 - 11

    Houghton Mif inHarcourt/Riverside 10.00 Grades 3 - 12

    Measured Progress5.70 Grades 3 - 11

    No performanceassessments in

    grades 9-11

    PARCC NR NR NR NR NR NR

    Scantron 5.00 NR

    Smarter Balanced 4.80 Grades 3 - 11

    Triumph Learning 20.00 Grades 3 - 12

    The Grade Levels column indicates that service providers reported having items of each type for only those grades. Additional notes about this are included in the Exceptions Column.

    CostStandard Product

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    17/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    TRODUCTIONhile multiple-choice and technology-enhanced testestions are types of items that are well-understood,sy to score, and comparatively cheap to produce,ly assessing the rigor and higher-order thinking skills

    quired by career- and college-ready standards requiresmething more substantive. Any assessment solutionat seeks to demonstrate the capability to measured provide rich, student achievement and growthormation on constructs such as writing, research,mmunicating reasoning and problem solving to thegree described in the CCSS, must offer test stimulihere students have the opportunity to do more thanect a, b or c.

    amples of this idea include asking a student tommarize a reading passage in his or her own words,write about the process he or she used to solve aath problem rather than just selecting the correctswer. MDE feels very strongly that educators deserve

    ong data on how students are achieving and growingthese challenging topics. To attempt to learn moreout what options are available now or in the nearure to support this idea, the survey included questionseci c to constructed-response items. Service providersre asked to list the number of constructed-responset questions that came with their standard product;mbers that are displayed in the following table.

    NOTESAs indicated in the text above and mentioned in otherappropriate sections of this report constructed-responsetest questions are considerably more expensive toscore than other types of test questions and studentresponses. Therefore, the survey included an opportunityfor service providers to indicate whether or not theywere able to provide additional constructed-response

    INTERIM

    items beyond what they offered in their standardpackage, and a corresponding pricing structure.However, the portion of the survey seeking to gatherinformation on this augmented option functioneddifferently, depending on the method the serviceprovider used to complete the survey. As a result,service providers interpreted the augmentation sectiondifferently and the information was not consistent

    or reliable. This was discovered as MDE staff beganexamining responses to this section and it wasimmediately evident that service providers interpretedthis section in dramatically different ways. Therefore, thedecision was made to not include information from thesurvey questions on augmented options (questions 70-73 in Appendix A).

    Interim AssessmentConstructed Response (CR) Test Questions Included in Per Student Cost Estimate

    S er vi ce P ro vi der M at hem ati cs C R Tes t Q ues tio ns E LA CR Tes t Q ue st io ns

    ExceptionsHand ScoredShort Answer

    Hand ScoredExtendedResponse

    AI ScoredShort Answer 1

    AI ScoredExtended

    Response 1

    Hand ScoredShort

    Answer

    Hand ScoredExtendedResponse

    AI ScoredShort

    Answer 1

    AI ScoredExtended

    Response 1

    ACT Aspire 0 0 0 0 0 0 0 0

    Amplify Education, Inc. 66 - 171 16 - 59 0 0 5 - 21 19 - 50 0 0

    College Board NR NR NR NR NR NR NR NR

    CTB/McGraw-Hill 12 - 34 2 - 8* 2 - 20 0 12 - 19 8 - 10 2 - 20 0*No hand scored extended

    response test questions for Grade12 math

    CurriculumAssociates LLC 0 0 0 0 0 0 0 0

    Discovery EducationAssessment 0 0 0 0 0 0 0 0

    Houghton Mif inHarcourt/Riverside 0 0 3 1 0 0 3 1

    Measured Progress** 16 8 0 0 0 8 0 0 Information is for Grades 3-8only; NR for High School

    PARCC NotSpeci edNot

    Speci edNot

    Speci edNot

    Speci edNot

    Speci edNot

    Speci edNot

    Speci edNot

    Speci ed

    Scantron 0 0 0 0 0 0 0 0

    Smarter Balanced 2*** 4 - 5**** 0 3*** 5 1 0 0

    ***No short answer testquestions in math grades 3-8;

    extended response testquestions in math are hand scored

    for grades 3-8 and AI scored forgrades 9-12

    Triumph Learning** 60 - 70 35 - 75 0 0 60 - 70 35 0 0 Information is for Grades 3-8

    only; NR for High School1 Arti cial Intelligence

    ConstructedResponse

    Standard ProductCost Implications

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    18/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    KEY: Appears to fully meet requirements based on responses provided

    Unclear if meets or appears to partially meet requirements based on responses provided

    Does not appear to meet requirements based on responses provided

    NR No response

    Accessibility

    TRODUCTIONchigan is committed to the inclusion of ALL students,luding students with disabilities (SWD) and English

    nguage learners (ELLs), in large-scale assessment and

    countability systems. Assessment results should notaffected by disability, gender, ethnicity, or English

    nguage ability, and all students should have anportunity to receive valid scores for summative anderim assessments. To ensure validity, assessments

    ust promote an equitable opportunity for ELLs, SWDs,d general education students. The challenge of howinclude all students in these assessments bringscessibility issues to the forefront. The purpose of thecessibility Category is to ensure that all studentsve the supports and tools they require in order toly access Michigans assessment system. There two types of accessibility features: Assessmentcommodations and Universal Tools. Assessmentcommodations are used to change the way students

    cess a test without changing the content beingessed. In other words, accommodations equalize

    try to the test without giving the student an unfairvantage, or altering the subject matter. For example,lind student could access the test in Braille rather

    an print, and an English language learner may requiret questions be translated into their primary language.

    niversal Tools can be used by any student who needsnor supports, such as a highlighter, magnifying device,notepad. A series of questions aimed at determining availability of these accessibility features for

    mmative and interim assessments were included in survey. Please refer to Appendices A and C.

    sed on the results of the categorization process, thelowing is a list of the responders in rank order, top-to-ttom, who had the most offerings meeting Michiganscessibility requirements for their respective summatived/or interim assessments:

    Number of accessibility featuresmeeting requirements

    Smarter Balanced 12

    PARCC 11

    ACT Aspire 6

    CTB/McGraw-Hill 5

    Measured Progress 5

    Amplify Education, Inc. 4

    Houghton Mif in Harcourt/Riverside 3

    Triumph Learning 3

    Discovery Education Assessment 2

    Scantron 2

    College Board 1

    Curriculum Associates LLC 1

    CONCLUSIONTwo of the respondents, Smarter Balanced and PARCC, provided suf cient evidence that their product meets all ofMichigans expectations for providing appropriate accommodations on their respective assessments for ELLs andSWDs. None of the respondents met all requirements for universally-provided tools. Smarter Balanced was the onlyrespondent to report they currently provide the required languages for translation.

    All respondents except for Triumph Learning indicated they meet Michigans requirements for reporting valid scores forstudents using State-approved accommodations on their respective assessments.

    Service Provider Accommodations for English languagelearners (ELLs)Accommodations for Students with

    Disabilities (SWD)Accessibility

    ToolsTranslationLanguages

    Accommtions

    ReportScor

    Embeddedtext-to-speech

    EnglishGlossing

    ForeignLanguageGlossing

    Fulltranslation

    of testquestions

    intolanguage

    otherthan

    English

    Universalaccommo-

    dations

    Embeddedtext-to-speech

    Embeddedvideo in

    ASL(human)

    Refresh-able braille

    Print-on-demand

    tactilegraphics

    Universalaccommo-

    dations

    Universally- provided

    accessibilitytools

    MinimallySpanish, Arabic

    Of cial sreported

    State-appraccommod

    ACT Aspire

    Amplify Education, Inc.

    College Board NR NR NR NR NR NR NR NR NR NR

    CTB/McGraw-Hill

    CurriculumAssociates LLC

    Discovery EducationAssessment NR

    Houghton Mif inHarcourt/Riverside

    Measured Progress

    PARCC

    Scantron

    Smarter Balanced

    Triumph Learning N

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    19/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    KEY: Appears to fully meet requirements based on responses provided

    Unclear if meets or appears to partially meet requirements based on responses provided

    Does not appear to meet requirements based on responses provided

    NR No response

    TechnicalRequirements

    TRODUCTIONrvice providers were asked to indicate the device typesd operating systems supported by their Computerdaptive Testing solution. Service providers were alsoked to provide the bandwidth requirement for eachting site. These factors have a signi cant effect on level of school technology readiness as well as the

    erall cost to schools and districts.

    l of the Service providers that responded, with theception of Triumph Learning, indicated that their onlineting system supports Windows XP/7 desktop andtop testing devices. Since Windows XP is still widely

    ed in Michigan schools, it is critical that the onlineting system provide support for these devices.

    l service providers that responded indicated thatir online testing system supports Mac OSX desktop

    d laptop testing devices. According to MTRAx, ahnology readiness survey tool, OSX devices are alsommon among Michigan schools. Therefore, it is criticalat the online testing system provides support for thesevices.

    An increasing number of schools are adoptingChromebook devices for student instructional andassessment use. Seven of the responding serviceproviders indicated that their online testing systemsupports Chromebook as a testing device. iPads arealso widely used in Michigan schools for instruction andassessment. Six of the responding service providersindicated that their online testing system supports theiPad as a testing device.

    Service providers were asked if MDE would have a formaldecision-making role with the ability to have a directin uence on the operating systems and technologyplatforms supported by their online testing system. OnlyPARCC and Smarter Balanced indicated MDE would havein uence on the operating systems supported. HoughtonMif in Harcourt/Riverside, PARCC, and Smarter Balancedindicated MDE would have in uence on the technology

    platforms supported.

    CONCLUSIONMany Michigan schools and districts have begundeployment of a variety of student-level mobile devicesincluding Chromebooks and tablets. In many schools,these mobile devices are actually replacing the traditionalcomputer lab con guration. Best practice calls forstudents to use the same device for both instructionand assessment. Therefore, the online testing systemneeds to support not only desktops and laptops but alsoChromebooks and tablets (running iOS, Android, andWindows 8). Additionally, some schools have limitedinternet bandwidth available, which may limit thenumber of students that can test simultaneously.

    Of the service providers that responded, DiscoveryEducation Assessment and Smarter Balanced appearto meet the overall criteria regarding technicalrequirements.

    Service Provider Th e onlin e testing system suppor ts the use of

    Windows XP/7

    desktops/ laptops

    Windows8

    desktops/ laptops

    MacOS X

    desktops/ laptops

    Chrome OSlaptops

    (Chromebooks)

    iOStablets(iPads)

    Androidtablets

    W8

    ACT Aspire

    Amplify Education, Inc.

    College Board NR NR NR NR NR NR

    CTB/McGraw-Hill

    CurriculumAssociates LLC

    Discovery EducationAssessment

    Houghton Mif inHarcourt/Riverside

    Measured Progress NR NR NR NR NR NR

    PARCC NR NR NR NR NR NR

    Scantron

    Smarter Balanced

    Triumph Learning

    Bandwidth Required300

    250

    200

    150

    100

    50

    0

    ACTAspire

    CTB/McGraw-Hill

    HoughtonMifflin

    Harcourt/Riverside

    Scantron SmarterBalanced

    TriumphLearning

    CurriculumAssociates

    LLC K i l o

    b i t s p e r

    s e c o n

    d p e r s

    t u d e n

    t

    *Only those service providers that responded with a bandwidth requirement are displayed.

  • 8/13/2019 Michigan Department of Education Common Core Assessment Option Report

    20/23

    Common Core Assessment Options Report December 1, 2013 December 1, 2013 Common Core Assessment Options Report

    FormativeAssessment

    Resources

    KEY: Appears to fully meet requirements based on responses provided

    Unclear if meets or appears to partially meet requirements based on responses provided

    Does not appear to meet requirements based on responses provided

    NR No response

    TRODUCTIONe formative assessment process differs from summatived interim assessments in many ways. Fundamental to

    derstanding these differences is knowing how and whenmative assessment is used.

    2006, Michigan education representatives collaboratedth other state education leaders, Council of Chief Statehool Of cers (CCSSO), and national and internationalperts on formative assessment to develop a widelyed de nition of formative assessment:

    Formative assessment is a process used by teachersand students during instruction that providesfeedback to adjust ongoing teaching and learning toimprove students achievements of intendedinstructional outcomes. (CCSSO FAST SCASS 2006)

    e importance of this de nition is that it is compatible with research showing such practices to be an important driverstudent learning gains. At the core of the formative assessment process is that it takes place during instructionsupport student learning while learning is developing. This is a distinct difference from summative and interimessment that are intended to assess students after an extended period of learning. Simply giving students anessment in the classroom does not mean that the assessment is formative. Use of assessment evidence requireschers to gain insights into individual student learning in relation to standards and to make instructional decisionsd to use descriptive feedback to guide next steps. In addition, during the formative assessment process, studentvolvement is an essential component. Teachers seek ways to involve the student in thinking about their thinking

    etacognition) to use learning evidence to close the gap and get closer to the intended learning target.

    hile formative assessment is not a new idea, teachers are not typically trained on it in-depth. Simply puttingources and tools into teacher hands is not suf cient. Sustained professional development is needed to apply

    und formative assessment practices. This is why reviewing MDE staff included Professional Development as a ratingegory.

    CONCLUSIONBased on a review of survey responses, it appears that CTB/McGraw-Hill, Discovery, Measured Progress, SmarterBalanced, and Scantron may meet all or most of the stated requirements. Each indicates an online repository, acompatible de nition of formative assessment, availability of classroom tools and professional learning resources, andopportunities for Michigan educators to submit additional resources. However, closer examination of resources andservices that support educator understanding and use of the formative assessment process is encouraged.

    Service Provider CompatibleDefnitionOnline

    Availability

    Variety of Classroomresources/tools/

    strategies

    Professionallearning

    opportunities

    Resourcesaligned to

    quality criteria

    MI educatorsubmission

    process

    CostIndicators

    ACT Aspire NR NR NR NR NR NR

    Amplify Education, Inc. No additional costs based onresponses provided

    College Board NR NR NR NR NR NR NR

    CTB/McGraw-Hill Additional costs based onresponse provided

    CurriculumAssociates LLC NR NR NR NR NR

    Discovery EducationAssessment

    No additional costs based onresponses