cognitively-based assessment enabled by technology

19
C R E S S T / U C L A 1 Cognitively-Based Assessment Enabled by Technology AERA 44.38 April 2001 UCLA Graduate School of Education & Information Studies Center for the Study of Evaluation (CSE) National Center for Research on Evaluation, Standards, and Student Testing (CRESST) Eva L. Baker

Upload: hu-williams

Post on 31-Dec-2015

15 views

Category:

Documents


0 download

DESCRIPTION

Cognitively-Based Assessment Enabled by Technology. Eva L. Baker. UCLA Graduate School of Education & Information Studies Center for the Study of Evaluation (CSE) National Center for Research on Evaluation, Standards, and Student Testing (CRESST). AERA 44.38 April 2001. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 1

Cognitively-Based Assessment Enabled by

Technology

AERA 44.38April 2001

UCLA Graduate School of Education & Information StudiesCenter for the Study of Evaluation (CSE)

National Center for Research on Evaluation, Standards, and Student Testing (CRESST)

Eva L. Baker

Page 2: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 2

Technology Principles for the Design and Use of Educational

Information

Problem definition

Assessment

Data interpretation and representation

Examples and inferred principles

Key research

Page 3: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 3

Problem

Global notions of assessment design—match or aligned to standards, illustrate a preferred format; normed interpretation

Naive view that mere access to data will improve performance

Policy now expects multiple purposes to be served by limited assessment(s)

One-at-a-time mentality

Assessment “systems” remain to be achieved

Page 4: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 4

To be Productive in Technology-Based Assessment/Improvement

Systems

Design reusable components—tasks, data modules, scoring protocols, reporting

Specify details guiding the integration of system elements

Plan for rapidly changing technology

Include in the system both data elements, user models, and interpretative options

Page 5: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 5

Assessment Design Strategy

Start with cognitive demands

Guide task development, test integration, and scoring elements

Implement in subject matter domains or skills (soft or hard)

Monitor precursor or developmental sequence

Review for linguistic appropriateness

Determine key data elements or processes to be collected

Page 6: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 6

Families of Cognitive Demands:Both Domain-Dependent and

Domain-Independent Features

Page 7: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 7

Authoring Tools

Assessment tasks and tests

Data representation

Interpretation

Public reporting

Page 8: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 8

CRESST Authoring System Plan: Part 1

Templates based on current model-based assessments

Web-based with expert and peer review

Automated scoring using extant- or expert-based systems

Correspondence with “content and performance standards” or other system goals

Page 9: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 9

Principles for Assessment Design Today

Contain cost by automation

Start with pervasive rather than ephemeral elements (e.g., cognitive demands)

Implement in content and skill domains

Assess and correct linguistic complexity and other likely sources of construct-irrelevant variance

Generate resusable structures, including support by users (teachers, administrators, publishers)

Link to other existing system elements

Page 10: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 10

Automation: Part 2

Depends on realization of “Learnome” maps of domains

Proofs of concept in literacy, geography, math, technical skills, chemistry

Selection of primitives or objects

Links to Web-enabled content classification

Default conditions supporting validity for purpose, reliability, and flexibility

Interactive user trials in real and controlled settings

Page 11: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 11

Principles for a Rapidly Changing World

Automate design based on Learnome primitives

Technological support for test administration

Automate data collection for on-the-fly technical quality monitoring

Create “add an egg” versions with talkies

Develop comparability indices

Page 12: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 12

System Data Interpreter(s) and Reporting Systems

Early version—QSP—data manager intuitive, novice user, disaggregation, query based, longitudinal story for individual, unit, institution, or program

Multiple purposes—feedback, evaluation, accountability, and individual diagnosis

Additional data—meeting requirements or supporting validity interpretations

Top-down, bottom-up

Massive differences in user knowledge requirements and expectations

Page 13: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 13

New Version

Expanded user set

User-selected data elements and representations

Local flexibility expanded

Scenarios to simulate consequences of selected actions on groups, schools, or system

Page 14: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 14

Report Card Generator

Automated representations of extant data elements

Iconic, metaphorical, intuitive

Multiple media—Web

Institutional, program, individuals

Inexpensive and fast

Page 15: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 15

Current Example:Static Data Representation

Page 16: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 16

Next Generation Reporting

Multiple metaphors

Intuitive, dynamic, and progressive

Extensible and portable

User selection of options based on personal mental model

Page 17: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 17

http://vv.arts.ucla.edu/

Page 18: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 18

Principles for Data Representation and

Interpretation

Explicit user models—purposes and element preferences

Responsive timing

Local automation of some functions

Representation flexibility

System supports for mental models and partial knowledge

Page 19: Cognitively-Based Assessment Enabled by Technology

C R E S S T / U C L A 19

Key Research

Learnome mapping and primitive development

Limits of on-the-fly technical quality supports

Flexibility by mental model of user(s)

Updates of “prescription” selection and scenario building

Integrating the user in the representation