cresst conference los angeles, ca september 15, 2000 cresst conference 9/15/00 v.3 computer-based...

31
CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil University of Southern California & National Center for Research on Evaluation, Standards, and Student Testing (CRESST) Gloria Hsieh University of Southern California Gregory K. W. K. Chung UCLA/CRESST

Upload: ethelbert-patterson

Post on 01-Jan-2016

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference

Los Angeles, CASeptember 15, 2000

CRESST Conference 9/15/00 v.3

COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM

SOLVING

Harry O'NeilUniversity of Southern California &

National Center for Research on Evaluation, Standards, and Student Testing (CRESST)

Gloria Hsieh University of Southern California

Gregory K. W. K. ChungUCLA/CRESST

Page 2: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.2

C R E S S T / U S C

CRESST MODEL OF LEARNING

Content Understanding

Learning

Communication

Collaboration Problem Solving

Self-Regulation

Page 3: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.3

C R E S S T / U S C

JUSTIFICATION: WORLD OF WORK

• The justification for collaborative problem solving as a core demand can be found in analyses of both the workplace and academic learning– O’Neil, Allred, and Baker (1997) reviewed five major studies

from the workplace readiness literature. Each of these studies

identified the need for (a) higher order thinking skills, (b)

teamwork, and (c) some form of technology fluency. In four of

the studies, problem-solving skills were specifically identified

as essential.

Page 4: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.4

C R E S S T / U S C

JUSTIFICATION: NATIONAL STANDARDS

• New standards (e.g., National Science Education Standards) suggest new assessment approaches rather than multiple-choice exams– Deeper or higher order learning

– More robust knowledge representations

– Integration of mathematics and science

– Integration of scientific information that students can apply to new problems in varied settings (i.e., transfer)

– Integration of content knowledge and problem solving

– More challenging science problems

– Conduct learning in groups

Page 5: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.5

C R E S S T / U S C

MODELSPROBLEM SOLVING DEFINITION

• Problem solving is cognitive processing directed at achieving a goal when no solution method is obvious to the problem solver (Mayer & Wittrock, 1996)

• Problem-solving components

– Domain-specific knowledge (content understanding)

– Problem-solving strategy

• Domain-specific strategy in troubleshooting (e.g., malfunction probability [i.e., fix first the component that fails most often])

– Self-regulation (metacognition [planning, self-monitoring] + motivation [effort, self-efficacy] )

Page 6: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.6

C R E S S T / U S C

Content Understanding

PROBLEM SOLVING

Domain-DependentProblem-Solving

Strategies

Self-Regulation

Metacognition

Self-Monitoring

Planning

Motivation

Effort Self-Efficacy

Page 7: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.7

C R E S S T / U S C

COMPUTER-BASED PROBLEM- SOLVING TASK (CAETI)

• Metacognition and motivation are assessed by paper-and-pencil survey instrument (self-regulation)

• Create a knowledge map on environmental science (Content understanding)

• Receive feedback on it

• Using a simulated Web site, search for information to improve it (problem-solving strategy)– Relevance, searches, browsing

• Construct a final knowledge map– Serves as the outcome content understanding measure

Page 8: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.8

C R E S S T / U S C

CRESST’S CONCEPT MAPPER

Page 9: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.9

C R E S S T / U S C

CORRELATION COEFFICIENTS: OUTCOME AND PROCESS VARIABLES (N = 38)

Process variablesKnowledge map

score

Relevant information found .08

Browsing .41**

Searching .29*

Focused browsing .52***

Feedback .51***

*p < .07** p < .01*** p < .001

Page 10: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.10

C R E S S T / U S C

CONCLUSIONS

• Computer-based problem-solving assessment is feasible– Process/product validity evidence is promising

• Allows real-time scoring/reporting to students and teachers

• Useful for program evaluation and diagnostic functions of testing

• What’s next?– Generalizability study

– Collaborative problem solving with group task

Page 11: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.11

C R E S S T / U S C

TEAMWORK MODEL

• Taskwork SkillsTaskwork team skills influence how well a team performs on aparticular task (e.g., whether or not a team of negotiators reachesan agreement with the other party).

• Teamwork SkillsTeamwork skills, or team process skills, influence how effectivean individual member will be as part of a team:– Adaptability Recognizing problems and responding appropriately

– Coordination Organizing team activities to complete a task on time

– Decision making Using available information to make decisions

– Interpersonal Interacting cooperatively with other team members

– Leadership Providing direction for the team

– Communication Clear and accurate exchange of information

Page 12: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.12

C R E S S T / U S C

CRESST ASSESSMENT MODEL OF TEAMWORK

Simulation

Pre-Defined Process Taxonomy

Union Management Negotiation/Networked

Concept Map

Real-Time Assessment and

ReportingNetworked Computers

Pre-Defined Messages

Page 13: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CORRELATION BETWEEN TEAM PROCESSES AND OUTCOME MEASURES1

(N = 26)

Team processPerformance

score Agreement 2Agreement

typeTime in

negotiations

Adaptability .31 -.24 .24 .72***

Coordination .29 -.08 .15 .65***

Decision making .41* -.04 .41* .69***

Leadership .31 -.02 .19 .65***

Interpersonal .03 -.52** -.05 .56***

Communication .35 -.20 .24 .80***

Note. *p < .05. **p < .01. ***p < .001.

1 High school students.

2 0 = No agreement, 1 = Agreement.

CREEST Conference 9/15/00 v.1

C R E S S T / U S C

Page 14: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.14

C R E S S T / U S C

Team process

Semanticcontentscore

Organizationalstructure

score

Adaptability -.65** -.49*

Coordination -.31 -.31

Decision making -.09 -.01

Interpersonal -.33 -.25

Leadership .23 .37

Communication -.36 -.24

*p < .05. **p < .01. (two-tail).

Chung, G. K. W. K., O’Neil, H. F., Jr., & Herl, H. E. (1999). The use ofcomputer-based collaborative knowledge mapping to measure teamprocesses and team outcomes. Computers in Human Behavior, 15, 463-493.

Nonparametric (Spearman) Correlations Between Team Processes and Post Outcome Measures for Concept Map (N =14)

Page 15: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.15

C R E S S T / U S C

PUZZLE

• Unfortunately, the concept mapping study

(Chung et al., 1999) found that the team process

did not predict team outcomes, unlike the union

management negotiation task.

We hypothesized that the lack of useful feedback

in the concept mapping task and low prior

knowledge may have influenced the results.

Page 16: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.16

C R E S S T / U S C

ONGOING RESEARCH

• We changed the nature of the task to provide more extensive feedback and to create a real “group” task

• Feedback will be knowledge of response feedback versus adaptive knowledge of response feedback

• A group task is a task where – no single individual possesses all the resources;

– no single individual is likely to solve the problem or accomplish the task objective without at least some input from others (Cohen & Arechevala-Vargas, 1987)

• One student creates the concept map, the other student does the searches

Page 17: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.17

C R E S S T / U S C

KNOWLEDGE OF RESPONSE FEEDBACK(Schacter et al. Study)

Your map has been scored against an expert’s map in environmental science. The feedback tells you:

• How much you need to improve each concept in your map (i.e., A lot, Some, A little).

Use this feedback to help you search to improve your map.

A lot Some A little_______________________________________________

Atmosphere Climate EvaporationBacteria Carbon dioxide Greenhouse gassesDecomposition PhotosynthesisOxygen SunlightWaste Water cycleRespiration OceansNutrients ConsumerFood chain Producer

Adapted Knowledge of Response (the above + the following)

Improvement: You have improved the “food chain” concept from needing “A lot of improvement” to the “Some improvement” category.

Strategy: It is most useful to search for information for the “A lot” and “Some” categories rather than the “A little” category. For example, search for information on “atmosphere” or “climate” first, rather than “evaporation.”

Page 18: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.18

C R E S S T / U S C

GENERAL LESSONS LEARNED

• Need model of cognitive learning (the Big 5)

– Need submodels of process

• problem solving is content understanding, problem-solving strategies, self-regulation

• Teamwork is adaptability, coordination, decision making, interpersonal skill, leadership, and communication

• For diagnostic low-stakes environments need real-time administration, scoring, and reporting

• Role of type of task and feedback may be critical for assessment of collaborative problem solving

Page 19: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

BACK-UP SLIDES

CREEST Conference 9/15/00 v.1 p.19

Page 20: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.20

C R E S S T / U S C

Problem SolvingDomain-Specific Domain-specific Augmented Concept Mapping With Search Task,Self-regulation strategies Transfer Tasks, Motivation (effort, self- efficacy,

anxiety), Search Strategies

ASSESSMENTS FOR TYPES OF LEARNING

Content UnderstandingFacts Procedures Explanation Tasks, Concept Mapping,Concepts Principles Multiple-Choice, Essays

TYPES OF LEARNING ASSESSMENT METHODOLOGY

Team Work and CollaborationCoordination Adaptability Collaborative Simulation, Self Report, Leadership Interpersonal ObservationDecision Making

Self-regulationPlanning Self-Report, observation, inferenceSelf-CheckingSelf-EfficacyEffort

CommunicationComprehension Use of Conventions Explanation scored for communicationExpression Multimode

Page 21: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

DOMAIN SPECIFICATIONS EMBEDDED IN THE UNION/MANAGEMENT NEGOTIATION SOFTWARE

General Domain Specification Specific Example

Scenario Role play a union-management negotiation byexchanging proposals in mixed motive context.

Players Student team (3 players) and one managementteam (computer software)

Students Either expert or novice, individual or team

Management Computer software (O’Neil, Allred & Dennis,1992)

Priorities Offsetting

Moves Reciprocal

Rounds Offer from student team and counter-offer frommanagement team.

Subcompetencies Propose options; make reasonablecompromises

Negotiation Issues Three in number (e.g., salary) with offsettingpriorities

Negotiation Outcome Measures Agreement (yes/no), Type of agreement(distributive vs. integrative), Final counter-offer,Time

Teamwork Processes Acceptability, Coordination, Decision making,Interpersonal, Leadership, Communication

Cognitive Processes (Domain-dependent) Fixed-pie bias

Cognitive Processes (Domain-independent)_ Metacognitive skills

Affective Processes (Domain-independent) Effort, Anxiety

CRESST Conference 9/15/00 v.1 p.19

Page 22: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.22

C R E S S T / U S C

Domain Specifications Embedded in the Knowledge Mapping Software

General domain specification This software

Scenario Create a knowledge map on environmental science byexchanging messages in a collaborative context

Participants Student team (3 members)

Knowledge map terms Predefined. 18 important ideas identified by contentexperts: atmosphere, bacteria, carbon dioxide, climate,consumer, decomposition, evaporation, food chain,greenhouse gases, nutrients, oceans, oxygen,photosynthesis, producer, respiration, sunlight, waste,and water cycle.

Knowledge map links Predefined. 7 important relationships identified bycontent experts: causes, influences, part of, produces,requires, used for, and uses

Type of learning Content understanding, collaboration

Outcome measures Semantic content scores, organizational structurescore, number of terms used, number of links used.

Teamwork processes Adaptability, coordination, decision making,interpersonal, leadership, communication

Page 23: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.23

C R E S S T / U S C

BOOKMARKING APPLET

Page 24: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.24

C R E S S T / U S C

SAMPLE METACOGNITIVE ITEMS

AlmostNever Sometimes Often

AlmostAlways

a. I figure out my goals andwhat I need to do toaccomplish them............................ 01 Ο 02 Ο 03 Ο 04 Ο

b. I almost always knowhow much of a task I haveto complete ...................................... 05 Ο 06 Ο 07 Ο 08 Ο

The following questions refer to the ways people have used to describe

themselves. Read each statement below and indicate how you generally

think or feel. There are no right or wrong answers. Do not spend too

much time on any one statement. Remember, give the answer that

seems to describe how you generally think or feel.

Note. Formatted as in Section E, Background Questionnaire: Canadian version of the International Adult Literacy Survey (1994). Item a is a planning item; item b is a self-checking item. Kosmicki (1993) reported alpha reliability of .86 and .78 for 6-item versions of these scales respectively.

Page 25: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.25

C R E S S T / U S C

TEAMWORK PROCESSES

Definition Example message

Adaptability Recognizing problems andresponding appropriately

If we give on wages, theymight give on pension.

Coordination Organizing team activitiesto complete a task on time

We need to hurry tocomplete this round.

Decision Making Using available informa-tion to make decisions

We should go higheron health and welfarebecause it’s their secondpriority.

Interpersonal Interacting cooperativelywith other team members

We are doing a great job.

Leadership Providing direction for theteam

Send the offer.

Page 26: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.26

C R E S S T / U S C

SCREEN EXAMPLE

Page 27: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.27

C R E S S T / U S C

FEEDBACK FREQUENCY

• Lowering the percentage of feedback

– slows down the acquisition of concepts

– but facilitates the transfer of knowledge

Page 28: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.28

C R E S S T / U S C

TIMING OF FEEDBACK

• Delayed-Retention Effect (Delayed > Immediate)

• Classroom or Programmed Instruction Settings (Immediate > Delayed)

• Developmental difference:

– Younger children —> Immediate > Delayed

– Older children —> Delayed > Immediate

Page 29: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.29

C R E S S T / U S C

THREE CHARACTERISTICS OF FEEDBACK

• Complexity of feedback– What information is contained in the feedback messages

• Timing of feedback– When is the feedback given to students

• Representation of feedback– The form of the feedback presented (text vs. graphics)

Page 30: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.30

C R E S S T / U S C

Team processPerformance

score AgreementAgreement

typeTime in

negotiations

Adaptability .26 .09 .20 .53*

Coordination .05 -.10 -.14 .38*

Decision making .22 .06 .35* .54*

Leadership .14 .14 .02 .54*

Interpersonal -.02 .00 -.10 .28*

Communication .17 .04 .07 .61*

* p < .05.

CORRELATIONS BETWEEN TEAMWORK PROCESS

SCALES AND OUTCOME MEASURES

FOR UNION PARTICIPANTS (N = 48)

Page 31: CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil

CRESST Conference 9/15/00 v.3 p.31

C R E S S T / U S C

THE NATURE OF TASKS Interaction will be positively related to productivity

under two conditions:

• Group Tasks

– No single individual possesses all the resources

– No single individual is likely to solve the problem or accomplish the task objectives without at least some inputs from others

• Ill-Structured Problem

– No clear-cut answers or procedures for the problem