Innovations in Learning Technologies Career Readiness Issues: Domain
Specific Creative Thinking
Harry O’NeilCRESST/University of Southern
California
2015 CRESST ConferenceIntegrating Technology, Games, and
Assessment in Teaching and LearningAugust 19, 2015
CRESST v3 8/12/15 1
Context
• College and Career Readiness• Common Core State Standards• Smarter Balanced• State of California• Career Readiness Issue• Domain Specific Creative Thinking• Learning Technologies
– Games, simulations– Instruction, assessment
CRESST v3 8/12/15 2
Problem of Career Readiness
• Many high school and community college graduates lack the necessary knowledge, skills, and attributes (KSA) to be productive members of a workforce that focuses on high-demand, high-skill, and high-wage careers.
• This development means that much more is expected of even entry-level members of the American workforce. – Our K-12 schools and community colleges to provide
this type of workforce.
CRESST v3 8/12/15 3
Changing World• Longer lives• Multiple careers
– 3 or 4 over lifetime• Globalization• Urgent unforeseen societal problems will always
spring up and need to be solved• Problems and solutions will diffuse more rapidly
because of technology and the media• Problems will be very complex in unpredictable
environments• Companies need to be agile, innovative, creative
CRESST v3 8/12/15 4
Increase in Technology
• Results in an increase in cognitive complexity. – Instead of performing simple procedural and
predictable tasks, a worker becomes responsible for inferences, diagnosis, judgment, and decision making, often under severe time pressure.
• These skills are not currently explicitly taught or assessed in our schools.
CRESST v3 8/12/15 5
Definitions of KSAs• Career readiness frameworks, studies,
literature reviews, meeting of experts, etc. • Earlier 1990s SCANS work, a recent source
is the CCSSO study which included industry sources, international benchmarks.
• PISA defined collaborative problem solving and it will be assessed in PISA 2016.
• NAEP is also designing a collaborative problem solving assessment which will be added to future NAEP assessments.
CRESST v3 8/12/15 6
Knowledge, Skills, & Attributes
• Knowledge is domain-specific• Skills are either domain-specific or
domain-independent• Attributes are relatively domain-
independent. – Attributes are considered as widely applicable
but hard to train. – The term attribute is usually considered
interchangeable with the term competency.
CRESST v3 8/12/15 7
Career Readiness Skills (edited)State of California (2015) CCSSO (2013) ACT (2014) O’Neil (2014)
Utilize critical thinking
Demonstrate creativity and innovation
Communicate
Develop an education and career plan aligned with personal goals
Apply technology
Practice personal health and understand financial literacy
Act as a responsible citizen
Model integrity, ethical leadership, and effective management
Work productively in teams
Employ valid and reliable research strategies
Understand the environmental, social, and economic impacts of decisions
Critical thinking
Creativity & innovation
Communicating effectively
Problem solving
Communicating effectively
Metacognition & self-awareness
Study skills & learning how to learn
Time/goal management
Working collaboratively
Critical thinking
Collaboration skills
ICT
Adaptive Problem Solving
Communication
Decision Making
CRESST v3 8/12/15 8
Career Readiness: AttributesCCSSO (2013) ACT (2014) O’Neil (2014)
Agency (self-efficacy
Initiative
Resilience
Adaptability
Leadership
Ethical behavior & civic
responsibility
Social awareness & empathy
Self-control
Self-efficacy
Fit
Integrity
Interests
Personality
Self-esteem
Values
Self-efficacy
Creative thinking
Metacognition
Teamwork
CRESST v3 8/12/15 9
A 21st Century Career Readiness Model
Knowledge
Teamwork
Attributes/Competencies
Creative Thinking
MetacognitionAdaptive Expertise
Pre requisites Domain Knowledge
Adaptability Situation Awareness
Skills
Communication DecisionMaking
Adaptive Problem Solving
21st Century Readiness
CRESST v3 8/12/15 10
What’s Missing
• Conceptual clarity for the KSAs• How to measure
– The development of assessments to measure these KSAs in high schools and community colleges.
– How to measure performance assessment approaches (e.g., portfolios, game simulations knowledge maps)
• Criteria (e.g., validity, fairness, transfer and generalizability, cost, and efficiency);
• Type of technology (e.g., paper-and-pencil, computer).
CRESST v3 8/12/15 11
Prior Research: 21st Century Skills• Adaptive problem solving involves the ability to invent solutions to
problems that the problem solver has not encountered before. In adaptive problem solving, problem solvers must adapt to their existing knowledge to fit the requirements of a novel problem (Mayer, 2014). Adaptive problem solving has also been conceptualized by O’Neil (2014) as being composed of content understanding, problem solving strategies, and self-regulation
CRESST v3 8/12/15 12
21st Century Skills• Communication is the timely and clear provision of information and the
ability to know whom to contact, when to contact, and how to report (Hussain, Bowers, & Blasko-Drabik, 2014)
• Decision Making involves the use of situational awareness information about the current situation to help evaluate a course of action and judge its effectiveness. It involves the ability to follow appropriate protocols, follow orders, and take the initiative to complete a mission (Hussain & Bowers, 2014)
• Teamwork is a trait of the individual that predisposes the individual to act as a team member. There are six teamwork processes: (a) adaptability, (b) coordination, (c) decision-making, (d) interpersonal, (e) leadership, and (f) communication. A complementary definition is provided by Bowers and Cannon-Bowers (2014). Their definition of teamwork includes knowledge of teamwork, leadership, mutual performance monitoring back up, communication, interpersonal skills and positive teamwork attitudes
CRESST v3 8/12/15 13
21st Century Skills (cont’d)• Metacognition is awareness of one’s thinking and is composed of two
components: planning and self-monitoring. Planning means that tone must have a goal (either assigned or self-directed) and a plan to achieve the goal. Self-monitoring means one needs a self-checking mechanism to monitor goal achievement (O’Neil, 1999)
• Situation Awareness involves being aware of what is happening around you, to understand how information, events, and your own actions will affect your goals and objectives, both now and in the near future. More formally, situation awareness can be defined as the (perception) of elements in the environment within a volume of time and space, the (comprehension) of their meaning, and the (projection) of their status in the near future (Endsley, 1995, p. 36)
• Domain-specific creativity
CRESST v3 8/12/15 14
Creative Thinking
• Creative thinking
• Fluency (ability to produce a large number of ideas)
• Novel/originality (the number of unusual categories of ideas judge by expert raters or by statistical infrequency)
• Flexibility (produce or use of different categories of ideas)
• Elaboration (ability to fill in details)
CRESST v3 8/12/15 15
Creative Thinking (cont’d)
• Domain-general measures of ideational fluency (e.g., Torrance Tests of Creative Thinking, Torrance, 1974, 1999).
• The problems presented to respondents to be solved are very different from the kinds of problems that people encounter in everyday life.
– Name all the ways to use a newspaper
CRESST v3 8/12/15 16
Domain-Specific Creativity• Creative thinking was long considered domain-general• Individuals who score high on the test of creative
thinking ability would be able to generate divergent or original ideas in a wide variety of domains
• More recently, however, scholars have presented evidence indicating that creative thinking is domain- or task-specific and not domain-general as long believed
• Measures of creative thinking ability typically provide scores on fluency (the number of ideas or solutions), flexibility (the number of different categories of ideas), originality (the number of unusual categories of ideas judged by expert raters or by statistical infrequency), and/or elaboration (the number of details involved)
CRESST v3 8/12/15 17
Domain-Specific Creativity• In domain-general creative thinking measures, the
problems presented to respondents to be solved are very different from the kinds of problems that people encounter in everyday life. For example, when responding to ideational fluency measures, respondents are asked to name all the ways to use a newspaper
• A few measures have been developed that are designed specifically to assess domain-specific creative thinking. The Ariel Real Life Problem Solving developed (Milgram & Hong, 2000-2009), for example, provides respondents with the opportunity to utilize domain-specific creative thinking ability in a wide variety of specific real-life situations (e.g., bullying at recess)
CRESST v3 8/12/15 18
Creative Thinking (cont’d)
• Specific Creative Thinking (cont’d)– requiring interpersonal problem-solving
skills.—a problem involving peers. – "At recess time you see that children are
hitting another child in your class. The child feels that the other children do not like him.
– What would you do if you were in his place? What are all the things that it is possible to do?"
CRESST v3 8/12/15 19
Vary Cognitive Demands of the Scenarios to Test
• A construct validity approach• Basic Level – relatively low need for 21st
Century Skills– Relatively low cognitive complexity– Primarily procedural– Limited decision making– One “best” solution– Multiple “adequate” solutions
CRESST v3 8/12/15 20
Vary Cognitive Demands of the Scenarios in Order to Test
• Advanced Level—relatively high need for Adaptivity– High cognitive complexity– Multiple competing priorities– Unpredictable environments– No “best” solution– Many potentially “disastrous” solutions– Trained procedures/principles may not work
Creative solutions may be neededCRESST v3 8/12/15 21
Damage Control Simulation-Game
Support for instruction and assessment in multiple settings: • DC Schoolhouse for pre-assessment• DC Schoolhouse for demonstration, discussion, practice• Shipboard training and practice for rating up• Shipboard practice for prevention of skill decay• Shipboard use by DCTT for drill planning and review
Performer: University of California Los Angeles, National Center for Research on Evaluation, Standards and Student Testing: POC: Bill Bewley [email protected]
Transition sponsor: SWOS
Solution:The damage control simulation enables students to conduct damage control operations aboard a simulated DDG involving fires and flood. Performance is examined using Bayesian Networks, to enable both in-game feedback and instructional components, and post-game scoring, reporting, and AAR.
Problem:To date, there are no proven methodologies for assessing complex performance using games and simulation. To research and develop such a methodology, CRESST developed a damage control simulation-game for the Center for Naval Engineering (CNE) to serve as both a research platform and a tool for immediate use in the CNE schoolhouse.
Technology:Data from the simulation are fed into a Bayesian Network which models Navy damage control doctrine. The Bayesian Network uses the performance data to determine probabilities of knowledge, skills, and future performance for a variety of damage control related KSAs, including casualty management, communications, and situation awareness.
OPERATIONAL NEED
SOLUTION
UNCLASSIFIED
Approach:• Create a simulation-game that involves complex performance• Analyze performance data using a Bayesian Network• Validate by comparing Bayesian scores to human evaluators• Use real-time analyses to enable feedback and instruction
CRESST v3 8/12/15 22
Challenges
• Overlap in construct definitions– Creativity part of many constructs
• Lack of reliable and valid trait 21st Century Skills questionnaires for selection purpose
• Few state measures for diagnostic/remedial purposes– What are in scenario performance measures, e.g.,
state self-efficacy questionnaires
CRESST v3 8/12/15 23
Challenges (cont’d)
• Trait vs. state• Lack of psychometrics for
simulation/games• Domain independent vs. domain
dependent
CRESST v3 8/12/15 24
Next Steps: Career Readiness Foci
• Knowledge, Skills, and Attributes
• Assessment of KSA
• Contexts in training as well as education
• The transition from high school to the world of work as well as community college to the world of work
• Adapts based on the CCSSO analysis and other sources a modified version of the CRESST framework as its conceptual model of the KSAs needed for this world of work
• A focus on entry-level positions
• KSA of teams or groups as well as individuals
CRESST v3 8/12/15 25
Bayes Nets for Diagnosis / Remediation
Bayesian Model of Knowledge and Performance Dependencies
Just-in-Time Training
Domain Ontology
content identified knowledge and skill gaps, and skill decay
prior knowledge
pre-assessment battery
demographics
player interaction modelCRESST v3 8/12/15 28
Traits vs. States• Traits are considered stable characteristics of a person and
are relatively difficult to change. A trait is a predisposition to manifest a state. Students are asked to describe how they generally think or feel.
• States refer to the manifestation of the traits in the situation. States (e.g., state worry) change in intensity and vary over time. Students are asked to describe either how they feel “right now” or how they felt while they were taking test.
(Spielberger, 1975)
CRESST v3 8/12/15 29
Adaptive Training Strategies• Use authentic problems to train cognitive readiness • Base training on Ontology• Simulate/demonstrate strategies• Provide practice (games and simulations)• Gradually increase problem novelty and variability• Measure transfer• Prevent forgetting
– Over-learn at acquisition, predicting when to re-teach (skill decay)
• Retention vs. transfer trade-off
CRESST v3 8/12/15 30
Motivation is Based on….• Interest
– Individuals work harder when they value what they learn and it is important to them
• Self-efficacy– Individuals work hard when they perceive themselves as
capable of doing well• Attribution
– Individuals work hard when they believe their effort will pay off – attributing success and failure to personal effort – not luck or other uncontrollable factors
• Achievement Goals– Individuals work hard when their goal is to understand
rather than compete
CRESST v3 8/12/15 31
Map/Ontology
• An ontology is a conceptual representation of a domain expressed in terms of concepts and the relationships among the concepts– Supports getting expert knowledge,
representing it, and sharing it– It is used in a number of fields outside of K-12
• Medical, engineering, e-commerce, military• Software tools are available
CRESST v3 8/12/15 32
Measurement of Motivation
• Use of PISA/OECD questionnaire– To measure efficacy, effort, test anxiety, and
as traits• Constructs measured (PISA, 2000)
– Marsh et al., 2006• PISA (Program for International Student
Assessment)– Testing program 40+ countries in math,
literacy, science, problem solving, motivation
CRESST v3 8/12/15 33
Critical Thinking Definitions (Abrami et. al, 2008)
• Critical thinking, or the ability to engage in purposeful, self-regulatory judgment
• The Delphi Committee identified six skills (interpretation, analysis, evaluation, inference, explanation, and self-regulation), 16 sub skills, and 19 dispositions (including inquisitiveness, open-mindedness, understanding others, and so on)
• These skills include concepts such as interpreting, predicting, analyzing, and evaluating
CRESST v3 8/12/15 34
Critical Thinking Definitions (cont’d)
• The CAAP Critical Thinking Test is a 32-item, 40-minute test that measures students’ skills in clarifying, analyzing, evaluating, and extending arguments. An argument is defined as a sequence of statements that includes a claim that one of the statements, the conclusion, follows from the other statements.
• Each passage is accompanied by a set of multiple-choice test items. A total score is provided for the Critical Thinking Test; no subscores are provided
CRESST v3 8/12/15 35
Critical Thinking Definitions (cont’d) (Wall Street Journal)
• “The ability to cross-examine evidence and logical argument. To sift through all the noise.” –Richard Arum, New York University sociology professor
• “Thinking about your thinking, while you’re thinking, in order to improve your thinking.” –Linda Elder, educational psychologist; president, Foundation for Critical Thinking
• “Do they make use of information that’s available in their journey to arrive at a conclusion or decision? How do they make use of that?” –Michael Desmarais, global head of recruiting, Goldman Sachs Group
CRESST v3 8/12/15 36
Domain Specific Creative Thinking Assessment Example within a SWOS MMTT Scenario
• Here is the summary of the training scenario in the Multi-Mission Tactical Trainer that you just completed. Three aircraft tracks were involved, including commercial aircraft (COMAIR), a track from a potentially hostile country called Orange (Orange Track 1), and another track (Orange Track 2) that appeared after about 9 minutes
• Orange Track 1 does not respond to electronic interrogation or radio queries and warnings, and it appears to be coming from country Orange, a potential enemy, so the Tactical Action Officer (TAO) orders an airplane to intercept in order to obtain a visual identification
• While Orange Track 1 is being investigated, Orange Track 2 appears about 9 minutes after Orange Track 1 appeared
• You are the TAO. What should you do at this time? Write down all the possible options you considered and would consider now
• We would score the respondents’ answers by matching the trainees’ response to an expert, e.g., a creative individual in SWOS and also investigate the machine scoring of fluency
CRESST v3 8/12/15 37