measuring diagnostic competence

32
Measuring Diagnostic Competence: Exploring Error Analysis Ability and the Roles of Procedural and Conceptual Knowledge Rebecca McNeil February 23, 2021

Upload: others

Post on 16-Jan-2022

3 views

Category:

Documents


0 download

TRANSCRIPT

Measuring Diagnostic Competence: Exploring Error Analysis Ability and the Roles of

Procedural and Conceptual Knowledge

Rebecca McNeil

February 23, 2021

Outline

• Introduction

• Theoretical Background

• Methods

• Results

• Discussion

Defining Error Analysis

• Error analysis involves…• Identifying and correcting specific error patterns students exhibit in their

work to promote further learning (Larrain & Kaiser, 2019)

• Remediating mistakes students make based on a lack of conceptual or procedural understanding to improve student learning (Ketterlin-Geller & Yovanoff, 2009)

• Subdimension of diagnostic competence → subdomain of pedagogical content knowledge - necessary “subject matter knowledge for teaching” (Shulman, 1986, p. 9)

The “Problem”

• Under-developed PCK can lead to ineffective mathematics instruction (Capraro, et al., 2005)

• Many studies assess various aspects of PCK, including teachers’ diagnostic competence, error analysis ability in mathematics

• But: few attend to roles of procedural and conceptual knowledge in error analysis (e.g. Phelps-Gregory & Spitzer, 2018), and even fewer report psychometric characteristics

Objectives

• Primary objective: develop an instrument to measure error analysis ability (EAA) as a construct that involves elements of both procedural and conceptual knowledge

• In hopes of…• 1) informing future research on the measurement of EAA, and

• 2) developing diagnostic/learning tool for math teachers

The Importance of Error Analysis

• Error analysis is essential for preparing teachers for effective mathematics instruction & improving student learning/achievement (Cooper, 2009; McGuire, 2003; Morris, et al., 2009).

• However, evidence indicates that teachers struggle with error analysis (Riccomini, 2005), and…• underestimate complexity of simple procedures,• make incorrect assumptions about student learning of

procedures, and• may not adequately represent knowledge students need to

learn certain procedures/concepts (Schoenfeld, 1985)

• Also, preservice teachers…• Possess mostly procedural knowledge of mathematics,

• hold many of same misconceptions as students, and

• struggle with effective planning, evaluation of students’ mathematical reasoning (Capraro, et al., 2005; Fuller, 1996; Graeber, et al., 1989; Ryan & McCrae, 2005; Stacey, et al., 2001)

• Are often influenced more by own mathematical thinking vs. knowledge of student mathematical thinking (Wilson, et al., 2013)

• Provided with more resources to practice error analysis, teachers could be better prepared for success in the classroom

Benefits of Error Analysis

• 1) Corrective Instruction• Facilitates student learning by allowing for corrective instruction (Brodie 2014;

Ketterlin-Geller & Yovanoff, 2009; Kingsdorf & Krawec, 2014; McGuire, 2013)

• 2) Formative Assessment• T gains knowledge of student mathematical thinking to “meet them where they are”

• Allows for teaching to the Zone of Proximal Development and appropriate scaffolding

• 3) Teacher Reflection/Evaluation • T can reflectively evaluate own mathematical thinking, address problem areas

• Can contribute significantly to teachers’ PCK & thus improve student learning (Hiebert et al., 2007; Santagata & Yeh, 2014)

Conceptual and Procedural Knowledge

• Procedural tied to particular problem types, conceptual usually not

• Student conceptual understanding & procedural fluency highlighted in math education (NCTM, 2014; NCTM, 2000)• But: Teachers tend to focus on fact-based instruction vs. conceptual or

procedural knowledge when providing remedial instruction (Woodward, et al., 1999)

Conceptual Knowledge Procedural Knowledge1

knowledge of theories, models, and structures

knowledge of algorithms and subject-specific skills, methods, and techniques, determining when it is appropriate to use specific procedures

1 (Anderson & Krathwhol, 2001)

Conceptual, then Procedural? Vice Versa?

• Unidirectional: CK often seen as precursor to PK/procedural fluency

• Other evidence points to a bi-directional relationship (Riccomini, 2014; Rittle-Johnson, et al., 2015)

• Less recent research also supports theory that CK and PK interact diachronically (Byrnes & Wasik, 1991; Inhelder & Piaget, 1980)

• Iterative/cyclical: Performing procedures → conceptual knowledge → additional procedural knowledge (Hatano & Inagaki, 1986)

Procedural & Conceptual Knowledge in EA

• Knowledge on a continuum: from most concrete to most abstract, with overlap between Conceptual and Procedural (Anderson et al., 2001)

• Thus: difficult to distinguish between conceptual/procedural errors, as observed in McNeil (2019), Riccomini (2014), Rittle-Johnson, et al. (2001)

Factual Conceptual Procedural

Theoretical Background

• Pedagogical Content Knowledge (Shulman, 1986)• Error analysis as an element of PCK (Herholdt & Sapire, 2014; Hill, et al., 2008)

• COACTIV teacher competence model (Brunner et al., 2013)• Key diagnostic skills for teachers: knowledge of student mathematical

thinking, mathematical tasks, and student assessment – elements of PCK

• TEDS-M framework (Döhrmann, et al., 2014 )• Draws from Weinert’s (2001) theory of competence

• Two facets in the model: cognitive abilities (including PCK) and affective-motivational characteristics

Diagnostic Competence in Math

• Defined with respect to three areas: dispositions, skills/thinking, and performance (Leuders, et al, 2018) • Diagnostic thinking: perception, interpretation, and decision-making (“P-I-D”)

• In error situations (Hoth, et al., 2016; Larrain & Kaiser, 2019)• Math teachers’ ability to gather information on student

understandings/misconceptions & engage in continual analyses so as to provide appropriate pedagogical responses

• Similar to “diagnostic teaching” (Schoenfeld, 2011)

Error Analysis

• Modified P-I-D model for error analysis (Heinrichs, et al. 2018)1. Perceiving/identifying,2. Understanding/interpreting, and3. deciding how to proceed • Adapted for preservice teacher preparation (Larrain & Kaiser, 2019)

• Other models/frameworks: • Three facets of teacher error competence (Seifried and Wuttke, 2010) • Reflective cycle of student error analysis (Lannin, et al., 2006) • Three-level error analysis problem structure for teachers (McGuire, 2003) • Seven-step process for completing error analysis (Howell, Fox, & Morehead,

1993)

Methods

• A measurement instrument was developed using the BEAR Assessment System (BAS; Wilson, 2005), which includes:

1. Developing the construct map,

2. designing items,

3. developing outcome spaces, and

4. fitting a measurement model.

Construct Map

Items Design – Topic Area

• Preservice teachers have inadequate knowledge of rational numbers (Lester, 1984; Stacey, et al., 2001), & experience difficulty teaching students about them (Barnett, et al., 1994; Strother, et al., 2016)

• Teachers exhibit misconceptions about (Tirosh et al., 1989)multiplication and division of rational numbers

• Fractions are fundamental to both school math & real-world situations, foundational to success in algebra (Neagoy, 2011)

• Proficiency with fractions has been deemed severely underdeveloped in K-8 mathematics education (National Mathematics Advisory Panel, 2008)

Items Design

• 7 item bundles, 28 items total

• Stimulus: 3 incorrectly solved math problems (student work sample)

• 2 bundles inconsistent, 5 consistent error patterns

• Directions below precede each bundle

Item ExampleItem Bundle 7: Subtract vs. Multiply

Outcome Space ExampleItems

“dP” and “dC”

“bc”

“a”

Data Collection

• 189 respondents recruited online via email, online forums

• Items, exit survey administered via Qualtrics

• Demographic data: teaching credential status, number of years of math teaching experience, levels of K-12 math taught, & level of education

• Sample includes current, former, and preservice teachers, also some non-teachers

Measurement Model

• Response data scored according to item-specific outcome spaces

• Model: unidimensional partial credit model (Masters, 1982)

• Analysis conducted via ConQuest (Adams, et al., 2015)

• Examined EAP/PV reliability, evidence for validity

Results

• Mean years teaching math = 4.33

• 10% credentialed, 25% preservice

• 22 items examined, bundles 5 and 6 excluded

• EAP/PV reliability = 0.71

Validity Evidence

• Content validity: documented instrument development process, via BAS “four building blocks” (Wilson, 2005)

• Response processes: 5 cognitive interviews, exit survey• Interviews: most mentioned either procedural or conceptual knowledge when

remediating, only sometimes including both

• Exit survey: completion time, difficulty w/ bundle 5 (excluded)

Validity Evidence

• Relations to other variables

Validity Evidence

• Internal Structure: Instrument Level

Validity Evidence

• Internal Structure - Item Level: • Weighted infit values between 0.7 – 1.3

Validity Evidence

• Consequences of Test-taking• Intended: Enjoyable and potentially useful

• “It was fun to consider how [to] explain fractions”

• “great questions” for focusing on student misunderstandings

• “I liked thinking about these problems”

• “Is there a training curriculum that can show how to identify possible error patterns from students?”

• Others: corrective instruction, formative assessment, teacher reflection

• Unintended: unproductive use or misuse of results

Implications

• Hypothesized construct structure appears to be supported• May be multidimensional

• 2D: PEA vs CEA

• 3D: PEA, CEA, and combined PEA/CEA

• Remediating items (L4): easier to provide PK vs. CK, w/e of bundle 2• Further support for separation of PEA and CEA

• PK and CK may interact cyclically, but L4 (P) easier to achieve vs. L4 (C)

Limitations

• Mapping of Remediating items to both Levels 4 and 5

• Convenience sampling

• Missing data: out of 198 respondents, only 100 completed all items

Future Directions

• Use of other measurement models • Ordered Partition Model (Wilson, 1992)

• EAA construct levels mapping to items

• Multidimensional Random Coefficients Multinomial Logit Model (Adams, et al., 1997)• Explore multidimensionality of EAA (3D?)

• Alternative conceptualization of EAA• Separate strands for PEA and CEA

Comments? Questions?

Thank you for your attendance!