2013-03-27 site tpack symposium

96
MEASURING TPACK International symposium on TPACK Joke Voogt, Petra Fisser, Ayoub Kafyulilo, Douglas Agyei (University of Twente) Johan van Braak, Jo Tondeur, Natalie Pareja Roblin (Ghent University) Denise Schmidt-Crawford, Dale Niederhauser, Wei Wang (Iowa State University) SITE 2013, 27 March 2013, New Orleans

Upload: petra-fisser

Post on 18-Dec-2014

1.743 views

Category:

Education


1 download

DESCRIPTION

Symposium at the 24th Annual International Conference of the Society for Information Technology & Teacher Education, 27 March 2013. This symposium discusses several ways in which (pre-service) teachers‟ TPACK can be measured. The first two studies unravel the TPACK survey (Schmidt et al., 2009), a self-report instrument to determine TPACK, and try to revalidate the survey in two different pre-service teacher education contexts: The US and the Netherlands. The third study triangulates findings from the TPACK survey with other instruments to better understand teachers‟ development of TPACK that resulted from teachers‟ collaborative design of technology integrated lessons. The last contribution focuses on measuring transfer of TPACK, as it studies how beginning teachers, who had TPACK training during their pre-service education, demonstrated TPACK in their practice. Similarities and differences in the ways TPACK were measured and its implications will be discussed.

TRANSCRIPT

Page 1: 2013-03-27 SITE TPACK symposium

MEASURING TPACK

International symposium on TPACK

Joke Voogt, Petra Fisser, Ayoub Kafyulilo, Douglas Agyei (University of Twente)

Johan van Braak, Jo Tondeur, Natalie Pareja Roblin (Ghent University)

Denise Schmidt-Crawford, Dale Niederhauser, Wei Wang (Iowa State University)

SITE 2013, 27 March 2013, New Orleans

Page 2: 2013-03-27 SITE TPACK symposium

Invited international symposium on TPACK

2010

Strategies for teacher professional development on TPACK

2011

Teachers‟ assessment of TPACK: Where are we and what is

needed?

2012

Developing TPACK around the world: Probing the framework even

as we apply it

2013: Measuring TPACK

Page 3: 2013-03-27 SITE TPACK symposium

Conceptualizing

TPACK

Strategies to acquire TPACK

Measuring effects

Page 4: 2013-03-27 SITE TPACK symposium

Ghana

Iowa State

Tanzania

The Netherlands Belgium

Page 5: 2013-03-27 SITE TPACK symposium

Part 1

Introduction to the symposium – Joke Voogt

Measuring TPACK: Further Validation of the Technological

Pedagogical Content Knowledge Survey for Preservice

(TPACK) Teachers - Denise Schmidt-Crawford, Wei Wang, Dale

Niederhauser, Iowa State University

Unraveling the TPACK model: finding TPACK-Core –

Petra Fisser & Joke Voogt, University of Twente, The Netherlands,

Johan van Braak & Jo Tondeur, Ghent University, Belgium

Discussion with the Audience

Page 6: 2013-03-27 SITE TPACK symposium

MEASURING TPACK: Further Validation of the

TPACK Survey

for Preservice Teachers

Denise A. Schmidt-Crawford

Dale Niederhauser

Wei Wang Center for Technology in Learning and Teaching

School of Education

Iowa State University

Page 7: 2013-03-27 SITE TPACK symposium

Validation of TPACK Survey Schmidt, D. A., Baran, E., Thompson, A. D., Mishra, P., Koehler, M. J., & Shin, T. S. (2009-10). Technological Pedagogical Content Knowledge (TPACK): The Development and Validation of an Assessment Instrument for Preservice Teachers. Journal of Research on Technology in Education, 42, 123-149.

Characteristics: • 47 likert-item survey

• Seven constructs

• Preservice teachers (elementary & early childhood education)

Page 8: 2013-03-27 SITE TPACK symposium

Sampling of Requests…

Page 9: 2013-03-27 SITE TPACK symposium

Sampling of Requests…

Page 10: 2013-03-27 SITE TPACK symposium

The Problem

• Other researchers using the survey were finding

different patterns of results:

• Factors were not stable

• Items were loading on different factors

• Factors were not aligning with the

conceptual model

Page 11: 2013-03-27 SITE TPACK symposium

Further Analysis • Research Context:

• 3-credit, introduction to technology course (15 weeks)

• Required for elementary education and early childhood education majors

• Attend two, 1-hour lectures and one, 2-hour laboratory session every week

• Participants:

• 534 preservice teachers

• 82% elementary education majors, 16% early childhood education majors, 2% other

• 88% female; 12% male

• 23% freshmen, 40% sophomores, 30% juniors, 7% seniors

• 72% had not yet completed a practicum experience

• Research Procedures:

• Online survey administered at the end of the course (15-25 minutes to complete)

Page 12: 2013-03-27 SITE TPACK symposium

Data Analysis

• Principle components

factor analysis

(Varimax with Kaiser

Normalization)

• Internal consistency

(Cronbach‟s alpha)

Page 13: 2013-03-27 SITE TPACK symposium

TPACK as an Exploded Abstraction

T

C P

Page 14: 2013-03-27 SITE TPACK symposium

Results

1. TK, PK, TPK, TCK factors remained the same.

TPACK

Construct

Total

Items

Eigen

Values

TK 6 .877

PK 7 .921

TPK 9 .902

TCK 5 .879

Page 15: 2013-03-27 SITE TPACK symposium

Results

2. CK is messy!

TPACK

Construct

Total

Items

Combined

Items

Eigen

Value

CK 12 3 .854

Comment:

I can use a ____________ way of thinking.

I have various ways and strategies for developing my understanding of __________.

I have sufficient knowledge about _____________.

Page 16: 2013-03-27 SITE TPACK symposium

Results

3. PCK – Math item dropped out.

TPACK

Construct

Total

Items

Eigen

Value

PCK 3 .865

Comment:

Indicated the participants were not answering “math” question in ways that

aligned with the other content areas.

Page 17: 2013-03-27 SITE TPACK symposium

Results

4. TPACK – Two factors emerged (content, general).

TPACK

Construct

Total

Items

Eigen

Values

Content 4 .885

General 4 .917

Page 18: 2013-03-27 SITE TPACK symposium

Measuring TPACK • Collecting information about preservice teachers‟ perception of what

they know

• Direct measure of self perception

• Indirect measure of knowledge

• Start using direct measures for some TPACK constructs

• e.g., CK – Content specific measures, PK – Praxis test

• Program evaluation – Provides metrics of key places in teacher

education program

• What is working? What is not? (Interventions)

• Start looking at TPACK as a dynamic model – What kinds of

experiences can we provide to build “overlap?”

Page 19: 2013-03-27 SITE TPACK symposium

Returning to the Problem

• Using survey with „other‟ populations (i.e., inservice teachers)

• Using survey with a focus on a specific content area (i.e., math, science)

• Using survey in different countries

• Validity & reliability are effected by population and content area

Page 20: 2013-03-27 SITE TPACK symposium

QUESTIONS?

Denise A. Schmidt-Crawford

[email protected]

Dale Niederhauser

[email protected]

Wei Wang

[email protected]

Center for Technology in Learning and Teaching

School of Education

Iowa State University

Page 21: 2013-03-27 SITE TPACK symposium

Unraveling the TPACK model: finding TPACK-Core

Joke Voogt, Petra Fisser University of Twente

Johan van Braak, Jo Tondeur Ghent University, Belgium

SITE, New Orleans, 27 March 2013

Page 22: 2013-03-27 SITE TPACK symposium

Aim of the study: Empirical exploration of the TPACK model

Can we reproduce the distinguished

constructs of the TPACK conceptual

framework as represented in the Venn

diagram with our data?

If not:

can we unravel the model?

can we find new constructs?

can we develop a new instrument that

measures the self-perception of

(pre-service) teachers?

Page 23: 2013-03-27 SITE TPACK symposium

Why this study?

We became fascinated by

the attractiveness of the model

the acceptance of the model by teachers

but also by the complexity of the model (and what‟s behind it)

We worked on

Survey for pre-service teachers

Professional development for

in-service teachers

Literature review (JCAL, 2012)

Discussions/debates/presentations

Page 24: 2013-03-27 SITE TPACK symposium

We all know the TPACK model:

“At the heart of good teaching with technology

are three core components:

content, pedagogy, and technology,

plus the relationships

among and between them.”

(Koehler & Mishra, 2006)

Page 25: 2013-03-27 SITE TPACK symposium

The context of the study

The Netherlands Flanders, Belgium

Pre-service teachers Teacher educators

Use of technology in the science

domain

Use of technology in different

domains

Sample:

- 287 pre-service teachers

- age 16-24

- 24% male, 76% female

- distributed over 4 years of

study

Sample:

- 146 teacher educators

- age 26-61

- 29% male, 71% female

- 1-38 years experience as

teacher educator

Instrument: TPACK Survey

(Schmidt et al., 2009), all items

Instrument: TPACK Survey

(Schmidt et al., 2009), T-related

items

Page 26: 2013-03-27 SITE TPACK symposium

Results (NL), reliability

Reliability all TPACK-items together: Cronbach‟s α = 0.92

Reliability for all categories within

the TPACK Survey: Domain Cronbach’s α

TK .90

PK .75

CK .74

PCK .63

TCK .85

TPK .72

TPCK .83

Models .73

Page 27: 2013-03-27 SITE TPACK symposium

Results (NL), factor analysis

Factor analysis

Can we measure TPACK by asking questions for each of the

7 TPACK domains?

Are we measuring the 7 TPACK domains?

Exploratory factor analysis (PC, Varimax) revealed 11 factors,

68% of variance explained

Further analysis of the factors lead to forcing to 7 factors, 58% of

variance explained.

But… are these 7 factors the same as the 7 TPACK-domains??

Page 28: 2013-03-27 SITE TPACK symposium

Results (NL), factor analysis

factor Items in factor Name factor Reliability Cronbach’s α

1 TK1 TK2 TK3 TK4 TK5 TK6 TK7 Technological

Knowledge

.90

2 PK1 PK2 PK3 PK4 PK5 PK6 PK7 Pedagogical Knowledge .75

3 CK1 CK2 CK3 PCK1 PCK2 Pedagogical Science

Content Knowledge

.80

4 TCK1 TCK2 TCK3 TCK4 TCK5 TCK6

TPK1 TPK2

TPCK2 TPCK3 TPCK4 TPCK6

Technological &

Pedagogical enhanced

Science Content

Knowledge

.88

5 TPK3 TPK4 TPK5

TPCK1 TPCK5

Critically applying

learned TPACK

.73

6 Model1 Model2 Model3 Model4 Role models of TPACK .73

7 TPCK7 TPCK8 TPCK9 TPCK10 TPACK Leadership .89

Page 29: 2013-03-27 SITE TPACK symposium

Results (NL), first findings

Yes: TK and PK (and “role models”)

No: CK, PCK, TCK, TPK and TPCK

CK and PCK are combined

TCK is combined with some of the TPK and some of the TPCK

items and form a “Core TPACK” dimension

The other TPK and TPCK items are combined and form a scale

“critically thinking about what you learned in your study before

applying it”

Except for 4 TPCK items that form a “TPACK Leadership” scale

Page 30: 2013-03-27 SITE TPACK symposium

Results (NL), focusing on the T-related items

factor Items in factor Name factor Reliability Cronbach’s α

1 TK1 TK2 TK3 TK4 TK5 TK6 TK7 Technological

Knowledge

.90

2 PK1 PK2 PK3 PK4 PK5 PK6 PK7 Pedagogical Knowledge .75

3 CK1 CK2 CK3 PCK1 PCK2 Pedagogical Science

Content Knowledge

.80

4 TCK1 TCK2 TCK3 TCK4 TCK5 TCK6

TPK1 TPK2

TPCK2 TPCK3 TPCK4 TPCK6

Technological &

Pedagogical enhanced

Science Content

Knowledge

.88

5 TPK3 TPK4 TPK5

TPCK1 TPCK5

Critically applying

learned TPACK

.73

6 Model1 Model2 Model3 Model4 Role models of TPACK .73

7 TPCK7 TPCK8 TPCK9 TPCK10 TPACK Leadership .89

Page 31: 2013-03-27 SITE TPACK symposium

Using the NL-results in the Flanders study

Survey for teacher educators

Only the T-related items from the TPACK Survey

Specific science-related items were left out, all items were transformed

to “your content area”

Reliability all TPACK-items together: Cronbach‟s α = 0.97

Reliability for all categories within

the TPACK Survey:

Domain Cronbach’s α

TK .95

TCK .92

TPK .83

TPCK .96

Page 32: 2013-03-27 SITE TPACK symposium

Results (FL)

Goal: Confirmatory Factor Analysis on the NL-data

First: doing the Factor analysis again on the NL-data with only the

TPACK Survey items that were included in the FL-survey:

factor Items in factor Name factor Reliability Cronbach’s α

1 TK1 TK2 TK3 TK4 TK5 TK6 TK7 TK .90

2 TCK1 TCK2 TCK3 TCK4

TPK1 TPK2

(TPCK1)

TCK & TPK .85

3 TPCK1 TPCK2 TPCK3 TPCK4

TPCK5 TPCK6

(TPCK1)

TPCK .77

Page 33: 2013-03-27 SITE TPACK symposium

Results (FL)

Next, the Confirmatory

Factor Analysis:

Yes, there is a good fit:

But:

the correlations

between the factors

are also very high,

a 1- or 2-factor

solution might be

possible*

Page 34: 2013-03-27 SITE TPACK symposium

Unraveling the TPACK model

When it comes to technology integration…

Factors:

TK, TPK/TCK, & TPCK

or… TK & TPK/TCK/TPCK?

or… TK/TPK/TCK/TPCK?

The integration of the domains as described by Koehler & Mishra go

beyond the 3 circles and the overlapping areas!

But what does that mean?

Page 35: 2013-03-27 SITE TPACK symposium

TK, TPK/TCK, & TPCK

TK items are very general:

“I know how to solve my own technical problems”, “I can learn

technology easily”, “I keep up with important new technologies”

TPK and TCK items are much more related to (the preparation of)

lessons:

“I can choose technologies that enhance the teaching approaches for

a lesson” and “I can choose technologies that enhance the content for

a lesson”

TPCK items are related to lessons and activities in the classroom:

“I can teach lessons that appropriately combine science, technologies,

and teaching approaches”, “I can select technologies to use in my

classroom that enhance what I teach, how I teach, and what students

learn”

Page 36: 2013-03-27 SITE TPACK symposium

Getting closer to TPACK Core

And if you take a close look..

It has been there the whole time!

“At the heart of good teaching with technology

are three core components:

content, pedagogy, and technology,

plus the relationships

among and between them.”

(Koehler & Mishra, 2006)

Propositions:

1. TK is conditional for TCK, TPK and TPCK

(Voogt, Fisser, Gibson, Knezek & Tondeur, 2012)

(& recent regression analysis)

2. The combination of TPK, TCK and TPCK is the heart (or the core)

of the model (TPACK Core)

Page 37: 2013-03-27 SITE TPACK symposium

What does this mean for measuring TPACK?

Can we keep the survey items for TK, TCK, TPK and TPCK?

Or do we need to develop a new set of items that measures TPACK

Core?

We don‟t have the answer (yet)..

Page 38: 2013-03-27 SITE TPACK symposium

What does this mean for measuring TPACK?

What we do know:

Developing an instrument that is suitable for every situation is

impossible

It is the specific context that matters most, and T, P and C are

always context-dependent!

Measuring TPACK by a self-reporting survey is not enough

More measurement moments are needed

More instruments (observation, lesson plan rubric, etc) are

needed

Page 39: 2013-03-27 SITE TPACK symposium
Page 40: 2013-03-27 SITE TPACK symposium

More information?

Ideas about (measuring) TPACK Core? Please contact us!

Petra Fisser: [email protected]

And for the Dutch & Flemish people htpp://www.tpack.nl

Page 41: 2013-03-27 SITE TPACK symposium

Part 2

Welcome back!

TPACK development in teacher design teams: assessing

teachers’ perceived and observed knowledge - Ayoub Kafyulilo,

Dar es salaam University College of Education, Tanzania; Petra

Fisser & Joke Voogt, University of Twente, The Netherlands

Long term impact of TPACK: From pre-service teacher learning

to professional and teaching practices - Douglas Agyei,

University of Cape Coast, Ghana; Joke Voogt, University of Twente,

The Netherlands

Discussant: Natalie Pareja Roblin – University of Ghent, Belgium

Discussion with Audience

Page 42: 2013-03-27 SITE TPACK symposium

TPACK development in teacher design teams: Assessing the teachers’ perceived and observed

knowledge

Ayoub Kafyulilo,

Dar es salaam University College of Education

Petra Fisser and Joke Voogt,

University of Twente

Page 43: 2013-03-27 SITE TPACK symposium

Introduction

This study was conducted with the in-service science teachers in

Tanzania.

It adopted design teams as professional development arrangement

to develop teachers‟ technology integration knowledge and skills.

TPACK was used as a framework for describing the teachers‟

knowledge requirements for integrating technology in science

teaching

Page 44: 2013-03-27 SITE TPACK symposium

The Intervention

The study comprised of four intervention activities

The workshop

Lesson design in design teams

Lesson implementation in the classroom

Mostly a projector and a laptop were used in teaching

Reflection with peers (peer appraisal)

Page 45: 2013-03-27 SITE TPACK symposium

Lesson design in design teams

Page 46: 2013-03-27 SITE TPACK symposium

An example of a classroom set up with a projector, laptop

and a projection screen

Page 47: 2013-03-27 SITE TPACK symposium

Research questions

What is the in-service science teachers‟ perceived TPACK before and

after intervention?

What are the observed in-service science teachers‟ TPACK before and

after intervention?

Page 48: 2013-03-27 SITE TPACK symposium

Participants

The study adopted a case study design

Design teams were study cases

Individual teachers were the units of analysis.

12 in-service science teachers participated in the study.

The 12 teachers formed three design teams (each with 4 teachers)

Page 49: 2013-03-27 SITE TPACK symposium

Instrument

Six data collection instrument were used in this study to collect self-

reported and observed data.

Self reported data were collected through;

TPACK survey,

Reflection survey,

Focus group discussion and

Interview

Observation data were collected through;

Classroom observation checklist,

Researcher‟s logbook

Page 50: 2013-03-27 SITE TPACK symposium

TPACK Survey (pre and post-intervention)

The TPACK survey was used before and after the intervention

The instrument was adopted from Schmidt et al (2009) and Graham et al

(2009) and used a 5 point Likert scale

The reliability was 0.93 Cronbach‟s alpha

Page 51: 2013-03-27 SITE TPACK symposium

Observation checklist

The observation checklist was administered before and during the

intervention

The items had a 3 point Likert scale: “No” = absence, “No/Yes” = partial

existence, and “Yes” = presence of the behavior

Two people rated the observation checklist and the inter-rater reliability

was 0.87 Cohen Kappa.

Page 52: 2013-03-27 SITE TPACK symposium

The reflection survey

The reflection survey was administered at the end of the intervention to

assess the teachers‟ opinions about learning technology in design teams

The overall reliability for items related to TPACK was 0.68 Cronbach‟s

alpha.

Page 53: 2013-03-27 SITE TPACK symposium

Researcher’s logbook

The researchers‟ logbook was used to maintain a record of activities and

events occurring during the intervention process.

The researcher‟s logbook was used during peer appraisal, TPACK

training and lesson design.

Data collected through the researchers logbook were important in

describing the interventions processes.

Page 54: 2013-03-27 SITE TPACK symposium

Teachers’ interview

The interview was administered at the end of the intervention to asses the

effectiveness of design teams in teachers‟ development of TPACK

An example of the interview question was:

What technology integration knowledge and skills did you develop

from design teams?

Four randomly selected interviews out of12 interviewees were coded by a

second person.

The inter-coder reliability was 0.83 Cohen Kappa.

Page 55: 2013-03-27 SITE TPACK symposium

Focus group discussion

A focus group discussion was administered at the end of the intervention

An example of the question asked in FGD was:

How do you evaluate the results of your discussion in design teams; in

terms of the products you made, decisions in the team, new ideas and

innovations

Two randomly selected FGD were coded by a second person.

The inter-coder reliability was 0.92 Cohen Kappa.

Page 56: 2013-03-27 SITE TPACK symposium

Results: Teachers’ perceived TPACK before and after the

intervention

Before intervention, teachers perceived their CK, PK and PCK as high,

and TK, TCK, TPK and TPCK were low.

After intervention, all TPACK components were perceived high.

A Wilcoxon signed ranks test for two related samples showed that TK,

PK, TCK, TPK and TPACK were significant at p ≤ 0.01 whereas CK and

PCK were significant at p ≤ 0.05

Results from the reflection survey showed that teachers‟ developed

TPACK through their participation in design teams.

Page 57: 2013-03-27 SITE TPACK symposium

Results (Teachers’ observed TPACK)

Findings from teachers observation showed a significant difference

between pre- and post-intervention results.

Pre-intervention results showed a low teachers‟ TK, TCK, TPK, and

TPACK (M < 1.5, SD ≤ 0.17) in a three points Likert scale

However, in the post-intervention results, all TPACK components were

high (P ≤ 0.05).

Page 58: 2013-03-27 SITE TPACK symposium

Conclusions

The triangulation of the findings from self-reported and observed data

showed;

A limited teachers‟ TK, TPK, TCK and TPACK before intervention,

After intervention all the TPACK components were high

In this study, self-reported data comply with the observed data

This differs from the findings of Alayyar (2011) and Kafyulilo et al (2011)

which showed a difference between the observed and perceived TPACK

Page 59: 2013-03-27 SITE TPACK symposium

Conclusions

Probably this has something to do with

The instrument,

The culture and

The level of the teachers.

Findings from both observed and self-reported data indicate that

teachers‟ PK, CK and PCK were high before and after intervention.

This may suggest that in the context of Tanzania, technology integration

efforts need to focus more on technology related components of TPACK

rather than the whole TPACK.

Page 60: 2013-03-27 SITE TPACK symposium

Thanks for your attention

[email protected]

Page 61: 2013-03-27 SITE TPACK symposium

Douglas Agyei

&

Joke Voogt

61

Long term impact of TPACK: From pre-service teacher

learning to professional and teaching practices

Page 62: 2013-03-27 SITE TPACK symposium

Motivation

Poor student achievements (in mathematics)

High failure rate (More than 86% of failures to enter Tertiary levels)

TIMSS 2003 & 2007 (43rd out of 44 & 46th out of 48)

Poor attitudes

Mathematics Teaching

Teacher-centred approach (Hardly any hands-on activities, Whole class teaching

Lots of notes being copied )

Low cognitive learning (Concept formation at a more abstract level, Heavy

emphasis on assessment)

Page 63: 2013-03-27 SITE TPACK symposium

Intervention studies in the 2009 – 2011

A Longitudinal study to integrate technology in teaching

mathematics (Ghana)

Two case studies of Professional Development (PD) in 2009 and 2010

Integration of the PD arrangement into a regular mathematics–specific IT course

(2011)

TPACK Framework

ICT (spreadsheet) to promote in-depth maths concept formation

Activity-Based Learning (ABL) to make lesson less teacher-centred

Page 64: 2013-03-27 SITE TPACK symposium

TPACK Conceptualization (Intervention Studies 2)

1. Make use of existing ICT tools

(Spreadsheet-specific)

2. Active involvement of learners

(Activity Based Learning-ABL)

3. Explore connection between

spreadsheet, ABL pedagogy and

mathematical concept

TPACK Frame work - Interconnection

of content pedagogy & technology

(Mishra & Koehler,2006)

Page 65: 2013-03-27 SITE TPACK symposium

Outcome of the Intervention Studies

Developed TPACK of Participants

Self- assessment TPACK

Lesson artefacts

Lesson Observations

Three years into project :

Mathematics teachers pursuing carriers in different institutions

Various Senior High Schools/Junior High Schools in Ghana

Page 66: 2013-03-27 SITE TPACK symposium

Challenge and Data Collection

Measure the impact of the Intervention Studies

Explore whether and how the beginning teachers integrate

ICT (demonstrate TPACK) in their teaching practices

Gain insight into factors promoting (or hindering) the

teachers’ ICT integration (TPACK demonstration)

− Questionnaire (100)

− Interview ( 20)

− Observation (8)

− Researchers’ logbook

Page 67: 2013-03-27 SITE TPACK symposium

Results (1)- Self Report

Table 1: Mean score of factors that influence teachers TPACK use (N=100).

Conditions Mean Std Dev

Skills and knowledge 4.57 .355

Dissatisfaction with status quo 4.48 .283

Commitment 4.21 .287

Availability of Time 3.75 .562

Rewards and Incentives 3.17 .237

Participation (Decision making

involvement)

3.02 .503

School Culture 2.05 .292

Resources ( ICT facilities) 1.71 .311

Page 68: 2013-03-27 SITE TPACK symposium

Results 2 : Lesson Observation

Table 2: Teacher lesson implementation (n=8)

Teachers Subject

Taught

ICT Availability Strategy

Two (2) Mathematics Personal Laptop and

projector

Spreadsheet techniques

(interactive demonstration)

Two (2) ICT Personal Laptop and

projector

Resources from Internet

(interactive demonstration)

Two (2) Mathematics Personal laptop (Rotating

groups of students )

Spreadsheet techniques

/Resources from Internet

Two (2) Mathematics No ICT Facility Worksheet to support

teamwork

Page 69: 2013-03-27 SITE TPACK symposium
Page 70: 2013-03-27 SITE TPACK symposium

Snapshot of a lesson on Linear Equations

TPCKmaths

TKss

Linear functions in the slope intercept form

Page 71: 2013-03-27 SITE TPACK symposium

Consider the diagrams below.

eye

water

Image of coin

coin

Snapshot of a lesson on Enlargement

Page 72: 2013-03-27 SITE TPACK symposium

A boy Image of the boy

Pin-hole camera

Snapshot of a lesson on Enlargement (2)

Page 73: 2013-03-27 SITE TPACK symposium

Snapshot of a lesson on Introduction to computer

networks (1)

Page 74: 2013-03-27 SITE TPACK symposium

Snapshot of a lesson on Introduction to computer

networks (2)

Page 75: 2013-03-27 SITE TPACK symposium

Summary of Results & Conclusions

Developed and strong positive views about TPACK in the long term (result of the pre-

service preparation intervention studies)

Specific focus on ABL “P” and spreadsheets “T” in Mathematics “C” helped to

develop deep connections between their subject matter, instructional strategy and the

ICT application, fostering TPACK in the long term (closer to the original conception of

Schulman’s (1986) ideas of Pedagogical Content Knowledge)

Develop TPACK in similar initiatives using other ICT applications and/or different

subject matter.

Develop and extend pedagogical reasoning to support students learning

Using multiple data sources is a good way to assess TPACK in the long run

Teachers’ “knowledge and skill” acquired and “dissatisfaction with the status quo”

are key in promoting long term TPACK

Lack of access to ICT infrastructure and unenthusiastic school cultures hinder TPACK

in the long run

Page 77: 2013-03-27 SITE TPACK symposium

SITE Conference, New Orleans, 2013 Natalie Pareja Roblin

discussant

Symposium: Measuring TPACK

Page 78: 2013-03-27 SITE TPACK symposium

TPACK: A growing concept

Page 79: 2013-03-27 SITE TPACK symposium

Main themes in these studies

Review of studies about TPACK published between 2005-2011 (n=55)

(Voogt et al., 2012)

• Development of the TPACK concept (14 studies)

• Measuring (student-)teachers’ TPACK (14 studies)

• Development of TPACK in specific subject domains (7 studies)

• Strategies to support the development of (student-) teachers’ TPACK (36 studies)

• TPACK and teacher beliefs (6 studies)

Page 80: 2013-03-27 SITE TPACK symposium

This symposium: Measuring TPACK

Pre-service teachers In-service teachers

Student teachers Teacher trainers

T

C P

Integrative views

Transformative views

Page 81: 2013-03-27 SITE TPACK symposium

Towards a comprehensive approach for measuring TPACK

Page 82: 2013-03-27 SITE TPACK symposium

Integrating multiple instruments to measure TPACK

1. Perceived TPACK

• Self-assessment survey (from integrative to transformative views on TPACK)

• Interviews

• Teacher reflections

• ....

2. Enacted TPACK

• Observation checklist

• Lesson plans

• Researcher logbook

• .....

Page 83: 2013-03-27 SITE TPACK symposium

TPACK as a complex and “fuzzy” concept

• How can TPACK (and its constituting knowledge domains) be operationalized? Is it possible (and desirable) to pull apart the knowledge domains that constitute TPACK?

• If TPACK is considered as a “sliding framework”, as suggested by Cox and Graham (2009), is it possible to develop standardized instruments to measure it?

• How does qualitative data contribute to the understanding of (pre-service) teachers’ TPACK development? What does it add to survey data?

• How to best combine self-reported and observed TPACK measurements?

Page 84: 2013-03-27 SITE TPACK symposium

Examining the development of TPACK across time

Pre-service teachers

Beginning teachers

In-service teachers

Page 85: 2013-03-27 SITE TPACK symposium

• How does TPACK develop as student teachers step into the teaching profession and become experienced teachers?

• What factors (personal, institutional, systemic) facilitate and/or hinder TPACK development?

• How does the context (school characteristics, learner characteristics, access to technology, ICT policies, etc.) influence the ways in which teachers integrate technology (i.e., how TPACK is put into action)?

TPACK development as a dynamic and context-bound process

Page 86: 2013-03-27 SITE TPACK symposium

Towards a comprehensive approach for measuring TPACK: Moving forward...

Page 87: 2013-03-27 SITE TPACK symposium

Integrating multiple instruments: Recent initiatives

Assessing teachers’ pedagogical ICT competences (The Netherlands)

Page 88: 2013-03-27 SITE TPACK symposium

Assessing teachers’ pedagogical ICT competences

Self-perceived Observed +

Page 89: 2013-03-27 SITE TPACK symposium

Format of the video vignette

Introduction (+/- 2 minutes)

Practice (+/- 4 to 8 minutes)

Reflection (+/- 2 minutes)

- Subject - Goal - Nature of ICT use - Perceived advantages/contributions of ICT

- ICT applications - Goals of ICT use - Attractive/efficient/effective uses - Pedagogical use of ICT (TPACK) - Teacher role - Student role

- Why this lesson? - Why this combination of T, P and C? - Would this lesson be different without ICT? - How do you know your (ICT) goals have

been accomplished?

Page 90: 2013-03-27 SITE TPACK symposium

Examining TPACK development across time and contexts: Recent initiatives

From pre-service to practice: Understanding beginning teachers’ uses of technology

(Belgium, Flanders)

Page 91: 2013-03-27 SITE TPACK symposium

Understanding beginning teachers’ uses of technology

Longitudinal qualitative study in Flanders Focus on (institutional) factors supporting TPACK development

Page 92: 2013-03-27 SITE TPACK symposium

Study 1: Approaches to support TPACK development

TE1: From TK to... TE2: From TK to TPK TE3: From TK to TCK

Moving from stand-alone technology courses to integrated approaches that aim to support TPACK development

Tondeur, J., Pareja Roblin, N., van Braak, J., Fisser, P., Voogt, J. (2012). Technological pedagogical content knowledge in teacher education: in search of a new curriculum.

Educational Studies, DOI:10.1080/03055698.2012.713548

Page 93: 2013-03-27 SITE TPACK symposium

Study 2: Technology integration by BT

Pre-service education influences how BT integrqte technology in their teaching practice. It contributes to:

- Developing TK

“The basic skills we did learn them”

- Getting to know various technology tools that could be used with educational purposes “[To learn about things] such as Klascement or Hot Potatoes was useful”

“If I had not learned it in my pre-service education, I think I would have never used it here”

- Learning how to teach with technology (!) little opportunities

“[We should learn] not only the application itself, but [also] how to use it

and how to integrate it [in your teaching]”

Page 94: 2013-03-27 SITE TPACK symposium

Study 2: Technology integration by BT

However, (the extent of) this influence depends on school characteristics:

- Access to technology

“It is not possible to sit behind 1 computer with 19 children”

- Clear ICT policies

“Everybody has one hour in the computer room. It is not compulsory, but the

school principal has strongly recommended it to us”

- Workload

“Making and trying out new things is difficult, especially at the start [of your

career] because you are busy with preparing your lessons”

- Support and mentoring systems

Page 95: 2013-03-27 SITE TPACK symposium

Measuring TPACK: Mission impossible?

Integrative views on TPACK

Transformative views on TPACK

How can TPACK (and its constituting knowledge domains)

be operationalized?

Generic instruments

Context & content specific instruments

Is it possible to develop standardized instruments to

measure TPACK?

Self-perceived TPACK measures

Observed TPACK measures

How to best combine self-reported and observed TPACK

measurements?

How does the context influence the ways in which teachers

integrate technology?

Page 96: 2013-03-27 SITE TPACK symposium

Thank you! [email protected]