measuring student success: using data to inform redesign

40
Efficacy Program & Course Redesign Brian Buckley Program Manager, Efficacy Research Colleen Kochannek Executive Marketing Manager, English, Humanities & Social Sciences

Upload: pearson-education-customer-experience

Post on 29-Nov-2014

327 views

Category:

Education


1 download

DESCRIPTION

At a recent customer summit, Efficacy Manager, Brian Buckley, presented the myriad of techniques Pearson is employing to place efficacy at the center of our MyLab/Mastering digital solutions. He also discusses how educators can use data to inform course redesigns.

TRANSCRIPT

Page 1: Measuring Student Success: Using Data to Inform Redesign

Efficacy Program&

Course Redesign

Brian BuckleyProgram Manager, Efficacy Research

Colleen KochannekExecutive Marketing Manager,

English, Humanities & Social Sciences

Page 2: Measuring Student Success: Using Data to Inform Redesign

What is Efficacy?

ef·fi·ca·cy n. Power to produce a desired effect; effectiveness

ef·fi·ca·cy (Pearson)n. Power to change lives through learning

Page 3: Measuring Student Success: Using Data to Inform Redesign

• Pearson’s commitment to efficacy• Measuring MyLabs’ impact on learning• How Pearson can help• Using the results• Early intervention tool• Course redesign resources

Agenda

Page 4: Measuring Student Success: Using Data to Inform Redesign

“We’re setting out to become the efficacy company… we need to define ourselves by how effective we are, by the impact we make.”

- Marjorie Scardino, Pearson CEO, 1996-2012

“…we want to be able to demonstrate that everything we do as a company delivers an improved learning outcome.”

- John Fallon, Pearson CEO, 2013-

Committed to Student Success

Page 5: Measuring Student Success: Using Data to Inform Redesign

Efficacy FrameworkCriteria area Rating Rationale summary

• Clear and manageable plan of action

• Clear governance and lines of responsibility

• Monitoring and reporting system

• Capacity and culture within Pearson team

• Customer capacity and culture

• Relationships with other stakeholders

Outcomes / Impact

• Clarity of intended outcome

• Value for money

• Quality of design

• Comprehensiveness of evidence

• Quality of evidence

• Effectiveness of evidence use

Strength of evidence base

Quality of planning and implementation

Capacity to deliver

Likelihood of impact

Page 6: Measuring Student Success: Using Data to Inform Redesign

So what?

Page 7: Measuring Student Success: Using Data to Inform Redesign

Efficacy

Page 8: Measuring Student Success: Using Data to Inform Redesign

Efficacy

Page 9: Measuring Student Success: Using Data to Inform Redesign

Efficacy

Page 10: Measuring Student Success: Using Data to Inform Redesign

Efficacy

Page 11: Measuring Student Success: Using Data to Inform Redesign

Efficacy

Page 12: Measuring Student Success: Using Data to Inform Redesign

MyLab White Papers

Page 13: Measuring Student Success: Using Data to Inform Redesign

I’ve helped put more computers in more schools than anybody else in the world and I’m absolutely convinced that technology is by no means the most important thing.

The most important thing is a person.

- Steve Jobs

Evidence of Success

Page 14: Measuring Student Success: Using Data to Inform Redesign

Trends

• Recognize /embrace educational value of technology integration

• Require MyLab for at least 10% of the final course grade

• Receive MyLab training and follow recommended best practices

• Provide students with an early introduction to MyLab• What it is, how to use it, and how it contributes to their grade• Use Pearson’s First Day of Class program

• Enable active class discussion by assigning pre-lecture homework• Implement assessments to measure student learning gains

• Pre- and post-tests, standardized exams, common final exams• Provide historical data: pass rates, comparable-exam scores, etc.

Page 15: Measuring Student Success: Using Data to Inform Redesign

Share your Story

Reading, Writing, and MyFoundationsLab

Colleen Kochannek, [email protected]

Sara Owen, [email protected]

Brian Buckley, [email protected]

Page 16: Measuring Student Success: Using Data to Inform Redesign

Using Data to Inform Redesign

Experimental Study: Introductory Physics

Question: Does homework copying adversely impact learning/retention?

Measure: Pretest scores, exam scores, and rates of copying

Page 17: Measuring Student Success: Using Data to Inform Redesign

Answer Submission Patterns

Palazzo, D., et al.; Patterns, correlates, and reduction of homework copying

Physics Education Research 6, 010104 (2010)

Quick solvers

Real-time solvers

Delayed solvers

COPIERS!

Page 18: Measuring Student Success: Using Data to Inform Redesign

Exam Scores and Copy Rates

< 1/3 SD on pretest> 2 SD on final exam

Palazzo, D., et al.; Patterns, correlates, and reduction of homework copying; Physics Education Research 6, 010104 (2010)

Page 19: Measuring Student Success: Using Data to Inform Redesign

Failure Rate and Copy Rates

Palazzo, D., et al.; Patterns, correlates, and reduction of homework copying

Physics Education Research 6, 010104 (2010)

Page 20: Measuring Student Success: Using Data to Inform Redesign

Redesign

• Large lecture to studio physics

• More instructor contact

• Fewer, more frequent assignments

• Pass/fail to graded homework

Results…

Page 21: Measuring Student Success: Using Data to Inform Redesign

Reduction in Copy Rates

Palazzo, D., et al.; Patterns, correlates, and reduction of homework copying

Physics Education Research 6, 010104 (2010)

Page 22: Measuring Student Success: Using Data to Inform Redesign

Decrease in D/F rate

Palazzo, D., et al.; Patterns, correlates, and reduction of homework copying

Physics Education Research 6, 010104 (2010)

Page 23: Measuring Student Success: Using Data to Inform Redesign

Early Intervention Tool

Page 24: Measuring Student Success: Using Data to Inform Redesign

The Use of Fractal Dimension in the

Categorization of Students’ Response Patterns

and in the Prediction of Final Exam Scores

(Random Walk)

William Galen & Rasil Warnakulasooriya

Statistical Learning Algorithms Team

Learning Technologies Group

November 2011

Page 25: Measuring Student Success: Using Data to Inform Redesign

How do we “see” a student in traditional assessment?

e.g., Score = 85%, Letter grade = “B”

How did the student get there?

Page 26: Measuring Student Success: Using Data to Inform Redesign

Correct on First Try Grading and Walks

A step to the left: Graded , Score = -1

A step to the right: Graded , Score = +1

Keep track of the

Net Score = # steps to the right – # steps to the left

after each step

Page 27: Measuring Student Success: Using Data to Inform Redesign

Distinguishing Response Patterns

0 100 200 300 400 500 6000

50

100

150

200

250

300

Student 57Fractal D = 1.60

Net S

core

Responses Submitted by Student

A student not showing

random walk behavior

How can we quantify the differences?

A student showing

random walk behavior

0 100 200 300 400 500 600

-25

-20

-15

-10

-5

0

5

10

Net S

core

Responses Submitted by Student

Page 28: Measuring Student Success: Using Data to Inform Redesign

Distinguishing Response Patterns

Page 29: Measuring Student Success: Using Data to Inform Redesign

0 100 200 300 400 500 600

-25

-20

-15

-10

-5

0

5

10

Net S

core

Responses Submitted by Student

Distinguishing Response Patterns

0 100 200 300 400 500 6000

50

100

150

200

250

300

Student 57Fractal D = 1.60

Net S

core

Responses Submitted by Student

A student showing

random walk behavior

Fractal Dimension = 1.94

A student not showing

random walk behavior

Fractal Dimension = 1.60

Fractal Dimensions calculated based on-- Estimators of Fractal Dimension: Assessing the Roughness of Time Series and Spatial Data. T. Gneiting, H. Ševčíková, & D. B. Percival, Univ. of Washington (Seattle) Technical Report No. 577, 2010.

Page 30: Measuring Student Success: Using Data to Inform Redesign

Distinguishing Response Patterns

Fractal Dimension

1 2

Less irregularity in the response pattern

i.e., Persistent behavior

(an increase in net score at one instance is more likely to be followed by an increase

at a future instance)

High irregularity in the response pattern

i.e., Anti-Persistent behavior

(an increase in net score at one instance is more likely to be followed by a

decrease at a future instance)

Straight line Plane

Page 31: Measuring Student Success: Using Data to Inform Redesign

Distinguishing Response Patterns

A Student with a Fractal Dimension 1.32

0 100 200 300 400 500 6000

100

200

300

400

500

Net S

core

Responses Submitted by Student

Student 27Fractal D = 1.32

0 100 200 300 400 500 600

-25

-20

-15

-10

-5

0

5

10

Net S

core

Responses Submitted by Student

Random Walk

Page 32: Measuring Student Success: Using Data to Inform Redesign

Distinguishing Response Patterns

A Student with a Fractal Dimension 1.60

0 100 200 300 400 500 600

-25

-20

-15

-10

-5

0

5

10

Net S

core

Responses Submitted by Student

Random Walk

0 100 200 300 400 500 6000

50

100

150

200

250

300

Student 57Fractal D = 1.60

Net S

core

Responses Submitted by Student

Page 33: Measuring Student Success: Using Data to Inform Redesign

Distinguishing Response Patterns

A Student with a Fractal Dimension 1.81

0 100 200 300 400 500 600

-30

-20

-10

0

10

20

30

40

50

Student 16Fractal D = 1.81

Net S

core

Responses Submitted by Student

0 100 200 300 400 500 600

-25

-20

-15

-10

-5

0

5

10

Net S

core

Responses Submitted by Student

Random Walk

Page 34: Measuring Student Success: Using Data to Inform Redesign

start of course end of courseResponses Submitted

the band of students who are responding randomly at the course level (breaks away from the majority of the students about 1/3rd of the way into the course)

Visualizing the Response Patterns of the Entire Class

majority of the students’ responses crisscrosses in this region; net score increases by about 40 points for every 100 responses

two skilled students’ response patterns overlap over a period of time

Page 35: Measuring Student Success: Using Data to Inform Redesign

Responses Submittedstart of course mid-course

Both students have the same net score

A Tale of Two Students

Page 36: Measuring Student Success: Using Data to Inform Redesign

The Struggling Student can be Identified

Short runs of smoothness followed by short runs of irregularity. Learning behavior?

Responses Submitted

start of course mid-course

onset

Long runs of smoothness followed by long runs of irregularity, then a tipping point (onset).

Responses Submitted

alert

start of course mid-course

Page 37: Measuring Student Success: Using Data to Inform Redesign

Putting the Response Patterns of the Two Students

in Perspective

start of course end of course

Page 38: Measuring Student Success: Using Data to Inform Redesign

Conditions for Triggering Alerts

Alerts (random walking)

No Alert

(not enough information)

Alert

(no random walking; but the net score steadily decreases)

Random Walk Region

1 .0

1 .2

1 .4

1 .6

1 .8

2 .0F

racta

l D

ime

nsio

n

R esponses Subm ittedResponses Submitted

Fra

ctal

Dim

ensi

on

Page 39: Measuring Student Success: Using Data to Inform Redesign

1.0

1.2

1.4

1.6

1.8

2.0

[+1 , ...)(-1 , +1)

Avg

. F

ract

al D

imensi

on

(first

half

of in

tera

ctio

ns)

Final Exam Score (std. dev. units)

(... , -1]

Errors: 95% CI, SEM

-1 -1 … 1 1

Students who scored above 1 std. dev in the final exam

had significantly smoother response patterns

95% confidence intervals shown

Thanks to Profs. Randall Hall and Leslie Butler for providing the exam score data

students exhibiting relatively smooth and consistent response patterns

students exhibiting irregular and inconsistent response patterns

Page 40: Measuring Student Success: Using Data to Inform Redesign

Questions?