assessment for learning: learning from assessment? sally jordan (@sallyjordan9) dps seminar, 19 th...

45
Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Upload: nelson-mcgee

Post on 29-Jan-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Assessment for learning: Learning from assessment?

Sally Jordan (@SallyJordan9)DPS Seminar, 19th November 2015

Page 2: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

My background

● Longstanding interests in assessment and maths skills development;

● Introduced online interactive assessment into S151 Maths for Science (2002);

● Several CETL and other externally-funded projects in particular:

● Project which investigated the use of short-answer free-text questions;

● “Remote observation” of student engagement with e-assessment;

● I am passionate about using scientific methodology to find out what is going on in learning.

Page 3: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Why have I used computer-marked assessment?

• In my work, the focus has been on ‘assessment for learning’, so feedback and giving students a second and third attempt is important (Gibbs & Simpson, 2004-5).

• We aim to ‘provide a tutor at the student’s elbow’ (Ross et al., 2006).

• My work has been at the limit of what is possible with computer-marked assessment (not just multiple-choice questions!).

• …to learn more about what is going on…[Learning analytics/assessment analytics]

Page 4: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Analysis of student errors

● At the most basic – look for questions that students struggle with;

● Look at responses in more detail to learn more about the errors that students make;

● This can give insight into student misunderstandings.

● So what topics in Maths for Science do students find difficult?

Page 5: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

So what topics in Maths for Science do students find difficult?

Page 6: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Analysis of student responses to individual questions

Gives information about student errors, linked to their misconceptions. The confidence in the findings is increased when• The questions require a ‘free-text’ (constructed)

response;• The questions are in summative use (students

are trying);• Similar errors are seen in different variants.

See Jordan (2014)

Page 7: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Why is the answer 243? (instead of 9)

The question was:

Evaluate 36/3

7

Page 8: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Why is the answer 243? (instead of 9)

The question was:

Evaluate 36/3

Students were evaluating

Instead of 36/3 = 32 = 9

6533 243

3

Page 9: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

For another variant the answer was 5000 instead of 100

The question was:

Evaluate 104/2

Students were evaluating

Instead of 104/2 = 102 = 100

9

410 100005000

2 2

Page 10: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Measuring student engagement… “750 students used my iCMA”

Page 11: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Measuring student engagement…

Page 12: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Measuring student engagement…

Page 13: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

When do students do iCMAs? (overall activity)

Page 14: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

When do students do iCMAs?(impact of deadlines)

Page 15: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

When do students do iCMAs (typical patterns of use)

Page 16: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Student engagement with feedback

Page 17: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Student engagement with feedback (identical question)

Module A Module B

Page 18: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

General conclusions

●Analysis of student responses to interactive computer-marked questions can give information about student misunderstandings and student engagement with assessment;

●Generally, students do what they believe their teachers expect them to do;

●Engagement with computer-marked assessment can act as a proxy for more general engagement with a module (and so act as an early warning if engagement is not as deep as we might wish).

Page 19: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

The future?

● Redecker, Punie and Ferrari (2012, p. 302) suggest that we should “transcend the testing paradigm”; data collected from student interaction in an online environment offers the possibility to assess students on their actual interactions rather than adding assessment separately.

Page 20: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

A short-answer question (PMatch)

https://students.open.ac.uk/openmark/s104-11b.icma48/

Page 21: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

A short-answer question (PMatch)

Page 22: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

A short-answer question (PMatch)

Page 23: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Short-answer free-text questions: human-computer marking comparison

●A linguistically-based system was used to mark and give feedback on student responses of ‘a sentence’ in length.

●The computer marking was compared with that of 6 human markers.

Page 24: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Question Number of responses in

analysis

Percentage of responses where the human markers

were in agreement with question author

Percentage of responses where

computer marking was in agreement

with question author

Range for the 6 human markers

Mean percentage

for the 6 human markers

A 189 97.4 to100 98.9 99.5

B 248 83.9 to 97.2 91.9 97.6

C 150 80.7 to 94.0 86.9 94.7

D 129 91.5 to 98.4 96.7 97.6

E 92 92.4 to 97.8 95.1 98.9

F 129 86.0 to 97.7 90.8 97.7

G 132 66.7 to 90.2 83.2 89.4

Page 25: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Short-answer free-text questions: computer-computer marking comparison

●An undergraduate student (not of computer science) developed answer matching using two algorithmically based systems, Java regular expressions and OpenMark PMatch;

●These are not simple ‘bag of words’ systems;●Student responses were used in the development of the answer matching, as had been the case for the linguistically based IAT system;

●The results were compared.

Page 26: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Question Responses in set

Percentage of responses where computer marking was in agreement with question

author

Computational linguistics

Algorithmic manipulation of keywords

IAT OpenMark Regular Expressions

A 189 99.5 99.5 98.9

B 248 97.6 98.8 98.0

C 150 94.7 94.7 90.7

D 129 97.6 96.1 97.7

E 92 98.9 96.7 96.7

F 129 97.7 88.4 89.2

G 132 89.4 87.9 88.6

Page 27: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Word-length for Snowflake question

●With no restriction on the number of words allowed

1

Page 28: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Word-length for Snowflake question

●With warning ‘Your answer should be no more than 20 words in length.’

1

Page 29: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Word-length for Snowflake question

●With warning ‘Your answer should be no more than 20 words in length.’

1

Page 30: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Exploring the gender gap

●S207 (The Physical World) was a 60-credit OU level 2 (FHEQ level 5) module (replaced by S217 from autumn 2015).

●Men were significantly and consistently more likely to complete S207 than women.

●Of those who complete, men were more likely to pass.

●The difference in outcomes was larger in 2013-14.●The effect is not present in our level 1 modules or in any other

Science level 2 modules except for S282 (Astronomy), where women also do less well but the difference is not so stark.

●Women do slightly better on our level 3 physical science modules.

30

Page 31: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

So what’s causing the attainment gap?

●It has something to do with the type of assessment we are using.

●There are other demographic differences (e.g. previous educational qualifications) between our male and female students.

●There are other differences between men and women e.g. in the amount of time they have available for study.

●It has something to do with role models and/or the nature of the support being offered.

●It has something more fundamental to do with the way we are teaching physics.

●On average, women and men handle certain physical concepts and skills of physics in a different way.

Initial hypotheses:

31

Page 32: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Results of data analysisConcentrating on the 2013-14 presentation

N (male) = 455N (female) = 157

32

Page 33: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Different outcomes for male and female students

33

Page 34: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

When do students stop submitting assignments?

34

Page 35: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Performance on different parts of the examPart A is multiple-choice questions; Part B is short-answer questions; Part C is longer questions (with choice)

35

Page 36: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Performance on different questions in the exam For Part C (Choice of 3 out of 7 long questions)

36

Page 37: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Choice of different questions in the exam For Part C (Choice of 3 out of 7 long questions)

37

Page 38: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Performance on iCMAs (interactive computer-marked assignments)Note that a wide range of questions are represented, but there is no obvious correlation between male/female performance and question type.

38

Page 39: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Demographics

There are no obvious differences between the distribution of other demographic factors (e.g. age, previous educational qualifications etc.) for men and women.

However, there is some indication that women with other characteristics, in particular

o having less than two A levelso not having English as a first language may be particularly likely to withdraw. There is also some evidence that women appreciate different

aspects of student support/tuition than men. Although the proportion of students with A levels is similar for

men and women, we don’t know how many have A levels in maths and physics.

39

Page 40: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Summary and some questions

●We have a significant attainment gap between men and women on our level 2 physics module; we have ruled out many possible explanations but further investigation is of vital importance.

●Block 2 (Describing motion) and Block 3 (Predicting motion) may be creating particular problems.

●Particular questions (not question types) may be causing particular problems.

●We know that most girls who do maths and physics A level go straight to University; does this mean that the female OU population has a relatively smaller proportion of students with A levels in maths and/or physics?

●Our level 1 modules prepare students for the content of S207/S217, but do they prepare students for the question types?

●Are our teaching materials and questions sufficiently clear for students for whom English is not their first language?

●Women appear to be more likely to give up if they are finding the module difficult. Is this because they have a different motivation for student/less time/less confidence? 40

Page 41: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

What next? Your thoughts and suggestions please

●S217 has a different tuition and assessment strategy – review the impact

●Surveying (male and female) students to find out more detail about their previous qualifications, whether they consider English to be their first language, and their perception of their preparedness to study S207/S217.

●Work with a statistician to model which demographic factors may be combining to contribute to success or the lack of it – for all students and across several universities

●Combine the work with another project that is using free-text questions to establish the “Force concept inventory”

41

Page 42: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

References

Clow, D. (2013). An overview of learning analytics. Teaching in Higher Education, 18(6), 683-695.Ellis, C. (2013). Broadening the scope and increasing the usefulness of learning analytics: The case for assessment analytics. British Journal of Educational Technology, 44(4), 662-664.Nicol, D. & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated learning: a model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218.Redecker, C., Punie, Y., & Ferrari, A. (2012). eAssessment for 21st Century Learning and Skills. In A. Ravenscroft, S. Lindstaedt, C.D. Kloos & D. Hernandez-Leo (Eds.), 21st Century Learning for 21st Century Skills (pp. 292-305). Berlin: Springer.

Page 43: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

For more about what I’ve discussed

Jordan, S. (2011). Using interactive computer-based assessment to support beginning distance learners of science, Open Learning, 26(2), 147-164.Jordan, S. (2012). Student engagement with assessment and feedback: Some lessons from short-answer free-text e-assessment questions. Computers & Education, 58(2), 818-834.Jordan, S. (2013). Using e-assessment to learn about learning. In Proceedings of the 2013 International Computer Assisted Assessment (CAA) Conference, Southampton, 9 th-10th July 2013. Retrieved from http://caaconference.co.uk/proceedings/ Jordan, S. (2014). Adult science learners’ mathematical mistakes: an analysis of student responses to computer-marked questions. European Journal of Science and Mathematics Education, 2(2), 63-87.

Page 44: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

For more on OU e-assessment systemsButcher, P. G. (2008). Online assessment at the Open University using open source software: Moodle, OpenMark and more. In Proceedings of the 12th International Computer Assisted (CAA) Conference, Loughborough, 8th-9th July 2008. Retrieved from http://caaconference.co.uk/pastConferences/2008/proceedingsHunt, T. J. (2012). Computer-marked assessment in Moodle: Past, present and future. In Proceedings of the 2012 International Computer Assisted Assessment (CAA) Conference, Southampton, 10th-11th July 2012. Retrieved from http://caaconference.co.uk/proceedings/Ross, S. M., Jordan, S. E. & Butcher, P. G. (2006). Online instantaneous and targeted feedback for remote learners. In C. Bryan & K. Clegg (Eds.), Innovative Assessment in Higher Education (pp. 123-131). London: Routledge.Sangwin, C. J. (2013). Computer aided assessment of mathematics. Oxford: Oxford University Press.

Much of what I have said is discussed in more detail in:Jordan, S. E. (2014). E-assessment for learning? Exploring the potential of computer-marked assessment and computer-generated feedback, from short-answer questions to assessment analytics. PhD thesis. The Open University. Retrieved from http://oro.open.ac.uk/4111

Page 45: Assessment for learning: Learning from assessment? Sally Jordan (@SallyJordan9) DPS Seminar, 19 th November 2015

Sally Jordan

email: [email protected]

twitter: @SallyJordan9

blog: http://www.open.ac.uk/blogs/SallyJordan/