qsapple2015 back to the future electronic marking of objective structured clinical examinations and...

Post on 15-Apr-2017

170 Views

Category:

Health & Medicine

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

“Back to the Future: Electronic Marking of Objective Structured Clinical Examinations and Admission Interviews for the higher education market”

Dr Thomas JB Kropmans

Objective Structured Clinical Examinations e.g. OSCE/MMI

Station 4

Station 1 Station 2 Station 3ExaminerPatient Student

Waiting Area – To be Examined

Student

Station 5Station 6Station 7Waiting Area - Examined

Secretary

Desk

StudentStudentStudent

Student

StudentStudentPatient Student Patient Student

Patient StudentPatient StudentPatient StudentPatient Student

Student Examiner Patient Secretary Flow of Students

Examiner Examiner

Examiner Examiner Examiner Examiner

Tablet/PDA

Web Server/Database

Web Server/Central Database

Admission interviews (Students)Selection interviews (HR)Objective Structured Clinical Examinations (Health Care, Law)Workplace Based Assessment

Where to be used

Only student ability to pass an examination/interview is being assessedDifficulty of the assessment is not taken into accountHeterogeneity of measurement instruments (stations) limits comparabilityComparison of student competence across institutions and assessment (OSCE/MMI) settings is requiredUniversal adoption of a standardized instrument is recommended (Clinical Teacher September 2014)

Without standard setting

Of 1000 forms 300 contains errorsLow Cronbach’s Alpha (< 0.8 poor)Pass mark of 50%?Introduction Online Marking Tool in December 2008

30% based upon error

Purpose of a (formative or summative) clinical skills exam?

Assess clinical skills or non-cognitive skillsDetermine between ‘good’ and ‘bad’ performanceTo provide adequate and timely feedbackReliable decision on ‘pass’ or ‘fail’

Real time solution

Internal Consistency Stations

Cronbach’s AlphaInternal consistency

Issues with examiners

Skew distribution• Median = 74%; min score = 4 (20%) and max 20 (100%)

Distribution of scores

74%4

100%

Nr o

f stu

dent

s

2020%

50%

Professional impression examiners

0 1 2 3 40.0

1.0

2.0

3.0

4.0

5.0

6.0

7.0

8.0

9.0

10.0

Method 1 Single borderline score

Method 1: Global score (Fail=0, Borderline=1, Clear Pass=2, Good Pass=3, Excellent=4)

Stat

ion

scor

es

Borderline Regression Analysis

Improved pass/fail decisions

19 prestigious universities>200 OSCE/MMIs successfully administeredStudent and Examiner results being analysedCost reduction of 70% Error reduction by 30%

19% MORE NUIG students fail after introducing BRAInternal consistency varies from 0.45 – 0.85Cut-off score varies between stations from 40 – 75%Generalisability Kappa between departments 0.4 – 0.9

Results

top related