SoM Assessment Quality Management Guide Sep 2016 Page 1
School of Medicine Undergraduate Assessment Quality Management Guide
2016
SoM Assessment Quality Management Guide Sep 2016 Page 2
Table of Contents
1 Introduction .................................................................................................................................... 4
2 Framework for Assessment ............................................................................................................ 4
3 Assessment Formats ....................................................................................................................... 4
3.1 Range of assessment by year .................................................................................................. 5
4 Assessment material generation .................................................................................................... 6
5 Standard setting methods ............................................................................................................... 6
5.1 Angoff Method ........................................................................................................................ 6
5.2 Hoftsee .................................................................................................................................... 6
5.3 Borderline Regression ............................................................................................................. 7
5.4 Borderline Groups ................................................................................................................... 7
5.5 Pre-set pass mark .................................................................................................................... 8
5.6 Satisfactory engagement ........................................................................................................ 8
6 Assessment question type balance ................................................................................................. 8
7 Assessment content review process (Emendation) ........................................................................ 8
8 Assessment delivery ........................................................................................................................ 9
8.1 Where? .................................................................................................................................... 9
8.2 Who invigilates? ...................................................................................................................... 9
8.3 Who are the markers? ............................................................................................................ 9
8.4 Use of External Examiners ...................................................................................................... 9
8.5 Use of university approved plagiarism detection software .................................................... 9
8.6 On the day assessment material security methods .............................................................. 10
9 Scoring Assessments ..................................................................................................................... 10
9.1 Award of Degree of MBChB with Distinction ........................................................................ 10
10 Post assessment material performance analysis ...................................................................... 11
11 Rules for Progression ................................................................................................................ 11
12 External Examiners .................................................................................................................... 12
13 Feedback to students after Assessments ................................................................................. 13
14 GLOSSARY .................................................................................................................................. 13
Appendix A: Keele School of Medicine Assessment team ................................................................ 14
Appendix B: Flow chart showing assessment generation process ................................................... 15
Appendix C: Angoff briefing .............................................................................................................. 16
Appendix D: Example Year 5 Blueprint ............................................................................................. 17
Appendix E: Flow chart Assessment delivery ................................................................................... 18
SoM Assessment Quality Management Guide Sep 2016 Page 3
Appendix F: Post assessment performance analysis ........................................................................ 19
Appendix G: School of Medicine Progress committee: .................................................................... 22
Appendix H: School of Medicine Assessment Committee ................................................................ 22
SoM Assessment Quality Management Guide Sep 2016 Page 4
1 Introduction
This guide has been developed to outline the different activities that are undertaken by the school
regarding assessments. The School of Medicine have a dedicated assessment team (Appendix A)
that work in conjunction with a broader group of clinicians and academics.
The intended audience for this guide is School of Medicine Staff, students and External Examiners for
the MBChB programme. It provides supplementary information regarding the assessments on the
course, providing additional information that is not included in the Programme Specification /
Course Regulations documentation.
2 Framework for Assessment
The School of Medicine has a comprehensive assessment programme which can be viewed in the
Assessment Practices document. Key aspects of this policy are outlined in this assessment section.
Overall, assessment is designed to:
Assist students to achieve the learning objectives of the medical programme.
Facilitate the development in students of the learning skills necessary to maintain currency
in later professional practice.
Provide evidence of the extent to which students have achieved the learning objectives of
the course.
Employ assessment practices that reflect current, evidence-based, best practice.
Align with the curriculum in both content and process and will assess knowledge, skills and
attitudes in an integrated manner.
Provide feedback to all students after summative assessments
Follow a process of blueprinting to ensure appropriate sampling of material reflecting
common international assessment practices.
3 Assessment Formats
The School uses a variety of assessment formats throughout the programme. These include written
and practical assessments. Examples of written assessments include Single Best Answer questions
(SBAs), Extended Matching questions (EMQs), short answer questions known as Key Feature
Problems (KFPs). Examples of practical assessments include the Objective Structured Clinical
Assessments (OSCEs) and Objective Structured Skills Examinations (OSSEs). This list is not exhaustive;
other formats may be used to support specific years of the course.
Some assessments will be ‘low stakes’ as their primary purpose is to provide feedback to students on
their learning progress. Other assessments will be ‘high stakes’ or summative as their primary
purpose is to inform decision-making about a student’s capacity to proceed to the next year of the
course or to graduate. Feedback will still be offered after high stakes assessments in order to
encourage students to continually improve their performance. Feedback is provided in a variety of
ways, including via an online portal, small and large group sessions, and individual meetings with
tutors for students whose performance is unsatisfactory.
SoM Assessment Quality Management Guide Sep 2016 Page 5
3.1 Range of assessment by year
Year Assessment Name % towards
Year Standard Set method Possible Grades
1 Publication based paper 10 Borderline Groups Satisfactory / Unsatisfactory
1 Student Selective Component 10 Pre-set passmark Satisfactory / Unsatisfactory
1 OSSE 16 stations 25 Borderline Regression Satisfactory / Unsatisfactory
1 Knowledge Based (SBA / EMQ /KFP) 55 Hoftsee
Satisfactory / Must Improve / Unsatisfactory
1 Portfolio / Refection / MSF 0 Satisfactory engagement Satisfactory / Unsatisfactory
2 Data Interpretation Paper 10 Hoftsee Satisfactory / Unsatisfactory
2 Student Selective Component 10 Pre-set passmark Satisfactory / Unsatisfactory
2 OSSE 16 stations 25 Borderline Regression Satisfactory / Unsatisfactory
2 Knowledge Based (SBA / EMQ /KFP) 55 Hoftsee Satisfactory / Unsatisfactory
2 Portfolio / Refection / MSF 0 Satisfactory engagement Satisfactory / Unsatisfactory
3 Critical Appraisal Paper 5 Borderline Groups Satisfactory / Unsatisfactory
3 OSCE 12 stations 35 Borderline Regression Satisfactory / Unsatisfactory
3 Knowledge Paper 60 Angoff Satisfactory / Unsatisfactory
3 SSC ( 2 x 4 week )
10 (towards year 4) Pre Set passmark Satisfactory / Unsatisfactory
3 Portfolio / Refection / MSF 0 Satisfactory engagement Satisfactory / Unsatisfactory
4 OSCAR 0 Pre Set passmark Satisfactory / Unsatisfactory
4 Knowledge 45 Angoff Satisfactory / Unsatisfactory
4 OSCE 16 stations 45 Borderline Regression Satisfactory / Unsatisfactory
4 SSC (4 weeks)
5 (towards year 5) Pre Set passmark Satisfactory / Unsatisfactory
4 Portfolio / Refection / MSF 0 Satisfactory engagement Satisfactory / Unsatisfactory
5 OSCE 14 stations 95 Borderline Regression Satisfactory / Unsatisfactory
5 Portfolio / Refection / MSF 0 Satisfactory engagement Satisfactory / Unsatisfactory
5 Elective 0 Satisfactory engagement Satisfactory / Unsatisfactory
SoM Assessment Quality Management Guide Sep 2016 Page 6
4 Assessment material generation
New assessment material is developed both for use “in house” and also for sharing within the
MSCAA Database. The Schools uses the Common Content questions set by the MSCAA team in the
Year 4 (finals) knowledge paper. Question writing workshops are run by the Assessment Tutors
where new material is generated and refined.
OSSE and OSCE Stations are produced in house and all material (both for practical and written
exams) is reviewed by an Emendation group, which is made up of Assessment Tutors, Year Leads and
Teaching Staff for the relevant year.
Assessment material is protected with password(s) and stored on a secure drive.
See Appendix B: Assessment material generation flow chart
5 Standard setting methods
Standard setting is a critical part of medical educational assessment as it determines the minimally
acceptable level of competence in order to allow progression to the next stage of the course, or to
graduate. The boundary between the acceptable and unacceptable level of competence is known as
the ‘cut score’. The School seeks to follow current evidence-based best practice in standard setting.
It uses a range of methods to ascertain what the cut score should be for different assessments. The
current methods employed by the School are summarised below.
Method:
5.1 Angoff Method
The (Modified) Angoff method is used pre- assessment to standard set for Years 3 and 4 knowledge
based papers.
Judges (these are members of staff who have a substantive role in teaching and assessment for that
particular year group of students) are briefed on the Angoff method and asked to visualise the
characteristics of a student who is only just competent to progress to the next stage. They are then
asked to review the whole paper. The experts are then asked to provide estimates (for each
question) of the proportion of borderline or “minimally acceptable” participants that they would
expect to get the question correct. These judgements are submitted and collated with all the other
judgements. An extensive and detailed group discussion then takes place to compare the
judgements of individual judges about the questions. After discussion, judges are given the
opportunity to revise their estimate if they wish.
See Appendix C: Briefing material supplied to Angoff participants
5.2 Hoftsee
The Hoftsee method is used post-assessment to standard set for Years 1 and 2 Knowledge based
papers.
Hofstee method
The judges are required to answer four questions:
SoM Assessment Quality Management Guide Sep 2016 Page 7
What is the maximum acceptable pass score?
What is the minimum acceptable pass score?
What is the minimum acceptable fail rate?
What is the maximum acceptable fail rate?
The four points are then plotted on a graph to form a box, the candidate performance is then
plotted (fail rate vs. pass score) this will pass through the box defined above.
The top left and bottom right corners of the box are joined, where this line crosses the candidate
performance line gives the pass score.
5.3 Borderline Regression
Borderline Regression is used post-assessment to standard set for the OSSE Years 1 and 2, and the
OSCE in Years 3, 4 and 5.
For each station, the scores of each candidate are plotted on a graph, grouping them by the overall
judgement each candidate received. A regression line is then drawn through each group of scores
and where the line intersects with the borderline group is the passing score for the station.
5.4 Borderline Groups
Borderline Groups is used to standard set for Year 1 publication based paper and the Year 3 Critical
Appraisal Paper.
Borderline groups method.
In this method candidates are scored for each domain of an assessment against marking guidelines.
0
5
10
15
20
25
30
35
40
45
50
55
60
65
70
75
80
85
90
95
100
105
0 10 20 30 40 50 60 70 80 90 100 110 120 130 140 150 160 170 180 190 200 210 220 230
Year 2 Knowledge / 299
Year 2 Knowledge / 299
SoM Assessment Quality Management Guide Sep 2016 Page 8
In addition to scoring performance the marker is asked to give a global rating for each candidate on
a five-point scale:
1. Unsatisfactory
2. Borderline.
3. Satisfactory
4. Good
5. Excellent
The Median of the Borderline students score is then calculated.
The SEM for that assessment is also calculated and 1 x SEM is added to this score to calculate the Pass
mark for that assessment.
5.5 Pre-set pass mark
The pre-set pass mark is an agreed pass mark (53%) that is used for all SSCs (year 1 - 4)
5.6 Satisfactory engagement
Satisfactory engagement is required for Years 1 - 5 portfolio, reflection, MSF and elective and is
defined as:
Attendance at PBL and small group meetings;
Attendance at practical and clinical learning sessions;
Evidence of satisfactory learning progress as determined by discussion of the Learning Portfolio with the Tutor/Appraiser ;
Timely submission of formative assessment;
Reflection on performance in formative assessment; and
Timely submission of MSF reports within each year.
6 Assessment question type balance
Each assessment is reviewed to ensure that the sampling of assessment material reflects the
blueprint. Assessment blueprints provide an overview of the specific competencies and skills
measured on any given assessment to ensure appropriate sampling of material taught throughout
each year of the course.
See Appendix D: Blueprint
7 Assessment content review process (Emendation)
As part of the emendation process the emendation team will:
Review the paper(s) / stations to ensure that the content is accurate and appropriate for the
year group concerned.
Make sure questions and instructions are clear.
Make sure questions are not misleading to students.
Make sure the answers we have / would expect from students are correct in line with
current best practice
Make sure we are not asking the same thing multiple times.
SoM Assessment Quality Management Guide Sep 2016 Page 9
Make sure there is appropriate sampling of the blueprint that aligns with the curriculum for
that year.
The emendation team will propose changes in response to the above. All changes are then reviewed
by the Assessments Team in conjunction with Year Leads and External Examiners.
8 Assessment delivery
8.1 Where?
Written assessments are delivered in University approved venues, such as the Sports Hall, where the
University holds written examinations for all its other courses.
OSSE / OSCE Examinations take place in School of Medicine Buildings at Keele, Royal Stoke Hospital
Site and Royal Shrewsbury Hospital.
See Appendix E: Flow chart Assessment delivery
8.2 Who invigilates?
Written assessments are invigilated by University employed invigilators.
OSSE / OSCE examinations are invigilated by members of the Assessment Team / Year Leads and
other appointed personnel as appropriate.
8.3 Who are the markers?
Written assessments are marked by member of the Teaching Team appropriate for the year group
they are marking. All OSCE examiners are required to have undertaken the OSCE training session run
by Staff Development and are then required to observe an OSCE session before marking a station on
their own. All KFP markers mark in pairs; a policy of “round the table” marking is in operation so that
any ambiguities in the marking scheme that may become apparent after students have taken the
assessment can be quickly resolved by consensus.
OSSE / OSCE Assessments are marked by trained examiners, who are academic / clinical staff
involved in teaching students.
8.4 Use of External Examiners
http://www.keele.ac.uk/qa/externalexaminers/
Each year of the course has at least two External Examiners responsible for their assigned year.
There is a Senior External Examiner for the School who oversees the 5 years of the course, normally
taking a particular interest in one year group or aspect of the course during each academic year.
External Examiners are sent questions for review, attend OSSE / OSCE assessments as observers and
attend Examination Boards.
8.5 Use of university approved plagiarism detection software
All submitted in course assessments are submitted by the students via university approved
plagiarism detection software. The Academic Conduct Officer for the School deals with any
suspected cases of plagiarism in line with University policies.
SoM Assessment Quality Management Guide Sep 2016 Page 10
8.6 On the day assessment material security methods
Written papers are all stored in a locked, secure area prior to transporting them to the venue.
For practical examination, students are held in quarantine pre or post exam to prevent them from
communicating with students who have already taken the exam. No form of communication device
is allowed to be used when held in quarantine areas.
9 Scoring Assessments
The Standard Error of Measurement (SEM) will be calculated for each paper and used to determine
boundaries between scores. The SEM will not be used in re-sits, as smaller numbers of candidates
makes this less valid. Instead, the SEM from the main assessment is applied.
Table 1: Determining grades in summative assessment.
Grade Score Interpretation
S Years 1, 2, 3 and 4: More than Cut Scorea + 1SEM
Satisfactory
Must
Improve
Year 1 Between Cut Scorea -1SEM and Cut Scorea + 1SEM
Satisfactory but must
improve
U Years 1 Less than Cut Scorea -1 SEM Years 2, 3 and 4: Less than Cut Scorea + 1 SEM
Unsatisfactory
a. The Cut Score is determined at Standard Setting as the score expected of a student who just
meets the threshold for progression. The Standard Error of the Measurement (SEM) reflects the
imprecision of the Cut Score.
9.1 Award of Degree of MBChB with Distinction
The major purpose of the summative assessment is to allow those students who have displayed the
relevant competencies to progress in the course and to graduate; conversely, those who have not
displayed these competencies are referred for remediation or exclusion from the course. However,
some students will excel in these assessments and this will be recognised by the award of a degree
of MBChB with Distinction.
Distinction points are awarded to those students attaining a high overall score in the Summative
examinations for that year.
A student becomes ineligible for the award of Distinction points within a year if any summative
assessments are unsatisfactory at the first attempt. Also, a student who defers an examination for
any reason will not automatically be eligible for Distinction points in that Year.
Distinction points may be awarded by an examination board for a student who defers for a health or
other unavoidable reason, using the cohort data from the first sitting of the exam.
The award of a degree with Distinction is conditional on satisfactory completion of the Portfolio and
the demonstration of a high level of professional practice as determined by the Examination Board.
SoM Assessment Quality Management Guide Sep 2016 Page 11
The Guidance for the Examination Board on the award of distinction points are:
4 Distinction Points are required from the course for the award of the degree of MBChB with Distinction.
One of these points must be obtained in the examinations in either Years 4 or 5.
One Distinction point may be obtained in each of Years 1 and 2. Up to two distinction points can be gained in each of Years 3, 4 and 5.
For graduate entry students entering the course in Years 2 or 3, excellence in a prior degree may allow award of Distinction points at the discretion of the School (maximum of 1 for Year 2 entrants, 2 for Year 3 entrants).
Distinction points will be awarded by the examination board at the end of each year based on suggested thresholds. The exact cut point for the award of Distinction points will be based on the recommended thresholds but may be modified according to the performance of the examination in that particular year.
The recommended thresholds for consideration by the examination board are:
In each of Years 1 and 2:
1 Distinction point for a student who is ranked in the top 15% of the year.
In each of Years 3, 4 and 5:
1 Distinction point for students who are in the top 15% of the year and two distinction points for
students who are in the top 5% of the year.
The Year 5 Exam Board will have final approval of awarding Distinction Points and will generate a
list of those students who will graduate with Distinction each year.
10 Post assessment material performance analysis
The School’s psychometrician performs extensive analysis of how each assessment has performed.
Both the assessment as a whole and individual assessment items are scrutinised using recognised
psychometric techniques. This information is fed back to those people responsible for designing the
assessments to in order to inform future development of questions.
See Appendix F: Post assessment material performance analysis
11 Rules for Progression
Rules for progression at first attempt of Year assessments
1. SATISFACTORY final grade in every assessment group allows a student to progress to the next
academic Year of the course.
2. A final UNSATISFACTORY grade requires students to resit the Assessment;
SoM Assessment Quality Management Guide Sep 2016 Page 12
Rules for progression at resit
1. SATISFACTORY final grade in every assessment group allows a student to progress to the
next academic Year of the course providing the combined Year mark is greater than or equal
to 53.
2. AN UNSATISFACTORY grade in all years of the course will result in exclusion from the course
subject to consideration by Progress Committee.
12 External Examiners
USE OF EXTERNAL EXAMINERS: http://www.keele.ac.uk/qa/externalexaminers/
The role of external examiners in UK medical schools is important as it opens to external scrutiny the
process by which assessment materials are developed, selected, applied, and how student scores
and progress decisions are made.
The practice at Keele is to have a smaller number of expert assessors to provide scrutiny and advice
on all steps of the assessment process, rather than confining their participation to observing clinical
examinations. There will be at least one external examiner for each year, with content expertise
most relevant to the content of the year, and with assessment development expertise. They may
participate in each of the following phases:
1. Assessment item writing and standard setting;
2. Perusal of examination papers;
3. Observation of practical/clinical examinations;
4. Examination Board meetings; and
5. A meeting with students to gain independent feedback on the examinations.
Examination Board
The Medical School, as part of the Universities regulations, has Examination Board meetings.
Representatives from the senior management of the Medical School, relevant unit and Year leaders
and relevant external examiners will attend the Board and ensure that the award of assessment and
examination scores is carried out with due process. This Board will confer the assessments awarded.
The University Regulations governing the examination for the degrees of MBChB is available on the
School of Medicine Website.
In addition to exam feedback arranged by the Assessments office, students may be referred to the
Enhanced Professional and Academic Support Service by Exam Board or in some cases by the Year
lead. A struggling student will be assigned an EPASS tutor who will act as a signpost to relevant help
and may in some cases offer help personally.
SoM Assessment Quality Management Guide Sep 2016 Page 13
13 Feedback to students after Assessments
The School regards feedback after assessments as an important part of the learning process. Feedback is provided in a variety of ways, including via an online portal, small and large group sessions, and individual meetings with tutors for students whose performance is unsatisfactory. In addition, students meet regularly with their Professional Development Tutors to discuss their learning progress. For Years 3 - 5 clinical assessments (Objective Structured Clinical Examinations – OSCEs) examiners
provide written and audio comments which may be accessed by students via the feedback website.
Students can see a detailed breakdown of the results, with the information displayed in a variety of
formats to suit the needs and preferences of the students. Students are able to access audio clips of
examiners giving them specific feedback about their strengths and areas for development for
particular questions. For the OSSEs in Years 1 and 2, students also have access to a simpler version of
the website.
Online feedback is also available for written assessments. Students can view a detailed breakdown
of their performance for specific parts of the course, and compare their performance with their peer
group as a whole.
In addition to exam feedback arranged by the Assessments office, students may be referred to the
Enhanced Professional and Academic Support Service by Exam Board or in some cases by the Year
lead. A struggling student will be assigned an EPASS tutor who will act as a signpost to relevant help
and may in some cases offer help personally.
14 GLOSSARY
SPEEDWELL: Exam software used for computer marking
MSCAA: The Medical Schools Council Assessment Alliance (MSCAA) is a partnership to improve
undergraduate assessment practice through collaboration between all 33 undergraduate medical
schools in the UK.
Benchmarking: to measure the quality of something by comparing it with something else of an
accepted standard
Multi quest: Multiple choice test maker software
EPASS: Enhanced Professional and Academic Support Service
SoM Assessment Quality Management Guide Sep 2016 Page 14
Appendix A: Keele School of Medicine Assessment team
DIRECTOR OF ASSESSMENTS
PSYCHOMETRICIAN ASSESSMENTS
TUTOR (KFPS)
ASSESSMENTS
TUTOR (OSCE)
DEPUTY DIRECTOR
OF ASSESSMENT
AND
ASSESSMENTS
TUTOR (SBA)
YEAR LEADS
UG ASSESSMENTS MANAGER
E-LEARNING & E-
ASSESSMENTS CO-
ORDINATOR
ASSESSMENTS
OFFICE
COORDINATOR
CLINICAL
ASSESSMENT
COORDINATOR
ASSESSMENTS
ADMINISTRATOR
ASSESSMENTS
ADMINISTRATOR
PROFESSIONAL
DEVELOPMENT & E-
PORTFOLIO
ADMINISTRATOR
SoM Assessment Quality Management Guide Sep 2016 Page 15
Appendix B: Flow chart showing assessment generation process
BLUEPRINT
(Assessment tutors and year
leads) Aligned to GMC / Tomorrow’s
Dr’s outcomes / Themes / Units / Blocks
QUESTION SELECTION
SPEEDWELL / MSCAA
Check of psychometrics (link to
example) (assessment tutors and
year leads)
Balance check
Question / Item
generation
https://mscaa.shef.ac.uk/
en/dashboard
EXAM PAPER REVIEW BOARD
Papers sent in advance. Input
from those teaching, block leads
/ unit leads / skills team
Changes made
smaller review board
EXTERNAL EXAMINER REVIEW
https://www.keele.ac.uk/qa/externalexa
miners/guidanceandinduction/
Matches our own curriculum and
standards comparable to their own
institution
Changes sent to Year leads / Assessment
Tutors for final Sign off. Response to
External Examiner re their comments
Knowledge Paper Standard Setting
Meeting – using Benchmarking data and
an awareness of factors such as
Common Content data and institutional
knowledge of cohorts.
Printing of written papers.
Images checked by year leads
OSCE Marking criteria uploaded
onto iPads. Quality and accuracy
checks made
Question banks updated with
any changes to questions
SoM Assessment Quality Management Guide Sep 2016 Page 16
Appendix C: Angoff briefing
Read each question on the PDF attached to this email. We need you to make three decisions about each question. 1. How relevant is the material that is being assessed to the work of a doctor starting their F1 placement? Is it essential (absolutely core knowledge), important (between the two other categories) or acceptable (slightly more supplementary knowledge, ‘nice to know’, but you could cope if an F1 doctor did not know it)?
2. How difficult is the material that is being assessed? Rate it at the level of the whole national cohort of F1 doctors [from all UK medical schools] starting their placements this August. Would this group find the question easy (more than two thirds would get the question right), moderate (half to two thirds would get the question right) or hard (fewer than half would get the question right)?
Drop down boxes on the Excel sheet have been provided to hopefully make this easier for you
3. Think of the F1 doctors who are just about acceptable to be allowed to practise anywhere in the UK. They are safe in the core, essential areas, and can effectively do the job (perhaps more slowly or with less fluency), but they have significant specific areas for improvement that would need to be addressed in the F1 year. Think of 100 of these doctors, who have graduated from any UK medical school. What percentage of these doctors would answer this question correctly?
Remember that if they were to guess, they have a 1 in 5 chance of choosing the correct answer, so your answer should not be below 20% for the SBAs.
For the KFPs, you will see the number of marks available for that question – you need to indicate how many marks you think the just about acceptable student would get on each KFP question.
Please remember to follow these important points:
We want your opinion and judgement. It is subjective by its nature and that is fine. However, you must be able to justify each of your judgements so it needs to be a serious, considered process. Could you (hypothetically) stand up in court, or a GMC hearing, and justify your decision-making?
Don’t feel that you should ‘play safe’ by always keeping your judgements in the middle ground. The questions vary quite a lot, so we would expect your judgements to vary considerably from question to question. When we discuss the questions, we will ask different members of the panel to justify their judgements. We will choose judges whose responses are in the middle of the range as well as those from either extreme.
It is extremely important that you do not think about what the Keele students have been taught (e.g. “They will do well in that question as I gave them a lecture on it…”) or about certain Keele students who you know. Instead, you should always be thinking about the national picture. How would F1 doctors who graduated from Aberdeen, Brighton, Cambridge etc. cope with these questions? This is because the GMC requires us to set standards which are comparable with other medical schools.
SoM Assessment Quality Management Guide Sep 2016 Page 17
Appendix D: Example Year 5 Blueprint
Exam date Description Assessment focus How many?
Res
pir
ato
ry
CV
S
Uro
logy
/ren
al
Gas
tro
inte
stin
al /
hep
ato
logy
End
o/M
etab
Sexu
al +
rep
ro h
ealt
h
Psy
chia
try
Neu
rosc
ien
ces
Mu
sc/S
kele
tal
Hae
mat
olo
gy
Oth
er
Station title: GP
Station title: Secondary care
History-taking in difficult circumstances
For example: In the context of: memory impairment, hearing impairment, learning disability, intoxicated, language difficulties
Can the candidate make a safe and correct diagnosis despite the communication challenges?
1
Clinical examinations (real patients or SPs)
Should be realistic and integrate two or more systems
Can the candidate make a safe diagnosis and management plan?
2
Acute illness assessment + management
Common and important scenarios for F1 doctors
Can the candidate make a safe diagnosis and management plan?
2
Responding to results of investigations
Common and important scenarios for F1 doctors
Can the candidate recognise important results which could compromise the safety of a patient and take prompt and safe action?
1
Management in difficult circumstances
For example: breaking bad news, negotiation skills, e.g. pt self-discharging, cancelled/delayed operation, angry relative, dying patient
Can the candidate manage the situation safely despite the communication challenges?
1
Safe Prescribing Including writing up of prescriptions for drugs, oxygen, fluids, or explanation of change of medication regime
Can the candidate prescribe safely in a time-pressured environment?
3
Professionalism + Patient Safety
Including assessment of capacity, confidentiality, duty of candour, other ethical issues
Can the candidate manage the situation safely despite the ethical challenges?
1
Communication with healthcare team
For example: handover, negotiating request for an investigation, requesting involvement of a different specialty
Can the candidate communicate in a professional, effective and safe manner?
1
Discharge planning Ensuring a patient is safe to be discharged
Can the candidate make a safe management plan and communicate it effectively?
1
Practical Procedures in difficult circumstances
For example: with a patient who is drunk, aggressive, extremely anxious, or when under time pressure
Can the candidate complete the procedure competently despite the challenges?
1
1 or 2
1 or 2 1
1 or 2
1 or 2 1 1
1 or 2 1 1
0 or 1 14
SoM Assessment Quality Management Guide Sep 2016 Page 18
OSCES Examiners trained in advance – must
also observe before examining
http://www.keele.ac.uk/medicine/staffinfor
mation/staffdevelopment/
STUDENT
BRIEFING
(example)
EXAMINER
BRIEFING
(example)
Clinical Invigilator
Support Staff on Circuit.
IPad data backed up with use of iPods and
paper mark sheet back ups
Marks
downloaded
from iPads
into Excel
FORMULAS / STANDARD
SETTING APPLIED. Pass /
Fail for each station
into Excel
INVIGILATED BY CENTRAL UNIVERSITY
http://www.keele.ac.uk/regulations/regulation8/
SBA / EMQ – Marked by
MultiQuest software.
Question performance
data sent to year leads for
QA check
KFPs – Team Marked /
double marking.
Comprehensive marking
schemes
Marks entered into Excel
– 2 x person role
Checks made by Year
leads
STANDARD SETTING
APPLIED.
Outcomes decided on
progression criteria
Checks made by Year
leads/Director
MASTER EXAM BOARD
SPREADSHEET. Checks
made by UG Assessments
Manager
EXAM BOARDS
Ran by year group. Assessment performance
discussed. Student progression decisions. Key
Teaching staff, Year Leads, External Examiners,
Assessment tutors and Senior Management in
attendance.
http://www.keele.ac.uk/regulations/ Appeals
/ Exam Board University Regulations Progress
Committee / Health and Conduct Committee
Feedback to students
ASSESSMENTS WASH UP
MEETING – by year.
Evaluation of Assessments
material supported by
data
Questions about
assessments included in
evaluations and
considered by QA office
and year leads respond to
Appendix E: Flow chart Assessment delivery
OSSE/OSCE KNOWLEDGE PAPERS
SoM Assessment Quality Management Guide Sep 2016 Page 19
Appendix F: Post assessment performance analysis
Overview of question / item analysis that takes place post assessments:
Knowledge Papers:
Simple statistics are calculated as below and histograms of performance with normal range curves
are also produced.
MCQ KFP Total
N Marks Available Mean Median
Std. Deviation Minimum Maximum
Performance of individual SBA / EMQ question graphs are also produced as well as individual item
correlation to the rest of the paper. Examples below:
SoM Assessment Quality Management Guide Sep 2016 Page 20
question Point Bi Serial
number of
candidates
number
of
candida
tes
passed Facility
Corrected Item-
Total Correlation
Cronbach's
Alpha if Item
Deleted A B C D E Blank
Discrimination Easiness Discrimination
Internal
Consistency /
Reliability
P1Q1 0.21 123 53 0.43 0.15 0.739 1.63 14.63 43.09 4.88 35.77 0.00
P1Q2 0.32 123 68 0.55 0.26 0.735 2.44 0.00 55.28 33.33 8.94 0.00
P1Q3 0.21 123 84 0.68 0.16 0.738 8.94 68.29 8.94 13.01 0.81 0.00
P1Q4 0.12 123 68 0.55 0.06 0.741 3.25 4.07 29.27 8.13 55.28 0.00
P1Q5 0.18 123 68 0.55 0.12 0.740 0.00 8.94 2.44 55.28 33.33 0.00
P1Q6 0.09 123 67 0.54 0.03 0.743 8.94 4.07 8.94 22.76 54.47 0.81
P1Q7 0.13 123 76 0.62 0.07 0.741 11.38 12.20 7.32 6.50 61.79 0.81
P1Q8 0.21 123 82 0.67 0.16 0.739 17.89 0.81 2.44 11.38 66.67 0.81
P1Q9 0.02 123 90 0.73 -0.04 0.744 2.44 11.38 73.17 0.00 12.20 0.81
P1Q10 0.18 123 51 0.41 0.13 0.740 13.82 8.94 13.01 41.46 22.76 0.00
P1Q11 0.10 123 76 0.62 0.04 0.742 61.79 9.76 12.20 12.20 3.25 0.81
P1Q12 0.21 123 75 0.61 0.15 0.739 10.57 8.13 14.63 60.98 5.69 0.00
% Chooing distractor options
question
number of
candidates
Marks
Available Minimum
Maximu
m
Average
Score
Av mark
Facility
Corrected Item-
Total Correlation
Cronbach's Alpha
if Item Deleted Priorities Suggestions
easiness discrimination
Internal
Consistency /
Reliability
K1 a 123 3 0.5 3 1.764 58.80 0.38 0.66
K1 b 123 6 2 6 5.122 85.37 0.28 0.67 2 Too easy?
K1 c 123 3 0 3 2.524 84.13 0.15 0.68 2 Too easy? Poor discrimination with overall score
K1 d 123 2 0 2 1.642 82.10 0.34 0.67 2 Too easy?
K1 e 123 2 0 2 1.74 87.00 0.21 0.68 2 Too easy?
K1 f 123 3 0 3 2.264 75.47 0.37 0.66
K1 g 123 1 0 2 0.927 92.70 0.31 0.67 1 Too easy?
K2 a 123 3 0 3 2.02 67.33 0.36 0.66
K2 b 123 2 0.5 2 1.732 86.60 0.23 0.67 2 Too easy?
SoM Assessment Quality Management Guide Sep 2016 Page 21
OSCE:
Data is looked at for each individual station as to how it performed and how it contributed to the
exam as a whole. This data is made available to Assessment Tutors and Year leads when selecting
questions for subsequent year’s assessments.
Station
NumberContent
Marks
Available
Cut
Score
(BLR)
Average
ScoreSD
Minimum
Score
Maximum
ScoreMedian Facility (%)
Cronbach's
Alpha if item
deleted
(overall α =
0.67)
Number of
Failures
Between
Group
Variation
(Effect Size; p
value) (Exam
Location
Effects)
Between
Group
Variation
(Effect Size; p
value)
(Placement
Location
Effects)
R² (Total
Station vs
Category SS
Score)
Corrected
Item Total
vs Total
Score
Inter-
Grade
Discrimin
ation
(Average
Increase
in Mark
per
Category
SS
Increase)
1
Epilepsy and
Asthma
Medication
27 15.8 19.39 3.76 10 27 20 71.81 0.65 22 0.031; 0.045 0.062; 0.017 0.92 0.22 3.34
2Parkinson's
Disease27 14.7 19.28 4.58 7 27 20 71.41 0.68 25 0.048; 0.013 0.008; 0.601 0.94 0.03 3.79
3Abdominal
Stoma Exam27 13.5 21.35 3.64 12 27 21 79.07 0.66 2 0.016; 0.147 0.010; 0.543 0.89 0.17 4.29
4 Acute COPD 27 14.9 18.82 3.76 8 27 18 69.70 0.64 14 0.018; 0.126 0.054; 0.029 0.90 0.28 3.46
5Administration
Antibiotics27 15.75 19.53 4.74 8 27 20 72.33 0.64 23 0.040; 0.023 0.004; 0.766 0.91 0.29 3.47
6 ABCDE 27 14.79 18.32 3.60 10 27 18 67.85 0.62 17 0.054; 0.008 0.072; 0.009 0.88 0.46 3.34
7 Palliative Care 27 13.63 20.38 4.21 7 27 20 75.48 0.61 9 0.017; 0.136 0.056; 0.026 0.90 0.46 4.27
8Prescribing IV
Fluids27 14.87 18.76 3.86 9 27 19 69.48 0.63 11 0.098; <0.001 0.082; 0.004 0.85 0.35 3.33
9 CVS Exam 23 12.88 17.11 3.09 9 23 17 74.39 0.66 6 0.001; 0.739 0.010; 0.519 0.87 0.16 2.9
10 MSK Knee 27 15.28 20.68 3.52 13 27 20 76.59 0.63 10 0.017; 0.138 0.009; 0.573 0.90 0.34 3.52
11Medication
History Review27 14.41 19.35 4.49 6 27 19 71.67 0.63 14 0.026; 0.067 0.044; 0.058 0.92 0.37 3.83
12Skill
Cannulation23 14.25 17.7 2.76 10 23 18 76.96 0.64 16 0.017; 0.141 0.044; 0.058 0.85 0.34 2.66
13Psychotic
Depression27 15.97 21.35 3.35 10 27 21.5 79.07 0.63 6 0.216; <0.001 0.164; <0.001 0.89 0.36 3.16
14
History
Abdominal
Pain
31 17.96 22.7 4.39 13 31 23 73.23 0.66 20 0.003; 0.504 0.003; 0.838 0.89 0.19 3.69
SoM Assessment Quality Management Guide Sep 2016 Page 22
Appendix G:
School of Medicine Progress committee: see http://medicine2.keele.ac.uk/qa/students.html
Appendix H: School of Medicine Assessment Committee
Membership, frequency & reporting
TITLE: Assessment Committee
MEMBERSHIP:
Director of Assessments (Chair)
Deputy Director of Assessments
Director of Academic Undergraduate Studies
Director of Curriculum
1 (Assessment) Lead for each of the 5 Modules
Director of Skills
Lead for Student Selected Components (SSC)
Hospital Dean
Education Office Manager
Undergraduate Manager
2 student members (1 senior, 1 junior)
Lay member
Head of School (ex officio)
Director of Undergraduate Programmes (ex officio)
Director of Academic General Practice (ex officio)
FREQUENCY OF MEETINGS: Quarterly
RECEIVE REPORTS FROM:
a) Progress Committee b) Examination Boards for Years 1-5
REPORTS TO: Undergraduate Course Committee
ADMINISTERED BY: Assessment Office administrator
TERMS OF REFERENCE:
SoM Assessment Quality Management Guide Sep 2016 Page 23
TERMS OF REFERENCE: Assessment Committee
1. Maintain strategic overview of assessment over all five modules of the curriculum. Ensure that where
decisions are taken by Module teams they do not duplicate or adversely impact on assessments in
other Modules.
2. Receive reports from the Modules on exam success rates/failures, quality assurance and from the
external examiners. Ensure consistent approach to issues raised by any of the reports, including
responses to external examiner reports.
3. Oversee methods of standard setting used in all assessments, ensuring a consistent approach across
Modules. Where appropriate ensuring that course regulations concerning award of grades are reviewed
and updated.
4. Make decisions concerning the methods used in awarding distinction points and which assessments will
be included in the process.
5. Consider matters of resource for running assessments and ensuring that where necessary assessments
are modified to ensure best use of resources.
6. Determine the appropriate criteria for progress decisions in the light of experience of the new curriculum
and where appropriate update the course regulations.
7. Review assessment data longitudinally to ensure that policies are adjusted in the light of experience.
Examples might include examining the standards required over time to ensure consistency, following
poorly performing students to determine whether decisions to exclude them from the course can be
made at an earlier stage etc.
8. Oversee the maintenance of the Speedwell question bank to ensure consistent tagging and co-
operation between Modules with regards to assessment material.
9. Make decisions concerning the form and timing of student feedback and work to ensure that feedback
on the course is of the highest possible standard within the available resources. Ensure clear
communication with the student body, via the modules, about these processes.
10. Ensuring that adequate formative assessment material is available and that formative assessment takes
place.