assessment and feedback in emergency medicine training: views of australasian emergency trainees

11
EDUCATION AND TRAINING Assessment and feedback in emergency medicine training: Views of Australasian emergency traineesSimon Craig, 1 George Braitberg, 1,2 Caroline Nicolas, 2 Geoff White 2 and Diana Egerton-Warburton 1 1 Southern Health, and 2 Monash University, Clayton, Victoria, Australia Abstract The aim of the present study is to describe ACEM trainees’ perspectives on assessment and feedback during their training. From May to July 2009, an anonymous Web-based survey on training and supervision in emergency medicine was conducted, addressing trainees’ perceptions of mandatory assessments (primary examination, fellowship examination and mandatory trainee research requirement) and feedback at work. Qualitative data were analysed using grounded theory methodology – themes were identified by close examina- tion of full text responses. In total, 622 trainees responded to the survey (response rate of 37%). Trainees report that general clinical supervision is adequate; however, direct super- vision at the bedside and feedback could be significantly improved. They perceive that the primary examination is necessary, although they feel it is irrelevant to their development as emergency trainees and are keen for more clinically applied knowledge to be tested. They dislike mandatory trainee research, feel inadequately supported and distracted from other aspects of their training. The fellowship examination was overall thought to be fair; however, there were concerns with the time pressures and restrictions to the written component of the examination. Additionally, the structured clinical examination was popular, whereas short cases and long cases were very unpopular. ACEM trainees’ views of training may help inform curriculum development, and might assist those providing education to improve local training programs. Key words: education, emergency medicine, professional. Introduction The ACEM is responsible for emergency medicine train- ing across Australia and New Zealand. Training takes a minimum of 7 years (Fig. 1). During this time, trainees must complete three major assessments – the primary examination, 2 the fellowship examination, 3 and the trainee research requirement 4,5 (Table 1). Although the Correspondence: Dr Simon Craig, 246 Clayton Road, Clayton, Vic. 3168, Australia. Email: [email protected] Simon Craig, FACEM, GCHPE, Emergency Physician/Registrar Education Coordinator; George Braitberg, FACEM, FACMT, Dip Epi Biostats, Professor of Emergency Medicine; Caroline Nicolas, BSocSci(Hons), Research Coordinator; Geoff White, PhD, MEd Studies, DipEd, BAppSci, Senior Lecturer; Diana Egerton-Warburton, FACEM, M Clin Epi, Emergency Physician/Director of Emergency Medicine Training. doi: 10.1111/j.1742-6723.2010.01353.x Emergency Medicine Australasia (2010) 22, 537–547 © 2010 The Authors EMA © 2010 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine

Upload: simon-craig

Post on 23-Jul-2016

213 views

Category:

Documents


0 download

TRANSCRIPT

EDUCATION AND TRAINING

Assessment and feedback in emergencymedicine training: Views of Australasianemergency traineesemm_1353 537..547

Simon Craig,1 George Braitberg,1,2 Caroline Nicolas,2 Geoff White2 and Diana Egerton-Warburton1

1Southern Health, and 2Monash University, Clayton, Victoria, Australia

Abstract

The aim of the present study is to describe ACEM trainees’ perspectives on assessment andfeedback during their training. From May to July 2009, an anonymous Web-based surveyon training and supervision in emergency medicine was conducted, addressing trainees’perceptions of mandatory assessments (primary examination, fellowship examination andmandatory trainee research requirement) and feedback at work. Qualitative data wereanalysed using grounded theory methodology – themes were identified by close examina-tion of full text responses. In total, 622 trainees responded to the survey (response rate of37%). Trainees report that general clinical supervision is adequate; however, direct super-vision at the bedside and feedback could be significantly improved. They perceive that theprimary examination is necessary, although they feel it is irrelevant to their developmentas emergency trainees and are keen for more clinically applied knowledge to be tested.They dislike mandatory trainee research, feel inadequately supported and distracted fromother aspects of their training. The fellowship examination was overall thought to be fair;however, there were concerns with the time pressures and restrictions to the writtencomponent of the examination. Additionally, the structured clinical examination waspopular, whereas short cases and long cases were very unpopular. ACEM trainees’ viewsof training may help inform curriculum development, and might assist those providingeducation to improve local training programs.

Key words: education, emergency medicine, professional.

Introduction

The ACEM is responsible for emergency medicine train-ing across Australia and New Zealand. Training takes a

minimum of 7 years (Fig. 1). During this time, traineesmust complete three major assessments – the primaryexamination,2 the fellowship examination,3 and thetrainee research requirement4,5 (Table 1). Although the

Correspondence: Dr Simon Craig, 246 Clayton Road, Clayton, Vic. 3168, Australia. Email: [email protected]

Simon Craig, FACEM, GCHPE, Emergency Physician/Registrar Education Coordinator; George Braitberg, FACEM, FACMT, Dip Epi Biostats,Professor of Emergency Medicine; Caroline Nicolas, BSocSci(Hons), Research Coordinator; Geoff White, PhD, MEd Studies, DipEd, BAppSci,Senior Lecturer; Diana Egerton-Warburton, FACEM, M Clin Epi, Emergency Physician/Director of Emergency Medicine Training.

doi: 10.1111/j.1742-6723.2010.01353.x Emergency Medicine Australasia (2010) 22, 537–547

© 2010 The AuthorsEMA © 2010 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine

training process is well documented,1 little has beenreported about ACEM trainees’ perceptions and experi-ences of assessment, supervision and feedback.

Trainees are also regularly asked to reflect on theirprogress and receive formal feedback every 3 to6 months from their supervisors and Directors of Emer-gency Medicine Training (DEMTs). This feedback isdocumented through In-Training Assessment forms.6

During a clinical shift, teaching ‘on the floor’ usuallytakes the form of ad hoc case presentations, with ajunior doctor seeing a patient independently, followedby discussion of the case with a senior clinician. Directobservation of trainees taking histories and performingphysical examination is infrequent.7

The primary aim of the present paper is to presentACEM trainees’ perceptions and experiences of assess-

Figure 1. Overview of ACEM training programme (used with permission).1

Table 1. Major assessments in ACEM training

Assessment Notes

Primary examination • Basic sciences – anatomy, pathology, pharmacology and physiology.• Each subject is tested with a 60-question multiple choice exam and a 10 min structured viva.• Any number of subjects (a single subject or up to all four) might be sat at a single examination.2

Regulation 4.10 /trainee research

• ‘Trainees must either have published a paper in a recognised peer-reviewed journal or have presenteda paper at a scientific meeting approved by the Board of Censors.’4

• Each project is adjudicated by Fellows of the ACEM who are appointed by the Board of Education.• Trainees are not eligible to sit the fellowship examination until the requirement has been

satisfactorily completed.5

Fellowship examination • Aims to determine whether a candidate is ‘ready to practice independently at the level of a consultantemergency physician.’3

• Based on a comprehensive syllabus.• Three written components -60 multiple choice questions in 90 min, 8 visual aid questions over 1 h,

and 8 short answer questions over 2 h.• Three clinical components – one long case, four short cases, and a 6 station structured clinical

examination.• Invitation to the clinical part of the exam is dependent upon achieving a minimum standard in the

written component.

S Craig et al.

538 © 2010 The AuthorsEMA © 2010 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine

ment, supervision and feedback. We designed a cross-sectional survey of ACEM trainee to describe theseperceptions. The results of this survey may providedata to inform decisions on curriculum developmentand trainee assessment. The findings might also beapplicable for other areas of clinical training.

Methods

A cross-sectional survey was used to explore ACEMtrainees’ perceptions and experiences of assessment,supervision and feedback. The survey was part of alarger study on ACEM trainees’ experiences of emer-gency medicine training.

An online survey, comprising of 38 questions wasdeveloped using SurveyMonkey, after reviews ofsimilar published surveys involving speciality train-ees.7 Focus group discussions were held with localACEM trainees and fellows, and suggestions were alsosought from the ACEM Board of Education and theTrainee Committee. The survey was finalized afterpilot testing by emergency trainees from SouthernHealth, Clayton, Australia, and took approximately20 min to complete.

Specific questions regarding training and ACEMrequirements allowed for free text responses. The ques-tions relevant to assessment, supervision and feedbackare presented in Table 2.

All 1666 ACEM trainees were emailed a request toparticipate in the study on three occasions between Mayand July 2009. The email contained study backgroundinformation and a link to the survey. Participation wasvoluntary, and completion of the survey provided a linkto a prize draw for an Apple I-Phone.

Trainees listed as inactive on the College’sdatabase were excluded. There were no other exclusioncriteria.

The survey was approved by the Southern HealthHuman Research Ethics Committee, the Standing Com-mittee on Ethics in Research involving Humans MonashUniversity, and the ACEM Scientific Committee.Informed consent was not specifically sought as partici-pation was voluntary.

Data analysis

Quantitative data were downloaded from the surveywebsite and analysed using the GraphPad InStatVersion 3.10 (GraphPad Software, La Jolla, California,2009).

Qualitative analysis was performed using groundedtheory methodology. One author (SC) read through allfree text answers to identify patterns of responding.Six main areas of interest were subsequently identi-fied: supervision, feedback, DEMT–trainee interac-tions, primary examination, regulation 4.10/traineeresearch and fellowship examination. Data were analy-sed and presented according to themes and categoriesthat emerged on close examination of full textresponses.8 Themes were identified in view of theirhigh frequency of occurrence. A second author (CN)conducted cross-verification of 30% of the data. Thedata were read repeatedly. Data with similar contentand meaning were grouped categorically. Analysiswas continued until all possible themes were identified(thematic saturation). Categories were refined throughcomparable responses, based on the level of concor-dance of responses.9 We attempted to adhere to previ-ously published recommendations for improving the

Table 2. Survey questions relevant to trainees’ views of supervision and feedback

Question Possible response

How would you rate the SUPERVISION you receive in your day to day clinical work? Poor/fair/good/excellentHow would you rate the FEEDBACK you receive in your day to day clinical work? Poor/fair/good/excellentHow useful do you find the feedback you receive at DEMT/trainee meetings? Not at all useful/useful/

very usefulDo you have any other comments about emergency medicine training / teaching / supervision? Free textIf YOU were running ACEM training, what changes (if any) would you make to the training

programme?Free text

If YOU were responsible for the primary examination, what changes (if any) would you make? Free textIf YOU were responsible for the research project (4.10), what changes (if any) would you make? Free textIf YOU were responsible for the fellowship examination, what changes (if any) would you make? Free text

ACEM, Australasian College for Emergency Medicine; DEMT, Directors of Emergency Medicine Training.

Assessment and feedback in emergency medicine

539© 2010 The AuthorsEMA © 2010 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine

rigour of qualitative research – particularly emphasiz-ing aspects of credibility, transferability, dependabilityand confirmability.10

Results

A total of 622 trainees responded to the survey(response rate 37%) and provided 2338 free-textresponses. In total, 410 related to supervision and feed-back, 189 to the fellowship examination, 408 to theprimary examination and 398 to regulation 4.10.

Demographic details, including the proportion ofrespondents who provided free text comments on train-ing and supervision are provided in Table 3. The major-ity of respondents were men (56%), trained in

metropolitan hospitals (61%) and had not yet fulfilledthe regulation 4.10 or fellowship examination (82% and93%, respectively).

Quantitative data related to the first three questionsare presented in Figure 2, indicating that a number ofrespondents found clinical supervision and feedback tobe good (45% and 39%, respectively) and DEMT feed-back to be useful (63.9%). However, more than 50% oftrainees found clinical feedback to be either poor (17%)or fair (37%).

Qualitative analysis examined the following issues:supervision, feedback, DEMT–trainee interactions, theprimary examination, regulation 4.10/trainee researchand the fellowship examination. A selection of represen-tative quotes from survey respondents are provided inTables 4 and 5.

Table 3. Demographic details of survey respondents

Number with free-textcomments (%)

No. with NO free-textcomments (%)

Total respondents(%)

SexMale 216 (35) 131 (21) 347 (56)Female 193 (31) 80 (13) 273 (44)

Training locationNew South Wales 95 (15) 50 (8) 145 (23)Victoria 90 (14) 45 (7) 135 (22)Queensland 83 (13) 41 (7) 124 (20)Western Australia 41 (7) 25 (4) 66 (11)New Zealand 41 (7) 22 (4) 63 (10)South Australia 24 (4) 15 (2) 39 (6)Northwest Territories 13 (2) 8 (1) 21 (3)Australian Capital Territory 11(2) 3 (0) 14 (2)Tasmania 9 (1) 0 (0) 9 (1)Overseas/not stated 2 (0) 4 (1) 6 (1)

Hospital of current employmentMetropolitan – major referral/tertiary 250 (40) 128 (21) 378 (61)Metropolitan – urban district 92 (15) 47 (7) 139 (22)Rural/regional 64 (11) 33 (5) 97 (16)Private hospital 3 (0) 3 (0) 6 (1)

Status with respect to major ACEM assessmentsPrimary examination

Passed primary 178 (29) 222 (36) 400 (64)Primary not passed 91 (15) 131 (21) 222 (36)

Regulation 4.10 (research paper/presentation)Regulation 4.10 fulfilled 62 (10) 53 (8) 115 (18)Regulation 4.10 not fulfilled 159 (26) 348 (56) 507 (82)

Fellowship examinationFellowship examination passed 22 (4) 17 (3) 39 (7)Fellowship examination not passed 145 (23) 438 (70) 583 (93)

Total 409 (66) 213 (34) 622 (100)

ACEM, Australasian College for Emergency Medicine.

S Craig et al.

540 © 2010 The AuthorsEMA © 2010 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine

Supervision (214 comments)

Most comments suggested that current levels of directbedside supervision are perceived to be inadequate (88comments). Trainee perceptions on general clinicalsupervision were mixed, but were more positivelyoriented than comments on bedside supervision.Supervision was perceived as insufficient by 44 train-ees, variable by another 16 and sufficient by 29trainees.

Eleven comments highlighted trainees’ concernsregarding the need to supervise junior staff. There werevarying views of supervision in non-ED rotations: fourrespondents reporting better levels of supervision thanwhile working in emergency, with one suggesting thatsupervision was worse.

Feedback (101 comments)

As evident in Figure 3, trainees found day-to-day feed-back to be unsatisfactory; primarily for being infre-

quent, focusing on the negative aspects of trainees’performance or lacking sufficient specificity to guidetrainee’s development.

DEMT/trainee interactions (95 comments)

The DEMT feedback is considered satisfactory;however, many trainees feel that meetings occur infre-quently, and the feedback given is too general to guidetheir ongoing development. There were four reports ofinterpersonal difficulties between DEMTs and trainees.

Other comments suggested that selection and train-ing of DEMTs should be more rigorous, that someDEMTs had insufficient direct knowledge of individualtrainees, and that trainees require the opportunity toprovide feedback on DEMT performance.

Primary examination (408 comments)

More than 90% of 171 trainee comments on the rel-evance of the primary examination raised concerns withsyllabus’ relevance to emergency medicine training.

0

50

100

150

200

250

300

350

400

Nu

mb

er o

f re

spo

nse

s

Clinical supervision Clinical feedback DEMTfeedback

Poor Fair Good Excellent Poor Fair Good Excellent Not at alluseful

Useful Veryuseful

Figure 2. Quantitative analysis of responses in relation to supervision, feedback, and Directors of Emergency Medicine Training(DEMT) meetings.

Assessment and feedback in emergency medicine

541© 2010 The AuthorsEMA © 2010 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine

Positive comments agreed with the necessity of anexamination in the basic sciences; not requiring anynecessary changes. Negative comments focused on theperceived irrelevance, and concerns with the extensivesyllabus. Pathology was the subject most commonlymentioned (19 comments), with four respondentsrequesting that the subject be removed. Most commentson the other subjects requested more clinical relevance.

The primary examination was thought to nega-tively impact on emergency medicine training in twoways – either as a barrier to entering advanced training(ten comments), or preventing trainees from acquiringpractical clinical knowledge (four comments).

Trainees were also concerned that any number ofsubjects could be sat at a single examination. Two train-ees requested the option of online examinations orexaminations being scheduled at more times during theyear, whereas another eight (all of who had completedthe primary examination) suggested that there shouldbe a minimum number of subjects sat.

Of the 25 responses focused on teaching and prepa-ration for the primary examination, the major issueswere requests for online resources from ACEM,

improved teaching and the provision of an accessiblecourse for primary examination candidates. Otherrespondents noted variable local support for the exami-nation, in terms of teaching or study leave.

In total, 11 of the 20 comments on alternatives to thecurrent primary examination suggested the addition ofa more clinical subject. Two of these suggested that theshort and long cases be relocated from the fellowshipexamination.

Regulation 4.10/trainee research (398comments)

Trainees appeared frustrated with the 4.10 requirement,with 87% of responses being negatively oriented. Theynoted that there is variable enthusiasm and interest inresearch, and many felt ‘forced’ to perform research inorder to fulfil their training obligations. The researchprojects themselves were thought to be of poor quality,making little contribution to the medical literature.

In total, 39 comments suggested the need forincreased resources and support from local departmentsand ACEM for trainees undertaking research. Specific

Table 4. A representative selection of trainee comments on supervision, feedback and DEMT–trainee interactions

Category Quotes

Supervision ‘I would like to have more direct supervision and formal teaching even for simple things. No one hasobserved me suturing since early internship- what if I’m really bad at it and I don’t even know?? . . . But theconsultants have to be motivated to teach and supervise, and if the department is too busy that just doesn’thappen.’

Provisional trainee, Victoria.‘Supervision of trainees is somewhat variable and at times scant, when compared to nursing eg presence ofclinical educators in department with a purely educational role: frequently consultant tied up on phone/admin issues, unable to share breadth of clinical experiences which is a waste of an invaluable resource.’

Third year advanced trainee, Queensland.

Feedback ‘As a general rule, feedback is reserved for when improvement needs to occur, it is uncommon for a job welldone to be commented on (true for all of med not just ED).’

Fourth year advanced trainee, Western Australia.‘What feedback? Consistent over several hospitals.’

First year advanced trainee, Tasmania.‘I think we do need more feedback – especially as we head closer to becoming ED consultants. It is justexpected that we feel prepared . . . but even after all the experience we have gained, it would be nice to haveour skills or weaknesses acknowledged.’

Fourth year advanced trainee, New South Wales.

DEMT–traineeinteractions

‘They don’t exist in my current hospital! DEMT has multiple responsibilities and is unable to spend time withtrainees!’

Second year advanced trainee, New South Wales.‘Never given specific constructive criticism . . . I would be surprised if there were no areas that needimprovement! It feels like we’re going through the motions.’

>4 years advanced trainee, New South Wales.

DEMT, Directors of Emergency Medicine Training.

S Craig et al.

542 © 2010 The AuthorsEMA © 2010 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine

suggestions included protected time, closer supervisionand advice, funding, and a departmental interest inresearch. These trainees felt that the 4.10 project dis-tracted them from learning about the clinical aspects ofemergency medicine, and interfered with studying forthe fellowship examination. Additionally, the projectaffected choice of rotations, in that it tied a trainee to aparticular department until the research is completed.

Although 76 respondents suggested removal of theproject from the curriculum, trainees also suggestedmany alternatives (140 comments). These includedmaking the project optional, alternative methods ofensuring adequate training in research methodologyand critical appraisal, or various non-research alterna-tives (administration, education, ultrasound, rural work,toxicology).

Fellowship examination (189 comments)

The most common administrative concerns were therequirement for trainees who had passed the writtenexamination but failed the clinical component to resitthe whole examination (29 comments), and conditionsfor eligibility to sit the fellowship examination. Elevenrespondents wanted the opportunity to sit the examina-

tion earlier in their advanced training, describing disap-pointment in finding large gaps in their knowledge onlyat the end of advanced training.

Most trainees view the examination as fair, althoughsome perceive that learning ‘how to play the game’ is asimportant as learning the syllabus. Training and teach-ing for the examination varies across different EDs.Respondents requested centralized lectures and tutori-als, or an emergency-specific exam-directed textbook.

Regarding specific exam components, trainees wereconcerned with time pressures in the written examina-tion – particularly the visual aid questions (VAQ) andshort answer question (SAQ) components. The clinicalexam had positive comments only for the structuredclinical examination (SCE), and only negative commentsfor the short and long cases.

Discussion

Supervision and feedback

The majority of trainees feel that their supervision isadequate. However, free-text comments suggest thatimprovements could be made, particularly in directsupervision at the bedside.

Table 5. A representative selection of trainee comments on the primary examination, regulation 4.10, and the fellowship examination

Category Quotes

Primary examination ‘Make the primary exams more clinically relevant as they seem very IRRELEVANT when you aresitting them and would rather be learning about information and procedures you need in your dailypractice.’

Provisional trainee, Victoria.

Regulation 4.10/traineeresearch

‘Encouraging research for the sake of it just promotes BAD research. . . . Do we really have time fordoing a pointless research project, the majority of which have NO IMPACT AT ALL on people’s dayto day practice? People actively pursue simple, quick projects for the sake of getting a box ticked, thisis hardly “good research” and forcing people who have no interest in research at all to do it merelycreates anger, frustration and resentment. You can’t force people to be interested in research . . .’

Fourth year advanced trainee, Victoria.

Fellowship examination ‘I would like to be able to sit the fellowship exam sooner rather than having to wait to complete3 years of full time equivalent training. As I am part time this makes it an awfully long haul and Iwant the motivation to gain that knowledge now.’

First year advanced trainee, South Australia.‘The amount of effort I and my study group expended on learning one-off, exam-only technique, thatwe would never use in day to day work, for long & short cases was ridiculous, and served NOPURPOSE WHATSOEVER in preparing me for work as a Consultant. It was simply a hoop-jumpingexercise, purely for the exam, that has very little application to real-life ED work . . . Do you reallythink that me learning how to do a demonstration rheumatological examination of the hands (whichwe perfected in our exam practice) will help me be a better consultant? . . .’

Fourth year advanced trainee, Victoria.

Assessment and feedback in emergency medicine

543© 2010 The AuthorsEMA © 2010 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine

Feedback has been defined as ‘ongoing appraisal ofperformance based on direct observation aimed atchanging or sustaining a behaviour’.11 It guides learn-ing, and is valued highly by learners.12

The present study demonstrates that traineesperceive feedback (both on the floor and from DEMTs)to be infrequent, focused on negative aspects of per-formance, and rarely based upon observed traineebehaviour. The lack of specific information leads togeneralized feedback which appears to have littlerelevance to ongoing trainee development.

The Royal Australasian College of Physicians(RACP) has introduced regular mini-clinical evaluationexercise (CEX) and multisource feedback as mandatoryparts of basic physician training in Australasia.13 Thisensures direct observation of practice and structuredfeedback to trainees on a regular basis throughout train-ing. A checklist-based assessment has previously beendeveloped for the purpose of direct observation of emer-gency medicine residents.14

Introduction of a similar system to emergency medi-cine training might provide trainees with more struc-

tured feedback based on their actual performance – achange which appears to be desired by many trainees.

Primary examination

The most prominent concern trainees have with theprimary examination was its emphasis on pure basicsciences, leading to the perception that the syllabus isirrelevant to their day to day practice, at a time when theyare trying to ‘learn the ropes’ of emergency medicine.

Many trainees are keen for more clinically basedknowledge to be tested. Potential areas of assessmentcould include (but are certainly not limited to) interpre-tation of ECGs, plain radiography, blood gases andpathology results, or demonstration of practical skillssuch as suturing technique, advanced cardiac lifesupport, or physical examination.

Testing of such skills before entering advancedtraining would equip trainees with a basic minimumstandard in these areas. However, this would requirea significant departure from the current primary

30

40

50

60

0

10

20

General comments on clinicalfeedback

Comments on FREQUENCY ofclinical  feedback

Comments on CONTENT ofclinical feedback

Nu

mb

er o

f re

spo

nse

s

Figure 3. Trainee comments regarding satisfaction with various aspects of feedback in day-to-day clinical work. ( ) Satisfied; ( ),Unsatisfied.

S Craig et al.

544 © 2010 The AuthorsEMA © 2010 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine

examination process, which, in the authors’ view, isunlikely to occur in the near future.

Fellowship examination

In the period of 1996–2003, the pass rate of 61% for thefellowship examination led some authors to question theadequacy of emergency medicine training in Australa-sia.15 Subsequently, efforts have been made to improvefeedback to trainees by providing comprehensiveexamination reports, specific definitions for the writtenexamination, access to past papers, and a comprehen-sive syllabus.5,16 Despite these efforts, the pass rate hasnot changed appreciably in recent years – in the periodof 2007–2009, the pass rate has remained at 61%.17

Interestingly, for the clinical components of theexamination, only positive comments were obtained forthe SCE, whereas only negative comments were givenfor the short and long cases. Currently, the short andlong cases are the only parts of the examination wherecandidates are assessed on their clinical skills with realpatients. These have been criticized in the educationalliterature because of low reliability, particularly in high-stakes examinations, where a reliability coefficient (anestimate of the random measurement error in assess-ment data) is recommended to be least 0.8.18,19 A singlelong case has an estimated reliability coefficient rangingfrom 0.2420 to 0.34,21 whereas four long cases is thoughtto achieve a coefficient of approximately 0.7.22

The use of SCEs is increasingly prominent amongpostgraduate examinations, and is well accepted byACEM trainees. Compared to the 1 hour 6 station SCE inthe ACEM fellowship examination, the emergency medi-cine fellowship examination in the UK has a 14 stationObjective Structured Clinical Examination (OSCE) thatruns for over 2 hour, and the American Board of Emer-gency Physicians has seven simulated-patient encoun-ters which run over approximately a 5 hour period.23

Some trainees wish to sit the fellowship examinationearlier during their training. It is widely held that assess-ment (such as examinations) drives learning.24 Enablingtrainees to sit the fellowship examination earlier duringtraining might lead to earlier development of the knowl-edge and skills required by senior emergency medicineclinicians. However, sitting without adequate prepara-tion might result in an even lower pass rate.

Trainees were critical of the requirement to resit thewritten components of the examination if it had previ-ously been passed. There is precedent for this in otherspecialty training programmes, such as the RACP,where the clinical component of their examination must

be passed within 5 years of passing the written compo-nent,13 and in the UK equivalent of the emergency medi-cine fellowship examination, where the three sections oftheir examination are regarded as modules which mustbe passed independently.25 Additionally, the four sub-jects of the ACEM primary examination can be com-pleted independently – once a subject is passed thecandidate is not required to be examined on it in subse-quent examinations.2

Direct observation of trainees in the workplace wassuggested by some respondents as an alternative toparts of the current fellowship examination. This hashigh face validity;26 however, it is resource-intensiveand unlikely to be introduced in the near future.27

Trainees working in hospitals with well-organizedexam-directed training and access to ACEM examinersappear to be at an advantage. Many trainees work insmaller centres with less structured teaching. Thisinequality is perceived to apply to preparation for boththe primary and fellowship examinations, and could beaddressed by developing online learning materialsaccessible to all trainees through the ACEM website.

Trainee research requirement

Most trainees feel that regulation 4.10 is a ‘hurdle’ thatmust be overcome, and they reported feeling ‘forced’ todo research that generates minimal useful information.Additionally, the respondents felt that they were beingdistracted from other more clinical aspects of training.Another major concern from some trainees is theimpediment that the 4.10 project places in the way ofprogression to the fellowship examination.

It is unknown what proportion of ACEM traineesintend to pursue research in the future. Surveys of USAemergency medicine programmes suggest that only5–7% of trainees are planning a career that emphasizesresearch.28 If the majority of trainees will not becomeresearchers, should we insist that they perform aresearch project during their training?

Concerns with the quality of research output mightalso be valid. A review of 57 research articles submittedto Emergency Medicine between 1998 and 2001 foundthat over one-third failed to further knowledge,described research that was not original, or had findingsthat were not clinically or socially significant.29

Support (from local departments and ACEM) for train-ees undertaking the project is variable. Some trainees feeldisadvantaged when local assistance is not readily avail-able. Interest in research and statistical expertise variesbetween EDs. Expecting trainees to produce quality

Assessment and feedback in emergency medicine

545© 2010 The AuthorsEMA © 2010 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine

research in addition to full-time work might not be real-istic, which has led some programmes in the USA tointroduce protected research time for trainees.30

Alternatives to regulation 4.10 were suggested bysurvey respondents, including research courses orexamination of critical appraisal skills. These alterna-tives might be more accurately blueprinted to the statedlearning objectives, as well as being more acceptable toACEM trainees.

Recent changes to ACEM training

In the last two years, a number of initiatives have beenestablished that might address some of the concerns ofACEM trainees. These include a review of the primaryexamination curriculum,31 revisions to regulation 4.10 toallow trainees to fulfil the requirement by completingtwo postgraduate university subjects in epidemiology,biostatistics, research methods or evidence based medi-cine,32 and the formation of the Training and AssessmentReview Working Group.33 It remains to be seen whetherthese modifications result in improved educational out-comes, or greater trainee satisfaction.

Limitations

The major limitation of our study is the relatively lowresponse rate (37%), increasing vulnerability to non-responder bias. Therefore, trainees who chose not tocomplete the survey might have differed systematicallyfrom those who did respond. Consequently, surveyresults might not be representative of overall ACEMtrainees’ views on assessment and feedback in emer-gency training.

Internet-based surveys have advantages over postalor telephone surveys in that they reduce costs, allow forrapid responses, and enable automatic transfer of datainto a database, which reduces the risk of data entryerrors. However, this might be at the expense of loweroverall response rates.34

Because of ethical and confidentiality considerations,our survey was emailed by a third party (ACEM). All1666 trainees were emailed the request to participate inthe survey; however, it is unlikely that every traineereceived the invitation. Given this, the true response rateis likely to be higher than the stated 37%.

Two reminder emails were sent via ACEM, with aninterval of one month between emails. Because ofACEM policies, no further reminders were allowed to besent. It is unknown whether higher reminders wouldhave added greatly to the response rate. Although, it

has been reported that up to five reminders has led to aresponse rate of over 50%.35,36

As the survey was distributed by a third party(ACEM), we are unable to provide specific demographicdetails on non-responders. However, our demographicdetails are similar to those reported for the population ofACEM fellows and trainees.37

Response bias is possible, with those trainees withstrong negative views being more likely to providewritten responses to the survey questions.

Conclusion

This is the first survey of Australasian EmergencyMedicine trainees seeking their views on all aspects oftraining. There is a perceived lack of bedside supervi-sion and a generally negative view of mandatory traineeresearch. Respondents noted concerns with specificexamination components, and inconsistency in theadministration of their examinations, with the Primaryoffering the exam in individual subjects whereaspassing the Fellowship is ‘all or nothing’.

Acknowledgement

This research was made possible with the assistance ofan Emerging Researcher Fellowship awarded by South-ern Health.

Author contributions

The paper was written primarily by SC, CN and GB.Survey design was by SC, GB, DEW and GW. Dataanalysis was by SC and CN. All authors were involvedin the critical revision of the paper for intellectualcontent and its final approval before submission. SC50%, CN 15%, GB 15%, GW 10%, DEW 10%.

Conflict of Interest

Professor George Braitberg and Dr Diana Egerton-Warburton are Section Editors for Emergency MedicineAustralasia. Simon Craig is a member of the ACEMTraining and Assessment Review Working Group(TARWG). George Braitberg is a member of the ACEMEducational Advisory Group (EAG). The research con-ducted and views expressed in this paper are those ofthe authors, and do not reflect the views of the abovementioned groups.

Accepted 8 September 2010

S Craig et al.

546 © 2010 The AuthorsEMA © 2010 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine

References

1. ACEM. Australasian College of Emergency Medicine Trainingand Examination Handbook. ACEM, ed. West Melbourne: Aus-tralasian College for Emergency Medicine, 2009.

2. ACEM. Primary examination. ACEM Training and ExaminationHandbook. West Melbourne: Australasian College of EmergencyMedicine, 2009; 75–80.

3. Rogers IR, Leach D, Brookes JG. Why don’t trainees pass theemergency medicine fellowship examination? Emerg. Med. Aus-tralas. 2004; 16: 336–42.

4. ACEM. Publication / presentation of a paper. ACEM Trainingand Examination Handbook. West Melbourne: AustralasianCollege of Emergency Medicine, 2009; 65–74.

5. ACEM. Fellowship examination. ACEM Training and Examina-tion Handbook. West Melbourne: Australasian College of Emer-gency Medicine, 2009; 86–94.

6. ACEM. Regulation 4.15 – supervision and accreditation of train-ing. ACEM Training and Examination Handbook. West Mel-bourne: Australasian College of Emergency Medicine, 2009; 21–3.

7. Burdick WP, Schoffstall J. Observation of emergency medicineresidents at the bedside: how often does it happen? Acad. Emerg.Med. 1995; 2: 909–13.

8. Liamputtong P. Grounded theory research. Qualitative ResearchMethods, 3rd edn. South Melbourne: Oxford University Press,2009; 206–24.

9. Charmaz K. The grounded theory method: an explication andinterpretation. In: Emerson RM, ed. Contemporary FieldResearch: A Collection of Readings. Boston: Little, Brown andCompany, 1983; 109–28.

10. Guba EG. Criteria for assessing the trustworthiness of natural-istic inquiries. Educ. Comm. Tech. J. 1981; 29: 75–91.

11. Richardson BK. Feedback. Acad. Emerg. Med. 2004; 11 (12):e1–5.

12. Gordon J. ABC of learning and teaching in medicine: one-to-oneteaching and feedback. Br. Med. J. 2003; 326 (7388): 543–5.

13. RACP. Requirements for Physician Training Australia: 2008Handbook. Sydney: Royal Australasian College of Physicians,2007, 15.

14. Shayne P, Gallahue F, Rinnert S et al. Reliability of a core com-petency checklist assessment in the emergency department: theStandardized Direct Observation Assessment Tool. Acad.Emerg. Med. 2006; 13: 727–32.

15. Cameron P, O’Reilly G. Is the ACEM training programmeadequate? Emerg. Med. Australas. 2004; 16: 271–3.

16. ACEM. Learning & examination processes. ACEM Training andExamination Handbook. West Melbourne: Australasian Collegeof Emergency Medicine, 2009; 108–32.

17. ACEM. Reports from the 33rd to 44th ACEM Fellowship Exami-nations. West Melbourne: Australasian College for EmergencyMedicine, 2004–2009.

18. Downing S. Reliability: on the reproducibility of assessment data.Med. Educ. 2004; 38: 1006–12.

19. Wass V, McGibbon D, Van der Vleuten C. Composite under-graduate clinical examinations: how should the components becombined to maximize reliability? Med. Educ. 2001; 35: 326–30.

20. Norcini JJ. The death of the long case? Br. Med. J. 2002; 324:408–9.

21. Wass V, Jones R, Van der Vleuten CPM. Standardized or realpatients to test clinical competence? The long case revisited. Med.Educ. 2001; 35: 321–325.

22. Wilkinson T, Campbell P, Judd S. Reliability of the long case.Med. Educ. 2008; 42: 887–93.

23. ABEM. American Board of Emergency Medicine. Certification,Policies and Procedures 2009. East Lansing: American Board ofEmergency Medicine, 2009.

24. Schuwirth LWT, van der Vleuten CPM. ABC of learning andteaching in medicine: written assessment. Br. Med. J. 2003; 326:643–5.

25. CEM. Fellowship Examination of the College of Emergency Medi-cine. Regulations and Guidance Notes (Effective from Spring2009). London: College of Emergency Medicine (UK), 2009.

26. Celenza A, Rogers IR. Qualitative evaluation of a formal bedsideclinical teaching programme in an emergency department.Emerg. Med. J. 2006; 23: 769–73.

27. Hazell W, Rogers IR. Update on the fellowship exam and itsrelation to modern educational principles and clinical compe-tency. Emerg. Med. Australas. 2005; 17: 263–5.

28. Neacy K, Stern SA, Kim HM, Dronen SC. Resident perception ofacademic skills training and impact on academic career choice.Acad. Emerg. Med. 2000; 7 (12): 1408–15.

29. Taylor DM, Brown AFT. Analysis of the study design andmanuscript deficiencies in research articles submitted to Emer-gency Medicine. Emerg. Med. (Fremantle) 2001; 13: 444–50.

30. Sloan EP, Bunney B, Silva J, Osborne ZP. A structured processfor fulfilling the resident research requirement. Acad. Emerg.Med. 2001; 8 (5): A582.

31. ACEM. Primary Examination Curriculum – Documents forReview. West Melbourne: Australasian College of EmergencyMedicine, 2009.

32. ACEM. Trainee research requirement (Regulation 4.10). ACEMTraining and Examination Handbook. West Melbourne: Aus-tralasian College of Emergency Medicine, 2010; 48.

33. ACEM. Australasian College of Emergency Medicine Website(Homepage). [Cited 15 April 2010.]. Available from URL: http://www.acem.org.au

34. Alves DW, Szucs PA. The demographics of e-mail for emergencymedicine research. Am. J. Emerg. Med. 2001; 19 (3): 192–5.

35. Braithwaite D, Emery J, De Lusignan S, Sutton S. Using theInternet to conduct surveys of health professionals: a valid alter-native? Fam. Pract. 2003; 20: 545–51.

36. Leece P, Bhandari M, Sprague S et al. Internet versus mailedquestionnaires: a controlled comparison (2). J. Med. Internet Res.2004; 24: e30.

37. ACEM. ACEM Annual Report 2009. West Melbourne: Australa-sian College of Emergency Medicine, 2009; 25–30, 32–38.

Assessment and feedback in emergency medicine

547© 2010 The AuthorsEMA © 2010 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine