direct observation of clinical practice in emergency medicine education

8
Direct Observation of Clinical Practice in Emergency Medicine Education Simon Craig, FACEM Abstract This review aims to summarize the current literature on the effects of direct, clinical observation of resi- dents in emergency departments (EDs) on learners, patients, and departmental functioning. A systematic literature search was conducted in Medline and ERIC, covering the years 1980–2009. Keywords were used to identify postgraduate medical staff working in the ED; direct observation of these trainees by supervising staff; and reports of outcomes relating to Kirkpatrick’s levels of reaction, learning, behavior, and institutional change. From an initial 11,433 abstracts and titles, 193 full-text articles were retrieved for further study. Application of inclusion and exclusion criteria yielded seven that were relevant to the topic. These studies comprised a range of methods—descriptive, qualitative evaluation, cohort studies, and a cross-sectional survey. Learner reaction was very enthusiastic. Positive changes in behavior due to feedback provided during direct observation were suggested by two studies. A single study evaluated trainee’s perceptions on patient outcomes and noted that thorough assessments and improved manage- ment decisions may be at the expense of slower throughput of patients and diversion of senior staff from direct patient care. Three studies noted the resource-intensive nature of direct observation. Direct observation of clinical practice may be useful in ED education; however, further research is required to evaluate its effects. ACADEMIC EMERGENCY MEDICINE 2011; 18:60–67 ª 2011 by the Society for Academic Emergency Medicine ‘‘If musicians learned to play their instruments as physicians learn to interview patients, the proce- dure would consist of presenting in lectures or maybe in a demonstration or two the theory and mechanisms of the music-producing ability of the instrument and telling him to produce a melody. The instructor of course, would not be present to observe or listen to the student’s efforts, but would be satisfied with the student’s subsequent verbal report of what came out of the instrument.’’ –George Engel, after visiting 70 medical schools in North America 1 E mergency departments (EDs) are important places of learning for junior doctors. A recent Australian study examining intern activities sug- gests that in an 8-week period, on average, an intern would see more than 200 patients, perform approxi- mately 300 procedures, and consult senior ED staff on almost 700 occasions. 2 When senior ED staff are con- sulted, a junior doctor’s management plan is often altered. 3 Traditionally, planned education for emergency medi- cine (EM) residents has consisted of tutorials and lec- tures. Even with protected teaching time (where residents are excused from clinical duties), attendance at teaching is variable due to shift work, holidays, and study leave. This leads to difficulty in ensuring uniform coverage of core topics and concerns that learners are being disengaged. 4 During a clinical shift, teaching ‘‘on the floor’’ usually takes the form of ad hoc case presentations, 5 with a resident seeing a patient independently, followed by discussion of the case with a senior clinician, either one to one 6 or in a forum such as a ‘‘board round,’’ 7 where cases are presented to a group of colleagues. However, in the author’s experience, it is difficult to obtain infor- mation on particular aspects of a resident’s interaction with a patient. Thoroughness of history and physical examination, empathy, communication skills, proce- dural skills, and decision-making at the bedside are all unseen, but greatly important for patient care. The importance of these skills has recently been acknowledged by the Australasian College for Emer- gency Medicine (ACEM), resulting in alterations to in-training assessment forms. 8 These are completed at the end of each clinical rotation (3- to 6-month period) ISSN 1069-6563 ª 2010 by the Society for Academic Emergency Medicine 60 PII ISSN 1069-6563583 doi: 10.1111/j.1553-2712.2010.00964.x From Monash Medical Centre, Clayton, Victoria, Australia. Received May 30, 2010; revision received June 18, 2010; accepted June 22, 2010. Conflict of interest: None. Supervising Editor: Richard Lammers, MD. Address for correspondence and reprints: Simon Craig, FACEM; e-mail: [email protected].

Upload: simon-craig

Post on 20-Jul-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Direct Observation of Clinical Practice in Emergency Medicine Education

Direct Observation of Clinical Practicein Emergency Medicine EducationSimon Craig, FACEM

AbstractThis review aims to summarize the current literature on the effects of direct, clinical observation of resi-dents in emergency departments (EDs) on learners, patients, and departmental functioning. A systematicliterature search was conducted in Medline and ERIC, covering the years 1980–2009. Keywords wereused to identify postgraduate medical staff working in the ED; direct observation of these trainees bysupervising staff; and reports of outcomes relating to Kirkpatrick’s levels of reaction, learning, behavior,and institutional change. From an initial 11,433 abstracts and titles, 193 full-text articles were retrievedfor further study. Application of inclusion and exclusion criteria yielded seven that were relevant to thetopic. These studies comprised a range of methods—descriptive, qualitative evaluation, cohort studies,and a cross-sectional survey. Learner reaction was very enthusiastic. Positive changes in behavior due tofeedback provided during direct observation were suggested by two studies. A single study evaluatedtrainee’s perceptions on patient outcomes and noted that thorough assessments and improved manage-ment decisions may be at the expense of slower throughput of patients and diversion of senior stafffrom direct patient care. Three studies noted the resource-intensive nature of direct observation. Directobservation of clinical practice may be useful in ED education; however, further research is required toevaluate its effects.

ACADEMIC EMERGENCY MEDICINE 2011; 18:60–67 ª 2011 by the Society for Academic EmergencyMedicine

‘‘If musicians learned to play their instruments asphysicians learn to interview patients, the proce-dure would consist of presenting in lectures ormaybe in a demonstration or two the theory andmechanisms of the music-producing ability of theinstrument and telling him to produce a melody.The instructor of course, would not be present toobserve or listen to the student’s efforts, but wouldbe satisfied with the student’s subsequent verbalreport of what came out of the instrument.’’

–George Engel, after visiting 70 medical schools inNorth America1

E mergency departments (EDs) are importantplaces of learning for junior doctors. A recentAustralian study examining intern activities sug-

gests that in an 8-week period, on average, an internwould see more than 200 patients, perform approxi-mately 300 procedures, and consult senior ED staff on

almost 700 occasions.2 When senior ED staff are con-sulted, a junior doctor’s management plan is oftenaltered.3

Traditionally, planned education for emergency medi-cine (EM) residents has consisted of tutorials and lec-tures. Even with protected teaching time (whereresidents are excused from clinical duties), attendanceat teaching is variable due to shift work, holidays, andstudy leave. This leads to difficulty in ensuring uniformcoverage of core topics and concerns that learners arebeing disengaged.4

During a clinical shift, teaching ‘‘on the floor’’ usuallytakes the form of ad hoc case presentations,5 with aresident seeing a patient independently, followed bydiscussion of the case with a senior clinician, either oneto one6 or in a forum such as a ‘‘board round,’’7 wherecases are presented to a group of colleagues. However,in the author’s experience, it is difficult to obtain infor-mation on particular aspects of a resident’s interactionwith a patient. Thoroughness of history and physicalexamination, empathy, communication skills, proce-dural skills, and decision-making at the bedside are allunseen, but greatly important for patient care.

The importance of these skills has recently beenacknowledged by the Australasian College for Emer-gency Medicine (ACEM), resulting in alterations toin-training assessment forms.8 These are completed atthe end of each clinical rotation (3- to 6-month period)

ISSN 1069-6563 ª 2010 by the Society for Academic Emergency Medicine60 PII ISSN 1069-6563583 doi: 10.1111/j.1553-2712.2010.00964.x

From Monash Medical Centre, Clayton, Victoria, Australia.Received May 30, 2010; revision received June 18, 2010;accepted June 22, 2010.Conflict of interest: None.Supervising Editor: Richard Lammers, MD.Address for correspondence and reprints: Simon Craig, FACEM;e-mail: [email protected].

Page 2: Direct Observation of Clinical Practice in Emergency Medicine Education

and are used to assess a range of competencies. Theyinclude knowledge and basic skills, clinical judgment,practical skills, professional relationships and communi-cation, ability to perform under stress and differentworkloads, sense of responsibility and work ethic, andvarious other aspects of performance.9 It would appearthat assessment of specific factors, such as ‘‘interactionwith patients and families,’’ ‘‘compassion ⁄ empathy,’’and ‘‘gentleness of technique,’’ would be impossiblewithout direct observation of practice.

Previous work has demonstrated that senior medicalstaff will often alter patient management plans deter-mined by residents. Sacchetti et al.3 found that one inevery 25 patients had a ‘‘major’’ change from theresidents’ proposed plan of care after review of thepatient by an attending emergency physician (EP).These changes included altered disposition, detectionof unsuspected pathology, or a marked revision ofintended treatment. Another 33% of all patients hadminor changes made to their management after reviewby an EP.3

The difference between management plans deter-mined by junior and senior physicians may have signifi-cant implications. Wyatt et al.10 found that actualsurvival of patients admitting to four Scottish hospitalswith traumatic injuries was significantly higher if apatient was managed by an emergency consultant com-pared to those managed by a junior doctor. This wasdespite the patients managed by EPs having moresevere injuries and a higher statistical likelihood ofdeath.

As well as an association with improved patient out-comes, increasing seniority of ED clinical staff is associ-ated with improved ED efficiency. This has recentlybeen demonstrated by Harvey et al.,11 where a resi-dent’s strike was associated with reduced ED waitingtimes and more rapid decisions for admission and dis-charge.

Direct observation is emerging as a potentiallyimportant educational technique in EM. Deficiencies inperformance of clinical skills or procedures can beidentified and appropriate feedback given.12 Potentialbenefits include provision of higher-quality care,improved patient safety, and enhanced confidence ofjunior medical staff. However, published surveys indi-cate that many EM residents are rarely observed intheir day-to-day work.13

Emergency departments are often chaotic, over-crowded places, with large numbers of patients suffer-ing undifferentiated illnesses, some with critical illness.Supervising EPs are subject to multiple distractions andare busy overseeing junior doctors and nursing staffand managing the ‘‘flow’’ of the department, as well astheir ‘‘own’’ patients.

Given these constraints, it is likely that some modelsof clinical supervision have not been fully applied in theED setting. Medical staff working in other less hecticenvironments, such as the operating room or the inten-sive care unit, may have different experiences of directsupervision. It may be, however, that some methods ofdirect supervision of junior staff—initially used in amore controlled setting—have some applicability topostgraduate education in EM.

Primary and Secondary Review AimsThe primary aim of this review was to summarize thecurrent literature on the effects of direct clinical obser-vation of residents in EDs on learners, patients, anddepartmental functioning. Secondary aims are to1) describe the characteristics of—and learner reactionto—published models of direct observation that havebeen applied to postgraduate medical education in EDs,2) identify other models of direct observation that havebeen applied to postgraduate medical education inselected areas of acute medical training outside of EM(anesthesia, intensive care), 3) develop recommenda-tions for current practice of direct observation in post-graduate training in EM, and 4) suggest areas forfurther research in this area.

METHODS

Study DesignThis was a literature review study utilizing Medline andERIC.

Search Strategy and Sources of PapersAll English-language articles published in peer-reviewed journals from 1980 to 2009 were included inthe search. Table 1 presents an overview of the searchterms used. From the articles generated from thesearch, reference lists were consulted to identify addi-tional papers not discovered during the initial search.Titles and abstracts of all papers identified using thedatabase searches were reviewed to ascertain whetherthey related to direct observation of clinical practice inthe setting of EM, anesthesia, or intensive care. The fulltext of selected articles was retrieved whenever possi-ble. The inclusion and exclusion criteria were thenapplied, leading to the final selection of articles.

Selection ParametersStudies involving postgraduate medical staff workingand training in the ED, ranging from interns (in theirfirst postgraduate year) up to senior EM residentswere included. To allow valid comparison with thetraining of EM residents, other areas of medicine withsimilar exposure to resuscitation and stabilization wereselected. Although all areas of acute hospital medicinehave some relevance to EM, this review selected post-graduate junior medical staff working in anesthesia andintensive care; doctors in these areas frequently applyknowledge and skills that closely overlap with thoseused in the ED.

Studies examining direct observation of medicalstudents, specialist physicians, or postgraduate medi-cal education in areas outside emergency, intensivecare, or anesthesia were excluded. Additionally, studiesexamining postgraduate education of other healthcare professionals (e.g., nursing, physiotherapy) wereexcluded.

Interventions reporting the effects of a supervisingphysician directly observing clinical practice by juniormedical staff were selected for this review. Studies ofclinical supervision or other educational interventionsthat did not involve direct observation of practice wereexcluded.

ACAD EMERG MED • January 2011, Vol. 18, No. 1 • www.aemj.org 61

Page 3: Direct Observation of Clinical Practice in Emergency Medicine Education

Kirkpatrick’s model of educational outcomes14 wasused to classify and analyze outcomes. The modeldescribes four nonhierarchical levels of learning (Fig-ure 1) and has been widely used in other reviews ofeducational interventions. Studies that reported onprevalence of direct observation but did not refer toany educational outcomes were excluded from thereview.

Included Study DesignsIn the part of the review examining direct observationin the ED, all study designs were included, to obtain a

comprehensive understanding of the application ofdirect observation in this setting. The review wasrestricted to studies that include outcome data in Kirk-patrick’s levels 2, 3, and 4 for those studies exploringdirect observation in the operating room and intensivecare unit. Although learner reaction (Kirkpatrick’s level1) is important, it does not directly relate to learning.Therefore, it may not be sufficient to justify the applica-tion of a relatively scarce educational resource—an EP.The use of specialist physicians to observe and educateresidents is unlikely to be revenue-neutral, and manag-ers may require evidence that the additional cost leadsto changes in knowledge, resident behavior, or patientresults. Additionally, the expectations and reactions ofEM learners—residents who have chosen a field with abroad range of presentations and acuity—may be dif-ferent from those working in other clinical areas.Therefore, studies of postgraduate trainees in anesthe-sia and intensive care were excluded if they reporteddata related only to Kirkpatrick’s level 1.

Quality AssessmentFor the purposes of quality assessment, the studies areseparated into the following groups: 1) qualitative anddescriptive ⁄ observational studies, 2) cross-sectionalsurveys, and 3) cohort studies.

Qualitative and Descriptive ⁄ Observational Studies.Qualitative and descriptive ⁄ observational studies wereassessed using a previously designed critical reviewform for qualitative studies developed by Letts et al.15

Review items include study purpose, need for the study,study design, sampling, data collection and analysis,overall rigor, and conclusions ⁄ implications.

Cross-sectional Surveys and Cohort Studies. Thequality analysis of cross-sectional surveys and cohortstudies was based upon the relevant checklists from aninternational collaborative group, STrengthening theReporting of OBservational studies in Epidemiology(STROBE).16,17 These checklists address specific aspectsof title, abstract, introduction, methods, results, discus-sion, and funding sources.

RESULTS

Data ExtractionA total of 193 full-text articles were retrieved, and theselection criteria outlined above applied. No additionaloriginal articles were identified by examination of thereference lists of the retrieved full-text articles (Fig-ure 2). An overview of the seven articles selected asmeeting all requirements for inclusion is presented inTable 2.12,18–23

Quality AssessmentThe quality of the included studies varied considerably.Common study weaknesses of the descriptive and qual-itative studies were the lack of an identified theoreticalperspective, no mention of whether sampling occurreduntil redundancy was achieved, and lack of a clear deci-sion trail. The cross-sectional survey and cohort studieswere of higher quality.

Table 1Search Terms Used

Population ⁄Participants Intervention Outcomes

trainee* OR Mini-CEX OR react*registrar* OR DOPS OR knowledge ORresident* OR SDOT skill* ORintern or interns OR observ* OR attitud* ORHMO* OR feedback OR learn* ORhospital medicalofficer* OR

superv* OR teach* OR

SHO* OR teach* OR behav* ORsenior hospitalofficer*

direct or bedsideOR

result* OR

‘‘physicians’’ OR bed-side OR organisat* OR‘‘graduate medicaleducation’’ OR

immediate OR organizat* OR

‘‘medical services’’OR

clinical chang*

‘‘medicine’’ ‘‘clinicalexperience’’ OR

casualty OR ‘‘practicumsupervision’’ OR

emergency OR ‘‘clinical teachinghealthprofessions’’ OR

emergency* OR ‘‘work experienceprograms’’ OR

A&E OR ‘‘experientiallearning’’ OR

accident andemergency

‘‘graduate medicaleducation’’ OR

anaesthetics OR ‘‘internshipprograms’’ OR

anesthesia OR ‘‘medicaleducation’’ OR

anaes* OR ‘‘competence’’ ORanes* OR ‘‘evaluation’’ ORtheat* ‘‘job performance’’

ORintensive care OR ‘‘personnel

evaluation’’ ORICU OR ‘‘professional

personnel’’intensive careunit OR

ITU

Level 1 Reaction (student’s reaction to the educational experience)

Level 2 Learning (changes in knowledge, attitudes, or skills)

Level 3 Behavior (changes in practice and application of learning to practice)

Level 4 Results (change at the level of the learner and the organization)

Figure 1. Kirkpatrick’s model of educational outcomes.

62 Craig • DIRECT OBSERVATION: EDUCATION IN THE ED

Page 4: Direct Observation of Clinical Practice in Emergency Medicine Education

Effects of Direct Observation on EM Residents,Patients, and EDsAn overview of the results for the four papers relatingto the primary aim is presented in Table 3.12,18,19,21 Theavailable evidence suggests that direct observation ofclinical practice is beneficial in terms of trainees’ learn-ing: positive changes in knowledge, skills, or attitudeswere cited in all studies. However, these benefits werenot demonstrated by grades in examination or testing,but were based on trainee self-report.

Two studies report changes in behavior, as evidencedby repeated observation of practice demonstratingaltered performance. Neither study evaluated whetherthese changes in behavior were sustained.

Only one study attempted to assess affect on patientcare. The perceived effect of the program on patientcare was determined from the viewpoint of the traineesand supervisors involved in the program. It wasthought that a more thorough, careful assessment andplan was put in place, but at the expense of decreasedefficiency (leading to prolonged waiting times forother patients). This was not supported by quantitativedata. Three of the five studies highlighted the resource-intensive nature of direct observation of clinicalpractice—the use of senior staff to supervise and teachrather than directly deliver patient care.

Published Models of Direct Observation in the EDTable 4 summarizes the four papers that describe char-acteristics of various models of direct observation inthe ED.12,18,19,23 The published models mostly rely onsenior EPs to supervise post-EM trainees. One study,examining death notification skills of trainees, used EDvolunteers who are specifically involved in the care ofbereaved families.

Three of the five studies reported the use of (and pro-vided examples of) checklists, which were filled in bythe observer during the clinical encounter. One exam-ple of such a checklist can be found in Shayne et al.23

Although some learners were initially apprehensive, allstudies that measured learner reaction report an enthu-siastic response to direct observation.

Direct Observation in Anesthesia and Intensive CareA single paper was identified examining the effect ofthe presence of a consultant anesthesiologist on patientoutcomes in the setting of emergency intubation.24 Thiswas a prospective cohort study examining 322 consecu-tive patients who required emergency tracheal intuba-tion by anesthesia trainees. Instances where aconsultant anesthetist was present were compared toinstances where there was no direct senior supervision.Respiratory therapists who were assisting with the intu-bation recorded data on patient demographics, clinicalmanagement, and immediate complications. Additionaldata on patient outcomes were extracted from thepatient’s medical records.

There were no differences between the groups atbaseline. Supervision of anesthesia trainees by a con-sultant anesthetist was associated with a significantdecrease in the rate of immediate complications (6.1%vs. 21.7%, p < 0.0001), particularly aspiration (0.9% vs.5.8%, p = 0.037). There were no differences in patientoutcomes at 28 days. The paper does not report anyinformation on learning experiences, changes in traineebehavior, or changes in organizational practice.

DISCUSSION

When compared to interns and residents, attendingEPs are more efficient at patient care,11 and their pres-ence has been associated with improved patient out-comes.10 A similar association of senior staff presencewith improved patient outcomes has been noted foremergency airway management by anesthesia train-ees.24 Direct supervision of residents has high facevalidity—they are observed performing tasks in a clini-cal environment.

This review has identified a small number of studiesthat describe the effects of this direct observation edu-cational intervention in the ED. The papers in thisreview comprise descriptive and observational studies,qualitative studies, one survey, and two cohort studies.No study fulfilled all the criteria in the relevant checklistfor methodologic rigor. All reported learning outcomesare ‘‘positive,’’ based on the self-report of participants.It is a weakness of the current literature in this areathat more definite information on the effects of learn-ing through direct observation of clinical practice islacking.

Figure 2. Flow diagram of search yield.

ACAD EMERG MED • January 2011, Vol. 18, No. 1 • www.aemj.org 63

Page 5: Direct Observation of Clinical Practice in Emergency Medicine Education

Table 2Overview of Articles Describing Effects of Direct Observation of Trainees

Article Summary of Study ObserversFeedback Provided

to Trainee Learner Reaction

Benensonand Pollack,200318

Prospective observationalstudy of emergencytrainee performance inactual deathnotifications.

Trained evaluators(ED-employed patientrepresentatives involvedin caring for bereavedfamilies) used achecklist to assesstrainees.

Unsatisfactoryperformance promptedconstructive feedbackfrom supervisingdoctors.

Not reported.

Celenzaand Rogers,200619

Qualitative evaluation ofthe introduction of‘‘clinical teaching shifts’’to a West Australian ED.

Consultant and registrarformally allocated toteaching and learningroles for a 5-hourperiod. Opportunisticteaching at the bedside,determined by case mix.

No specific data collectedduring period of directobservation.

Opportunistic teachingand feedback providedat the bedside,determined by case mix.

Prior tocommencement,learners wereapprehensive.However, a positiveresponse reported atend of program.

Sessions wereperceived as too long(suggested2–3 hours).

Cydulka et al.,199612

Descriptive study ofintroduction of formaldirect observation andevaluation of traineesfor up to 4 hours duringa clinical shift four timeseach academic year.

EP (either residencycoordinator or facultyadvisor) used checklistto assess traineeperformance.

After observed patientencounter, the traineeand supervisor woulddiscuss the clinicalfindings and develop atreatment plan.

Immediate feedbackprovided to trainee.

Example: ‘‘I was reallydreading this day …when you werecoming to follow mearound, but as itturned out—nervousas I was—I reallyenjoyed the time,attention, andteaching.’’

Dorfsman andWolfson20

Program of structureddirect observation of EMresidents during clinicalshifts in the ED.

Three different EPs actedas observers over an18-month period.Checklist filled in byobservers during clinicalencounter.

4- to 5-hour shiftsfollowing and directlyobserving ED resident.

Strengths andweaknesses noted andprotected time forfeedback at end ofsession.

Perceived asextremely helpfuland nonthreatening.

Requestedopportunity forfurther teachingshifts.

Suggested addingshifts whereresidents shadow anEP, seeing how theyare able to efficientlymanage their time.

Jouriles et al.,200221

Secondary data analysisof direct observationdata forms that werecompleted over an8-year period.

EPs used achecklist -based scoringsystem to assessinterpersonal skills ofED residents.

Not reported Positive—’’consideredto be a valuableexperience’’

Lloyd et al.,200022

Descriptive study of thethe use of directobservation ofSHO-patientconsultations, followedby feedback from amore senior doctor.

Senior ED trainees whohad been specificallytrained in givingfeedback performeddirect observation onconsultations by EDsenior house officers.Purpose-designed feedback chart used torecord observations.

Verbal feedback and copyof the feedback chartprovided to SHO afterconsultation.

SHOs felt that theyhad benefited fromthe teachingexperience andwelcomed thefeedback theyreceived.

Shayneet al., 200223

Descriptive study ofintroduction ofprotected clinicalteaching time and abedside clinicalevaluation instrument inan EM training program.

Introduction of theclinical evaluationexercise (CEE)—providing a ‘‘snapshot’’of a trainee’sperformance in a clinicalencounter. Datagathered through directobservation andrecorded using aspecific evaluation form.

Formal written andoral feedback fromsupervising EP.

Not reported.

SHO = senior house officer.

64 Craig • DIRECT OBSERVATION: EDUCATION IN THE ED

Page 6: Direct Observation of Clinical Practice in Emergency Medicine Education

None of the studies were able to demonstrate a sus-tained change in behavior. This may be due to the diffi-culty in attributing behavior change to a singleeducational intervention. Additionally, it has been rec-ognized for many years that people, once they areaware they are being observed, intentionally or unin-tentionally modify their behavior. This ‘‘Hawthorneeffect’’25 may result in trainees attempting to performat their ‘‘best,’’ rather than practice at their usual level.It has been suggested that by making direct observation‘‘routine’’ and repetitive, this effect may be minimized.6

Emergency clinicians have previously raised concernsabout the emphasis in the medical education literatureon learner reaction at the expense of the ‘‘correct’’ endpoints of quality patient care or clinical outcome mea-sures.26

Direct observation and bedside supervision appearto have conflicting effects on the ED as a whole—individual patient care may improve due to earlierinvolvement of senior staff and improved decision-making, but departmental efficiency may be reduceddue to the diversion of senior staff to direct clinicalsupervision and the decreased speed with which patientsare seen. This resulting delay in care may be detrimentalto patients.

Despite this, over time there may be a greaterincrease in efficiency and clinical skills for those train-ees who are provided with higher levels of supervisionand feedback. However, there is no perfect measure forthese aspects of performance, and results may be con-founded by multiple other factors, such as trainee moti-vation, didactic teaching, and clinical experiences.

Table 3Reported Effects of Direct Observation of Clinical Practice in the ED

Paper

Effects on Learning(Knowledge, Skills,

Attitudes)

Effects on Behaviorof Emergency

Trainees Effects on Patients Effects on the ED

Benensonand Pollack,200318

NA Feedback afterdirect observationled to improvedperformance inthose initiallyassessed as‘‘unsatisfactory.’’

NA NA

Celenza andRogers, 200619

Trainees report learningclinical reasoning, clinicalknowledge, professional,and communications skills.

Obstacles identified includedthe busy department,interruptions, and sessionsperceived as ‘‘too long.’’*

NA More rapiddecision-makingand managementdecisions.

Senior ‘‘secondopinion.’’

More thoroughassessment.

Patients feelingbetter cared for.

Prolonged waitingtimes for otherpatients.*

Slower patientprocessing.*

Resource-intensive(‘‘nonclinical’’ useof senior staff).*

Cydulka et al.,199612

Opportunity for extendeddiscussion and teaching atthe bedside.

Immediate corrective advicefor identifiedproblems ⁄ deficiencies.

Identification of underlyingcause of ‘‘slow’’ or‘‘underperforming’’trainees.

Increased attentionto detail withregard to clinicalassessment,preparation forprocedures.

NA Resource-intensiveuse of EPs.*

Jouriles et al.,200221

Direct observation may beuseful to assess trainee’sinterpersonal skills,however, needs furtherevaluation.

Raw scores of trainees’performance did not discriminate between thosewith poor or goodinterpersonal skills.*

Standardization of thesescores provided moreaccurate information.

NA NA NA

NA = no information reported.*Negative effects.

ACAD EMERG MED • January 2011, Vol. 18, No. 1 • www.aemj.org 65

Page 7: Direct Observation of Clinical Practice in Emergency Medicine Education

It remains to be seen whether an ‘‘up-front investment’’in trainees results in better patient outcomes.

All studies that mentioned effects on the ED high-lighted the high cost of direct observation, in terms ofsenior clinician time. This is at the expense of eitherdirect patient care or other administrative tasks. Whensignificant investment of clinician time is required forminimal proven clinical benefit, and in the currentfinancial state of many hospitals, it is thought unlikelythat managers will write a blank check to continue aprogram on purely educational grounds.

Future research in this area must attempt to identifyclinically relevant outcomes that can be attributedto the educational intervention. This is likely to bedifficult—patient outcomes are influenced by multiplehealth systems and clinical teams, premorbid healthconditions, and other factors.27 Despite this difficulty, itis important to demonstrate positive effects on patientcare or ED processes to ensure ongoing support forsuccessful educational programs in a resource-poorenvironment.

Models of Direct Clinical Supervision in the EDDirect clinical supervision has the potential to posi-tively affect patient outcome and trainee development,

especially when combined with focused feedback.28 Thepublished models of direct clinical supervision rangefrom checklist-based assessment and specific feedback,to a ‘‘clinical teaching shift,’’ which is geared towardenhancing the learning experience of trainees in a morefluid way.

All published models were positively received bytrainees, although initial learner apprehension wasreported. Where apprehension was noted, it quicklybecame replaced by appreciation for the individualizedattention and excellent learning opportunities directobservation offered. None of the identified studies com-pared different models of direct clinical supervision, soat this stage, no recommendations can be made on aparticular model; this could be a potential focus forfuture research.

Future ResearchThere is little published research on the effects of directobservation in the ED, and there are several potentialavenues for further research. First, we should examineeffects other than learner reactions. It is important tocollect supplementary data to determine whether directobservation in the ED has a positive effect on learning,trainee behavior, patient outcomes, or ED processes.Positive results in these areas may increase the likeli-hood of wider adoption of this educational method.Second, comparisons of different models of directobservation (checklist-based assessment and feedback,opportunistic teaching, or a combination) are needed,to determine the optimal method of delivery. Third,comparisons between direct observation and feedbackby peers and by supervisors are also needed.

LIMITATIONS

The literature search and review was performed by asingle author and included only English-language arti-cles. It is possible that some important articles weremissed during the review process; however, attemptswere made to minimize this by the checking ofreference lists of retrieved papers use of two separatedatabase searches (ERIC and Medline).

The search terms and inclusion criteria were strictlyapplied. This may have led to underinclusion of otherstudies that are potentially relevant to EM. However,the aim of this review was to provide information perti-nent to the implementation and effects of direct obser-vation in EM. A comprehensive review of the use ofdirect observation in postgraduate medical educationhas been published elsewhere and found that reportson validity and description of educational outcomeswere scarce.29

CONCLUSIONS

This article summarizes the current literature on theeffects of direct, clinical observation of residents in EDson learners, patients, and departmental functioning.This educational method is resource-intensive, requir-ing senior staff to be free of a patient load and able todirectly observe and supervise an intern or resident.Many papers identify positive learner reaction, which

Table 4Overview of Published Models of Direct Observation

Paper Intervention Data Collection

Benensonand Pollack,200318

Trained evaluators(ED-employed patientrepresentativesinvolved in caring forbereaved families.

Checklist filled inby observersduring clinicalencounter.

Celenza andRogers,200619

Consultant and registrarformally allocated toteaching and learningroles for a 5-hourperiod.

Opportunistic teachingat the bedside,determined by casemix.

No specific datacollected duringperiod of directobservation.

Learner andteacher reactionsassed usingquestionnaire.

Cydulka et al.,199612

EP (either residencycoordinator or facultyadvisor).

After observed patientencounter, the traineeand supervisor woulddiscuss the clinicalfindings and develop atreatment plan.

Immediate feedbackprovided to trainee.

Checklist filled inby observersduring clinicalencounter.

Shayne et al.,200223

Introduction of theclinical evaluationexercise (CEE)—providing a‘‘snapshot’’ of atrainee’s performancein a clinical encounter.

Data gathered throughdirect observation, andimmediate feedbackprovided to thetrainee.

Checklist filled inby observersduring clinicalencounter.

66 Craig • DIRECT OBSERVATION: EDUCATION IN THE ED

Page 8: Direct Observation of Clinical Practice in Emergency Medicine Education

may be used to justify the introduction of such a pro-gram. Additional outcomes include a perception ofmore thorough clinical assessment and improved man-agement decisions, but these are achieved at theexpense of slower overall patient processing. In ahealth care environment where resources are insuffi-cient to meet all clinical priorities, further research isrequired to determine whether such programs havepositive effects on learning, ED efficiency, or—mostimportantly—measurable patient outcomes.

References

1. Gordon J. ABC of learning and teaching in medi-cine: one to one teaching and feedback. Br Med J.2003; 326:543–5.

2. Zhu JN, Weiland TJ, Taylor DM, Dent AW. Anobservational study of emergency departmentintern activities. Med J Aust. 2008; 188:514–9.

3. Sacchetti A, Carraccio C, Harris RH. Resident man-agement of emergency department patients: is clo-ser attending supervision needed? Ann Emerg Med.1992; 21:749–52.

4. Carley S, Mackway-Jones K. Developing a virtuallearning course in emergency medicine for F2 doc-tors. Emerg Med J. 2007; 24:525–8.

5. Kilroy DA. Clinical supervision in the emergencydepartment: a critical incident study. Emerg Med J.2006; 23:105–8.

6. Yoon P. Direct observation in postgraduate emer-gency medicine training. Israeli J Emerg Med. 2005;5:25–29.

7. Carley S, Morris H, Kilroy D. Clinical teaching inemergency medicine: the board round at Hope Hos-pital emergency department. Emerg Med J. 2007;24:659–61.

8. ACEM. Australasian College of Emergency Medi-cine Training and Examination Handbook. Mel-bourne, Australia: ACEM, 2008.

9. ACEM. Australasian College of Emergency MedicineIn-training Assessment Form (Form GT-11), inAustralasian College of Emergency MedicineTraining and Examination Handbook. Melbourne,Australia: ACEM, 2008.

10. Wyatt JP, Henry J, Beard D. The associationbetween seniority of accident and emergency doc-tor and outcome following trauma. Injury. 1999;30:165–8.

11. Harvey M, Al Shaar M, Cave G, Wallace M, Bry-don P. Correlation of physician seniority withincreased emergency department efficiency duringa resident doctors’ strike. N Zeal Med J. 2008;121:59–68.

12. Cydulka RK, Emerman CL, Jouriles NJ. Evaluationof resident performance and intensive bedsideteaching during direct observation. Acad EmergMed. 1996; 3:345–51.

13. Burdick WP, Schoffstall J. Observation of emer-gency medicine residents at the bedside: how oftendoes it happen? Acad Emerg Med. 1995; 2:909–13.

14. Kirkpatrick DL. Evaluating training programs: thefour levels. San Francisco, CA: Berret-Koehler,1994.

15. Letts L, Wilkins S, Law M, Stewart D, Bosch J,Westmorland M. Critical Review Form - QualitativeStudies (Version 2.0). McMaster University. Avail-able at: http://www.srs-mcmaster.ca/Portals/20/pdf/ebp/qualreview_form1.doc. Accessed Oct 15, 2010.

16. STROBE. Strobe checklist for cross-sectional studies,version 4. STrengthening the Reporting of OBserva-tional studies in Epidemiology. Available at: http://www.strobe-statement.org/index.php?id=available-checklists. Accessed Oct 15, 2010.

17. STROBE. Strobe Checklist for cohort studies, ver-sion 4. STrengthening the Reporting of OBserva-tional studies in Epidemiology. Available at: http://www.strobe-statement.org/index.php?id=available-checklists. Accessed Oct 15, 2010.

18. Benenson RS, Pollack ML. Evaluation of emergencymedicine resident death notification skills by directobservation. Acad Emerg Med. 2003; 10:219–23.

19. Celenza A, Rogers IR. Qualitative evaluation ofa formal bedside clinical teaching programme inan emergency department. Emerg Med J. 2006; 23:769–73.

20. Dorfsman ML, Wolfson AB. Direct observation ofresidents in the emergency department: a struc-tured educational program. Acad Emerg Med.2009; 16:343–51.

21. Jouriles NJ, Emerman CL, Cydulka RK. Directobservation for assessing emergency medicine corecompetencies: interpersonal skills. Acad EmergMed. 2002; 9:1338–41.

22. Lloyd G, Skarratts D, Robinson N, Reid C. Commu-nication skills training for emergency departmentsenior house officers–a qualitative study. J AccidEmerg Med. 2000; 17:246–50.

23. Shayne P, Heilpern K, Ander D, Palmer-Smith V;Emory University Department of Emergency Medi-cine Education Committee. Protected clinical teach-ing time and a bedside clinical evaluation instrumentin an emergency medicine training program. AcadEmerg Med. 2002; 9:1342–9.

24. Schmidt UH, Kumwilaisak K, Bittner E, George E,Hess D. Effects of supervision by attending anesthe-siologists on complications of emergency trachealintubation. Anesthesiology. 2008; 109:973–7.

25. Roethlisberger F, Dickson W. Management and theworker: an account of a research program con-ducted by Western Electric Company, HawthorneWorks, Chicago. Cambridge, MA: Harvard Univer-sity Press, 1939.

26. Gisondi M. Improving ED bedside teaching and res-ident evaluation. Presentation to Stanford EM Fac-ulty Development Session. May 21, 2003.

27. Norcini JJ. ABC of learning and teaching in medi-cine: work based assessment. Br Med J. 2003;326:753–5.

28. Kilminster S, Cottrell D, Grant J, Jolly B. AMEEGuide No. 27: Effective educational and clinicalsupervision. Med Teach. 2007; 29:2–19.

29. Kogan JR, Holmboe ES, Hauer KE. Tools for directobservation and assessment of clinical skills of med-ical trainees: a systematic review. JAMA. 2009;302:1316–26.

ACAD EMERG MED • January 2011, Vol. 18, No. 1 • www.aemj.org 67