review article workplace-based assessment ... -...

Post on 19-Aug-2020

0 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

www.mjms.usm.my © Penerbit Universiti Sains Malaysia, 2015For permission, please email:mjms.usm@gmail.com

Abstract Workplace based assessment (WPBA) refers to a groupof assessmentmodalitieswhichevaluatestrainees’performanceduringtheclinicalsettings.HallmarkofWPBAistheelementof observation of the trainee’s performance in real workplace environment along with relevantfeedback,thusfosteringreflectivepractice.WPBAconsistsofobservationofclinicalperformance(mini-clinicalevaluationexercise,directobservationofproceduralskills),discussionofclinicalcases(casebaseddiscussion),andfeedbackfrompeers,coworkers,andpatients(multisourcefeedback).This literaturereviewwasconductedonthedatabasesofMEDLINE,EMBASE,CINAHL,andTheCochraneLibrary.DatawereretrievedbyconnectingMedicalSubjectHeadings(MeSH)keywords:‘workplace based assessment’ AND ‘mini-clinical evaluation exercise’ AND ‘direct observation ofproceduralskills’AND‘casebaseddiscussion’AND‘multi-sourcefeedback’.Additionalstudiesweresearched in the reference listsof all includedarticles.AsWPBA is gainingpopularity, there is agrowingneedforcontinuingfacultydevelopmentandgreaterevidenceregardingthevalidityandreliabilityoftheseinstruments,whichwillallowtheacademiatoembedthisstrategyintheexistingcurricula.Asoftoday,thereareconflictingreportsabouttheeducationalimpactofWPBAintermsofitsvalidityandreliability.ThisreviewdrawsuponthespectrumofWPBAtools,theirdesignsandapplications,andanaccountoftheexistingeducationalimpactofthisemergingassessmentstrategyinmedicaleducation.Finally,thestudyreportssomeeducationalimpactofWPBAsonlearningandemphasisestheneedformoreempiricalresearchinendorsingitsapplicationworldwide.

Keywords: social networking sites social media; Facebook, twitter; Myspace; Linkedin, medical education

Introduction

In the traditional apprenticeship-trainingmodels,traineesconfrontawiderangeofhealth-careproblemsand,asotherphysicianspractice,they are required to apply their professionalcapabilities in a competent and skillful manner(1). It is essential that the assessment modelsshouldsafeguardthesafetyofpatientsaswellasto offer the opportunity of contextual feedbackto the trainee. For plausible solutions, a wealthof assessment tools have been described, manyof which are modifications of the conventionalclinicallongcaseexamination.WPBAentailstheevaluation of daily clinical practices employedin the working situation (2). Simply, it is an“assessment of what doctors actually do inpractice” (3). WPBA encompasses wide rangeof assessment strategies that evaluate traineesin clinical settings and provide feedback. WBAhas allowed the transition away from the useof numbers-based experience toward a morestructuredformatofassessment(4).

WPBAhasbeenadoptedbytheUKGeneralMedical Council (GMC) and the Academyof Medical Royal Colleges (AoMRC) for theassessment of performance in the postgraduatemedical education (5). Likewise, WPBA isalso being used and gaining popularity in theundergraduate medical education (6). GMCelaborates WPBA as assessments for learning(formative),ratherthanasassessmentsoflearning(summative) (7). Despite this, WBA gathersobjective evaluation of a trainee’s competenceand performance, providing a generic blueprintof summative functions. This review exploresmultipledimensionsofWPBAwithaviewtolookinto its educational impact on medical traineesandphysicians.

Study Design

Thissearchwasconductedonthedatabasesof MEDLINE, EMBASE, CINAHL, and TheCochrane Library. Data were retrieved byconnecting Medical Subject Headings (MeSH)

Review Article PROVISIONAL PDF

Workplace-based assessment; applications and educational impact Salman Yousuf GuraYa

College of Medicine, Taibah University, Almadinah Almunawwarah , Tareeq Jamiat, P.O Box 30054 Kingdom of Saudi ArabiaSubmitted: 6Nov2014

Accepted: 6Nov2015

5Malays J Med Sci. Nov-Dec 2015; 22(6): 5-10

6 www.mjms.usm.my

Malays J Med Sci. Nov-Dec 2015; 22(6): 5-10

keywords: ‘workplace based assessment’ AND‘mini-clinical evaluation exercise’ AND ‘directobservation of procedural skills’ AND ‘casebased discussion’ AND ‘multi-source feedback’.Additionalstudiesweresearchedinthereferencelistsofallincludedarticles. The body of information from literatureshowedarangeofcategoriesofWPBA.

Categories of WPBA

AnumberofWPBAstrategiesexist,allaimingtoassessvariousfacetsoftrainees’performance.Their categories and specific indications aresummarisedinTable1.

Observation of clinical performance

Mini-clinicalevaluationexercise(mCEX) In the mCEX, an assessor evaluates atrainee–patient interaction in any healthcareinstitution.Suchclinicalencountersareexpectedto last for about 15 minutes, and the trainee isexpected to conduct a focused history and/orphysicalexaminationwithin this stipulated time(8).Attheend,traineesuggestsadiagnosisandmanagement plan, the performance is gradedbyusinga structuredevaluation form,and thenconstructive feedback is provided. Assessorsuse a nine-point Likert scale ranging from‘unsatisfactory’to‘superior’(9).Thisprovidessixdomain-specific ratings and one final rating ofclinical competence. Trainee undertakes aroundsix clinical assessments during the year, with adifferentassessorforeachsession. Nairetalsurveyedagroupof internationalmedicalgraduatesfortheacceptability,reliability,andfeasibilityofmCEXandreportedthatabout50%graduateswereeithersatisfiedorverysatisfiedwith this assessment strategy for learning (10).Other studies examined the educational impactofmCEXinanesthesiatrainingandshowedthatmajority of trainees (and their evaluators) were

satisfiedby thescheduleofassessmentsandthequalityoffeedbackoffered(11).

Directobservationofproceduralskills(DOPS) DOPSwasintroducedbytheRoyalCollegeofPhysiciansandnowformsanintegralcomponentofWPBAfordoctors inthe foundationyearandthoseinspecialisttraining(12).Itwasspecificallydesigned to assess procedural skills involvingrealpatientsinasingleencounter.DuringDOPS,an assessor evaluates a trainee conducting aprocedure as a part of his/her routine practicaltraining against a set criteria,which is followedbya face-to-face feedback session (13).DOPS isscored evaluations of practical procedures andclinicalexaminations.Thismethodofassessmenthasbeenshowntobevalid,reliableandfeasiblein evaluating postgraduatemedical registrars intheUK(14). DOPS is considered as a valuable learningopportunityfortraineestoenhanceperformanceinaskill. Intricateworkingbetweentraineeandassessor is required for its timely and effectivefunctioning. DOPS assessments are tailor-madeto be conveniently integrated into trainees’ andassessors’ daily routine and hence cconsideredhighlyfeasible(15).Afeed-backsurveyrespondedby 25 of the 27 pre-registration house officerscompleting the assessments showed that themajority(70%)washelpedbydirectobservationinimprovingtheirclinicalskills(5).However,astudyconducted in2009showedthatanumberof interns agreed that DOPS might enhancetheir clinical acumen, but this evidence has notbeen reported scientifically, and the number ofrespondentswasverysmall(16).

Discussion of clinical cases

Case-baseddiscussion(CbD) ACbD focuses entirely on the doctor’s realworkandat all times explores exactlywhatwasdoneandwhyandhowanydecision,investigation

Table1:CategoriesofworkplacebasedassessmentandtheirobjectivesNo. Tasks Tools1 Observationofclinicalperformance Mini-clinicalevaluationexercise

Directobservationofproceduralskills2 Discussionofclinicalcases Casebaseddiscussion3 Feedbackfrompeers,coworkers,and

patientsMultisourcefeedback(360°assessment)MinipeerassessmenttoolTeamassessmentofbehaviorsPatientsatisfactionquestionnaire

Review Article |Workplace-basedassessmentmodalities

www.mjms.usm.my 7

orinterventionwasdecidedupon(17).Thetraineeselectsthetiming,therecords,andtheassessor.Afewdaysbeforetheassessment,acaseischosenwithparticularcurriculumobjectivesinmindandthendiscussedusingfocusedquestionsdesignedto elicit responses that will indicate knowledge,skills,attitudesandbehavioursrelevanttothosedomains.Afterthediscussion,theassessorratesthequalityoftheperformanceandthenprovidesconstructive feedback. On average, trainees areassessed six times during the year. A workingplanforatypicalCbDisasfollows;

a. Planning• Traineeselectstwomedicalrecords• Assessor selects a medical record for

discussionandassessment• Trainee and assessor map out potential

curriculum domains and specificcompetencies

• Assessorpreparesquestionsfordiscussionb. Discussion

• Assessorensuresthatmedicalrecordsareavailableduringthediscussion

• Discussion starts with a reminder ofmedicalrecordofthepatientbyassessor

• Assessor explores the trainee’s clinicalreasoningandprofessionaljudgment

• Discussion is focused on the case,determining the trainee’s diagnostic andmanagementskills

c. Feedback• Assessor provides effective and

constructivefeedbacktothetrainee

CbDevaluateswhatthetraineesactuallydidrather thanwhat they think theymightdo.ThisisthemoststrikingdifferencebetweenCbDandobjectivestructuredclinicalexamination(OSCE),which assesses the physician performanceunder examination conditions (18). CbD hasbeen demonstrated to have significant face andcontent validity (19). In addition, it has beendemonstrated that (with sufficient sampling)good levels of reliability (20) and validity withassessortrainingcanbeachieved(21).TheinnatenatureofCbDdemandsthatdoctors’ownpatients(cases)areusedforaconversationordiscussionthatprovidesthemainimpetustoassesstrainee’sapplied knowledge, clinical reasoning anddecision-making. CbD can explore a full rangeof holistic, balanced and justifiable decisions incomplexsituations,suchastheabilitytorecognisedilemmas,managingacomplexcaseinthegivenrangeofoptions,decidingonacourseofaction,explainingthecourseofaction,andreflectingon

thefinaloutcomes.

Multisource feedback (360° assessment)

Mini-peerassessmenttool(mPAT) mPATencompassestheintegrationof ideasabout an trainee’s performance in a range ofcompetencedomainsfromtheircolleagues.Thisassessmentstrategygathersconfidentialfeedbackfrom eight peers evaluating 16 aspects from thefollowingdomains(22);

• Diagnosisandappropriateapplicationofavailableinvestigativetools

• Managementoftime• Management of stress, fatigue, and

workload• Effectivecommunication• Knowledgeofone’sownlimitations

Foundation doctors in theUK are requiredtocompleteatleasttwomPATsperyear.ArcheretalexploredthevalidityofmPATbyamappingexercise against theUK Foundation CurriculumTrainees’ clinical performance (23). Theyadministered a 16-item questionnaire writtenagainstasix-pointscaleontwoseparateeventsduringtheperiodofpilotstudy.Theparticipants’responses were evaluated to identify internalstructuralframework,potentialpointsofleniencyandvariousmeasurementvariables.Theanalysisof these variables generated two main factorsof clinical competence and humanistic values.The research illustrated that as a component ofassessmentprogram,mPAThas thepotential toprovideaneffectiveandreliabletoolofcollatingcolleagueopinions incomprehensivelyassessingtheFoundationtrainees.

Teamassessmentofbehaviors(TAB) TAB is a form of multisource feedbackassessment for the trainee doctors in the UKFoundationCurriculum (24).TABhas followingfour domains based on the GMC’s guidance onprofessionalbehaviour;

• Developingandmaintainingprofessionalrapport and relationships with thepatients

• Communicatingbyeffectiveverbalskills• Workinginateamandasteamleader• Ensuringtheaccessibilityandavailability

TAB is used as a formative as well as asummative tool to help people improve theirperformances. This assessment tool entailsa minimum of 10 returns for a valid, reliableevaluation. The recommended mix of raters is

8 www.mjms.usm.my

Malays J Med Sci. Nov-Dec 2015; 22(6): 5-10

specified,sinceratingsvarysignificantlybystaffgroup(25).

Patientsatisfactionquestionnaire(PSQ) PSQ can provide formative feedback on adoctor’sprofessionalperformancewithinaprocessof appraisal (26). A structured questionnaireis used to obtain patients’ feedback. Physiciansare expected to get feedback at least once everyfiveyears,toreflectonthefeedbacktheyobtain,andtouseittoinformtheirfurtherprofessionaldevelopment, where appropriate (27). Whenpatients assess physicians’ or larger health caresystems, the demographics of the patients, andthe questionnaire administrationmethods (e.g.,postal, telephone or use of proxy respondents)can potentially influence final ratings (28) (29,30).When colleagues judge the performance ofotherphysicians,therater’spersonalimpression,thedurationandnatureoftherater’srelationshipwiththeexaminee,andtherater’sfamiliaritywiththe doctor’s practice can jeopardise the entireassessmentprocess(31). Manyreportsintheexistingliteraturesuggestthatmultisource feedback canobjectively assesskey competencies like communications skills,interpersonal skills, collegiality, professionalexpertise,andtheabilitytoprogressinthemedicalfield (21), (32), (33). Multisource feedback,however, has its own limitations. A number ofstudies have shown that responses tend to beskewed towards positive assessments of doctorperformance (34) (35) (36) Others have showndissatisfaction about the ability of multisourcefeedback, patient feedback in particular, inidentifyingtheunderperformingdoctors(37).

Acute Care Assessment Tool (ACAT) ACATassessestheperformanceofatraineeafter,forinstance,anightshiftinacutemedicine

StrengthsandweaknessesofWPBA A study on the medical students indicatedthatWPBAwasusefulforincreasingcontacttimewith the supervisors (38). In another research,studentsreportedthatallocatingatutorinWPBAwas the most effective means of judging thecompetence (39). By WPBA, learners establishprofessional relationship with their tutors, whobecomeauthenticsourcesofeffectivefeedbackforthelearners.TheoutrightstrengthofWPBAisitsformativepotentialforassessment.Theessentialimpetusneededtoachieve‘assessmentforlearning’is the provision of feedback by the assessor,enabling the trainee to steer his or her learningtowards the intended learning achievements

(40). Evidence-based research has shown thatsystematicfeedbackdeliveredbyacrediblesourcecanenhanceclinicalperformance(41).Themoststriking evidence of performance improvementby WPBA is derived from studies exploringthe feasibility and validity of the multisourcefeedback(5).Researchfrompsychologyliteraturehas shown that multisource feedback canlead to gradual enhancements in professionalcompetence (42), Another study of medicaleducation showed that doctors getting specificfeedbackfrompeers,colleaguesandpatientscanusethedatatoimplementmodificationsintheirclinical practice (43). The direct observation oftrainee’s performance at the workplace is onlymade useful by the associated feedback, thustriggeringreflection(44).ThefeasibilityofWPBAis amplified than the conventional assessmentsas it isappliedandconductedduringthecourseofday-to-dayroutine.AlthoughWPBAdemandspriortrainingofthefaculty,additionaltimeandstudentknowledgeandsensitisation,institutionsdon't need any dedicated infrastructure toestablish this assessment strategy. However, asurveyreportingtheviewsof539surgicaltraineeson the Intercollegiate Surgical CurriculumProgram(16)showedthat60%ofrespondentsfeltthat the program adversely affected on trainingopportunities due to a long time required tocompletetheassessments.Morethan90%statedthattheprogramhadaneutralornegativeimpactontheirtrainingoverall. WPBA, per se, cannot replace traditionalmethodsofassessmentbutcarriesapotentialofadd-onmethod especially to the in- training orformativeassessment.Thetraineeswhoperformwell in initial encountersmay get overconfidentand this may be a hurdle in motivating themfor future improvements (44). Since WPBAdemandsalotoftime,traineestendtoseeklesssenior assessors. There is evidence to suggestthatthemoreseniorandexpertstaffmayprovidelower but more accurate rating of performance(45).At thesame time,WPBArequiresassessortrainingparticularlyinobjectiveevaluationsandproviding effective feedback. Sensitization andintroductory seminars aboutWPBAmay be thefirststepingroomingthestaffaboutappropriatefunctionalityofthisassessmenttool.

Conclusion

WPBA involves evaluation of performanceand provision of feedback of doctors in theireveryday activities. These assessment toolspurport to provide valuable insight to trainee,

Review Article |Workplace-basedassessmentmodalities

www.mjms.usm.my 9

assessor and academics, and are found to havesome educational impact on learning.However,further evidence-based interventional andexperimental models are needed to establishits significant educational impact in medicaleducation.

Acknowledgement

None.

Conflict of interests

None.

Funds

None.

Correspondence

ProffesorSalmanYousufGurayaFRCS,MMedEd(Dundee)CollegeofMedicineTaibahUniversityAlmadinahAlmunawwarahTareeqJamiatP.OBox30054KingdomofSaudiArabiaTel:+966148460008Fax:+966148471403Email:salmanguraya@gmail.com

References

1. Norcini JJ, McKinley DW. Assessment methodsin medical education. Teach Teach Educ. 2007;23(3):239–250.

2. Swanwick T, Chana N. Workplace assessmentfor licensing in general practice. Br J Gen Pract. 2005;55(515):461–467.

3. SwanwickT,ChanaN.Workplace-basedassessment.BrJ Hosp Med (17508460).2009;70(5):290–293.

4. NesbittA,BairdF,CanningB,GriffinA,SturrockA.Student perception of workplace-based assessment.Clin teach. 2013;10(6):399–404. doi: 10.1111/tct.12057.

5. Miller A, Archer J. Impact of workplace basedassessment ondoctors’ education andperformance:a systematic review. BMJ. 2010;341: c5064. doi:10.1136/bmj.c5064.

6. BoursicotK,EtheridgeL,SetnaZ,SturrockA,KerJ,SmeeS,etal.Performanceinassessment:Consensusstatement and recommendations from the Ottawaconference. Med Teach. 2011;33(5):370–383. doi:10.3109/0142159X.2011.565831.

7. Franche R-L, Cullen K, Clarke J, Irvin E, SinclairS, Frank J. Workplace-based return-to-workinterventions:asystematicreviewofthequantitativeliterature. J Occup Rehabil. 2005;15(4):607–631.doi:10.1007/s10926-005-8038-8.

8. Augustine K, McCoubrie P, Wilkinson J, McKnightL.Workplace-basedassessment inradiology—whereto now? Clin Radiol. 2010;65(4):325–332. doi:10.1016/j.crad.2009.

9. Hawkins RE, Margolis MJ, Durning SJ, NorciniJJ. Constructing a validity argument for the mini-clinicalevaluationexercise:Areviewoftheresearch.Acad Med. 2010;85(9):1453–1461. doi: 10.1097/ACM.0b013e3181eac3e6.

10. Nair BR, Alexander HG, McGrath BP, ParvathyMS, Kilsby EC, Wenzel J, et al. The mini clinicalevaluationexercise(mini-CEX)forassessingclinicalperformanceofinternationalmedicalgraduates.Med J Aust.2008;189(3):159–161.

11. Weller J, Jolly B, Misur M, Merry A, Jones A,Crossley JM, et al. Mini-clinical evaluationexercise in anaesthesia training. Br J Anaesth.2009;102(5):633–641.doi:10.1093/bja/aep055.

12. Bindal N, Goodyear H, Bindal T, Wall D. DOPSassessment: A study to evaluate the experienceand opinions of trainees and assessors.Med Teach. 2013;35(6):e1230–e1234. doi:10.3109/0142159X.2012.746447.

13. Pelgrim E, Kramer A, Mokkink H, Van denElsen L, Grol R, Van der Vleuten C. In-trainingassessmentusingdirectobservationofsingle-patientencounters:aliteraturereview.Adv Health Sci Educ. 2011;16(1):131–142.

14. WilkinsonJR,CrossleyJG,WraggA,MillsP,CowanG,WadeW.Implementingworkplace-basedassessmentacrossthemedicalspecialtiesintheUnitedKingdom.Med Educ.2008;42(4):364–373.doi:10.1111/j.1365-2923.2008.03010.x.

15. Morris A,Hewitt J, Roberts C. Practical experienceof using directly observed procedures, mini clinicalevaluationexaminations,andpeerobservationinpre-registration house officer (FY1) trainees. Postgrad Med J.2006;82(966):285–288.doi:10.1136/pgmj.2005.040477.

16. Pereira EA, Dean BJ. British surgeons' experiencesof mandatory online workplace-based assessment.J R Soc Med. 2009;102(7):287–293. doi: 10.1258/jrsm.2009.080398.

17. BrownN,HolsgroveG,TeeluckdharryS.Case-baseddiscussion.Adv Psych Treat.2011;17(2):85–90.

18. Guraya S, Alzobydi A, Salman S. Objectivestructuredclinicalexamination:Examiners’biasandrecommendations to improve its reliability. J Med Med Sci.2010;1:269–272.

19. Jennett PA, Scott SM, Atkinson MA, Crutcher RA,Hogan DB, Elford RW, et al. Patient charts andphysician office management decisions: chart auditandchartstimulatedrecall.J Continuing Educ Health Professions.1995;15(1):31–39.

10 www.mjms.usm.my

Malays J Med Sci. Nov-Dec 2015; 22(6): 5-10

20. NormanGR,DavisDA,LambS,HannaE,CaulfordP, Kaigas T. Competency assessment of primarycare physicians as part of a peer review program.JAMA. 1993;270(9):1046–1051. doi: 10.1001/jama.1993.03510090030007.

21. Epstein RM, Hundert EM. Defining and assessingprofessionalcompetence.JAMA.2002;287(2):226–235.doi:10.1001/jama.287.2.226.

22. Burkill G. Work-based assessment for trainees—more than just a few new tools? Clin Radiol.2008;63(1):12–14.doi:10.1016/j.crad.2007.07.014.

23. ArcherJ,NorciniJ,SouthgateL,HeardS,DaviesH.mini-PAT(PeerAssessmentTool):avalidcomponentofanationalassessmentprogrammeintheUK?Adv Health Sci Educ Theory Pract. 2008;13(2):181–192.doi:10.1007/s10459-006-9033-3.

24. WallD,SinghD,WhitehouseA,HassellA,HowesJ.Self-assessment by trainees using self-TAB as partof the team assessment of behaviour multisourcefeedbacktool. Med Teach.2012;34(2):165–167.doi:10.3109/0142159X.2012.644840.

25. Bullock AD, Hassell A, Markham WA, Wall DW,WhitehouseAB.How ratings vary by staff group inmulti source feedbackassessmentof juniordoctors.Med Educ.2009;43(6):516–520.doi:10.1111/j.1365-2923.2009.03333.x.

26. Campbell J, Richards S, Dickens A, Greco M,NarayananA,BrearleyS.Assessing theprofessionalperformance of UK doctors: an evaluationof the utility of the General Medical Councilpatient and colleague questionnaires. Qualy Saf Health Care. 2008;17(3):187–193. doi:10.1136/qshc.2007.024679.

27. Wright C, Richards SH, Hill JJ, Roberts MJ,NormanGR,GrecoM,etal.Multisourcefeedbackinevaluatingtheperformanceofdoctors:theexampleoftheUKGeneralMedicalCouncilpatientandcolleaguequestionnaires.Acad Med.2012;87(12):1668–1678.doi:10.1097/ACM.0b013e3182724cc0.

28. WoodsSE,BivinsR,OtengK,EngelA.Theinfluenceof ethnicity on patient satisfaction. Ethn Health. 2005;10(3):235–242.

29. CampbellJL,RobertsM,WrightC,HillJ,GrecoM,Taylor M, et al. Factors associated with variabilityin the assessment of UK doctors’ professionalism:analysis of survey results. BMJ. 2011;343: d6212.doi:10.1136/bmj.d6212.

30. LipnerRS,BlankLL,LeasBF,FortnaGS.Thevalueofpatientandpeerratingsinrecertification. Acad Med. 2002;77(10Suppl):S64–S66.

31. Mackillop LH, Crossley J, Vivekananda-SchmidtP, Wade W, Armitage M. A single genericmulti-source feedback tool for revalidation ofall UK career-grade doctors: Does one size fitall? Med Teach. 2011;33(2):e75–e83. doi:10.3109/0142159X.2010.535870.

32. ViolatoC,LockyerJ,FidlerH.Multisourcefeedback:a method of assessing surgical practice. BMJ. 2003;326(7388):546–548. doi: http://dx.doi.org/10.1136/bmj.326.7388.546.

33. Brown, C. Multi-source feedback. in: A. Malik, D.Bhugra, A. Brittlebank (Eds.) Workplace-Based Assessments in Psychiatry. The Royal College ofPsychiatrists,London;2011:68–75.

34. Archer J, McGraw M, Davies H. Assuring validityof multisource feedback in a national programme.Arch Dis Childd. 2010;95(5):330-335.doi:10.1136/adc.2008.146209.

35. Campbell J, Narayanan A, Burford B, Greco M.Validationofamulti-sourcefeedbacktoolforuseingeneral practice. Educ Prim Care. 2010;21(3):165–179.

36. Richards SH, Campbell JL, Walshaw E, DickensA, Greco M. A multi method analysis of freetext comments from the UK General MedicalCouncil Colleague Questionnaires. Med Educ. 2009;43(8):757–766.doi:10.1111/j.1365-2923.2009.03416.x.

37. ArcherJC,McAvoyP.Factorsthatmightunderminethe validity of patient and multi source feedback.Med Educ. 2011;45(9):886–893.doi:10.1111/j.1365-2923.2011.04023.x.

38. Sabey A, Harris M. Training in hospitals: what doGP specialist trainees think of workplace-basedassessments?EducPrimCare.2011;22(2):90–99.

39. Archer JC. State of the science in healthprofessional education: effective feedback. Med Educ. 2010;44(1):101–108. doi: 10.1111/j.1365-2923.2009.03546.x.

40. Norcini J, Burch V. Workplace-based assessmentas an educational tool: AMEE Guide No. 31.Med Teach. 2007;29(9-10):855–871.doi:10.1080/01421590701775453.

41. VeloskiJ,BoexJR,GrasbergerMJ,EvansA,WolfsonDB.Systematicreviewoftheliteratureonassessment,feedback and physicians' clinical performance*:BEME Guide No. 7. Med Teach. 2006;28(2):117–128.doi:10.1080/01421590600622665.

42. Smither JW, London M, Reilly RR. DoesPerformance Improve Following MultisourceFeedback? A Theoretical Model, Meta Analysis,And Review Of Empirical Findings. Personnel psychology. 2005;58(1):33–66. doi: 10.1111/j.1744-6570.2005.514_1.x.

43. FidlerH,LockyerJM,ToewsJ,ViolatoC.Changingphysicians'practices:theeffectofindividualfeedback.Acad Med. 1999;74(6):702–714.

44. Singh T, Modi JN. Workplace-based assessment:A step to promote competency based postgraduatetraining.Indian Pediatr. 2013;50(6):553–559.

45. Hill F, Kendall K, Galbraith K, Crossley J.Implementing the undergraduate miniCEX: atailored approach at Southampton University.Med Educ. 2009;43(4):326–334. doi: 10.1111/j.1365-2923.2008.03275.x.

top related