![Page 1: Blueprinting for the assessment of health care professionals](https://reader031.vdocuments.us/reader031/viewer/2022020112/575067111a28ab0f07a9d52b/html5/thumbnails/1.jpg)
Blueprinting for theassessment of healthcare professionalsHossam Hamdy, Arabian Gulf University, Kingdom of Bahrain
An important approach thathas evolved and beenemphasised in the
assessment of health profession-als is the blueprint approach toassessment construction.1 Like‘portfolio’, a term that has beenborrowed by medical educationfrom the arts, ‘blueprint’ has beenborrowed from architecture. Itindicates that a process ofassessment needs to be conductedaccording to a replicable plan.This fundamental procedure, as aprecursor to test construction anditem choice, ensures that testcontent is mapped carefullyagainst learning objectives toproduce a ‘valid examination’.2 Itgenerates congruence betweenthe subject matter deliveredduring instruction, or
competencies expected to beacquired by the student, and theitems that appear in the test.3
An important aim of blue-printing is to reduce two majorvalidity threats. The first is ‘con-struct under-representation’ –that is, the under-sampling orbiased sampling of the curriculumor course content.4 For example,in a written examination, thisthreat may occur when there aretoo few items covering a topic toachieve a balanced representationof the curriculum. In an objectivestructured clinical examination(OSCE), it can occur when too fewcases are in a specific domain,or when stations on rare orinappropriately difficult problemsare included. The second threat,
‘construct- irrelevant variance’,can occur as a result of flaweditem formats, items that are tooeasy or too hard, or the choice ofinappropriate test modalities (forexample, test methods that min-imise the observation of clinicalpractice, when that is the focus ofthe assessment). Rater bias andinadequate sampling of studentbehaviours – for example, a ten-dency to focus exclusively on alimited set of aspects of theperformance, are further examplesof this type of threat.5 It is athreat because a student’s per-formance on the sample of itemsin the test is generally taken alsoto indicate his or her performanceacross a broader domain of inter-est. Therefore, if the sample is notrepresentative of that broader
Blueprinting isan efficientmethod forhelping withthe testconstructionprocess
PracticalAssessment
� Blackwell Publishing Ltd 2006. THE CLINICAL TEACHER 2006; 3: 175–179 175
![Page 2: Blueprinting for the assessment of health care professionals](https://reader031.vdocuments.us/reader031/viewer/2022020112/575067111a28ab0f07a9d52b/html5/thumbnails/2.jpg)
domain of interest, the examina-tion results will be biased. Thisoccurs, for example, in a finals MDexamination, when too manyquestions are focused mainly onone system. In addition toensuring adequate relevance andsampling, blueprinting helps toidentify test instruments appro-priate to the constructs and con-tents of the assessment.
Although assessment blue-printing is an efficient method forhelping with the test constructionprocess, its application has beenoverlooked by medical teachers. Arecent study by Bridge et al.3
showed that only 15 per cent ofcurriculum administrators at 144US and Canadian medical schoolsrequired course directors orinstructors to develop assessmentblueprints.
THE PROCESS
Essentially, most assessmentblueprints are constructed as aspreadsheet or series of grids thatreflect all the parameters of the
Mostassessment
blueprints areconstructed as a
spreadsheet
176 � Blackwell Publishing Ltd 2006. THE CLINICAL TEACHER 2006; 3: 175–179
![Page 3: Blueprinting for the assessment of health care professionals](https://reader031.vdocuments.us/reader031/viewer/2022020112/575067111a28ab0f07a9d52b/html5/thumbnails/3.jpg)
curriculum, including the preciseclinical context in which thiscurriculum operates. For example,an urban versus a rural contextmay have a major impact on thelevel of resolution of clinicalproblems that can be achieved bystudents.
The first step for blueprintdevelopment is to define thepurpose and focus of the assess-ment. This step is critical in theassessment of health profession-als in various specialties and atvarious stages in their profes-sional development. For example,in a blueprint for a medicalundergraduate programme finalassessment, consideration of theexpected future job responsibilit-ies of graduates should guide the
identification of the competen-cies graduates need to acquire atthe point where they exit theprogramme. This will help inbuilding examinations that aretailored to the expectation thatexaminees are able to perform thetask competently given their levelof training. These competenciescan be further deconstructed intocomponents of related knowledge,skills and attitude. Such a processof identification and analysis hasan important educational impact,allowing curriculum planners andteachers to review critically andrevise priorities in curriculumcontent, learning strategies andstudent assessment. In addition,this simplifies the standard-set-ting process and motivatesimprovement in both teaching
and learning. Examples of such‘core’ competencies are the Gen-eral Medical Council’s (UK)framework on good medicalpractice;6 the Association ofAmerican Medical Colleges’ learn-ing objectives for medical stu-dents;7 the Accreditation Councilof Graduate Medical Education’soutcomes project;8 and the‘objective for qualifying examina-tion’ of the Medical Council ofCanada.9
The complexity of activitiesundertaken by health profession-als, different practice contexts,and different patients’ problemsacross a range of age groups willneed an assessment process thatencompasses all these elements.This can usually be achieved onlywith the use of suitable multipleassessment instruments in orderto develop a highly valid andreliable examination.
The following concrete exam-ple explains how an assessmentblueprint might be developed.Let us consider developing anexamination for the purpose ofassessing medical students’ com-petencies at the end of anundergraduate medical educationprogramme. We shall consider, inthis example, the competencydomain of ‘patient care’. Threebasic questions need to be askedand answered. The first question,‘What to assess?’ brings theresponse ‘problems and tasks’.Core health problems could beidentified in relation to differentbody systems, age groups andpractice contexts – for example,‘emergency or non-emergency’.Clinical tasks related to patientcare could be further sub-dividedinto history taking, physicalexamination, clinical reasoning,diagnosis, investigations, treat-ment and follow-up. The secondquestion is, ‘What is the expecteddegree of resolution of each case,or task appropriate to the exam-inee’s level?’ In our case, thisresolution would be at the levelexpected of a final-year medical
The firstquestion is‘What toassess?’
� Blackwell Publishing Ltd 2006. THE CLINICAL TEACHER 2006; 3: 175–179 177
![Page 4: Blueprinting for the assessment of health care professionals](https://reader031.vdocuments.us/reader031/viewer/2022020112/575067111a28ab0f07a9d52b/html5/thumbnails/4.jpg)
student. The third question is,‘Which components of the taskwill be assessed, and what is/arethe test instrument(s) to beused?’ Almost all clinical tasksinvolve the three components ofknowledge, skills and attitude,but for some tasks one of thesecomponents may take priorityover the others.
Miller’s Pyramid10 for theassessment of clinical compe-tences provides a framework thatwill help in defining a clear andreproducible focus for the assess-
ment (see Figure 1). It describestwo major domains: cognition –‘knowledge’; and behaviour –‘performance’. Students’ know-ledge could be assessed at therecall level ‘knows’, or its appli-cation – ‘knows how’; and studentbehaviour in ‘skills and attitudes’at ‘shows how’ and ‘does’ levels.This taxonomy helps in selectingthe best possible assessmentinstrument.11 Written tests areusually employed for the assess-ment of cognition, preferably atthe higher level of application ofknowledge and problem-solving
‘know-how’ rather than at a levelthat requires simple recall. Wellconstructed A-type multiplechoice questions (MCQs), exten-ded matching questions (EMQs),key feature questions12 and otherconstructed-response short-answer questions will enable theclinical teacher to sample a widerange of relevant constructs andcontent.
At the ‘shows how’ level, anOSCE using simulated patients ordemonstration of procedures onmodels could be used. Assess-ment at the ‘does’ level is moredifficult, but it can be observedthrough interaction with realpatients – for example, the directobservation clinical encounterexamination (DOCEE)13 and miniclinical evaluation exercise ‘Mini-CEX’.14 These examinations havea higher degree of authenticityand reduce the tendency tofractionate the competencywhich is a problem when usingshort station OSCE. Here, theknowledge, skills and profes-sional behaviour components of aclinical task are examined in asingle package.
Written testsare usually
employed forthe
assessment ofcognition
Figure 1. A simple model of competence.
Source: Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine(Supp) 1990;65: S63–S7.
Purpose of the exam: Summative: e.g.: Final M.D examination End of Clerkship etc.Competency domain: Patient care
Key:
What to assess How to assess Practicecontext Age Clinical task Competency level Test instrument System Problem1 2 1 2 3 1 2 3 4 5 6 1 2 3 4
Upper GI bleeding x x x x x A-type MCQ Key feature question
Projectile vomiting – pyloric stenosis x x X X X X X X XA-type MCQ R-type MCQ Gastrointestinal
Blood in the stool – rectal cancer x x x x x x x x x x MEQOSCE – X-ray
Cardiac arrest x x x x x OSCE – CPR Cardiovascular Chest pain x x x x x x x x x
WrittenOSCE – ECG – Simulated patient
Problem Practice context Age Clinical task Competency level Test instrument Presentation or 1 = Emergency 1 = Child 1 = History taking 1 = Knows 1 = Written pathology 2 = Non-emergency 2 = Adult 2 = Physical examination 2 = Knows how a. A-type MCQs
3 = Elderly 3 = Diagnosis 4 = Investigation
3 = Shows how 4 = Does
b. R-type EMQs
5 = Treatment c. Key feature questions 6 = Follow-up d. Modified essay question
(MEQ) 2 = OSCE Simulatedpatients/procedures 3 = Clinical encounter Real patients
Figure 2. Example of test blueprint.
178 � Blackwell Publishing Ltd 2006. THE CLINICAL TEACHER 2006; 3: 175–179
![Page 5: Blueprinting for the assessment of health care professionals](https://reader031.vdocuments.us/reader031/viewer/2022020112/575067111a28ab0f07a9d52b/html5/thumbnails/5.jpg)
Constructing an assessmentblueprint as a grid or table ofspecification should reflect theparameters discussed above. Itcould be developed showing theproblems, age of the patient andpractice context on one dimen-sion, and the clinical task to betested on a second. Adding thedimension of competency levelwill help in identifying theappropriate test instrument (seeFigure 2).
After developing the grid,the cells with crosses representthe competency domainavailable for sampling in anygiven test. Depending on thepracticability of delivering theassessments, a sample of the cellswill drive the nature andconstruction of all the test items.For example, OSCE stations couldbe selected from a question bankif available, or new test itemsdeveloped to assess a particularcell of the matrix. A similarapproach can be applied todevelop blueprinting in relationto a course in basic medicalscience, where objectives,concepts and topics are identifiedand matched with expectedstudent performance andthe optimal assessmentinstrument.3
In Figure 2, which representsan example of an assessmentblueprint, upper GI bleeding willbe presented as an emergency
problem in an adult. The task tobe tested could be diagnosis orinvestigation. The competencylevel would be ‘knows how’ andthe best test instrument could beA-type MCQs or a key featurequestion. The multi-dimensionalmatrix of the blueprint, with itscategories and sub-categories, isalso a useful approach whenorganising and developing ques-tion banks.
Assessment blueprinting isnot a complicated procedureonce the grid has been designed,but it is important to keep thematrix simple by avoiding theuse of too many dimensions. Thiswill keep it practical anduser-friendly.
REFERENCES
1. Nunally JC. Psychometric theory, 2nd
edn. New York: McGraw-Hill, 1978.
2. Dauphinee D. Determining the con-
tent of certification examinations.
In: Newble D, Jolly B, Wakeford R.
The certification and recertification of
doctors: issues in the assessment of
clinical competence. Cambridge:
Cambridge University Press,
1994:92–104.
3. Bridge DP, Musial J, Frank R, Roe T.
Measurement practices: methods for
developing content-valid student
examinations. Med Teach
2003;25:414–421.
4. Messick S. Validity. In: Linn RL, ed.
Educational measurement, 3rd edn.
New York: American Council on
Education/Macmillan, 1989:13–104.
5. Downing SM, Haladyna TM. Validity
threats: overcoming interference
with proposed interpretations of
assessment data. Med Educ
2004;38:327–333.
6. General Medical Council. Good med-
ical practice, 2nd edn. London: GMC,
1998.
7. Medical School Objectives Project
(MSOP), AAMC Report: Learning
Objectives for Medical Student
Education: Guidelines for Medical
Schools. Available at: http://
www.aamc.org/meded/msop/msop.
8. Accreditation Council for Graduate
Medical Education (ACGME) Outcome
Project. Available at http://
www.acgme.org/Outcome.
9. Medical Council of Canada. Objec-
tives for the qualifying examination.
Downloadable from: http://
www.mcc.ca/Objectives_online.
10. Miller G. The assessment of clinical
skills/competence/performance.
Acad Med 1990;65(Suppl.):S63–S67.
11. Crossley J, Humphris G, Jolly B.
Assessing health professionals. Med
Educ 2002;36:800–804.
12. Case S, Swanson D. An item writing
manual. Downloadable from National
Board of Medical Examiners, Phil-
adelphia,USA (http://
www.nbme.org).
13. Hamdy H, Prasad K, Williams R, Salih
FA. Reliability and validity of the
direct observation clinical encounter
examination (DOCEE). Med Educ
2003;37:205–212.
14. Norcini JJ, Blank LL, Arnold GK,
Kimball HR. The Mini-CEX (clinical
evaluation exercise): a preliminary
investigation. Ann Intern Med
1995;123(19):795–799.
� Blackwell Publishing Ltd 2006. THE CLINICAL TEACHER 2006; 3: 175–179 179