implications for undergraduate and graduate education derived from quantitative research in...

8
The .lourrial ofcontinuing Education in the Health Professions, Volume 16, pp. 159-1 66. Printed in the U.S.A. Copyright 0 1996The Alliance for ContinuingMedical Education,the Society of Medical College Directors of ContinuingMedical Education, and the Council on CME, Association for Hospitai Medical Education. All rights reserved. Original Article Implications for Undergraduate and Graduate Education Derived from Quantitative Research in Continuing Medical Education: Lessons Learned from an Automobile DAVE DAVIS, MD, FCFP Continuing Education Faculty of Medicine University of Toronto Toronto, ON Chedoke-McMaster Hospitals MARY ANN THOMSON, BHSc (PT), MSc School of Rehabilitation Science Faculty of Health Sciences McMaster University and Hamilton, ON Abstract: The purpose of this paper is to highlight key factors injluencing the effectiveness of continuing medical education (CME} and to explore implications for undergraduate and graduate medical education. We briefly present the results of a recent systematic review of CME on physician performance and on health care outcomes and explore the earlier phases of medical education from three perspectives: needs assessment, type of educational inter- vention (format), and evaluation. First, it appears that needs assessment may pluy an increasingly important role in graduate and undergraduate education, especially given the diversity of stu- dents and residents in their prior training and experience. Second, educational interventions inciuding practice-based systems such as reminders are more effective than more traditional interventions such as conferences (didactic presentations} and printed materials, at least in the CME setting. Lectures and printed materials may be an efficient way of delivering con- tent, but alone may be insuflicient to change practice even in undergraduate education. Third, the issue of evaluation,.from both the micro (student} and macro (program} levels has impli- cationsfor undergraduate and residency education. In CME, the issue of competency assessment has begun to be addressed in a rigorous and planned manner, derived in part from the area of remedial education. The paper concludes with a cull for a similar attempt to answer the basic question “Do undergraduate and graduate education really work?” Key Words: Automobile, continuing medical education (CME), graduate education, quanti- tative research, undergraduate education This paper attempts to derive lessons for the edu- cation of undergraduate medical students and res- idents gained from research in continuing medical education (CME): the last, longest, and arguably most complex phase of medical education. The paper’s subtitle, “Lessons Learned from an Auto- mobile,” relates to the mid- 1980s Honda dashboard set-up, with its two prominently displayed air circulation buttons.* The first of these buttons, rep- resenting the vent for fresh air as a straight arrow, may serve as a reminder of the traditional way in which we view the medical education continuum: 3 or 4 years in undergraduate programs, several years in graduate education, and then a lifetime of CME activity. The second button, representing recycled air (an arrow turned backwards on itself), *The analogy of the Honda Accord in medical education was first used at a meeting of the Association of Canadian Medical Colleges, Standing Committee on CME by one of us (Dave Davis, MD) in October, 1988. Reprint requests: Dave Davis, MD, FCFP, Continuing Education, 150 College Street, Suite 121, Toronto, ON M5S 3E2. 159

Upload: dave-davis

Post on 11-Jun-2016

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Implications for undergraduate and graduate education derived from quantitative research in continuing medical education: lessons learned from an automobile

The .lourrial ofcontinuing Education in the Health Professions, Volume 16, pp. 159-1 66. Printed in the U.S.A. Copyright 0 1996The Alliance for Continuing Medical Education, the Society of Medical College Directors of Continuing Medical Education, and the Council on CME, Association for Hospitai Medical Education. All rights reserved.

Original Article

Implications for Undergraduate and Graduate Education Derived from Quantitative Research in Continuing Medical

Education: Lessons Learned from an Automobile DAVE DAVIS, MD, FCFP Continuing Education Faculty of Medicine University of Toronto Toronto, ON Chedoke-McMaster Hospitals

MARY ANN THOMSON, BHSc (PT), MSc School of Rehabilitation Science Faculty of Health Sciences McMaster University and

Hamilton, ON

Abstract: The purpose of this paper is to highlight key factors injluencing the effectiveness of continuing medical education (CME} and to explore implications for undergraduate and graduate medical education. We briefly present the results of a recent systematic review of CME on physician performance and on health care outcomes and explore the earlier phases of medical education from three perspectives: needs assessment, type of educational inter- vention (format), and evaluation. First, it appears that needs assessment may pluy an increasingly important role in graduate and undergraduate education, especially given the diversity of stu- dents and residents in their prior training and experience. Second, educational interventions inciuding practice-based systems such as reminders are more effective than more traditional interventions such as conferences (didactic presentations} and printed materials, at least in the CME setting. Lectures and printed materials may be an efficient way of delivering con- tent, but alone may be insuflicient to change practice even in undergraduate education. Third, the issue of evaluation, .from both the micro (student} and macro (program} levels has impli- cations for undergraduate and residency education. In CME, the issue of competency assessment has begun to be addressed in a rigorous and planned manner, derived in part from the area of remedial education. The paper concludes with a cull for a similar attempt to answer the basic question “Do undergraduate and graduate education really work?”

Key Words: Automobile, continuing medical education (CME), graduate education, quanti- tative research, undergraduate education

This paper attempts to derive lessons for the edu- cation of undergraduate medical students and res- idents gained from research in continuing medical education (CME): the last, longest, and arguably most complex phase of medical education. The paper’s subtitle, “Lessons Learned from an Auto- mobile,” relates to the mid- 1980s Honda dashboard set-up, with its two prominently displayed air

circulation buttons.* The first of these buttons, rep- resenting the vent for fresh air as a straight arrow, may serve as a reminder of the traditional way in which we view the medical education continuum: 3 or 4 years in undergraduate programs, several years in graduate education, and then a lifetime of CME activity. The second button, representing recycled air (an arrow turned backwards on itself),

*The analogy of the Honda Accord in medical education was first used at a meeting of the Association of Canadian Medical Colleges, Standing Committee on CME by one of us (Dave Davis, MD) in October, 1988.

Reprint requests: Dave Davis, MD, FCFP, Continuing Education, 150 College Street, Suite 121, Toronto, ON M5S 3E2.

159

Page 2: Implications for undergraduate and graduate education derived from quantitative research in continuing medical education: lessons learned from an automobile

Qiiantitatiite Research i i i C M E

may represent, for the purposes of this paper. the lessons that research in CME has for earlier edu- cational phases.

The paper deals with the subject of lessons learned from CME in several areas: first, the char- acteristics of the literature that we use to define the impact of CME: second. the key lessons that such literature has for medical educators; and, third, the implications for undergraduate education (UGE) and graduate (residency) education (GE) pro- grams and curricula. Finally, the article will con- clude with some caveats and cautions about future research and development priorities.

Results of CME Research: Impact Studies

The continuing medical education (CME) litera- ture exists i n several forms and is derived from a wide variety of sources. including traditional bio- medical and adult education, continuing profes- sional education, and educational and social psy- chology. Through a process of searching in and retrieval from related commercial databases, much of this literature is indexed in the Research and Development Resource Base in CME (RDRB/ CME).' The RDRB/CME is a bibliographic data- base containing references to over 6OOO articles rel- evant to continuing health professional educa- tion. Of these. approximately one third describe the impact of specific CME interventions. such as didactic presentations. interactive workshops. or newsletters.

To summarize this body of evidence in a way that addresses the question "what is the impact of CME?" is a major effort, demanding a process of study selection? assessment of study quality, data extraction, and analysis. There are several exam- ples of such reviews of this body of research,'-' the last of whichq summarizes the literature of 99 ran- domized controlled trials of educational interven- tions designed to improve physician performance or health care outcomes. Inasmuch as this review builds on those that come earlier, its findings are used as the substrate to develop implications for undergraduate medical and residency education

programs. The 99 trials described 160 interven- tions, of which almost two thirds displayed an improvement in at least one major outcome mea- sure. Seventy percent demonstrated a change in physician performance, while almost half (48%) of the interventions targeted at health care out- comes produced a positive change.

There were three major findings with sig- nificant curricular or programmatic implica- tions. First, CME outcomes were greatly affected by the types and formats of interventions used, particularly when they represented a very broad definition of CME, and by the number of times they were applied. Second, CME outcornes appeared to be more effective when close atten- tion was paid to needs assessment, a determi- nation of the gap between ideal and real perfor- mance. Third, these educational interventions comprising CME, each with its own set of com- plexities and practice variability, were subjected, successfully, to evaluation.

Formats in CME

The review described and summarized success- ful and unsuccessful educational interventions, success being determined by the study's impact on changing physician performance and health care outcomes (Table 1). Here, effective change strategies included practice-linked interventions such as reminders,'(' patient-mediated interven- tions," and outreach visits (such as those repre- sented by academic detailers, who are trained to promote rational prescribing behaviorsx), and (to a lesser extent) by community-based maneuvers such as opinion leaders or educational influen- tials." Multifaceted activities, especially using three or more interventions, were also e f f e~ t ive .~ Audit and feedback as a maneuver was somewhat less effective in changing physician performance. However, when the feedback was personalized and given by an individual in some authority (e.g., a faculty-level supervisor)," there was a positive effect in several studies. Finally, and most impor- tantly relative to the impact of educational inter- ventions, formal CME conferences or activities,

160

Page 3: Implications for undergraduate and graduate education derived from quantitative research in continuing medical education: lessons learned from an automobile

Davis and Thomson

Table 1 Effective vs Ineffective Change Strategies

Ineffective Educational materials, especially mailed, unsolicited print resources Formal CME, especially didactic courses

Moderately Effective Auditlfeedback, enhanced if feedback individualized Opinion leaders

Academic detailing Practice-linked methods, such as reminders and patient education Multifaceted interventions

Highly Effective

without enabling or practice-reinforcing strate- gies, had relatively little i m p a ~ t . ~ It should be noted that these studies, by their quantitative nature, did not analyze the perceptions of physi- cians regarding their learning and change: the studies thus provide a macro view of clinical prac- tice, not a qualitative, adult learning perspective.

There appears to be at least one theoretical construct to explain those interventions that are effective in changing physician performance and differentiate them from those that are not. Suc- cessful interventions appear to have three char- acteristics, described in an earlier review of 50 ran- domized controlled trials,’ based on work by Green et al.I4 They predispose the physicians to change (e.g., by disseminating information), they enable the change by facilitating it in the practice environment (e.g., by patient education), and, once made, they reinforce the change (e.g., by reminders). This paradigm, proposed by Green et al.I4 as the PRECEDE model, is derived from the field of health promotion and is one with close par- allels to CME and physician performance change.

Needs Assessment

A further feature of the 99-study review was the issue of needs determination. Here, the studies were reviewed to determine the degree to which the CME intervention was based on identified gaps in performance, targeted to those with sub- optimal behaviors, and attempted to correct bar- riers to change in the practice setting. These stud- ies were contrasted with those in which no needs

determination process was identified in the study, or with those that were based on clinical guide- lines developed by national or local groups. The former studies displayed a more effective pat- tern of change in physician performance or health care outcomes than the others, a phenomenon that held true even in those studies in which a local consensus process was used to implement the guidelines.

These studies and others15 call into question the abilities of physicians to truly judge their own competencies. Derived from the remedial litera- ture, there are nonetheless ways in which such competency or gap assessments can be made, from the perspective of competence16 or from that of physician performance.”

Evaluation

The final feature of the review article derives not so much from the detail of its findings, but rather from its concept: though educational processes in CME are complex, subject to widely differing internal and external variables, their impacts on defined areas can be subjected to assessment and their outcomes measured.

As reflected in this review, evaluation in CME had adopted a broad and comprehensive profile of end points or outcomes as targets, ranging from simple, postcourse “happiness indexes” to health status. In contrast, undergraduate education (UGE) and graduate education (GE) still focus mostly on competency assessment as the major marker of educational success in these latter two program

161

Page 4: Implications for undergraduate and graduate education derived from quantitative research in continuing medical education: lessons learned from an automobile

Qiiaiititatiise Research in CME

areas. Only recently in UGE and GE has attention been turned to more performance- or health care outcome-bawd endpoints for these programs.

Implications for Undergraduate and Graduate Education

Adapting some of this information to the resi- dency and undergraduate setting generates some thoughts and suggestions in the areas of format. needs assessment. and evaluation of educational impact.

Formats in Undergraduate and Residency Education

The first cluster of recommendations arises from the yuestion of the formats best used for teaching and learning in UGE and GE. The review clearly indicates the ineffectivsness of the traditional didactic lecture in changing physician perfor- mance. arguably the most important of findings in this area. I t seems clear that we need to use a wider variety of learning resources in these ear- lier settings. particularly those that the physician himself or herself will use in practice. Such things as distance education technology or computers make good. empirical sense: if one uses them in undergraduate and residency settings. one may use them. probably u ith more facility. in CME. There is already evidence that physicians exposed to self-directed learning. often coordinated through small-group. problem-based strategies. continue their practice of self-directed learning, keeping up to date. later in practice."

Further. adapting some of the more successful CME interventions lends itself to practice-based methods in the settings of clerkship and residency education. Audit and feedback. especially when performed by a respected peer or faculty-based supervisor. appears to be an effective means by which to alter performance. as do computer- generated reminders in the area of disease pre- vention and health promotion." Undergraduate and graduate planning might do well to choose the community-based sites. the future location of

undergraduate and residency education, in which these capabilities exist.

Finally, the impact of two or three effective interventions in the same subject area leads nat- urally to a discussion of undergraduate curricular maps. on which are plotted themes and objec- tives, presented in a variety of ways, and rein- forcing or building on knowledge. Such models would build on those of Green et al.I4 and others who suggest successful change strategies that incorporate predisposing, enabling, and reinforc- ing attributes. Residency curricula, more depen- dent on the haphazard confluence of patient vis- its. lends itself more to the model of academic half-days, and the development of private learn- ing logs that can track and document the learning in subject or topic areas of relevance to the devel- oping clinician.

Competency and Needs Assessment in the Curriculum

The second cluster of recommendations relates to the subject of learner-driven needs assess- ment, until now the domain of CME, and clearly not the focus of earlier undergraduate or residency curricula in which planning is the sole responsi- bility of the program director or undergraduate dean. However, the era has passed in which all medical students or residents come from similar backgrounds and may be considered empty slates on which a medical curriculum may be written. Residents, too, display widely varying areas of competence and need. Since there is evidence that educational changes occur in the face of a defin- able gap. the consideration of flexible, modular curricula. marked by competency testing and exemptions, may permit undergraduates and res- idents to spend time in areas of actual learning need. rather than previously determined curricular "want."

The process and methods of needs assessment, from the CME literature perspective, appears to be at least bifold, comprising an equal mixture of subjective learner "wants" combined with faculty

162

Page 5: Implications for undergraduate and graduate education derived from quantitative research in continuing medical education: lessons learned from an automobile

input, the latter hopefully based on objective data. The notion of determining unperceived needs exists in the face of rather hard evidence derived from at least one of the randomized controlled trials (RCTs)lS that practicing physicians and, presum- ably, their earlier counterparts may not always know what they need to know. Major tools for the assessment of need have also come from CME, in this case, from the problem of the “discompetent physician,” a topic of increasing focus in CME.16 Here, it has been demonstrated that a wide variety of tools must be used to triangulate learning gaps and deficiencies. These include multiple choice questions, simulated patient visits, chart stimu- lated recalls, problem-based orals, and others. Each test must pass its own test, namely, that of demon- strated reliability, validity, credibility, compre- hensiveness, and feasibility. Further, the compe- tency assessment field has seen the emergence of a trained bank of assessors to help evaluate the com- petence of their peers, at some time, effort, and cost. This sort of effort might well be emulated in fac- ulty development.

Finally, work in the area of discompetence has developed a picture of competence of the prac- ticing physician, until now not well articulated for audiences comprised of undergraduate planners and teachers. This vision of the competent gener- alist-graduate may be the most useful tool since the emergence of the General and Professional Edu- cation of the Physician (GPEP) report over a decade ago.lY

The Issue of Program Evaluation

Third, the issue of program evaluation and out- comes assessment in CME has lessons for earlier components of the continuum. Although these earlier stages are marked by significant attempts at student/resident competency assessment, little in the way of program or outcomes assessment is attempted. Further, at the other end of the evalu- ation spectrum, there appears to be better attention paid on the part of the UGE planners to the per- ception of the learner. One may respond by

Davis and Thomson

163

pointing to the broad CME experience in evalua- tions. If the short, interrupted, and disconnected activities of CME can be evaluated programmat- ically and assessed in the real-world setting (with all of the contaminating and cointervening factors of practice life), then so can undergraduate and graduate education. While we may do this in indi- vidual schools and individual programs, the process is sporadic and permits the observer to perceive no coherent plan at work. The questions “does UGE work?” or “does residency work?” appear not to be raised in the literature, in parallel to the sim- ilar question in CME posed by researchers in the 1 9 8 0 ~ ’ ~ Further, with few exceptions,20,2’ there exist few consistent reviews of the literature of undergraduate and residency education. Finally, this sort of evaluation can give rise to more com- prehensive and longitudinal views of the effect of medical education: there is no reason why we cannot begin to look for the influence of under- graduate learning and residency training on the practice patterns of physicians.

The phrase “evaluate CME’ has also caused a rethinking of what we mean by CME; it is no longer “CME as the short course.” In the same way, it may be possible to rethink the earlier levels of education, creating a new mission, new objectives, and new curricula. In such a vision of education, the objec- tives, evaluation criteria, and tools are the impor- tant conceptual and actual keystones of the cur- riculum, not the long list of lectures that currently form the work of most undergraduate learners or the rotations of the residency supervisor.

Summary: Cautious Advice to the Planners of Undergraduate and Graduate Education-Driving the Honda Accord

This brief paper is derived from a review of stud- ies of the effect on physician performance and health care outcomes by individual CME inter- ventions. As such, it suffers from several method- ological difficulties (e.g., lack of detail about the specific educational intervention) that make inter- pretation difficult and translation into the earlier phases of the continuum problematic. In most

Page 6: Implications for undergraduate and graduate education derived from quantitative research in continuing medical education: lessons learned from an automobile

Table 2 Lessons from CME: Recommendations to Undergraduate and Residency Educational Planning

1. 2.

3.

4.

5.

6.

Needs assessment/flexible curricula based on demonstration of mastery Student and resident assessment based on:

a ) reliable tools b) knowledge plus skills plus attitudes c ) the efforts of trained (cross-continuum) assessors d ) a clear, articulated picture of competence

The use of learning resources and formats that: a) resemble those of use to practicing physicians b) display proven effectiveness

Program evaluation of specific undergraduate and graduate interventions in a comprehensive, planned, and pro1 : c t ’ ive manner Explore/refine factors contributing to discompetence

a ) in the undergraduate setting b) at Admissions

Undergraduate and graduate curriculum. resources. and experience based on an understanding of physician change model(s)

instances. despite these difficulties in interpreta- tion. cautious suggestions may be made to under- graduate and graduate planners.

First. there may be some publication bias at work here. in which only positive studies are pub- lished. Further. this review may have ignored important studies that did not fit the selected cri- teria but that could have elucidated areas of ques- tion or uncertainty.

Second. and most importantly. these studies are RCT. macro views of the actions of physicians. not focused on learners’ motivations. patient prob- lems. and learning patterns. Fox and his colleagues demonstrated that true learning for physicians occurs in an interactive and iterative mode. as exemplified in the Change Study for the Society of Medical College Directors of CME.” The study was conceived to help understand how and why physi- cians make changes in their practices and the role of learning o r education (CME) in the process of change. It employed approximately 400 interviews and confirmed that change was ubiquitous: that it was promoted by many factors in the physician‘s environment. including those derived from patient and peer contact, from desires for competency, and from personal experience: and that many learning

resources could be used in facilitating the change process. This view inside the learning process i s much different from that provided by the macro, quantitative. and external perspective of the RCT review.

Third, the studies of lectures or didactic teach- ing sessions do not include students and in only some instances do they include residents. Thus, the findings of the weaknesses of the printed mater- ial method or didactic conferences must be mod- ified when applied to the undergraduate setting. Certainly, from the faculty perspective, lectures are an efficient means of transmitting information. However, even if they are an effective tool for the transmission of practice-translatable knowledge, questions may be raised about the longer-term consequences of too much dependence on the lec- ture as a curricular vehicle. Students may become accustomed to this passive mode of learning and unskilled in the use of other learning resources. Thus, the traditional lecture may reinforce a pas- sive-learner role, not conducive to effective, prac- tice-based CME skills.

Fourth, implications for undergraduate and graduate curricular reform also include an aware- ness of the practice environment as a force for

164

Page 7: Implications for undergraduate and graduate education derived from quantitative research in continuing medical education: lessons learned from an automobile

Davis and Thomson

change and the teaching of self-directed learning and critical appraisal skills so that knowledge derived from research can be adopted by practi- tioners. In fact, the plethora of learning resources and aids-from the conference and printed mate- rial to the academic detailer-appears likely to grow, given the current state of CME and health care practice. Given this growth, the develop- ment of self-directed learning skills in under- graduate and graduate curricula and clinical expe- riences appears mandatory, not optional: the question is not when but how much and which technologies to introduce. Further, incorporation of discussion of the learning environment (e.g., managed care) in undergraduate programs and res- idency is a necessary and practical step, made eas- ier in new community/ambulatory care settings.

In conclusion, this particular review of the lit- erature of CME, though methodologically flawed, has much to inform the planners and learners of the earlier phases of medical education (Table 2). In many ways, this process of information may be visu- alized as the Honda Accord “recycled air” figure alluded to in the opening paragraphs of this article. Less conceptually, this process may also be viewed as a dialogue between all planners and learners from medical school admission to retirement from practice. This dialogue possesses clear and impor- tant implications for the participants in North Amer- ica’s health care system: the committed educator, the competent clinician, and the patient.

References

1. Office of Continuing Education. Research and development resource base in continuing medical education: annual report of the RDRB/CME, 1994/95. Toronto: Faculty of Medicine, University of Toronto, 1995.

2. Lloyd JS, Abrahamson S. Effectiveness of continuing medical education: a review of the evidence. Eva1 Health Prof 1979; 2:25 1-280.

3. Bertram DA, Brooks-Bertram PA. The evalua- tion of continuing medical education: a litera- ture review. Health Educ Monogr 1977; 51330-362.

4. Beaudry JS. The effectiveness of continuing medical education: a quantitative synthesis. J Cont Educ Health Prof 1989: 9:285-307.

5. Haynes RB, Davis DA, McKibbon A, Tugwell P. A critical appraisal of the efficacy of contin- uing medical education. JAMA 1984; 251161-64.

6. Oxman AD, Thomson MA, Haynes RB, Davis DA. No magic bullets: a systematic review of 102 trials of interventions to improve profes- sional practice. Can Med Assoc J 1995; 153: 1423-1 43 1.

7. Davis DA, Thomson MA, Oxman AD, Haynes RB. Evidence for the effectiveness of CME: a review of 50 randomized controlled trials. JAMA 1992; 268:1111-1117.

8. Avom J, Soumerai SB. Improving drug-thera- py decisions through educational outreach: a randomized controlled trial of academically based “detailing.” N Engl J Med 1983; 308: 1457-1463.

9. Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance: a sys- tematic review of the effect of continuing medical education strategies. JAMA 1995; 274:700-705.

10. Tierney WM, Hui SL, McDonald CJ. Delayed feedback of physician performance versus immediate reminders to perform preventive care. Med Care 1986; 24:659-666.

11. Vinicor F, Cohen SJ, Mazzuca SA, et al. Diabeds: a randomized trial of the effects of physician and/or patient education on diabetes patient outcomes. J Chronic Dis 1987; 40:345-356.

12. Lomas J, Enkin M, Anderson GM, Hannah WJ, Vayda E, Singer J. Opinion leaders vs audit and feedback to implement practice guidelines. Delivery after previous cesarean section. JAMA 199 1; 265:2202-2207.

13. Pinkerton RE. Resident physician performance in a continuing education format: does newly acquired information improve patient care? JAMA 1980; 244:2183-2185.

165

Page 8: Implications for undergraduate and graduate education derived from quantitative research in continuing medical education: lessons learned from an automobile

14. Green L. Kreuter M. Deeds S. Partridge K. Health education planning: a diagnostic approach. Palo Alto. CA: Mayfield Press, 19x0.

15. Sibley JC. Sackert DL. Neufeld V. Gerrard B. Rudnich KV. Fraser W. A randomized trial of continuing medical education. N Engl J Med 1983: i06:511-515.

16. Norman GR. Davis DA. Painvin A. Lindsay E. Rath D. Rapbeer M . Comprehensive assess- ment of clinical competence of family/general phy\icians using multiple measures. RIME Proc 10x9: 28:75-80.

17. Davis DA. Lindsa) E. Mazmanian PE. The effectiveness of CME interventions. In: Davis DA. Fox RD. eds. The physician as learner. Ch icap : American Medical Association. 1994:’?5-2XO.

18. Shin JH. Haynes RB. Johnston ME. Effect of problem-based. self-directed undergraduate

education on life-long learning. Can Med Assoc J 1993: 1481969-976.

19. Association of American Medical Colleges. The general professional education of the physician. Washington, DC: AAMC, 1985.

70. Albanese MA, Mitchell S. Problem-based learning: a review of literature on its outcomes and implementation issues. Acad Med 1993; 68:52-8 I .

2 I . Nonnan GR. Schmidt HG. The psychological basis of problem-based learning: a review of the evidence. Acad Med 1992; 67:557-565.

22. Fox RD. Mazmanian PE, Putnam RW. Changing and learning in the lives of physi- cians. New York: Praeger. 1989.

23. Premi JN. Individualized continuing medical education. In: Davis DA, Fox RD, eds. The physician as learner. Chicago: American Medical Association, 1994:203-216.

166