meta-analyses and metaphors in cme literature

2
The Journul of Continuing Education in the Health Professions, Volume 1 I, pp. 85-86. Printed in the U.S.A. All rights reserved Copyright 0 1991 The Alliance for Continuing Education and the Society of Medical College Directors of Continuing Medical Education. Guest Editorial Meta-Analyses and Metaphors in CME Literature DAVE DAVIS, M.D., F.C.F.P. Chair, Continuing Education Professor, Family Medicine McMaster University Hamilton, Ontario This issue of JCEHP contains several thoughtful and well-articulated examples of research in continuing education for health professionals and one example of a literature review in CME - a “meta-analysis” in this decade of rapid literature access, review and synthesis. This particular review, by McLaughlin and Donaldson, is the most recent of several such CME literature analyses, beginning with Bertram and Brooks-Bertram’s overview article in 1977.’ Each of them is a building block in the theoreti- cal and practical construct of medical practice education. They have added to, and, to the extent to which they are used or quoted, shaped the field. The purpose of this brief editorial is to review the definition and purposes of such meta-analyses; to discuss some of the relevant methodol- ogy; and, finally, to explore the outcomes and meaning of such reviews to the field of CME research. First, definitions are in order. Meinert narrowly restricts the term meta-analysis to statistical analyses of two or more trials of the same treatment “for the purpose of drawing a global conclusion concerning the safety and efficacy of that treatment.” * Others would argue that, while this is a valid and useful notion in clinical trials, the realm of continuing education for health professionals deals not with one treatment, but many, and exists in a pluralistic, multi-impact milieu, demanding both qualita- tive and quantitative measures. Further, the purposes of the CME researcher are richer than those of a biomedical colleague. A rough list of the objectives of such reviews would include: determination of the “state of the art” of CME research; identifying strengths, goals and future direc- tions; explanation of the boundaries and content of the field; improvement in the quality of research itself, particularly in the design and reporting of studies; and, finally, the development, testing and modification of theories of medical or health care practice education. Secondly, the methodologies useful in meta-analysis augment and direct the field. A few examples follow. Earlier review articles, grappling with the question of outcome measures, adopted Dixon’s3 highly practical construct of evaluation criteria. McLaughlin and Donaldson refine this 85

Upload: dave-davis

Post on 11-Jun-2016

213 views

Category:

Documents


1 download

TRANSCRIPT

The Journul of Continuing Education in the Health Professions, Volume 1 I , pp. 85-86. Printed in the U.S.A. All rights reserved Copyright 0 1991 The Alliance for Continuing Education and the Society of Medical College Directors of Continuing Medical Education.

Guest Editorial

Meta-Analyses and Metaphors in CME Literature DAVE DAVIS, M.D., F.C.F.P. Chair, Continuing Education Professor, Family Medicine McMaster University Hamilton, Ontario

This issue of JCEHP contains several thoughtful and well-articulated examples of research in continuing education for health professionals and one example of a literature review in CME - a “meta-analysis” in this decade of rapid literature access, review and synthesis. This particular review, by McLaughlin and Donaldson, is the most recent of several such CME literature analyses, beginning with Bertram and Brooks-Bertram’s overview article in 1977.’ Each of them is a building block in the theoreti- cal and practical construct of medical practice education. They have added to, and, to the extent to which they are used or quoted, shaped the field. The purpose of this brief editorial is to review the definition and purposes of such meta-analyses; to discuss some of the relevant methodol- ogy; and, finally, to explore the outcomes and meaning of such reviews to the field of CME research.

First, definitions are in order. Meinert narrowly restricts the term meta-analysis to statistical analyses of two or more trials of the same treatment “for the purpose of drawing a global conclusion concerning the safety and efficacy of that treatment.” * Others would argue that, while this is a valid and useful notion in clinical trials, the realm of continuing education for health professionals deals not with one treatment, but many, and exists in a pluralistic, multi-impact milieu, demanding both qualita- tive and quantitative measures. Further, the purposes of the CME researcher are richer than those of a biomedical colleague. A rough list of the objectives of such reviews would include: determination of the “state of the art” of CME research; identifying strengths, goals and future direc- tions; explanation of the boundaries and content of the field; improvement in the quality of research itself, particularly in the design and reporting of studies; and, finally, the development, testing and modification of theories of medical or health care practice education.

Secondly, the methodologies useful in meta-analysis augment and direct the field. A few examples follow. Earlier review articles, grappling with the question of outcome measures, adopted Dixon’s3 highly practical construct of evaluation criteria. McLaughlin and Donaldson refine this

85

Dave Davis

schema to first-order outcomes - more directly the expected product of continuing education interventions - and those that are second order out- comes, i.e., those affected by variables such as practice site, patients or colleagues in the delivery of health care. While still in need of further def- inition, this distinction highlights the directives of the qualitative researcher, as evidenced best by the recent Society of Medical College Directors’ Change Study4. Even more in need of definition than outcome measures, the notion of ‘interventions’ (the biomedical researcher’s ‘treat- ment’) has been refined somewhat by McLaughlin and Donaldson to include a definition of, and distinction between, the terms ‘methods’ and ‘instructional technique’. In another example, Beaudry5 provided us with a useful comparative device - the effect size - adapted from biostatisti- cal literature.

Thirdly, there are the findings and meanings of meta-analyses. A regu- lar reader of this journal needs no reminder of the former: that CME works particularly well in improving physician competence, less well in changing performance and, with weak penetration, in changing patient outcomes. We know, for example, that practice-based interventions, like performance feedback, work better than didactic measures not based on learning needs. What may not be so clear, to those of us dealing with the trees of CME research, is the shape, or even the existence of the forest: that meta-analyses are done at all in CME research is a statement of the worth and the maturity of the field.

It is clear that the definitions of interventions or outcomes need to be further refined. It is clear that our methodologies need further testing. It is clear that a prospective plan, an agenda for research, needs to be devel- oped. Nevertheless, the meaning of the meta-analytical statement is that CME research - a melding of research in health services, education psy- chology, adult and continuing professional education, and related fields - is here to stay. References 1 . Bertram DA, Brooks-Bertram PA. The evaluation of continuing medical education:

2. Meinert CL. Meta-analysis: Science or religion? Controlled Clinical Trials, 1989;

3. Dixon J. Evaluation criteria in studies of continuing hducation in the health profes- sions: A critical review and a suggested strategy. Eval. Health Prof, 1978; 1:47-65.

4. Fox RD, Mazmanian PE, Putnam RW, eds. Change and learning in the lives of physicians, Praeger , New York 1989.

5. Beaudry J. The effectiveness of continuing medical education: A quantitative syn- theses, J Cont Educ Health Prof, 1989; 9:285-307.

A literature review. Health Educ Monog, 1977;5:330-362.

10:257-263(~).

86