impact on clinical behavior of face-to-face continuing medical education blended with online spaced...

6
Original Research Impact on Clinical Behavior of Face-to-Face Continuing Medical Education Blended with Online Spaced Education: A Randomized Controlled Trial TIMOTHY SHAW,PHD; ANDREA LONG,EDM; SANJIV CHOPRA, MD; B. PRICE KERFOOT, MD EDM Background: Spaced education (SE) is a novel, evidence-based form of online learning. We investigated whether an SE program following a face-to-face continuing medical education (CME) course could enhance the course’s impact on providers’ clinical behaviors. Methods: This randomized controlled trial was conducted from March 2009 to April 2010, immediately following the Current Clinical Issues in Primary Care (Pri-Med) CME conference in Houston, Texas. Enrolled providers were randomized to receive the SE program immediately after the live CME event or 18 weeks later (wait-list controls). The SE program consisted of 40 validated questions and explanations covering 4 clinical topics. The repetition intervals were adapted to each provider based on his or her performance (8- and 16-day intervals for incorrect and correct answers, respectively). Questions were retired when answered correctly twice in a row. At week 18, a behavior change survey instrument was administered simultaneously to providers in both cohorts. Results: Seventy-four percent of participants (181/246) completed the SE program. Of these, 97% (176/181) submitted the behavior change survey. Across all 4 clinical topics, providers who received SE reported significantly greater change in their global clinical behaviors as a result of the CME program (p-values .013 to <.001; effect size 0.7). Ninety-seven percent (175/179) requested to participate in future SE supplements to live CME courses. Eighty-six percent (156/179) agreed or strongly agreed that the SE program enhanced the impact of the live CME conference. Discussion: Online spaced education following a live CME course can significantly increase the impact of a face-to-face course on providers’ self-reported global clinical behaviors. Key Words: education, medical, continuing, technology Disclosures: Dr. Kerfoot receives career development support from the United States Agency for Healthcare Research and Quality, the American Urological Association Foundation (Linthicum, MD), and Astellas Pharma US, Inc. Dr. Kerfoot owns equity in and is a board member of Spaced Ed- ucation Inc. The views expressed in this article are those of the author and do not necessarily reflect the position and policy of the United States Fed- eral Government, Harvard Medical School, or the Department of Veterans Affairs. No official endorsement should be inferred. Dr. Shaw: Director, Workforce Education and Development Group, Uni- versity of Sydney, Australia; Ms. Long: Manager of Distance Learning and CME Online, Department of Continuing Medical Education, Harvard Med- ical School; Dr. Chopra: Faculty Dean for Continuing Education, Harvard Medical School; Dr. Kerfoot: Staff surgeon, Veterans Affairs Boston Health- care System, and Associate Professor, Harvard Medical School. Correspondence: Timothy Shaw, Workforce Education and Development Group (A01), The University of Sydney, Sydney, NSW, 2006 Australia; e-mail: [email protected]. C 2011 The Alliance for Continuing Medical Education, the Society for Academic Continuing Medical Education, and the Council on Continuing Medical Education, Association for Hospital Medical Education. Published online in Wiley Online Library (wileyonlinelibrary.com). DOI: 10.1002/chp.20113 Traditional types of continuing medical education (CME), such as live conferences or grand rounds, often have only a modest impact on clinicians’ knowledge retention and clinical behavior. 1,2 Recent studies indicate that blended pro- grams combining face-to-face activities with online learn- ing activities can be more effective than either intervention alone. 3 This raises the possibility that the impact of tradi- tional face-to-face CME activities can be improved through the addition of adjunct online programs. Spaced education (SE) is a novel, evidence-based form of online education that has been demonstrated in randomized trials to improve knowledge acquisition, 4 boost retention 5,6 and change behavior. 7,8 SE involves participants’ receiving short multiple-choice questions and feedback via e-mail in a repeating pattern over a number of weeks. The method- ology is based on 2 core psychological research findings: the spacing and testing effects. The spacing effect refers to the finding that educational encounters that are re- peated over time increase the acquisition and retention of knowledge. 9 The testing effect refers to the finding that the JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS,31(2):103–108, 2011

Upload: timothy-shaw

Post on 11-Jun-2016

218 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Impact on clinical behavior of face-to-face continuing medical education blended with online spaced education: A randomized controlled trial

Original Research

Impact on Clinical Behavior of Face-to-Face ContinuingMedical Education Blended with Online SpacedEducation: A Randomized Controlled Trial

TIMOTHY SHAW, PHD; ANDREA LONG, EDM; SANJIV CHOPRA, MD; B. PRICE KERFOOT, MD EDM

Background: Spaced education (SE) is a novel, evidence-based form of online learning. We investigated whetheran SE program following a face-to-face continuing medical education (CME) course could enhance the course’simpact on providers’ clinical behaviors.

Methods: This randomized controlled trial was conducted from March 2009 to April 2010, immediately followingthe Current Clinical Issues in Primary Care (Pri-Med) CME conference in Houston, Texas. Enrolled providers wererandomized to receive the SE program immediately after the live CME event or 18 weeks later (wait-list controls).The SE program consisted of 40 validated questions and explanations covering 4 clinical topics. The repetitionintervals were adapted to each provider based on his or her performance (8- and 16-day intervals for incorrectand correct answers, respectively). Questions were retired when answered correctly twice in a row. At week 18, abehavior change survey instrument was administered simultaneously to providers in both cohorts.

Results: Seventy-four percent of participants (181/246) completed the SE program. Of these, 97% (176/181)submitted the behavior change survey. Across all 4 clinical topics, providers who received SE reported significantlygreater change in their global clinical behaviors as a result of the CME program (p-values .013 to <.001; effectsize 0.7). Ninety-seven percent (175/179) requested to participate in future SE supplements to live CME courses.Eighty-six percent (156/179) agreed or strongly agreed that the SE program enhanced the impact of the live CMEconference.

Discussion: Online spaced education following a live CME course can significantly increase the impact of aface-to-face course on providers’ self-reported global clinical behaviors.

Key Words: education, medical, continuing, technology

Disclosures: Dr. Kerfoot receives career development support from theUnited States Agency for Healthcare Research and Quality, the AmericanUrological Association Foundation (Linthicum, MD), and Astellas PharmaUS, Inc. Dr. Kerfoot owns equity in and is a board member of Spaced Ed-ucation Inc. The views expressed in this article are those of the author anddo not necessarily reflect the position and policy of the United States Fed-eral Government, Harvard Medical School, or the Department of VeteransAffairs. No official endorsement should be inferred.

Dr. Shaw: Director, Workforce Education and Development Group, Uni-versity of Sydney, Australia; Ms. Long: Manager of Distance Learning andCME Online, Department of Continuing Medical Education, Harvard Med-ical School; Dr. Chopra: Faculty Dean for Continuing Education, HarvardMedical School; Dr. Kerfoot: Staff surgeon, Veterans Affairs Boston Health-care System, and Associate Professor, Harvard Medical School.

Correspondence: Timothy Shaw, Workforce Education and DevelopmentGroup (A01), The University of Sydney, Sydney, NSW, 2006 Australia;e-mail: [email protected].

C© 2011 The Alliance for Continuing Medical Education, the Society forAcademic Continuing Medical Education, and the Council on ContinuingMedical Education, Association for Hospital Medical Education.� Published online in Wiley Online Library (wileyonlinelibrary.com).DOI: 10.1002/chp.20113

Traditional types of continuing medical education (CME),such as live conferences or grand rounds, often have onlya modest impact on clinicians’ knowledge retention andclinical behavior.1,2 Recent studies indicate that blended pro-grams combining face-to-face activities with online learn-ing activities can be more effective than either interventionalone.3 This raises the possibility that the impact of tradi-tional face-to-face CME activities can be improved throughthe addition of adjunct online programs.

Spaced education (SE) is a novel, evidence-based form ofonline education that has been demonstrated in randomizedtrials to improve knowledge acquisition,4 boost retention5,6

and change behavior.7,8 SE involves participants’ receivingshort multiple-choice questions and feedback via e-mail ina repeating pattern over a number of weeks. The method-ology is based on 2 core psychological research findings:the spacing and testing effects. The spacing effect refersto the finding that educational encounters that are re-peated over time increase the acquisition and retention ofknowledge.9 The testing effect refers to the finding that the

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS, 31(2):103–108, 2011

Page 2: Impact on clinical behavior of face-to-face continuing medical education blended with online spaced education: A randomized controlled trial

Shaw et al.

process of testing does not merely measure knowledge, butactually alters the learning process itself to significantly im-prove knowledge retention.10,11

Each spaced education item consists of an evaluative com-ponent (a clinically relevant multiple-choice question) andan educational component (the correct answer and a detailedexplanation of the answer). Participants submit an answer,receive immediate feedback, and compare their performancewith peers. To harness the educational benefits of the spacingeffect, the spaced education item is then repeated over inter-vals of time ranging from 1–12 weeks. An adaptive algorithmtailors the length of the spacing intervals and number of rep-etitions of the content for each learner based on his or herperformance. If a participant answers a question incorrectly,it repeats 8 days later; if the participant answers correctly, itrepeats 16 days later. Each question must be answered cor-rectly twice to be retired. In a randomized trial, this adaptivealgorithm was found to increase learning efficiency by morethan 35%.12

A recent pilot study combining a face-to-face programwith an adjunct spaced education course indicated that sucha combination enhanced learning and was well-accepted byparticipating clinicians.13 This current study looked to ex-pand on that pilot to determine if supplementing a face-to-face CME event with a spaced education follow-up programcan impact on participants’ self-reported clinical practice.

Educational Intervention and Methods

Study Participants

Health care providers attending the Current Clinical Issues inPrimary Care (Pri-Med) CME conference in Houston, Texas,in March 2009 were invited via e-mail to enroll in the study.There were no exclusion criteria. Institutional review boardapproval was obtained for the study from Harvard MedicalSchool.

Structure of the Live CME Event

Pri-Med, the live CME event, was held from March 19–21,2009, and was attended by approximately 3000 cliniciansfrom multiple institutions. The 3-day conference coveredtopics relevant to day-to-day practice, such as primary care,treating older patients, women’s health, pediatrics, and di-abetes management. The program comprised 15 sessions,offering a total of 50 lectures and 3 keynote addresses. Theprogram was developed and presented by faculty at HarvardMedical School and Baylor College of Medicine.

Development of the Spaced Education Items and Surveys

The SE items were derived from online modules that had beendeveloped and validated at the Harvard Medical School De-partment of Continuing Education. Each SE item containedan evaluative component (a multiple-choice question) and aneducational component (the correct answer and explanationsof the correct and incorrect answers). The topics covered the

diagnosis and management of hypertension (6 questions), hu-man immunodeficiency virus infection (12 questions), nonal-coholic steatohepatitis or NASH (12 questions), and diabetes(10 questions). The conference presentations were reviewedto ensure that the content of the spaced education programwas covered in the face-to-face sessions.

Behavior change among participants was measured via 5-point Likert-type questions. Specifically, we assessed the de-gree to which the CME program changed participants’ globalclinical practice patterns (ie, “Please indicate the degree towhich the PriMed-Harvard CME Program has changed yourclinical practice patterns for the following clinical condi-tions: diabetes, hypertension, HIV, and NASH”) and theirglobal confidence in managing patients with these four clin-ical conditions (i.e. “Please rate the degree to which thePriMed-Harvard CME Program has changed your confidencein managing patients with the following clinical conditions:diabetes, hypertension, HIV, and NASH”). In an exploratoryanalysis to assess impact of SE on targeted clinical behaviors,we also asked enrollees to rate on a 7-point Likert-type scalewhether 4 specific patient-management behaviors changedas a result of the live CME event: use of beta-blockers inpatients with hypertension, use of diuretics in patients withhypertension, treatment of cardiovascular risk factors in pa-tients with diabetes, and management of a 63-year-old malewith diabetes who had not reached his therapeutic goal de-spite taking 2 non-insulin medications. These questions wereselected by the authors as representing key take-home mes-sages from the program.

Participants were also asked, using a 7-point Likert-typescale, the degree to which the online program enhanced theimpact of the live CME course and whether or not theywould wish to participate in further SE courses should theybe offered as supplements to face-to-face programs in thefuture.

Study Design

This randomized controlled trial was conducted from March2009 to April 2010. Enrollees were stratified by medicaldegree (MD versus non-MD) and block randomized (blocksize = 8) to one of two cohorts. Participants were stratifiedprior to block randomization to ensure equal distribution ofparticipants with medical degrees in both of the randomizedgroups. Block randomization was used to ensure a relativelyequal distribution of participants between cohorts. A blocksize of 8 was used to ensure that the numbers in each co-hort would be equal, plus or minus 4. Providers in cohort1 received the SE program immediately following the liveCME event, while providers in cohort 2 (wait-list controls)started the SE program 18 weeks later. A wait-list controlsmethodology was used so that the effect of SE in one cohortcould be compared with a control cohort that did not re-ceive the intervention at the same time. Ultimately, however,both cohorts would receive the benefit of the intervention.The education material delivered via spaced education was

104 JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—31(2), 2011DOI: 10.1002/chp

Page 3: Impact on clinical behavior of face-to-face continuing medical education blended with online spaced education: A randomized controlled trial

Spaced Education Enhances Live CME Program

Mr. U. is a 34-year-old man who presents with a three-dayhistory of fever, sore throat, and skin rash.

Current History

• His symptoms developed acutely and have been moderatelysevere.

• He is fatigued and has a mild headache with the fever butdenies other associated constitutional or HEENT complaints.

• He has been taking acetaminophen around-the-clock withsome relief.

• He denies recent exposure to anyone with a similar illness.

Past History

• The patient has been in generally good health. He denieschronic medical problems or prior hospitalizations, surgeries,or significant injuries.

• He has a remote history of chlamydia infection and viral hep-atitis but does not know the type.

• He reports having a negative HIV antibody test two years priorat an anonymous test site.

• He is a gay man with multiple partners, some of whom are notwell known to him.

• He uses condoms “fairly regularly.”• His last sexual encounter, which involved receptive anal inter-

course, was about two weeks ago.• He drinks alcohol occasionally but denies other drug use.

Medications – noneAllergies – none

Physical Examination

Vitals: T 103.4 F, HR 104 and regular, BP128/82, RR 14.

General: Acutely ill-appearing man in moderatedistress.

HEENT: Moderate tender cervical adenopathy.No sinus tenderness. Tympanicmembranes normal. Exudativepharyngitis with ulcer noted on lefttonsil.

Chest: Clear to percussion and auscultation.Cardiovascular: Without gallops or murmurs.

Abdomen: Soft, nontender. Without mass ororganomegaly.

Extremities: Without edema. No evidence ofarthritis or bursitis.

Neurologic: Grossly nonfocal.Other: Skin: Diffuse erythematous macular

eruption, which is greatest on torso.

Laboratory Results

Lab Test Value Normal Range UnitsWBC 3.1 (20% atypical 4–10 K/mL

lymphocytes)Hct 40.2 40–54 (men) %Plt 125 150–450 K/mLBUN 20 9–25 mg/dLCr 0.8 0.8–1.3 mg/dLALT 54 7–52 U/LAST 58 9–30 U/LAlk Phos 100 36–118 U/LTotal Bili 1.2 0.2–1.2 mg/dL

The most likely diagnosis in this patient is:

Secondary syphilisPrimary HIV infectionEstablished HIV infectionHepatitis A

FeedbackPrimary (Acute) HIV infection, also known as “acute HIVinfection” or “seroconversion syndrome,” presents two tothree weeks after exposure as an undifferentiated viral ill-ness. The most common clinical manifestations includefever, generalized adenopathy, pharyngitis, and rash. Labo-ratory abnormalities may include leukopenia and increasedliver function tests. Primary HIV infection should be in-cluded in the differential diagnosis of any acute viral syn-drome in a patient who is behaviorally at risk. It should alsobe considered if an illness is prolonged or atypical or if anillness presents as a mononucleosis-like syndrome (fever,exudative pharyngitis, cervical adenopathy, and atypicallymphocytosis) with a negative heterophile test.

EXHIBIT 1. Example HIV Spaced Education Question

identical for providers in each cohort. At week 18, the be-havior change survey was administered simultaneously toproviders in both cohorts (see FIGURE 1).

The 40-question adaptive SE course was structured so thatproviders were sent 2 items (questions/explanations) everyother day. Questions relating to each content area were sentconsecutively in the order of hypertension, human immunod-eficiency virus infection, non-alcoholic steatohepatitis, anddiabetes (an example question is included in EXHIBIT 1). FIGURE 1. Structure of the Randomized Controlled Trial

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—31(2), 2011 105DOI: 10.1002/chp

Page 4: Impact on clinical behavior of face-to-face continuing medical education blended with online spaced education: A randomized controlled trial

Shaw et al.

If an item was answered incorrectly, it was repeated 8 dayslater. If an item was not answered within 8 days of its arrival,it was marked as answered incorrectly and was cycled backto the provider again. If an item was answered correctly, itwas repeated 16 days later. If an item was answered correctlytwice in a row, it was retired and no longer repeated. On anygiven day, new items would only be sent to a provider if norepetitions of prior items were scheduled for that day. Thelength of the adaptive SE program thus varied based on eachprovider’s performance in the SE program. At the end of theSE program, providers were asked to complete a short on-line end-of-program evaluation and, upon doing so, receivedCME credit for the SE program.

Outcome Measure and Statistical Analysis

The primary outcome measure was the difference betweencohorts in self-reported behavior change from the liveCME conference. Secondary outcome measures were self-reported confidence changes, the perceived effectiveness ofthe SE program, and providers’ interest in receiving futureSE programs.

Analyses were conducted on the data of those providerswho completed the SE program. Completion was definedas retiring 32 or more of the 40 SE items. Enrollees whoanswered one or more SE questions were defined as partic-ipants. Two-tailed t-tests were utilized to evaluate the sta-tistical significance of differences in survey scores betweencohorts. A P-value threshold of .05 was used for determin-ing significance. Intervention effect sizes for learning weremeasured by means of Cohen’s d.14 Cohen’s d expresses thedifference between the means in terms of standard devia-tion units, with 0.2 generally considered as a small effect,0.5 as a moderate effect, and 0.8 as a large effect.15 Statis-tical analyses were performed with SPSS for Windows 18.0(Chicago, IL).

Results

Three hundred conference attendees enrolled in the trial andwere randomized between cohorts. The characteristics ofproviders in each cohort were similar (TABLE 1). Two hun-dred forty-six enrollees (82%) participated in the SE programand, of these, 181 (74%) completed the SE program. SE com-pletion rates were similar across cohorts (FIGURE 2).

Ninety-seven percent of SE completers (176/181) sub-mitted the behavior change survey. These providers’ demo-graphic characteristics were similar between cohorts (datanot shown). Across all 4 clinical conditions, providers whoreceived SE (cohort 1) reported significantly greater changein response to the questions relating to their global prac-tice patterns as a result of the CME program, compared towait-list control (cohort 2) providers (p-value range .013to < .001; TABLE 2). Cohen effect sizes ranged from 0.4to 0.9. Similarly, providers who received SE (cohort 1) re-ported significantly increased confidence resulting from the

TABLE 1. Characteristics of Randomized Enrollees

Cohort 1 Cohort 2

Enrollees Randomized 151 149

Gender

Male 56 (37%) 50 (34%)

Female 95 (63%) 99 (66%)

Degree

MD 68 (45%) 67 (45%)

DO 6 (4%) 4 (3%)

NP 48 (32%) 48 (32%)

RN 3 (2%) 4 (3%)

PA 20 (13%) 20 (13%)

Other 6 (4%) 6 (4%)

Specialty

Family/general practice 84 (56%) 79 (53%)

Internal medicine 21 (14%) 23 (15%)

Medicine specialty 13 (9%) 18 (12%)

Pediatrics 8 (5%) 6 (4%)

Surgery 10 (7%) 5 (3%)

Other 15 (10%) 18 (12%)

Age (mean, SD) 50.9 (SD 12.0) 49.8 (SD 11.6)

Note: Percentages may not add to 100% due to rounding.

FIGURE 2. Modified CONSORT Flow Chart for the Randomized Con-trolled Trial

CME program across all four clinical conditions, comparedto wait-list controls (p values all ≤ .001). Cohen effect sizesranged from 0.5 to 1.0. In our exploratory analysis of changesin the targeted assessment of specific clinical measures, nosignificant difference in ratings was found between cohorts.

Ninety-nine percent of SE completers (179/181) submit-ted the end-of-program evaluation. Of these, 98% (175/179)indicated a wish to participate in further SE courses shouldthey be offered as supplements to face-to-face programs inthe future. Eighty-seven percent (156/179) agreed or stronglyagreed that the SE program enhanced the impact of the live

106 JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—31(2), 2011DOI: 10.1002/chp

Page 5: Impact on clinical behavior of face-to-face continuing medical education blended with online spaced education: A randomized controlled trial

Spaced Education Enhances Live CME Program

TABLE 2. Impact of Spaced Education on Providers’ Clinical Practice Behaviors

Please rate the degree to which the PriMed-Harvard CME Program has changed your clinical practice patterns for the following

clinical conditions:

(1) No Change At All—(5) Great Change

Condition Cohort 1: SE Cohort 2: no SE Effect size p value

Diabetes 3.7 (SD 1.0) 3.1 (SD 1.1) 0.5 <.001

Hypertension 3.4 (SD 1.1) 3.0 (SD 1.1) 0.4 .013

HIV 3.1 (SD 1.3) 1.9 (SD 1.2) 1.0 <0.001

NASH 3.8 (SD 1.1) 2.7 (SD 1.3) 0.9 <.001

CME conference. Cohort 1 providers recommended that SEprograms be started a mean 2.5 weeks (SD 5.7) after a liveCME conference, while cohort 2 providers recommended 5.1weeks (SD 5.6, p < .01).

Discussion

This randomized trial demonstrates that spaced educationsignificantly enhances the effect of a traditional face-to-faceprogram on clinicians’ reported assessment of global clini-cal behaviors. In addition, the study reinforces the findingsthat spaced education is well-received by physicians as amethod of CME delivery. The results of this study are ofparticular significance in the current context of CME, inwhich enormous emphasis is now being placed internation-ally on increasing the impact of CME on clinical practicebehaviors,16 while at the same time, attendance at lecture-style CME events that have modest impact on behaviorremain popular.17–19 These findings add to the growing re-search evidence showing that blended learning programs thatcombine face-to-face activities and online programs can bemore effective than a single face-to-face intervention alone.3

Our findings in this study raise a number of additionalquestions that require further research: How effective isspaced education as a replacement for those that could notattend a face-to-face meeting? What is the optimal timing be-tween a face-to-face meeting and starting an adjunct spacededucation program? How does SE impact on specific aspectsof management as opposed to generalized self-reported be-havior change? There are several limitations to this study,including its focus on a single face-to-face CME event andits reliance on self-reported outcome measures, which havebeen questioned in terms of how they relate to actual clini-cal behavior change.20 It appears that 18% of enrollees didnot receive the spaced education e-mails due to technical is-sues (eg, spam blockers) or chose not to participate in thestudy, since they did not submit an answer to a single spacededucation question. Our exploratory analysis revealed thatthe increase in global self-reported behavior changes in

cohort 1 was not founded on the 4 specific patient-management behaviors that we assessed. This finding may bedue to a number of factors, including the fact that our chosensubset of specific clinical behaviors were not encountered bythe participant or were not ones that contributed to their re-ported increase in global confidence. Strengths of the studyinclude the large number of participants from multiple in-stitutions, the randomized controlled design, and the use ofvalidated educational interventions.

In summary, our trial demonstrates that the impact of tra-ditional face-to-face CME on the self-reported global clin-ical behaviors of participants can be significantly improvedthrough the addition of an online spaced education program.

Acknowledgments

The authors would like to thank Ronald Rouse, David Bozzi,and Jason Alvarez of the Harvard Medical School Center forEducational Technology for the development of the spacededucation delivery platforms utilized in this trial.

Lessons for Practice

• Online learning programs, such as spacededucation, can be used to enhance the im-pact of traditional face-to-face CME activi-ties.

• Online learning programs can be well-received by clinicians if clinically relevant.

References

1. Davis D, O’Brien MA, Freemantle N, Wolf FM, Mazmanian P, Taylor-Vaisey A. Impact of formal continuing medical education: do confer-ences, workshops, rounds, and other traditional continuing educationactivities change physician behavior or health care outcomes? JAMA.1999 Sep 1;282(9):867–874.

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—31(2), 2011 107DOI: 10.1002/chp

Page 6: Impact on clinical behavior of face-to-face continuing medical education blended with online spaced education: A randomized controlled trial

Shaw et al.

2. Forsetlund L, Bjørndal A, Rashidian A, et al. Continuing educationmeetings and workshops: effects on professional practice and healthcare outcomes. Cochrane Database of Systematic Reviews. 2009(2).

3. US Department of Education, Office of Planning, Evaluation, and Pol-icy Development. Evaluation of Evidence-Based Practices in OnlineLearning: A Meta-Analysis and Review of Online Learning Studies.Washington, DC: US Dept of Education; 2009.

4. Kerfoot BP, Kearney MC, Connelly D, Ritchey ML. Interactive spacededucation to assess and improve knowledge of clinical practice guide-lines: a randomized controlled trial. Ann Surg. 2009 May;249(5):744–749.

5. Kerfoot BP. Learning benefits of on-line spaced education persist for2 years. J Urol. Jun. 2009;181(6):2671–2673.

6. Kerfoot BP, Fu Y, Baker H, Connelly D, Ritchey ML, Genega EM.Online spaced education generates transfer and improves long-termretention of diagnostic skills: a randomized controlled trial. J Am CollSurg. 2010;211:331–337.

7. Matzie KA, Kerfoot BP, Hafler JP, Breen EM. Spaced education im-proves the feedback that surgical residents give to medical students: arandomized trial. Am J Surg. 2009 Feb;197(2):252–257.

8. Kerfoot BP, Lawler EV, Sokolovskaya G, Gagnon D, Conlin PR.Durable improvements in prostate cancer screening from onlinespaced education: a randomized controlled trial. Am J Prevent Med.2010;39:472–478.

9. Pashler H, Rohrer D, Cepeda NJ, Carpenter SK. Enhancing learningand retarding forgetting: choices and consequences. Psychon Bull Rev.2007 Apr;14(2):187–193.

10. Larsen DP, Butler AC, Roediger HL III. Test-enhanced learning inmedical education. Med Educ. 2008 Oct;42(10):959–966.

11. Karpicke JD, Roediger HL III. The critical importance of retrieval forlearning. Science. 2008 Feb 15;319(5865):966–968.

12. Kerfoot BP. Adaptive spaced education improves learning efficiency: arandomized controlled trial. J Urol. 2010 Feb;183(2):678–681.

13. Long A, Kerfoot BP, Chopra S, Shaw T. Online spaced education tosupplement live courses. Med Educ. 2010;44:519–520.

14. Cohen J. Statistical Power Analysis for the Behavioral Sciences. 2nded. Hillsdale, NJ: Erlbaum; 1988.

15. Maxwell SE, Delaney HD. Designing experiments and analyzing data:a model comparison approach. Belmont, CA: Wadsworth; 1990.

16. ACCME. Accreditation Council for Continuing Medical EducationWeb site. www.accme.org/index.cfm/fa/home.home/home.cfm. Ac-cessed August 3, 2010.

17. Price DW, Overton CC, Duncan JP, et al. Results of the first nationalKaiser Permanente continuing medical education needs assessment sur-vey. The Permanente Journal. 2002;6(2):76–84.

18. Mamary E, Charles P. Promoting self-directed learning for continuingmedical education. Med Teach. 2003;25(2):188–198.

19. Nylenna M, Aasland OG. Doctors’ learning habits: CME activitiesamong Norwegian physicians over the last decade. BMC Medical Edu-cation. 2007;7(10).

20. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE,Perrier L. Accuracy of physician self-assessment compared with ob-served measures of competence: a systematic review. JAMA. 2006 Sep6;296(9):1094–1102.

108 JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—31(2), 2011DOI: 10.1002/chp