evaluation of a continuing medical education system

24
Evaluation of a Continuing Medical Education system integrated to an EHR in an academic Hospital in Argentina Damian Borbolla, MD

Upload: others

Post on 02-Feb-2022

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluation of a Continuing Medical Education system

Evaluation of a Continuing Medical Education system

integrated to an EHR in an academic Hospital in

Argentina

Damian Borbolla, MD

Page 2: Evaluation of a Continuing Medical Education system

continuous building of skills , knowledge and attitudes

Page 3: Evaluation of a Continuing Medical Education system

Strategies

There is not conclusive definition about effectiveness of CME

Peck C, McCall M, McLaren B, Rotem T. Continuing medical education and continuing professional development: international comparisons. BMJ. 2000 Feb 12;320(7232):432-5. Review. PubMed PMID: 10669451

Page 4: Evaluation of a Continuing Medical Education system

But this is the situation….

CMEApproaches

Ineffective In changing behavior

Approaches

Effective

Physicians don’t like it

We need a different approach?

Stephens MB, McKenna M, Carrington K. Adult learning models for large-group continuing medical education activities. Fam Med. 2011 May;43(5):334-7. PubMed PMID: 21557103

In terms of:Retention of informationChanging behavior

live media (not print)Multimediamultiple exposures

Page 5: Evaluation of a Continuing Medical Education system

Adult learning theory principles

1- Need to know

2- Self concept

3- Foundation

4- Readiness to learn

Pedagogy ≠ Andragogy

Patient encounter and theory principles

5- Orientation to learning

6- Motivation to learn

Knowles M. The adult learner: the definitive classic in adult education and human resource development. Seventh edition. Amsterdam: Elsevier; 2011

Page 6: Evaluation of a Continuing Medical Education system

Need to know

� Adults need to know the reason they are learning something

� Being in the presence of a patient with specific characteristics and having information about it will make the physician wanted to read about it

� Patient encounter awake information needs

� recognized IN every 3 patients (Gorman 1995)

� Unrecognized IN

� knowledge base information, one of the most common sources of information � knowledge base information, one of the most common sources of information needed by (Osheroff 1991)

� Unrecognized Needs

� Information-gathering vs. information seeking activities (Gorman, Helfand1995)

infobuttonsdata extracted from the EHR are used to

anticipate the needs of the clinician

Page 7: Evaluation of a Continuing Medical Education system

Readiness to learn

� Relevance to their work

� Similar to “just in time” (Nissen 2004)

Infobuttons and effectiveness?

Page 8: Evaluation of a Continuing Medical Education system

Orientation to learning

� Problem-centered rather than content-oriented

� CONTEXT

Page 9: Evaluation of a Continuing Medical Education system

Hospital Italiano de Buenos Aires (HIBA)

� 2 Hospital

� 24 clinics

� Aprox. 2000 physicians

� In-house developed HIS

� Fully implemented � Fully implemented ambulatory EHR

Page 10: Evaluation of a Continuing Medical Education system

CME system

� Developed with the CME committee and the HIBA University

� Working together with pediatrics, internal and family medicine departments.

� First 2 year for the content and system development

� Ready to implement

Page 11: Evaluation of a Continuing Medical Education system

CME system

� Patient characteristics

� Age

� Sex

� Problems or diagnosis

� Medications

� Physician characteristics

� Specialty

� Context

� Ambulatory setting

� Inpatient

Knowledgecontent

� Inpatient

� ED

Page 12: Evaluation of a Continuing Medical Education system

CME Evaluation Model

Level Outcome Definition

1 Participation The number of physicians and others who registered and attended

2 Satisfaction The degree to which the expectations of the participants about the setting and delivery of the CME activity were met

3 Learning Changes in the knowledge, skills, and attitudes of the participants; the development of competence the development of competence

4 Performance Changes in practice performance as a result of the application of what was learned

5 Patient health Changes in the health status of patients due to changes in practice behavior

6 Populationhealth

Changes in the health status of a population of patients due to changes in practice behavior

Mazmanian PE, Davis DA, Galbraith R; American College of Chest Physicians Health and Science Policy Committee. Continuing medical education effect on clinical outcomes: effectiveness of continuing medical education: American College of Chest Physicians Evidence-Based Educational Guidelines. Chest. 2009 Mar;135(3 Suppl):49S-55S. Review. PubMed PMID: 19265076

Page 13: Evaluation of a Continuing Medical Education system

Proposed sequential system evaluation� Will physicians use the CME system?

� Will physicians be satisfied with the use of the CME system?

� Will physicians’ knowledge change with the use of the CME system?

First level

Third level

Second level

of the CME system?

� Will the CME system change physicians’ performance?

� Will the CME system change patients’ health?

� Will the CME system change population health?

Second phase

Page 14: Evaluation of a Continuing Medical Education system

Proposed design

� RCT

� For the third level evaluation and level 1 and 2 assessed in the intervention group

� Randomization: physicians will be grouped into different geographic clusters by practice location to minimize contamination and to maximize the efficiency of intervention delivery. Clusters of similar size will be randomly allocated to intervention or controlrandomly allocated to intervention or control

� Controlled before and after study (quasi-experimental)

� Control group for

� Before and after design without control

� The same physicians are controls

Page 15: Evaluation of a Continuing Medical Education system

Timeframe - RCT

Rep

lannin

gPilot

Sample Calculation

Use

Levels 1&2R

epla

nnin

gPilotRandomization

Interv. Group

Control Group

JSPLL

Using the system

FT

Use

Satisfaction

Level 3

Page 16: Evaluation of a Continuing Medical Education system

Timeframe – Controlled B-A

Rep

lannin

gPilot

Sample Calculation

Use

Levels 1&2R

epla

nnin

gPilot

Interv. Group

Control Group

ITJSPLL

Using the system

FT

Use

Satisfaction

Level 3

Page 17: Evaluation of a Continuing Medical Education system

Timeframe – B-A

Rep

lannin

gPilot

Sample Calculation

Use

Levels 1&2R

epla

nnin

g

Interv. GroupITJSPLL

Using the system

FT

Satisfaction

Level 3

The test has questions related to information pieces and question not related . The same subject is the

control

Page 18: Evaluation of a Continuing Medical Education system

Intervention

Page 19: Evaluation of a Continuing Medical Education system

Instruments

� Knowledge test:� IT and AT the same or similar test

� Multiple choice question

� Developed for specialist in the CME committee

� Group of questions related to information pieces and other question not related

� Jefferson Scale of Physician Lifelong Learning (JSPLL)Jefferson Scale of Physician Lifelong Learning (JSPLL)� Developed to measure physicians' orientation to lifelong learning

� Possible confounder

� Satisfaction Survey� Satisfaction with the system interface

� QUIS

� Satisfaction with the educational strategy

Page 20: Evaluation of a Continuing Medical Education system

Pilot Study

� Characteristics:

� 20 physicians

� They will perform:

� Before start using the system

� Knowledge test

� Jefferson Scale of Physician Lifelong Learning (JSPLL)

� After 2 using the system

� Satisfaction

� Analysis of the use of the system

Page 21: Evaluation of a Continuing Medical Education system

Pilot Study

� Before the start of the protocol a pilot study will be performed with the aim of:

� Test Instruments

� Physicians baseline knowledge for sample calculation for the protocol

� Variance (to see if it is too large)

� Ceiling effect (to see if base knowledge is too high)Ceiling effect (to see if base knowledge is too high)

� Pilot recruitment process

� Validation of satisfaction with the educational approach

Page 22: Evaluation of a Continuing Medical Education system

Analysis level 1

� System Usage

� Proportion of information piece accessed or read over the total information shown

Page 23: Evaluation of a Continuing Medical Education system

Analysis level 2

� Satisfaction with the system and the educational strategy

� Satisfaction with system interface

� QUIS

� Educational strategy evaluation tool

� Validation with pilot study

� Feasibility� Feasibility

� Endorsement

� Reliability

� Test-retest (intraclass correlation)

� Internal vality

� Crombach test

� Score

23Pelayo M, Cebrián D, Areosa A, Agra Y, Izquierdo JV, Bu endía F. Effects of online palliative care training on knowledge, attitude and satisfaction of primary care physicians. BMC Fam Pract. 2011 May 23;12:37. PubMed PMID: 21605381

Page 24: Evaluation of a Continuing Medical Education system

Analysis level 3 (B-A)

� Will the CMR system improve physicians knowledge?

� Outcome variable: Score in Before and After tests

� Intervention

� Paired t-test for question related to information pieces

� Expecting to find a difference due the intervention

� Paired t-test for question not related to information pieces

� Expecting to find not difference� Expecting to find not difference

� Control

� Paired t-test for both type question

� Expecting to find not difference

� RCT

� T-test total score between 2 groups