maryellen e. gusic md associate dean, clinical education professor of pediatrics penn state college...
TRANSCRIPT
Maryellen E. Gusic MDAssociate Dean, Clinical Education
Professor of PediatricsPenn State College of Medicine
“…we cannot value something that we cannot share, exchange, examine.”
Lee Shulman 1990
AcknowledgementsConnie Baldwin PhD, University of Rochester
Medical CenterLatha Chandran MD, MPH, Stony Brook
University Medical CenterCo-leaders of the Academic Pediatric
Association Educational Scholars Program
Do educators in academic health centers have time for scholarship? Is their contribution to the quality of future physicians valued?Barchi and Lowery. Academic Medicine 2000“The growing emphasis on delivery of clinical
services and the concomitant decrease in time for tenured and clinician-educator faculty to teach and do scholarly work jeopardizes both the potential for continued discovery and the education of the next generation of medical scholars.”
Are educators under-developed as academicians?
Promotion criteria for clinician educators examined by Beasley et al. in 1997
Importance of criteria for assessment (scale of 1-7) teaching skills (6.3) clinical skills (5.8) development of educational programs (5.3) nonresearch scholarship (5.1) education research (4.5)
Tools used to evaluate teaching: awards, peer evaluation, learner evaluation, teaching portfolio
Academic advancement slower for clinician educatorsThomas et al. Academic Medicine 2004
Odds of being at a higher rank were 85% less for academic clinicians and 69% less for teacher clinicians than for basic researchersAdjusted for age, gender, time in rank and work
satisfactionSatisfaction with progress towards academic
promotion 92% lower for academic clinicians and 87% lower for teacher-clinicians
Rigor of promotion progress lessened by paucity of valid evaluation methods for teaching and clinical practice
There are problems with the current systems of recognition for clinician-educatorsLevinson and Rubenstein 2000
Lack of reliable measures of teaching excellence
Lack of valid methods that measure outcomes of teaching and educational programs
Lack of congruence between job responsibilities and criteria by which faculty are judged for promotion
Judgments must be based on explicit criteriaFaculty members, department chairs and
P&T committee chairs and members may have differing definitions of excellence
In addition, there may be differing opinion/perception of the relative value of educational contributions in the P&T process
Work often discounted because it is not documented adequately or not understood by P&T committee members
First step: Expanding the definition of scholarshipIn 1990, Boyer challenged the concept that
teaching is simply an expected task performed by all academic physicians
Expanded definition of scholarship to include the scholarship of application, integration and teaching in addition to the scholarship of discovery
Reality: scholarship of discovery often most valued realm in academic institutions
“The elusiveness of the scholarship of teaching”Glassick. Academic Medicine 2000
Adoption of Boyer’s expanded definition of scholarship challenged by:Agreement about the meaning of this
category of scholarshipAgreement about how quality should be
measured
Excellent teaching is not the same as the scholarship of teaching
Glassick created an “equal playing field” by establishing common criteria for scholarship
Clear goalsAdequate preparationAppropriate methodsSignificant resultsEffective presentationReflective critique
One solution used by academic health centers: the creation of various promotion tracksNora et al. Academic Medicine 2000
Challenges of different tracks Perceived value/status Tenure eligibility Congruence of expectations for performance with
assigned activities of faculty members Ability to change tracks as careers evolve over time
Separate promotion tracks less important than “appropriate methods to evaluate” performanceBeasley et al. JAMA 1997
Promotion committees must accept an expanded definition of scholarshipCriteria for promotion must include the
scholarship of teachingEducational “credits” are more difficult to
document than research “credits”Documentation standards must allow for
methods that establish the quality and impact of the work of educators
Challenges of traditionally accepted academic documentsCV mainly documents educational quantity
(countable data)CV does not typically allow flexibility to document
quality and impact measures of educational activities
Challenge for educators to provide evidence that demonstrates a scholarly approach using traditional formats
Use of grants and publications as only markers of scholarship inadequate in capturing the work of educators
Educator Portfolios (EPs) show quantity, quality, and impact of an educator’s work
Documentation template that allows faculty to make their educational activities and accomplishments visible and to establish impactProve value
EPs have multiple usesFor use in P&T process For annual performance review
Negotiating for a new position, raise or time for educational work
For goal setting and meeting with a mentor/advisor
For writing a biographical sketch or grant proposal
For updating your cvFor award nominationsFor applying for a new job
Developmental vs Promotional EPsDevelopmental EPs Promotional EPsFormative document
Provides broad perspective
Helps to strategically plan career and intentionally plan educational work Tracks over time Aids in reflective practice
Serves as communication tool with mentors
Foundation for developing promotional EP
Summative documentHighlights, summarizes
major accomplishments and key achievements
Short, focused presentation
Personal statement to provide context for review of work
Summarized evidence of quality and effectiveness
The use of EPs in the P&T process in US medical Schools Simpson et al. Academic Medicine 2004
400% increase since 1992 in the number of schools using portfolios in promotion packets
Observations:Dissemination of work important factor for
inclusionInfrequent use of outcome measures or
internal/external review of educational work
Consistency of categories included in EPs but limited consensus on types of evidence to prove quality and impact Interviews of faculty responsible for
appointments/promotions revealed that excellence was not explicitly defined“ ‘We know what we want to look for…but it is
not really codified…”“ ‘We gave up defining scholarship because it
was eating up so much time and we could not get consensus. We have just been going ahead with the art and the “we know it when we see it” approach.’”
Lack of accepted common terminology, lack of standards for documentation and lack of guidelines and criteria for the evaluation of the content of EPs limits their success in accomplishing this goal
Documentation standards for educators explored in 2006 in a Consensus Conference on Educational ScholarshipConvened by AAMC Group on Educational
Affairs
Affirmation of 5 categories of educational activity and accomplishment
TeachingCurriculumAdvising and/or mentoringEducational leadership and/or
administrationLearner assessment
Excellence requires: “Q2 Engage”Quantity
Measures of the types and frequencies of activities and roles
Quality Evidence of effectiveness using
comparative measuresEvidence of engagement with the
community of educators
Engagement measured through a scholarly approach and scholarshipUse of a scholarly approach
demonstrated through evidence that one’s work builds on the work of others
Scholarship requires “P3”:Public displayPeer reviewDissemination= creating a platform upon
which others can build
A scholarly approach is proactive and reflectiveEvidence of a systematic approach using
best practices or information from the literature
Reflective practice: using self assessment and information from others to enhance future educational efforts
Dissemination of scholarly products allows peer reviewPeer review uses accepted criteria of
evaluationTo be considered scholarship, products must be
presented in a peer reviewed venue or repositoryAllows use of product by othersAllows to build upon the work of the scholar
Next step: Development of an accepted set of standards by which to value the work of educators
Faculty would better understand expectations for performance and judgment criteria Self assessment allows faculty to build skills in an organized
fashion Educational programs would improve
Development, implementation and evaluation of the programs would consider guidelines for excellence and a scholarly approach
Faculty and evaluators would share a common languageEducation would be seen and valued as a viable career
track in academic medicine
Criteria for the evaluation of educators can be refinedFincher et al. Academic Medicine 2000
The work of educators must be evaluated to be recognized and rewardedEffectiveness of teaching must be “rigorously
substantiated”The results of educational leadership must be
“demonstrable and broadly felt”The advancement of learning must be
measured to assess educational methods and programs
Although more widely used, EPs lack a widely accepted, standardized formatEPs remain difficult to assess in the absence of recognized standards for documentation and evaluation
31
Academic Pediatric Association (APA) Educational Scholars Program: An EP “test tube”ESP is a national faculty development
program for pediatric educatorsWe developed an EP template for use by
our scholarsStructure for systematically presenting
numeric and narrative data
APA EP template peer reviewed and published on MedEdPortalInclusion of 5 standard domainsAdditional items
Educational philosophy statementEvolves from an understanding of theory and
best practices combined with experience and reflection on teaching
Five year goals as an educatorEvidence of scholarly accomplishment
http://www.ambpeds.org/site/education/education_faculty_dev_template.htm
We have also created a systematic tool for analysis of EPs: The APA EP Analysis Tool
Peer reviewed and published on MedEdPortal
http://www.ambpeds.org/site/education/education_faculty_dev_template.htm
Use of a parallel template for the portfolio and the analysis tool allows valid and reliable evaluation
Analysis tool Allows reproducible analysis for use across disciplines
and across institutions Promotes same methodology used in the evaluation of
researchersPrinciples which guided the development of each:
Use of measurable outcomes to demonstrate impact Quantitative and qualitative measures to ensure
objective analysis
The analysis tool was developed through a formal consensus building processAcademic Medicine 2009
Multiple rounds of item development and selection L. Chandran, C. Baldwin, T. Turner, E. Zenni, L. Lane, D.
Balmer, M. Bar-on, D. Rauch, D. Indyk, L. GruppenEnhancement of template to improve the quality of
information available for reviewCreation of a set of instructions for use of the tool to
promote reliable application of standards
List of >100Quantitative
Items
List of 52Qualitative
Items
Tool 1.1:Selected & Combined 43 Items
Tool 1.2:Tested,
Refined & Reconciled
48 Items
Tool 2.0:36 items*
MedEdMedEdPortal Portal
ApprovalApproval
MedEdMedEdPortal Portal
ApprovalApproval
Step 42 EPs
8 raters
Step 33 EPs
3+ 2 raters
Step 515-20 EPs8 raters
Tool DevelopmentInter-rater Reliability Testing
EP TemplateRevision
Step 127 EPs6 raters
Step 25 EPs
4 raters EP TemplateRevision
TemplateDevelopment
EPTemplate
Analysis tool item summary18 quantitative items including index scores
that combine related measuresWeights used for index scores are calibrated
across the tool to ensure equivalence 18 qualitative items measured using three
point scale (novice/intermediate/expert)Intermediate rating defined with verbal
specifications
Guidelines for choice of standardsQuantitative measure was used if it was valid,
important, and could be reliably measuredQualitative measure to capture information
that was not readily quantifiable Structured reporting format required for
qualitative assessmentAccepted constructs applied to enhance the
credibility of qualitative standards Miller’s criteria for learner assessment strategies GNOME model for curriculum design
Measurement of scholarly activityScholarly approach to education
Entire EP reviewed Special attention to educational philosophy, 5
year goals, narrative comments which follow each domain
Evidence of reflective practice and use of “best practices” from the literature
Special consideration of educator’s focal educational effort
Assessed using framework for excellence established by Glassick
Evaluation of a scholarly approachEP content analyzed for:
Evidence of systematic planningConsultation with literature/best practicesRigorous measurement of educational quality
and outcomesProducts/methods assessed through peer
reviewPresentations, publications, adoption of
products by others
Products of educational scholarshipIncludes peer reviewed publications,
presentations, and disseminated educational products adapted by others
Public dissemination, peer review and platform for others
Lessons learned in the development of the analysis toolFocused selection of essential items makes tool
practicalQuantitative items are based on judgment of
quality not just numbersSpecification of qualitative ratings is critical to
achieve concordanceQualitative items must be recorded numerically
to give them equivalence with quantitative items
The ability to use the tool is dependent on the quality of the data submitted—information must be documented meticulously
The goal of our current project: To create a set of general principles and specific
criteria for faculty evaluation regardless of template used to document educational activitiesTo promote a common understanding using a
common and established vocabulary for excellenceTo allow individual institutions to set and apply fair
and rational standards for consistent decision-making
To encourage continued conversation among the community of educators
To offer a sample tool based on the principles discussed
Purpose of the current projectTo establish a sound foundation for academic
promotion and advancement of educatorsTo provide a framework for the systematic
analysis of educator performance
We do not expect a national consensus about precise criteria for the advancement of educatorsPrinciples must be applied with consideration
of the individual culture of each institutionExpectations for faculty performanceNeeds of the educational mission
“Successful educators…need resources to fulfill the educational mission.” Simpson et al. Summary Report from the Consensus Conference on Educational Scholarship 2007
“We must evolve continuously our organizational structures, human resources activities, political coalitions, and symbols to support scholarship in education.” Fincher et al Academic Medicine 2000
Development of a sound rating system will require the institution to develop and implement: Accurate and complete data sourcesDefinition and acceptance of specific criteria
for evaluationConsistent application of criteria Inclusion of quantitative and qualitative
measures
Additional topics for conversationInstitutional requirements/preferences need
to be considered in developing the rating systemShould each domain be assigned an equal value?Should faculty members be expected to be
active/demonstrate excellence in each domain? It is unlikely that a faculty member will display
equivalent performance in all of the domains included in an EP
How many areas of excellence are required for advancement?
How should the value of each domain be established? National consensus + local considerationsScholarly approach vs Products of Scholarship
While both are important parts of establishing the credentials of an educator, products of scholarship carry more value than use of a scholarly approach
Principles for educator evaluationEvaluations must be based on objective criteriaUse both quantitative and qualitative measuresExpect educators to plan systematically to help
learners achieve specific, evaluable learning objectives
Expect scholarly activity from all faculty Evaluate scholarship rigorouslyExpect variation among educatorsInform faculty of criteria Educate those who evaluate educators to
recognize superior performance