toolbox for evaluating educators: resources to facilitate

41
1 Toolbox for Evaluating Educators: Resources to Facilitate Performance Appraisal Association of American Medical Colleges (AAMC) Task Force on Educator Evaluation Table of Contents Page Glossary………………………………………………………………………….. 2-5 Executive Summary………………………………………………………….. 6-7 User’s Guide………………………………………………………………....... 8-12 Indicators for Evaluation in Five Domains Teaching…………………………………………………………….…… 13-15 Learner Assessment……………………………………………………. 16-19 Curriculum Development………………………………………………. 20-23 Mentoring and Advising………………………………………………… 24-26 Educational Leadership and Administration………………………….. 27-29 Appendices……………………………………………………………………… 30-37 Reference List…………………………………………………………………… 38-41 TASK FORCE MEMBERS Maryellen E. Gusic, MD Task Force Chair Contact Information: (317) 278-6513 [email protected] Jonathan Amiel, MD Jamie Padmore, MSc Constance Baldwin, PhD Suzanne Rose, MD, MSEd Consultants: Latha Chandran, MD, MPH Deborah Simpson, PhD Nancy Lowitt, MD, MEd Ruth-Marie E. Fincher, MD Henry Strobel, PhD Lois Nora, MD, JD, MPH Brian Mavis, PhD Craig Timm, MD Kathe Nelson, MD Patricia O’Sulllivan, EdD Tom Viggiano, MD, MEd

Upload: others

Post on 10-Jan-2022

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Toolbox for Evaluating Educators: Resources to Facilitate

1

Toolbox for Evaluating Educators: Resources to Facilitate Performance Appraisal

Association of American Medical Colleges (AAMC) Task Force on Educator Evaluation Table of Contents

Page Glossary………………………………………………………………………….. 2-5 Executive Summary………………………………………………………….. 6-7 User’s Guide………………………………………………………………....... 8-12

Indicators for Evaluation in Five Domains

Teaching…………………………………………………………….…… 13-15 Learner Assessment……………………………………………………. 16-19 Curriculum Development………………………………………………. 20-23

Mentoring and Advising………………………………………………… 24-26

Educational Leadership and Administration………………………….. 27-29

Appendices……………………………………………………………………… 30-37 Reference List…………………………………………………………………… 38-41

TASK FORCE MEMBERS

Maryellen E. Gusic, MD Task Force Chair

Contact Information: (317) 278-6513 [email protected]

Jonathan Amiel, MD Jamie Padmore, MSc Constance Baldwin, PhD Suzanne Rose, MD, MSEd Consultants: Latha Chandran, MD, MPH Deborah Simpson, PhD Nancy Lowitt, MD, MEd Ruth-Marie E. Fincher, MD Henry Strobel, PhD Lois Nora, MD, JD, MPH Brian Mavis, PhD Craig Timm, MD Kathe Nelson, MD Patricia O’Sulllivan, EdD Tom Viggiano, MD, MEd

Page 2: Toolbox for Evaluating Educators: Resources to Facilitate

2

Glossary

DECANAL POSITIONS Administrative roles within the medical school dean’s office are referred to as decanal positions. Many of these positions include Assistant Dean or Associate Dean as part of the title. Examples of these positions are Associate Dean for Undergraduate Medical Education, Assistant Dean for Student Affairs, Associate Dean for Faculty Affairs, Assistant Dean for Graduate Medical Education and Associate Dean for Research. (Sonnino, 2012) DOMAIN For the purposes of this toolbox, the activities and accomplishments of faculty with significant involvement in medical education are considered in terms of five major categories. These five categories or domains are: (a) Teaching, (b) Learner Assessment, (c) Curriculum Development, (d) Mentoring and Advising, and (e) Educational Leadership and Administration. These domains are based on the literature, and were affirmed by the 2006 AAMC Group on Educational Affairs Consensus Conference on Educational Scholarship (Simpson et al., 2007). An educator will likely focus their work in one or more domain. ENGAGEMENT (with the EDUCATIONAL COMMUNITY) In providing evidence of a scholarly approach to one’s educational work, reference to the work done by the broader community of educators is an important component. Evidence of engagement with the educational community is made apparent in two ways. First, educators should demonstrate that their work draws on the scholarship of others, i.e., the current body of knowledge. The second aspect of engagement with the community of educators is providing evidence of contributing to this body of knowledge through presentations and publications of one’s findings, lessons learned, or best practices in education. Dissemination allows peer review as well as the use of one’s work by others. Thus drawing from and adding to our shared knowledge base provides evidence of engagement with the community of educators. (Simpson et al., 2007). ENGAGEMENT (with LEARNERS) This term is often used to describe an outcome related to the satisfaction or reaction of learners in an educational experience. A high level of learner engagement implies that learners have made a commitment to their learning, as evidenced by active participation in an educational experience. This includes involvement in assignments and activities, continued active participation in the face of challenges and expression of satisfaction with the learning experience. Involved learners demonstrate motivation beyond grades or other typical achievement indicators. Involved learners are motivated because they want to apply what they have learned. Learner engagement demonstrates successful teaching in classroom settings or successful leadership in professional development settings. (Geocaris, 1997) GLASSICK’S CRITERIA In an effort to define excellence in scholarly work, Glassick identified six criteria that provide a standard for assessing the quality of all types of scholarship: educational scholarship as well as scholarship in basic, clinical and social science research. “Scholars whose

Page 3: Toolbox for Evaluating Educators: Resources to Facilitate

3

work is published or rewarded must have clear goals, be adequately prepared, use appropriate methods, achieve [significant] results, [communicate effectively],and reflectively critique their work. (Glassick, 2000). (See Scholarly Approach) INDICATORS (for Evaluation) Specific measurable standards for each of Glassick’s criteria (see Glassick’s Criteria) are provided for each domain of an educator’s activity (see Domain) in the Toolbox. These standards are presented as broad indicators and detailed indicators. The standards describe the characteristics of an educator’s activities and accomplishments that should be assessed to evaluate performance in this domain. An educator must present evidence in his/her portfolio (see Portfolio) to demonstrate achievement of these standards. The broad indicators are useful for decision-makers making summative judgments about the performance of a faculty member. The detailed indicators are provided for individuals writing letters of recommendation and for other expert reviewers who advise decision-makers. KIRKPATRICK’S MODEL This model for the evaluation of training and education programs was developed by Donald Kirkpatrick (Kirkpatrick & Kirkpatrick, 2006). This approach to evaluation categorizes outcomes into four levels of results, of increasing complexity and impact. The first level is satisfaction/reaction and represents learners’ thoughts and feelings about the educational experience. This is typically measured in terms of satisfaction. The second level is learning and is measured as a change in knowledge or skills as a result of participation in the educational program. The third level of outcomes is application, which is measured by the extent to which new knowledge or skills learned in the educational program are transferred to other settings or used in daily practice. The fourth outcome level is impact and it is the extent to which the educational program has had an impact or promotes change within the broader institution. For example, a program on hand hygiene can be well-received (satisfaction) by the participants who also achieve high scores on tests of knowledge and skills related to this topic (learning). The extent to which participants consistently practice hand hygiene (application) would be of important and it would be expected that an effective hand washing program would ultimately reduce infection rates (impact). LEADERSHIP FRAMES Bolman and Deal (2003) described a model for leadership using four frames. Their model acknowledges that there is a wide range of effective leadership behaviors, and that these behaviors represent four aspects of one’s work as a leader: structural, human resource, political and symbolic. The structural approach to leadership is problem-based and focuses on strategy, experimentation and adaptation. The human resource framework is characterized by supporting, advocating for, and empowering others to enhance human capital. The political frame emphasizes building networks and coalitions among stakeholders. The symbolic aspect of one’s leadership is inspirational and involves communicating a clear vision to motivate and engage others. PORTFOLIO An educator portfolio is a compilation of evidence documenting one’s work as an educator. It is a part of one’s professional dossier. The portfolio documents the quantity as well as the quality of a faculty member’s educational accomplishments. A portfolio should be organized by domain of activity (see Domain): teaching, learner assessment, curriculum development, mentoring/advising, and educational leadership/administration. The evidence included in one’s educator portfolio can take many forms but should be aligned

Page 4: Toolbox for Evaluating Educators: Resources to Facilitate

4

with the criteria being used to evaluate the educator’s performance. . Examples of the type of evidence include course evaluations, course syllabi, newly developed educational resources, indicators related to learner achievement, licensure examination scores, outcomes related to institutional goals or benchmarks, testimonials from peers in other settings, etc. In many cases, there are specific institutional guidelines for the format and structure of a portfolio. SCHOLARLY APPROACH In his 1990 book, “Scholarship Reconsidered,” Ernest Boyer outlined a new paradigm for understanding the significance of faculty activities, encouraging us to "...break out of the tired old teaching versus research debate and define, in more creative ways, what it means to be a scholar" (Boyer, 1990, p. xii). Subsequently Glassick (2000) defined six criteria (see Glassick’s Criteria) that provide a structure for determining the extent to which an intellectually rigorous approach informs the work of a scholar. The criteria include the need for: clearly articulated goals, adequate preparation, use of appropriate methods, measurement of significant results, effective presentation of findings and reflective critique. The criteria apply to one’s work as an educator and together, they define a scholarly approach. SCHOLARSHIP Boyer (1990) expanded the traditional notion of faculty scholarship (see Scholarly Approach) being synonymous with research (the scholarship of discovery). In addition to the scholarship of discovery, Boyer defined scholarship to include the scholarship of integration (the synthesis of information across disciplines or content areas), the scholarship of application (using expertise to address real world needs) and the scholarship of education (studying processes of teaching and information dissemination). For an educator, scholarship is demonstrated when products are peer-reviewed and disseminated and thereby, contribute to knowledge within a field of inquiry (see Engagement with the Educational Community). SMART GOALS SMART is an acronym used to describe the desirable attributes of educational learning goals: specific, measureable, attainable, realistic and timely. Consider the goal: Increase USMLE Step 1 scores in pharmacology to meet the national mean within the next two years. Specific goals are definable and clearly stated, as in the example above. A measurable goal includes a metric or numeric target to define goal attainment. An attainable goal acknowledges the available resources and the challenges present, so that one can focus efforts to achieve the stated outcome. The outcome specified in the goal should be results-based rather than process-focused and the timeline for achieving the goal should be specified. (Doran, 1981; Spencer and Jordan, 2001)

Page 5: Toolbox for Evaluating Educators: Resources to Facilitate

5

References for Glossary 1. Bolman LG and Deal TE. Reframing Organizations: Artistry, Choice and Leadership. San Francisco. Jossey-Bass

Publishing; 2003. 2. Boyer EL. Scholarship reconsidered: priorities of the professoriate. San Francisco, CA. Jossey-Bass Publishers; 1990. 3. Doran, GT. There’s a S.M.A.R.T. way to write management’s goals and objectives. Management Review. 1981; 70 (Nov):

35-36. 4. Geocaris, C. Increasing student engagement: A mystery solved. Teaching for Authentic Student Performance. 1997; 54(4):

72-75. 5. Glassick CE. Boyer’s expanded definition of scholarship, the standards for assessing scholarship and the elusiveness of the

scholarship of teaching. Acad Med. 2000; 75:877-880. 6. Kirkpatrick DL and Kirkpatrick JD. Evaluating Training Programs: The Four Levels (3rd Ed).

San Francisco, CA: Berrett-Koehler Publishers, 2006. 7. Simpson D, Fincher RM, Hafler JP, Irby DM, Richards BF, Rosenfeld GC, Viggiano TR. Advancing educators and education

by defining the components and evidence associated with educational scholarship. Med Educ. 2007; 41: 1002-1009. 8. Sonnino, RE. Achieving Decanal Positions in Medical Schools. Association of Women Surgeons.

https://www.womensurgeons.org/CDR/DecanalPositions.asp. Accessed December 1, 2012. 9. Spencer J and Jordan R. Educational outcomes and leadership to meet the needs of modern health care. Quality in Health

Care. 2001; 10 (Suppl ii): ii38-ii45.

Page 6: Toolbox for Evaluating Educators: Resources to Facilitate

6

Executive Summary

Purpose: • To provide clear, yet flexible standards for evaluating the contributions of faculty whose careers focus on education • To provide an evidence-based approach to support fair, rigorous decisions for the academic advancement of faculty educators Targeted users: • PRIMARY USERS: Those who make decisions about faculty advancement (e.g., chairs, rank and tenure committees, academic

leaders, awards committees) and their consultants • SECONDARY USERS: Faculty and those who help faculty plan a career as an educator, prepare promotion materials, and/or

design faculty development programs The activities and accomplishments of an educator can be evaluated in FIVE domains and each domain can be judged using FOUR constructs:

Five domains of educator activity and accomplishments: • Teaching, Learner Assessment, Curriculum Development, Mentoring and Advising, and Educational Leadership and

Administration

Four constructs for evaluation in each domain: • Quantity may be judged by number of learners, number of sessions, hours of effort, or other countable factors • Quality can be appraised in many ways, including:

o learner and peer evaluation o evidence of learning or application of learning in practice o measures of impact (e.g., peer reviewed products, dissemination to other settings)

Evaluation of quality is often facilitated by expert reviews (e.g., letters of recommendation, assessment by educational consultants). • A scholarly approach in education, as in any other field, is demonstrated by documenting clear goals, adequate preparation

(e.g., work is informed by the literature and best practices in the field), appropriate methods, significant results, effective presentation, and reflective critique (Glassick’s criteria9,10).

• Scholarship is demonstrated when an educator creates products that are judged through a peer review process and are then made available for use/adaptation by others (e.g., published or presented to a wide audience19,20). Evidence of use of scholarly products by others is an added but not required element to demonstrate scholarship.

Indicators for Evaluation in Five Domains of Educator Activity • Introductory paragraphs: Define each domain and explain how to approach its evaluation

Page 7: Toolbox for Evaluating Educators: Resources to Facilitate

7

• Table of Indicators for Evaluation:

o Column 1: Lists Glassick’s criteria

o Column 2: Lists Broad Indicators to measure quality, a scholarly approach and scholarship

o Column 3: Provides Detailed Indicators for use in reviews by consultants and educational experts How decision makers should use the Toolbox: • Identify the activity domains relevant to an individual faculty member’s work. • Use “Broad Indicators” from the Toolbox tables to evaluate the faculty member’s performance in each relevant domain.

(Consultants or expert reviewers can use the “Detailed Indicators” to provide a summative evaluation for decision making committees.)

• Aggregate the evaluation of each domain and develop a composite summary across domains, if appropriate, applying judgments about the relative value of each domain.

Please note: Some indicators may not be relevant to the educator’s activities. One should expect documentation sufficient to support an assessment of quality, but few if any individual portfolios will provide evidence for ALL indicators listed in the domain tables. The Toolbox is not prescriptive! Institutions have flexibility to define their expectations for: • The quantity of educational activities and how these will be combined with measures of quality in the evaluation of educators; • How an educator’s use of a scholarly approach to education will be weighed; • The level of engagement in educational scholarship; • The requirements for promotion of faculty in each track and rank, including expectations about level of activity, number of

domains of activity, and level of accomplishment.

Page 8: Toolbox for Evaluating Educators: Resources to Facilitate

8

Users’ Guide

Purpose of the Toolbox: The Toolbox is designed to offer practical resources for decision makers to help them develop consistent, efficient evaluation processes to assess educators’ activities and accomplishments fairly and rigorously. It provides clear, yet flexible standards for evaluating the contributions of faculty with a career focus on education. Experts who advise promotion committees should find the indicators in the Toolbox useful to guide and structure their detailed reviews of faculty portfolios. Educators and their mentors may use the Toolbox as they plan the scope of the educator’s activities, choose areas of emphasis for maximum impact, and effectively document these activities. Primary Users:

• Faculty and committees charged with decision-making (promotion and awards/honors) • Consultants to these committees and external reviewers

Secondary Users:

• Faculty affairs & education deans • Career mentors of educators • Educators planning their careers

Background: In the past two decades, significant progress has been made in defining and justifying the value of scholarship in education7,10,14,19,20,21 and numerous programs to develop faculty as educational scholars have been established11,17,23. However, educators at many academic health centers continue to struggle with advancement and promotion, because the process for evaluating their contributions is often cumbersome and accepted standards for evaluation are vague or lacking3,22,24. The basis for evaluating educators has been strengthened by defining educational scholarship4,7,13 and by developing templates for faculty to document their educational contributions using educator’s portfolios12,20. The final step in helping institutions value faculty members’ educational contributions is the development of a sound framework for evaluating educational contributions. This Toolbox was created to fill that void. Development of the Toolbox: The Toolbox is the product of a 3-year process by a national AAMC task force with broad representation from the membership of the AAMC Groups on Educational and Faculty Affairs (GEA and GFA) and from other AAMC stakeholder Groups and Councils. The task force includes educators who teach across the continuum of medical education at academic health centers and teaching hospitals. This resource is based on the literature and builds on the findings of the 2006 AAMC GEA Consensus Conference on Educational Scholarship20,21 and educator evaluation tools developed by the Academic Pediatric Association’s Educational Scholars Program1,2,5,6,12. The Toolbox resulted from a consensus building and iterative process, including discussion and feedback with AAMC stakeholder groups at regional and national meetings between 2008 and 2011.

Page 9: Toolbox for Evaluating Educators: Resources to Facilitate

9

Organization of the Toolbox: The Executive Summary describes the purpose, format and intended use of the Toolbox. Sections that address indicators for evaluation for the five domains of educational activity comprise the body of the Toolbox. Each domain is defined and approaches to evaluation of work in the domain are described, including measuring quality, use of a scholarly approach, and scholarship. A Table of Indicators for Evaluation is formatted identically for each domain. An indicator is an observable or measurable element of performance that shows quality or impact.

• Column 1 in the Tables gives Glassick’s six criteria for excellence; these provide a useful, uniform approach to evaluation of each domain.

• Column 2 provides a set of Broad Indicators that are likely to be useful to decision making committees, who are tasked to make summative judgments about the performance of a faculty member.

• Column 3 provides Detailed Indicators for use by writers of recommendation letters and also by expert consultants who are asked to comment in detail on the accomplishments highlighted in an educator’s portfolio.

The Broad and Detailed Indicators are intended to provide a comprehensive and specific list of standards for reviewing the documentation in a portfolio. While the reviewer should expect documentation that is sufficient to support an assessment of quality in a domain, not all indicators are likely to be addressed in an individual’s portfolio. Some indicators may not be relevant to the educational activities of a particular faculty member. Key components of educator performance: The toolbox uses the five domains of educator activity that were re-affirmed at the AAMC Group on Educational Affairs consensus conference on educational scholarship20,21. Most faculty educators begin their careers working in 1-2 domains and may add others as their careers evolve. An educator’s impact increases when he/she engages with the broader community of educators by learning and building on colleagues’ work (i.e., scholarly approach), and/or by contributing to peer reviewed, publicly disseminated products (i.e., scholarship). Evaluating an educator’s activities: An educator’s contributions in each domain can be judged with reference to: • Quantity (e.g., learner numbers, effort) • Quality (e.g., strong evaluations and learning outcomes) • Scholarly approach (applying knowledge of the relevant education literature and best practices) • Scholarship (producing peer reviewed, publically disseminated products that are available for use or adaptation by others)

Scholarship is demonstrated through 3 P’s: Creation of products that are Peer reviewed and made Public, thereby creating in a Platform upon which others can learn and build21. Evidence that others have in fact used a scholarly product is valued, but not obligatory.

Page 10: Toolbox for Evaluating Educators: Resources to Facilitate

10

Glassick’s criteria9,10 provide a structure to assess quality, a scholarly approach and scholarship:

1. Clear goals 2. Adequate preparation 3. Appropriate methods 4. Significant results 5. Effective presentation 6. Reflective critique

Kirkpatrick’s16 Four Levels can be adapted and used to define significant results:

How to use the Toolbox: Primary Users: Faculty and committees charged with faculty evaluation and decision-making • Identify which of the five domains are relevant to an individual faculty member’s work. • For each domain, use Broad Indicators from the Toolbox tables to evaluate the candidate’s performance with respect to each

of Glassick’s six criteria. • Summarize the evaluation of each domain and develop a composite summary across domains, if appropriate. Decision makers

will need to apply judgments about the relative value of each domain in their evaluations. • When needed, ask external reviewers, educational consultants or other experts to use the Detailed Indicators to review the

portfolio for quality (e.g. evaluation of current disciplinary content, educational methodologies, technological innovation). o Reviewers may be asked to analyze information in the portfolio in depth and select examples of the candidate’s

performance that demonstrate quality in each relevant domain. ! Appendix A provides two sample letters from reviewers that support of candidates for promotion ! Appendix B is a sample of a summary of a candidate’s portfolio compiled by an educational consultant

Kirkpatrick Level Assessment Level What is Assessed

1 Satisfaction/reaction Did participants and stakeholders like it?

2 Learning Was there a change in knowledge, skills, attitudes, and/or behavior?

3 Application Was the desired performance demonstrated in other settings?

4 Impact What was the outcome/effect on educational programs/processes, both within and outside the institution?

Page 11: Toolbox for Evaluating Educators: Resources to Facilitate

11

Secondary Users: Educators who will be seeking promotion (and their career mentors) • Review your educator portfolio in parallel with the Toolbox tables. The tables tell you what “counts” most for promotion and

advancement. Don’t be intimidated by the large number of indicators in the Toolbox tables: these are meant to be very comprehensive lists. No one could document everything in these tables.

• In relevant domains, highlight the specific indicators in the Toolbox that can be addressed by evidence in your portfolio and those that might be addressed if more data were available. For example, candidates seeking to strengthen their documentation might: o Request copies of past learner evaluations to document teaching excellence. o Ask an educational expert to attend and review a lecture or a problem-based learning session, and include the review in a

letter of recommendation. o Describe a scholarly approach to an educational project using Glassick’s criteria.

• If important areas in the Toolbox tables are not documented in your portfolio, develop a plan so your portfolio will be stronger in the future (e.g., participating in professional development activities, using a scholarly approach to plan projects in your educational activities).

Across institutions, promotion committee guidelines vary. The Toolbox can guide institutions seeking to develop or apply their own evaluation metrics, based on local academic values. Whatever evaluation approach they use, decision makers must apply a rational and transparent process, based on agreement about: • Which indicators/how many indicators define higher and lower levels of performance in each domain; • The relative weights of the five domains; • How many domains of educational activity are expected at each academic level in each track; • How to evaluate faculty who contribute to more than one institutional mission (education, research, clinical care, and/or

administration). Any evaluation process needs to be applied consistently and fairly to all candidates for promotion. A key goal of the Task Force on Educator Evaluation throughout this project has been to promote fair, objective, and rigorous decision making processes for educators. Reflective critique: 1) The Toolbox is designed to be comprehensive. It is not appropriate to expect an educator to have a record of performance

that covers all of the domains or all of the criteria in a domain. Evaluation of an educator should be focused on their specific activity domains.

2) The Toolbox must be applied within an institutional context to be useful. Medical schools and health science centers differ in their priorities and expectations for their faculty (e.g., the expected balance among teaching, research and service; the opportunities available to faculty for innovation in curriculum development or learner assessment). No two institutions are the same in their expectations of faculty at specific academic levels in specific career tracks. Hence the Toolbox can serve as a guide to development of promotion processes, but it is not a prescriptive resource.

Page 12: Toolbox for Evaluating Educators: Resources to Facilitate

12

3) The Toolbox does not offer metrics for evaluation. Conversion of indicators for evaluation to a numerical rating system (if desired) must reflect local academic values. The tables of indicators are NOT intended to be used as checklists for rating faculty. The toolbox creators have not defined relative weights for the five domains, or relative weights within each domain for the six criteria of Glassick.

Page 13: Toolbox for Evaluating Educators: Resources to Facilitate

13

DOMAIN: TEACHING27-39

Teaching is defined as any organized activity that fosters learning and the creation of associated instructional materials. Teaching targets learners at all levels of medical education including faculty and practitioners. It involves any number of learners in activities such as lectures, workshops, small group discussions, patient-centered teaching and various settings (e.g., classroom, clinical, laboratory, and virtual environments). Development of curricula (defined as a longitudinal set of educational activities) is considered under the curriculum development domain of the toolbox.

Evaluation of sustained contributions in teaching requires judgment about quantity (number, duration and scope of teaching activities). For each teaching activity, Glassick’s criteria9,10 provide a structure to assess quality (teaching has been effective with positive reviews); scholarly approach (application of literature and best practice models); and scholarship (peer reviewed publications, presentations and products and/or evidence of adoption by others). The Broad Indicators listed in column 2 below, are likely to be useful to decision makers tasked to make summative judgments about the performance of a faculty member. Column 3 provides Detailed Indicators for use by writers of recommendation letters and other expert reviewers who advise decision makers. The Detailed Indicators are intended to enable greater clarity and specificity in reviews of an educator portfolio. While the reviewer should expect documentation that is sufficient to support an assessment of quality in this domain, not all indicators are likely to be addressed. Some indicators may not be relevant to the educational activities of a particular faculty member. Glassick’s

Criteria Broad Indicators

(for decision makers) Detailed Indicators

(for expert reviewers and consultants to decision makers) Clear goals Learning objectives for the

teaching session(s) are: • Stated clearly • Specified to measure

learners’ performance • At appropriate level for

targeted learners

Learning objectives are: • Based on documented needs of learners • Specific, measurable, achievable, realistic, and timely (SMART) • In multiple domains (e.g., knowledge, skills, attitudes, and/or

behaviors)

Adequate preparation

• Congruence with institutional/program goals and integration with other components of curriculum

• Subject matter is presented at depth and breadth matched to learners’ needs and time available

• Material is up-to-date and evidence based • Material is comprehensive and is logically integrated with other

Page 14: Toolbox for Evaluating Educators: Resources to Facilitate

14

Glassick’s Criteria

Broad Indicators (for decision makers)

Detailed Indicators (for expert reviewers and consultants to decision makers)

• Use of best practices from the literature, professional development activities and personal experience

• Resource planning

curricular components • Resources needed for teaching are specified and available • Adequate preparation for use of technology

Appropriate methods

• Teaching methods aligned with learning objectives

• Methods are feasible, practical, ethical

• Innovative teaching methods used to achieve objectives

• Employs suitable range and variety of teaching strategies supported by learning theory, by best practices and/or by literature review

• Uses interactive approaches and promotes self-directed learning • Uses methods that promote critical thinking and reasoning skills • Provides evidence of innovation (e.g., novel strategies to promote

learning) • Teaching methods include ways to monitor learners’ progress If technology is used, it: • Aids learning of the content • Is easy to navigate • Is interactive (e.g., teacher to learner, peer to peer, learner to

content, learner to technology) Significant results

• Satisfaction/reaction • Learning: Measures of

knowledge, skills, attitudes, and/or behaviors

• Application: Desired performance demonstrated in other settings

• Impact: On educational programs and processes within and/or outside institution

Satisfaction/Reaction • Rating of teaching by learners, peers or experts • Comparison of learner ratings to ratings of other teachers (internal,

external) Learning • Evidence of learning based on measurable changes in knowledge,

skills, attitudes, behaviors • Comparison of learner performance to established benchmarks

and/or to other learners’ performance in previous years Application • Demonstration of skills/behaviors learned from teaching activity in

subsequent settings or curricular components • Impact

• Positive evaluation by knowledgeable peers, educational leaders, curriculum committee

Page 15: Toolbox for Evaluating Educators: Resources to Facilitate

15

Glassick’s Criteria

Broad Indicators (for decision makers)

Detailed Indicators (for expert reviewers and consultants to decision makers)

• Recognition by internal/external awards or incentives Effective presentation

Recognized as valuable by others (internally or externally) through: • Peer review • Dissemination

(Presentations/publications) and/or

• Use by others

• Peer reviewed publications/presentations of teaching strategies or instructional materials

• Invitations to provide faculty development, conduct workshops, or do presentations about teaching locally, at other institutions, or in other regional, national, international venues

• Invitation to peer review other teachers locally or at other institutions • Breadth of dissemination and adoption of teaching

methods/materials: local, regional, national, international Reflective critique

• Reflection and results of evaluations used for ongoing improvement

• Critical analysis of teaching activity using all information from others and from self-assessment

• Evidence of ongoing improvement of teaching activity based on critical analysis and reflection

Page 16: Toolbox for Evaluating Educators: Resources to Facilitate

16

DOMAIN: LEARNER ASSESSMENT40-46

Learner assessment is defined as all activities associated with measuring knowledge, skills, attitudes and behaviors of learners so that judgments can be made about their performance. The information from assessments indicates how well the learner has achieved pre-specified expectations for performance. This information has impact on the learner and also serves important administrative purposes, such as making progress decisions about the learner. Evaluation of sustained contributions in learner assessment requires judgment about quantity (number of assessments and breadth of the faculty member’s role and effort in the development and implementation of the assessment). For each learner assessment activity, Glassick’s criteria9,10 provide a structure to evaluate quality (assessments measure what they are supposed to measure, include sufficient relevant samples of a learner’s performance, and information gained has impact on the learner and the institution); scholarly approach (application of literature and best practice models); and scholarship (peer reviewed publications, presentations, and products, and/or evidence of adoption by others). The Broad Indicators listed in column 2 below, are likely to be useful to decision makers tasked to make summative judgments about the performance of a faculty member. Column 3 provides Detailed Indicators for use by writers of recommendation letters and other expert reviewers who advise decision makers. The Detailed Indicators are intended to enable greater clarity and specificity in reviews of an educator portfolio. While the reviewer should expect documentation that is sufficient to support an assessment of quality in this domain, not all indicators are likely to be addressed. Some indicators may not be relevant to the educational activities of a particular faculty member.

Glassick’s Criteria

Broad Indicators (for decision makers)

Detailed Indicators (for expert reviewers and consultants to decision makers)

Clear goals Learner Assessments: • Are appropriate for the content and

level of learning objectives/competencies

• Define expectations for learner’s performance

• Content of assessment matches learning objectives or competencies for the learning activity

• Expected learning outcomes specify knowledge, skills, attitudes and/or behaviors

• Assessments are matched to the expected level of learner performance/competency

• Purpose of assessment (midpoint feedback, grading, or certification) is clear to the learner

Adequate preparation

• Congruence with institutional/program goals and integration with institution’s system of assessment

• Use of best practices from the literature, professional development

• Content and format of the assessment are up-to-date • Blue print for assessment is based on learning

objectives and includes topics to be covered and performance outcome to be assessed

• Assessments planned are consistent and/or integrated

Page 17: Toolbox for Evaluating Educators: Resources to Facilitate

17

activities and personal experience • Resource planning (facilities, faculty,

schedules )

with assessments used in other courses as well as with • institutional assessment plan (if one exists) Training

materials are sufficient to allow implementation and analysis

• Requirements and expectations for raters or graders are adequately described

• Learners are oriented to the format of assessment • Resources needed for assessment are specified and

available • Individuals involved (learners, staff, faculty) are

adequately prepared to use resources Appropriate

methods • Assessment format aligned with

learning objectives • Assessment process is consistent and

uses accurate scoring methods • Assessment occurs in setting

suitable for demonstration of relevant learning

• Sufficient sample of the learner’s performance collected to assure accurate capture of real ability/competency

• Methods are useful, feasible, practical, ethical

• Use of innovative assessment methods to measure performance!

• Employs suitable range and variety of assessments supported by assessment theory, by best practices and/or literature review (e.g., cognitive tests, skills tests, attitude assessments, behavior assessments)

• Evidence for validity of assessment provided43 o Content o Administration o Internal structure (item analyses/reliability) o Statistical analysis relating performance to other

variables o Consequences (effect on learner and/or program)

• Scoring process and methods are easily achievable and yield useful data

• Assessment guide provides information about the development, administration, scoring, interpretation, and presentation of results, and indicates how assessments have been modified over time in response to learner and educator feedback

• Assessment data provided takes into account consequences such as: o Formative feedback to learners to guide plan for

learner improvement o Information for summative decisions o Information to inform effectiveness of instruction

Page 18: Toolbox for Evaluating Educators: Resources to Facilitate

18

Glassick’s Criteria

Broad Indicators (for decision makers)

Detailed Indicators (for expert reviewers and consultants to decision makers)

Significant results

• Satisfaction/reaction: Assessment evidence provides meaningful feedback about quality and implementation of assessment

• Learning: Measures knowledge, skills, attitudes, and/or behaviors

• Application: Desired performance demonstrated in other settings

• Impact: On progress decisions about learners and on educational programs and/or programs of assessment within and/or outside institution

Satisfaction/Reaction • Rating of assessment strategies by learners, peers or

experts • Questions/tasks are clear and meaningful • Assessment deemed relevant, timely and constructive • Comparison of learner ratings to ratings of other assessment

methods (internal, external) • Assessment tools considered fair by meeting accepted

standards (e.g., reliability and validity) Learning • Measure is sensitive to changes in knowledge, skills,

attitudes, behaviors • Comparison of learner performance to established

benchmarks and/or to other learners’ performance in previous years

Application • Transfer of assessment approach to another learning

experience or program Impact • Information from assessment strategy meets the needs of

program leaders and committees • Learners use data to inform learning plans for performance

improvement • Data is used to inform improvement of education or

assessment programs • Positive evaluation by knowledgeable peers, educational

leaders, curriculum committee • Recognition by internal/external awards or incentives

Effective presentation

Recognized as valuable by others (internally/externally) through: • Peer review • Dissemination

• Display of assessment results is clear to audience and appropriate for purpose

• Results linked to targeted objectives/competencies and match course, program, or institutional categories (e.g.,

Page 19: Toolbox for Evaluating Educators: Resources to Facilitate

19

(Presentations/publications) and/or !• Use by others!

competency domains) • Tables, charts, graphics (e.g., dashboards) are used to

clearly communicate results to targeted audienceResults presented with appropriate comparisons

• Peer reviewed publications/presentations about assessment or of assessment tools

• Invitations to provide faculty development, conduct workshops, or do presentations about assessment locally, at other institutions, or in other regional, national, or international venues

• Breadth of dissemination and adoption of assessment methods/materials and/or guides: local, regional, national, international

Reflective Critique

• Reflection and results used for ongoing improvement of the assessment itself and/or the program of assessment

• Critical analysis of assessment tool/method using information from others and from self-assessment

• Evidence of ongoing improvement of assessment based on critical analysis and reflection

Page 20: Toolbox for Evaluating Educators: Resources to Facilitate

20

DOMAIN: CURRICULUM DEVELOPMENT47-51

Curriculum is defined as a longitudinal set of systematically designed, sequenced and evaluated educational activities. A curriculum can target learners at any level from undergraduate through continuing professional development and may be delivered in many formats. Evaluation of sustained contributions in curriculum development requires judgment about quantity (number, duration and scope of each curriculum, breadth of the faculty member’s role and effort). For each curriculum development activity, Glassick’s criteria9,10 provide a structure to assess quality (curriculum has demonstrated effectiveness with positive reviews); scholarly approach (application of literature and best practice models); and scholarship (peer reviewed publications, presentations and products and/or evidence of adoption by others). The Broad Indicators listed in column 2 below, are likely to be useful to decision makers tasked to make summative judgments about the performance of a faculty member. Column 3 provides Detailed Indicators for use by writers of recommendation letters and other expert reviewers who advise decision makers. The Detailed Indicators are intended to enable greater clarity and specificity in reviews of an educator portfolio. While the reviewer should expect documentation that is sufficient to support an assessment of quality in this domain, not all indicators are likely to be addressed. Some indicators may not be relevant to the educational activities of a particular faculty member. Glassick’s

Criteria Broad Indicators

(for decision makers) Detailed Indicators

(for expert reviewers and consultants to decision makers) Clear goals Learning objectives for the

curriculum are: • Stated clearly • Specified to measure

learners’ performance • At appropriate level for

targeted learners

Learning objectives are: • Based on documented needs of learners • Specific, measurable, achievable, realistic, and timely (SMART) • In multiple learning domains (e.g., knowledge, skills, attitudes and/or

behaviors)

Adequate preparation

• Needs assessment done, if required

• Congruence with institutional/program goals and integration with other components of the curriculum

• Rationale for curriculum development is supported by identified gap, problem, and/or opportunity to improve

• Curriculum targeted for the specific needs of learners • Curriculum design and evaluation is based on accepted

frameworks49,51 • Material is presented at depth and breadth matched to learners’

needs

Page 21: Toolbox for Evaluating Educators: Resources to Facilitate

21

Glassick’s Criteria

Broad Indicators (for decision makers)

Detailed Indicators (for expert reviewers and consultants to decision makers)

• Use of best practices and approaches from the literature, professional development activities and personal experience

• Systematic approach to identifying and acquiring resources needed to implement the curriculum

• Material is up-to-date and evidence based • Review of literature and available resources influence curriculum

development and evaluation • Tools/guidelines that accompany curriculum provide sufficient detail

for other individuals or institutions to implement it • Resources needed for curriculum implementation are specified and

available • Time allocation for the curriculum is appropriate • Stakeholder buy-in is obtained • Adequate preparation for use of technology

Appropriate methods

• Teaching, learner assessment, and curriculum evaluation methods are aligned with curriculum objectives

• Methods are feasible, practical, ethical

• Innovative teaching and assessment methods are used and aligned with objectives

Instructional Methods • Employs suitable range and variety of teaching strategies supported

by learning theory, by best practices and/or by literature review • Uses interactive approaches and promotes self-directed learning • Uses methods that promote critical thinking and reasoning skills • Provides evidence of innovation (e.g., novel strategies to promote

learning and critical reasoning skills) • Teaching methods include ways to monitor learners’ progress If technology is used, it: • Aids learning of the content • Is easy to navigate • Is interactive (e.g., teacher to learner, peer to peer, learner to

content, learner to technology) Curriculum Evaluation • Is linked to learning objectives for curriculum • Employs multiple data sources • Incorporates assessment of instructional methods, teachers and

learning Learner Assessment • Learner can obtain feedback about performance (formative and

summative)

Page 22: Toolbox for Evaluating Educators: Resources to Facilitate

22

Glassick’s Criteria

Broad Indicators (for decision makers)

Detailed Indicators (for expert reviewers and consultants to decision makers)

• Assessment of knowledge, skills, attitudes, and behaviors (as appropriate)

Significant results

• Satisfaction/reaction • Learning: Measures of

knowledge, skills, attitudes, and/or behaviors

• Application: Desired performance demonstrated in other settings

• Impact: On education programs and processes within and/or outside institution

Satisfaction/Reaction • Rating of curriculum by learners and by faculty who teach

components of the curriculum, peers or experts • Comparison of learner ratings to ratings of other curricular

components (internal, external) Learning: • Evidence of learning based on measurable changes in knowledge,

skills, attitudes, behaviors • Comparison of learner performance to established benchmarks

and/or to other learners’ performance in previous years Application • Demonstration of learned skills/behaviors from curriculum in other

settings or curricular components Impact • Positive evaluation by knowledgeable peers, educational leaders,

curriculum committees • Curriculum highly rated in accreditation review • Recognition by internal/external awards or incentives

Effective presentation

Recognized as valuable by others (internally or externally) through: • Peer review • Dissemination

(Presentations/publications) and/or

• Use by others

• Peer reviewed publications/presentations of curriculum • Invitations to provide faculty development, conduct workshops or do

presentations to help others with curriculum development locally, at other institutions, or in other regional, national or international venues

• Invitations to peer review curricula in other educational programs within or outside the institution

• Breadth of dissemination and adoption of curriculum’s teaching methods/materials, assessment methods/tools, evaluation methodology, and/or guides and processes: local, regional, national, international

Page 23: Toolbox for Evaluating Educators: Resources to Facilitate

23

Glassick’s Criteria

Broad Indicators (for decision makers)

Detailed Indicators (for expert reviewers and consultants to decision makers)

Reflective critique

• Reflection and evaluation results used for ongoing improvement

• Critical analysis of curriculum using all information from others and from self-assessment

• Evidence of ongoing improvement of curriculum based on critical analysis and reflection

Page 24: Toolbox for Evaluating Educators: Resources to Facilitate

24

DOMAIN: MENTORING AND ADVISING52-60 Mentoring is a process in which an experienced professional gives a person with relatively less experience guidance, teaching and development to achieve broad professional goals. Advising differs from mentoring in that it is specific to a circumscribed goal. Ideally, mentoring and advising relationships are active and reciprocal, providing the mentee/advisee with developmentally and contextually appropriate guidance and the mentor/advisor with personal and professional satisfaction. Evaluation of sustained contributions in mentoring and advising requires judgment about quantity (number, duration and scope of relationships, breadth of the faculty member’s effort). For each mentoring/advising activity, Glassick’s9,10 criteria provide a structure to assess quality (effectiveness of mentor/advisor and demonstrated effectiveness with positive reviews and positive outcomes emerging from relationship); scholarly approach (application of literature and best practice models); and scholarship (peer reviewed publications, presentations and products and/or evidence of adoption by others). The Broad Indicators listed in column 2 below, are likely to be useful to decision makers tasked to make summative judgments about the performance of a faculty member. Column 3 provides Detailed Indicators for use by writers of recommendation letters and other expert reviewers who advise decision makers. The Detailed Indicators are intended to enable greater clarity and specificity in reviews of an educator portfolio. While the reviewer should expect documentation that is sufficient to support an assessment of quality in this domain, not all indicators are likely to be addressed. Some indicators may not be relevant to the educational activities of a particular faculty member.

Glassick’s Criteria

Broad Indicators (for decision makers)

Detailed Indicators (for expert reviewers and consultants to decision makers)

Clear goals • Clear and contextually appropriate vision for mentee’s/advisee’s career

• Mutually agreed-upon goals for the relationship

• Evolution of goals over time

• Vision is aligned with SMART (specific, measurable, achievable, realistic, and timely) goals for the mentee/advisee

• Focus on mentee’s/advisee’s goals • Reframing of mentee’s/advisee’s goals over time aligned

with vision, progress, ongoing needs and context Adequate preparation

• Knowledge of: o Stages of mentee’s/advisee’s

career trajectory o Milestones required for

mentee’s/advisee’s professional advancement

• Identification of mentee’s/advisee’s priorities, strengths, and needs

• Delineation of tasks, strategies and necessary resources for mentee/advisee to achieve each milestone and to advance professionally

Page 25: Toolbox for Evaluating Educators: Resources to Facilitate

25

Glassick’s Criteria

Broad Indicators (for decision makers)

Detailed Indicators (for expert reviewers and consultants to decision makers)

o Available and needed resources to meet vision and associated goals

• Use of best practices from the literature, professional development activities and personal experience

Appropriate methods

• Methods aligned with mentee’s/advisee’s needs and goals

• Methods aligned with goals for relationship

• Methods are ethical and evolve as mentee/advisee advances professionally

• Innovative methods used to achieve goals for relationship and to assist mentee/advisee in meeting goals

• Uses collaboratively established goals, timelines and milestones to monitor progress

• Facilitates goal achievement through process of support, challenge and vision56 and ongoing constructive feedback

• Provides stage-appropriate support: initiation, building, sustaining and disengagement

• Utilizes periodic “check-ins” to assess effectiveness of relationship and success in meeting goals

Significant results

• Satisfaction/reaction • Learning: Measures knowledge,

skills, attitudes and/or behaviors of mentee/advisee

• Application: Relationship with mentor/advisor contributes to accomplishments and evolving professional identity of mentee/advisee

• Impact: Accomplishments of mentee/advisee have impact within and/or outside the

Satisfaction/Reaction • Rating by mentee/advisee, peers or experts of overall

effectiveness of mentor/advisor (ability to provide support and guidance towards meeting goals/vision)

• Comparison of ratings to ratings of other mentors/advisors (internal, external)

• Sustained productive relationship between mentor/advisor and mentee/advisee

Learning • Advisees/mentees attribute to relationship with

mentor/advisor their acquisition of education-related knowledge and skills, attitudes and professional academic

Page 26: Toolbox for Evaluating Educators: Resources to Facilitate

26

institution skills and behaviors associated with career success52 Application • Accomplishment of mentee’s/advisee’s professional goals • Achievement of professional advancement relative to

institutional/discipline norms Impact: • Mentee/advisee engages in service to the

institution/professional organization • Mentee/advisee or mentor/advisor bring visibility and

recognition to institution through outcomes of mentoring relationship

• Mentee/advisee and/or mentor/advisor receive funding that support mentoring activities/programs

• Mentee/advisee contributes as a mentor/advisor to others (internal and/or external)

Recognition of mentor/mentee by internal/external rewards or incentives

Effective presentation

Recognized as valuable by others (internally or externally) through: • Peer review • Dissemination

(Presentations/publications) and/or

• Use by others

• Peer-reviewed publications or presentations of mentoring/advising models, methods and/or materials

• Invitation to provide faculty development, conduct workshops or do presentations to help others with mentoring locally, at other institutions, or in other regional, national, or international venues

• Breadth of dissemination and adoption of models, methods, materials, guides or policies: local, regional, national, international

Reflective critique

• Reflection and results of evaluations used for ongoing improvement

• Critical analysis of mentoring/advising activities and relationship, using all information from others and from self-assessment

• Evidence of ongoing improvement of mentoring/advising based on critical analysis and reflection

Page 27: Toolbox for Evaluating Educators: Resources to Facilitate

27

DOMAIN: EDUCATIONAL LEADERSHIP AND ADMINISTRATION61-66 Educational leaders achieve transformative results by leading others to advance educational programs, initiatives, and/or groups. Examples include leaders of education committees, clerkships and courses, training and professional development programs, and decanal positions. Leaders in medical education must be evaluated for leadership and administrative skills, in addition to program outcomes. Evaluation of sustained contributions in educational leadership and administration requires judgment about quantity (number, duration and scope of leadership roles). For each leadership position, Glassick’s criteria9,10 provide a structure to assess quality (leader and program have demonstrated effectiveness with positive reviews); scholarly approach (application of literature and best practice models); and scholarship (peer reviewed publications, presentations, and products and/or evidence of adoption by others). The Broad Indicators listed in column 2 below, are likely to be useful to decision makers tasked to make summative judgments about the performance of a faculty member. Column 3 provides Detailed Indicators for use by writers of recommendation letters and expert reviewers who advise decision makers. The Detailed Indicators are intended to enable greater clarity and specificity in reviews of an educator portfolio. While the reviewer should expect documentation that is sufficient to support an assessment of quality in this domain, not all indicators are likely to be addressed. Some indicators may not be relevant to the educational activities of a particular faculty member.

Glassick’s Criteria

Broad Indicators (for decision makers)

Detailed Indicators (for expert reviewers and consultants to decision makers)

Clear goals • Articulated vision • Goal setting aligned with vision • Goals congruent with

institutional goals

• Clear communication of purpose for the program/initiative/group • Anticipated outcomes or products defined at the start of an initiative

Adequate preparation

• Development of timeline with milestones and deliverables

• Selection and development of team

• Motivating stakeholders to collaborate in realizing the vision

• Use of best practices and approaches from the literature, professional development activities and personal experience

• Ongoing leadership skills development to enhance effectiveness • Cultivation of flexible and “ready-for-change” culture • Support and resources obtained for educational programs:

o Stakeholder endorsement and faculty buy-in regarding vision and goals o Financial (e.g., salary lines, supplies/expenses, grants) o Logistical (e.g., time, facilities)

Page 28: Toolbox for Evaluating Educators: Resources to Facilitate

28

• Systematic approach to identifying and acquiring resources needed to implement projects

Appropriate methods

• Development and management of resources and processes

• Methods that are feasible, practical, and ethical

• Creative and innovative solutions used to achieve goals

• Evaluation aligned with goals

• Attention to all leadership frames61 Structural o Clear, efficient processes for program implementation and evaluation o Allocation of resources to enhance sustainability

Human resource o Communications are informative, inspire trust, and motivate others o Skillful delegation and empowerment of others o Fosters innovative ideas/solutions from team members o Facilitates social and professional linkages between individuals o Mentoring and professional development of team members

Political o Skillful negotiation with all stakeholders o Productive coalition building o Effective conflict resolution; perspective maintained in times of difficulty o Successes of team members reported to higher levels of leadership o Public advocacy for program/initiative o Actions in accordance with ethics, values, integrity and self-awareness

Symbolic o Development of a collective sense of identify and commitment to the

program/initiative o Communication of program’s purpose and successes in public venues o Praise/rewards for excellent performance in public venues

Significant results

• Satisfaction/Reaction • Impact: on

participants/stakeholders and on educational programs and initiatives within and/or outside institution

Satisfaction/Reaction • High level of engagement of participants • High ratings of leader and program/initiative by participants and stakeholders Impact • Progress towards achievement of program/initiative's goals • Improved performance outcomes of team members • Improved outcomes for learners and/or institution • Sustainability of programs/initiatives (and accreditation, if applicable) • Recognition by internal/external awards or incentives

Page 29: Toolbox for Evaluating Educators: Resources to Facilitate

29

Glassick’s

Criteria

Broad Indicators

(for decision makers)

Detailed Indicators

(for expert reviewers and consultants to decision makers) Effective

presentation Recognized as valuable by others (internally/externally) through: • Peer review • Dissemination

(Presentations/publications) and/or

• Use by others

• Display of programs/initiatives results is clear and understandable to audience and appropriate for purpose

• Peer reviewed publications/presentations related to the programs/initiatives and/or leadership

• Invitations to provide faculty development about leadership locally, at other institutions, or in other regional, national or international venues

• Invitations to conduct workshops or give presentations to help others improve their programs/initiatives

• Invitations to consult with or peer review other programs locally or at other institutions or organizations

• Breadth of dissemination and adoption of program models and methods: local, regional, national, international

Reflective critique

• Reflection and results used for ongoing improvement of self, participants, and programs/initiatives

• Critical analysis of leader and programs/initiatives using information from others and from self-assessment

• Evidence of ongoing improvement of leadership and/or programs/initiatives based on critical analysis and reflection

Page 30: Toolbox for Evaluating Educators: Resources to Facilitate

30

Appendix A: Two Sample Promotion Letters Written by External Reviewers

SAMPLE LETTER #1: RE: Clinical Educator, MD (a.k.a. Dr. CE) -- Promotion to Professor in the Clinician Educator Path Dear Dr. XX, Chair, Committee on Rank and Tenure, MM Medical School: I fully support Dr. CE’s candidacy for promotion to the rank of Professor in the Clinician Educator track at MM Medical School. I first met Dr. CE in the mid-1990’s when she began redesigning the Out Patient Clinic Conferences for Internal Medicine residents. Our consultations on education have continued, focusing on Dr. CE’s curriculum innovations and scholarship in chronic disease management and most recently translating her findings into iPod Touch applications for use as point-of care teaching tools and for management for diabetes. As my experience and my own expertise revolve around Dr. CE’s education-related work, I will focus my comments on these roles. Educators typically excel in teaching a particular level of trainee and in 1-2 areas: teaching, curriculum, advising/mentorship, learner assessment, and/or educational leadership. I would like to offer comments on several of these areas to illustrate the breath of Dr. CE’s educator accomplishments to support my recommendation for her promotion to professor. • EXCELLENCE IN TEACHING: Often faculty can demonstrate excellence in teaching one level of trainee (e.g., medical student,

fellow) or in a limited number of teaching methods. Dr. CE has emerged as a strong medical student teacher now rating at average in a highly rated department. She also has a sustained record of excellence in resident education, confirmed by multiple teaching awards and consistently high teaching ratings. This record of excellence is outlined in her CV through awards and via ratings in her Educator’s Portfolio.

• CURRICULUM DEVELOPMENT & EDUCATIONAL INNOVATION: This is truly an area of excellence as Dr. CE is amongst the earliest and most effective users of e-based instructional delivery platforms – from our open source Learning Management System to iTunes U Podcasts and now her newest chronic care application for mobile devices (e.g., iPad) based on her laminated diabetes pocket cards. Evidence of learning using these tools is strong in both our medical student and resident curriculums. M4 students’ pre/post competence in diabetes management reflects significant gain scores in their ability to manage diabetes care. The Medicine PGY1 & 3 curriculums on chronic disease management and the Ambulatory Clinical QI curriculums have resulted in significant improvements in residents’ knowledge and skills.

• EDUCATIONAL SCHOLARSHIP: Dr. CE has shared her education-related work, in numerous peer reviewed venues including the Society of General Internal Medicine, Academy of Healthcare Improvement (AHI), the International Forum on Quality & Safety in Health Care, American College of Physicians, ACCC Congress-Association of American Medicine Colleges and the State Hospital Association. In addition to the on-line publications of her work associated with these presentations, her other publication venues reflect the convergence of her work in education and medicine/health care improvement, including the Journal of

Page 31: Toolbox for Evaluating Educators: Resources to Facilitate

31

General Internal Medicine and State Medical Journal. In addition, with multiple invited and peer reviewed presentations and lectureships, she demonstrates a continuous record of dissemination at a regional to national level.

As I have often written in my promotion support letters, academic promotion to senior faculty rank must also demonstrate, a “value added.” Dr CE’s continuous record of excellence in education as a teacher, curriculum designer, and innovator in using emerging e-technology for instruction is outstanding. She brings evidence-based rigor to the design, delivery and evaluation of strategies to address challenges that bridge clinical care and medical education. As an active citizen throughout her career, Dr. CE has served on departmental, school-wide and hospital committees aligned with her focus on chronic care, including her current service on the medical school’s Transition Care Team. These value-added contributions exemplify what we seek in our professors and thus Dr. CE has my strongest support for promotion to professor. If I can provide any additional information and/or answer any questions, please to do not hesitate to contact me.

Sincerely, Xxxxxx

Page 32: Toolbox for Evaluating Educators: Resources to Facilitate

32

SAMPLE LETTER #2: RE: Basic Science Educator, PhD (a.k.a. Dr. BSE): Promotion to Associate Professor (non-tenure track)

Dear Promotions Committee members: I am delighted to support Basic Science Educator, PhD, most enthusiastically as a candidate for promotion from assistant professor to associate professor of xxxxxx (non-tenure track). Dr. BSE is an outstanding educator in the School of Medicine. He is also emerging as an effective educational researcher and is earning an enviable national reputation as an educational scholar. Indubitably, he meets (or exceeds) the criteria for promotion to associate professor. Recently, he was selected through a rigorous peer-reviewed process as one of the ten inaugural members of the School of Medicine’s Academy of Medical Education Scholars. Dr. BSE earned his PhD in 20xx from the Department of xxxxx at XXXX. His interest in and talent for teaching emerged early and continues to grow. His contributions as a teacher and educational scholar have been remarkable. Teaching and Educational Administration School of Medicine: Dr. BSE is a key educator in the department, school, and institution. He has earned the respect of his colleagues as well as students and has taught in four of our five health professions schools as well as the summer educational enrichment program. Dr. BSE has taught first-year medical students in the xxxxxx course since 2000; he was named director in 2007. He prepares lectures, handout materials, and examination questions for many lectures and spends untold hours teaching students in the xxxxxx lab and tutoring them one-on-one. His handouts are exceptional and reflect extensive knowledge and use of web-based materials. He is readily available to students and is often sought for clarification, counseling, or advice. Based on my personal observation, review of student evaluations, and informal discussions with peers, he is a cornerstone of the departmental teaching programs. His evaluations are truly outstanding. While there is an historical tendency for faculty to teach (and to be recognized for teaching) primarily within their school of primary appointment, Dr. BSE has transcended departmental and school barriers, and contributes significantly to teaching in the Schools of Dentistry, Allied Health Sciences, and Graduate Studies as well as the summer educational enrichment program. In recognition of his outstanding teaching, he has received many teaching awards, including Excellence in Teaching (Course xxxx) in 2002, 2003, and 2005. He was selected by the students as advisor by the Classes of 2007 and 2011. He has also been recognized by his peers with the Faculty Senate Distinguished Faculty Award for Basic Science Teaching.

Page 33: Toolbox for Evaluating Educators: Resources to Facilitate

33

Educational Research and Scholarship Dr. BSE’s professional passion rests with teaching and all that is entailed in becoming a master teacher and educational scholar. He consistently analyzes his own performance and that of colleagues in an effort to learn from his and others’ experience. He reflects on students’ evaluations and makes changes as appropriate based on them and he seeks and responds to feedback from peers. In addition, Dr. BSE has participated in nearly every teaching skills workshop offered by our educator skills development program as he continuously strives to improve his skills as a teacher and educator. He is a true educational scholar. He has produced a well-received Web CT course. He recognized the need for a compendium text to accompany the didactic and laboratory components of the Allied Health Sciences xxxxx courses, and wrote a comprehensive compendium to accompany the course. Dr. BSE is actively involved in the community of teachers of xxxx and other educators nationally as indicated by his participation in the recent meeting of his discipline’s premier society and his peer-reviewed presentations at meetings of the International Association of Medical Science Educators (IAMSE) and the Association of American Medical Colleges’ Group on Educational Affairs. He develops instructional materials that are suitable for peer review by national repositories, such as MedEdPORTAL, and subsequent public dissemination. Dr. BSE is earning a national reputation as an educational scholar as indicated by 11 peer-reviewed presentations at regional, national, or international medical education meetings. He has published in the peer-reviewed medical education literature and written two chapters. His educator’s portfolio is an exemplar of portfolios. In it, Dr. BSE documents his educational contributions, evidence of quality and impact, and related educational scholarship. Dr. BSE fills a critical niche for the Department of xxxxx, the School of Medicine, and the college as a whole. His commitment and enthusiasm are palpable and infectious. He has earned the respect of students and colleagues, and selflessly devotes himself to continuous improvement of our educational programs by listening to feedback from peers and learners, and assessing outcomes. He is an outstanding educator and has earned my most enthusiastic and unqualified support as an exceptional candidate for promotion to associate professor. Sincerely yours, XXXXX

Page 34: Toolbox for Evaluating Educators: Resources to Facilitate

34

Appendix B: Educational Consultant’s Summary of a Portfolio:

Promotion to Professor in the Clinician Educator Track

Review of Educational Activities and Products for Dr. XXXX

DOMAIN OF

EDUCATIONAL ACTIVITY

EVIDENCE OF: Quantity Quality Impact

TEACHING

• Extensive teaching activity at levels of students, residents, fellows, and faculty

• Lectures on all 5 domains of education

• Lectures, modules for national learners

• Online modules for specialty society

COMMENT: Impressive breadth and quantity of contributions

• Many honors and awards for teaching

• Leading role in development of enduring materials for national specialty organization

• Invited to teach at prestigious national faculty development program

COMMENT: Strong evidence of high quality

• Development of many online teaching modules with dissemination to national audiences of physicians and educators

• Online textbook for clinical educators: >4500 page hits in 2 yrs, 1634 downloads in 2009

OVERALL: Evidence of high impact in a diversity of venues, institutional, regional, and national

LEARNER ASSESSMENT

Question writer for specialty board and program for renewal of certification in specialty

• Repeated invitations to serve on national exam writing committees

OVERALL: Evidence of national reputation and impact

CURRICULUM DEVELOPMENT This domain is a special area of strength for this educator

• For specialty fellowship program

• For specialty residency program

• Faculty development and CME at XX Medical School and XX Medical Center

• Online specialty curriculum resource

• Curriculum committee memberships for students,

• Won numerous grants to develop courses and curricula

• Invitations to develop on-line specialty curriculum and the specialty’s national faculty development program for educators

Page 35: Toolbox for Evaluating Educators: Resources to Facilitate

35

DOMAIN OF EDUCATIONAL

ACTIVITY

EVIDENCE OF: Quantity Quality Impact

residents, and fellows COMMENT: Huge effort in domain

COMMENT: Very high quality in curriculum development

OVERALL: Evidence of high quality and impact, institutionally and nationally

MENTORING AND ADVISING

• Mentoring of 7 residents • Mentoring of 15 fellows as

program director • National mentoring

through specialty’s national faculty development program for educators

COMMENT: Quantity of mentees is high for candidate’s age

• All but 2/15 mentored fellows are now in academic positions

• Numerous fellows have received awards and grants

• Has included fellows in publications

COMMENT: Strong evidence of quality as a mentor

OVERALL: Strong mentoring impact through fellowship program

EDUCATIONAL LEADERSHIP AND ADMINISTRATION This is strongest domain of this candidate

Institutional: • Fellowship director • Assoc residency program

director • Co-director, master

teacher fellowship • Co-director, medical

school’s educational scholars fellowship program

• Director, Managed Care, Quality and Outcomes Course (residents)

Institutional: • Fellowship director, full

accreditation of program with dissemination of program models

• Clear record of growth and promotion to higher levels of responsibility at XX Medical School

• Responsible for development of numerous new programs for educators at XX Medical School

Page 36: Toolbox for Evaluating Educators: Resources to Facilitate

36

DOMAIN OF EDUCATIONAL

ACTIVITY

EVIDENCE OF: Quantity Quality Impact

• Developed a new Division of Medical Education in department

• Extensive committee work as chair and as member

National: • Specialty’s national faculty

development program for educators: o Executive Committee o Cohort Leader o Curriculum and

Evaluation Committee o Research project leader

• Invited member, specialty’s fellowship accreditation committee

• Specialty’s health literacy project advisory group and others

COMMENT: Many contributions locally and nationally as an educational leader

National: • Highly effective as a

national educational leader in 2 specialty-specific organizations, and in regional educational group of the AAMC

• Published products from work on a curriculum committee for specialty’s national faculty development program for educators

COMMENT: Exceptional quality

OVERALL: Outstanding impact in many programs, locally and nationally

ACROSS DOMAINS:

SCHOLARLY APPROACH

See domains above Also: • Reviewing for journals • Specialty’s on-line journal

editorial board • Reviewing for professional

academic organizations • Extensive engagement in

personal professional development

• Use of best practice models and peer rev literature in teaching and curriculum development

• Impressive innovation in curriculum development and program development

• Commitment to lifelong learning/personal QI: MPH and MEd degrees; high quality professional

• Sustained commitment to innovation at all levels of home institution

• Extensive contributions to the greater educational community via national professional organizations

Page 37: Toolbox for Evaluating Educators: Resources to Facilitate

37

DOMAIN OF EDUCATIONAL

ACTIVITY

EVIDENCE OF: Quantity Quality Impact

development activities sustained over 15 years

• Conducts both quantitative and qualitative studies and publishes in varied journals

COMMENT: Strong evidence of exceptional quality

OVERALL: Very impressive contributions to education community

SCHOLARSHIP

(PR = peer reviewed) • 9 PR articles (4 first

authored) • 4 non-PR articles • 1 online textbook for

clinical educators • 12 book chapters • 51 PR national

presentations • 80 regional/local

presentations • 17 grants (mostly small

and local) to support educational development

COMMENT: Extensive dissemination through presentations

• Excellent reviews of online textbook for educators

• Grant support: excellent record of success in obtaining support for educational activities

COMMENT: Quality demonstrated through acceptance of many PR presentations and grant funding

OVERALL: Dissemination strong in presentations and online enduring materials

Page 38: Toolbox for Evaluating Educators: Resources to Facilitate

38

Reference List General References on Educational Scholarship 1. Baldwin, C, Chandran, L, & Gusic, M. Educator evaluation guidelines. MedEdPORTAL, 2012. Available from:

www.mededportal.org/publication/9072. 2. Baldwin, C, Chandran, L, & Gusic, M. Guidelines for evaluating the educational performance of medical school faculty: priming a national

conversation. Teach Learn Med. 2011; 23(3):285-97. 3. Beasley, BW, Wright, SM, Cofransesco, J Jr, Babbott, SF, Thomas, PA, & Bass, EB. Promotion criteria for clinician educators in the United

Stated and Canada. A survey of promotion committee chairpersons. JAMA. 1997; 278(9);723-8. 4. Boyer, EL. (1990). Scholarship reconsidered: priorities of the professoriate. San Francisco, CA: Jossey-Bass Publishers. 5. Chandran, L, Gusic, M, Baldwin, C, Turner, T, Zenni, E, Lane, J, et al. APA Educator Portfolio Analysis Tool. Approved by MedEdPORTAL;

2009. Available from: https://www.mededportal.org/publication/1659 Accessed 11-03-2011. 6. Chandran, L, Gusic, M, Baldwin, CD, Turner, T, Zenni, E, Lane, L,…Gruppen, LD. Evaluating the performance of medical educators: a novel

analysis tool to demonstrate the quality and impact of educational activities. Acad Med. 2009; 84:58-66. 7. Fincher, RE, Simpson, DE, Mennin, SP, Rosenfeld, GC, Rothman, A, McGrew, MC,…Turnbull, JM. Scholarship in teaching: an imperative for

the 21st century. Acad Med. 75:887-894, 2000. 8. Fitzgerald, HE, Burack, C, & Seifer, SD. (2010). Handbook of engaged scholarship volumes 1 and 2. East. Lasing. MI: Michigan State

University Press. 9. Glassick, CE Huber, MT, Maeroff, GI. (1997). Scholarship assessed: evaluation of the professoriate. San Francisco, CA: Jossey-Bass. 10. Glassick, CE. Boyer’s expanded definition of scholarship, the standards for assessing scholarship and the elusiveness of the scholarship of

teaching. Acad Med. 2000; 75:877-880. 11. Gruppen, LD, Simpson, D, & Searle, NS. Educational fellowship programs: common themes and overarching issues. Acad Med. 2006; 81:990-

994. 12. Gusic, ME, Chandran, L, Balmer, DF, D'Alessandro, DM, & Baldwin, CD. Educator portfolio template of the pediatric academic societies'

educational scholars program. Approved by MedEdPORTAL; 2007. https://www.mededportal.org/publication/626 Accessed 11-03-2011. 13. Hafler, JP, Morzinski, JA, Blanco, MA, & Fincher, RE. (2012). Educational scholarship. In: Morgenstern BZ et al. (Eds), The guidebook for

clerkship directors (4th Edition). Syracuse, NY: Gegensatz Press. 14. Hutchings P, Huber MT, Ciccone A. (2011). The scholarship of teaching and learning reconsidered: Institutional integration and impact. San

Francisco: Jossey-Bass. 15. Hutchings, P. and Shulman, L.S. (1999). The scholarship of teaching: new elaborations and developments. Change, 31(5), 10-5. 16. Kirkpatrick, DL, & Kirkpatrick, JD. (2006). Evaluating Training Programs: The Four Levels (3rd Ed). San Francisco, CA: Berrett-Koehler

Publishers. 17. McLean, M, Cilliers, F, & Van Wyk, JM. Faculty development: Yesterday, today and tomorrow. Med Teach. 2009; 30:555-584. 18. Shulman, LS. Signature pedagogies in the professions. Daedalus. 2005; 124(3):52-59. 19. Shulman, LS. Teaching as community property: Putting an end to pedagogical solitude. Change. 1993:25(6):6-7. 20. Simpson, D, Fincher, RM, Hafler, JP, Irby, DM, Richards, BF, Rosenfeld, GC,…Viggiano, TR. Advancing educators and education by defining

the components and evidence associated with educational scholarship. Med Educ. 2007;41:1002-1009.

Page 39: Toolbox for Evaluating Educators: Resources to Facilitate

39

21. Simpson, D, Fincher, RM, & Hafler, JP. Advancing educators and education: defining the components and evidence of educational scholarship. Available at: http://tinyurl.com/7evolwy Accessed December 22, 2011.

22. Simpson, D, Hafler, J, Brown, D, & Wilkerson, L. Documentation systems for educators seeking academic promotion in US Medical Schools. Acad Med. 2004; 79:783-790.

23. Steinert, Y, Mann, K, Centeno, A, Dolmans, D., Spencer, J, Gelula, M,…Prideaux, D. A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: BEME Guide No. 8. Med Teach. 2006; 28:497-526.

24. Thomas, PA, Diener-West, M, Canto, MI, Martin, DR, Post, WS, & Streiff MB. Results of an academic promotion and career path survey of faculty at the Johns Hopkins University School of Medicine. Acad Med. 2004; 79:258-264.

25. Viggiano, TR, Shub, C., & Giere, RW. The Mayo Clinic’s clinician educator award: A program to encourage educational innovation and scholarship. Acad Med. 2000; 75(9); 940-943.

26. Whitcomb, M. The medical school’s faculty is its most important asset. Acad Med. 2003;78:117-118. Teaching 27. Alguire, PC, DeWitt, DE, Pinsky, LE, & Ferenchick, GS. (2010). Teaching in your office, 2nd Edition. in teaching medicine series. Philadelphia,

PA: ACP Press. 28. Ende, J. (2010). Theory and practice of teaching medicine. Philadelphia, PA: ACP Press. 29. Geraci, S, Buranosky, R, Babbott, SF, & Hollander, H. AAIM report on master teachers and clinician educators, Part 5: Academic documentation

and tenure. Am J Med. 2010;123:1151-1154. 30. Geraci, SA, Babbott, SF, Hollander, H, Buranosky, R, Devine, DR, Kovach, RA,…Berkowitz, L. AAIM report on master teachers and clinician

educators, Part 1: needs and skills. Am J Med. 2010;123:769-773. 31. Geraci, SA, Devine, DR, Babbott, SF, Hollander, H, Buranosky, R, & Kovach, RA. AAIM report on master teachers and clinician educators Part

3: finances and resourcing. Am J Med. 2010;123:963-967. 32. Geraci, SA, Hollander, H, Babbott, ST, Buranosky, R, Devine, DR, & Kovach, RA. AAIM report on master teachers and clinician educators Part

4: faculty role and scholarship. Am J Med. 2010;123:1065-1069. 33. Geraci, SA, Kovach, RA, Babbott SF, Hollander H, Buranosky R, Devine DR,…Berkowitz L. AAIM report on master teachers and clinician

educators Part 2: faculty development and training. Am J Med. 2010;123:869-872. 34. Jeffries, WB & Huggett, KN (Eds). (2010). An introduction to medical teaching. Springer Dordrecht Heidelberg London New York. 35. Mastascusa, EJ, Snyder, WJ, & Hoyt, BS. (2011) Effective instruction for STEM disciplines – from learning theory to college teaching. San

Franscisco, CA: Jossey-Bass. 36. Pangaro, L. (2010) Leadership careers in medical education, in teaching medicine series. Philadelphia, PA: ACP Press. 37. Skeff, KM, & Stratos, GA. (2010). Methods of teaching medicine. Philadelphia, PA: ACP Press. 38. Turner, TL, Palazzi, DL, & Ward, MA. The clinician-educator’s handbook, http://www.bumc.bu.edu/facdev-

medicine/files/2010/05/ClinicianEducators.pdf Accessed 2-10-12. 39. Wiese, J. (2010). Teaching in the hospital. Philadelphia, PA: ACP Press.

Page 40: Toolbox for Evaluating Educators: Resources to Facilitate

40

Learner Assessment 40. Cook, DA & Beckman, TJ. Current concepts in validity and reliability for psychometric instruments: Theory and application. Am J Med. 2006;

119, 166.e7-166.e16. 41. Downing, SM & Haladyna, TM. Validity threats: Overcoming interference with proposed interpretations of assessment data. Med Educ. 2004:

38:327-333. 42. Downing, SM. Reliability: On the reproducibility of assessment data. Med Educ. 2004; 38: 1006-1012. 43. Downing, SM. Validity: On the meaningful interpretation of assessment data. Med Educ. 2003; 37:830-837. 44. Epstein, R. Assessment in medical educ. N Engl J Med. 2007; 356:387-396 45. Holmboe, ES & Hawkins, RE. (2008). Practical guide to the evaluation of clinical competence. Philadelphia, PA: Mosby/Elsevier. 46. Standards for educational and psychological testing, American Psychological Association, Washington DC, 1999. Curriculum Development 47. Calahane, M & Lotfipour F. (2012). Clerkship curriculum. In: Morgenstern BZ et al. (Eds), The guidebook for clerkship directors (4th Edition).

Syracuse, NY: Gegensatz Press. 48. Chessman, A, & Anderson, K. Creating a clerkship curriculum. In RE Fincher (Editor) Guidebook for Clerkship Directors (3rd Edition). Published

by the Allience for Clinical Education. Accessed March 17, 2012 http://familymed.uthscsa.edu/ACE/chapter3.htm. 49. Kern, DE, Thomas, PA, & Hughes, MT. (2009). Curriculum development for medical education: A six-step approach (2nd Ed.). Baltimore, MD:

Johns Hopkins University Press. 50. Piskurich, GM. (2006). Rapid instructional design-learning ID fast and right. San Francisco: Pfeiffer – John Wiley Sons. 51. Roberts, KB, DeWitt, TG, Goldberg, RL, & Scheiner, AP. A program to develop residents as teachers. Arch Pediatr Adolesc Med.

1994;148:405–10. Mentoring and Advising 52. Bland, CJ, Schmitz, CC, Stritter, FT, Henry, RC & Aluise, JJ. (1990). Successful faculty in academic medicine: essential skills and how to

acquire them. New York, NY: Springer Publishing Company, Inc. 53. Bland, CJ, Taylor, AL, Shollen, SL, Weber-Main, AM & Mulcahy, PA. (2009). Faculty success through mentoring: a guide for mentors, mentees,

and leaders. Lanham, MD: Rowman & Littlefield Publishers, Inc. 54. Bower, D, Morzinski, J, Diehr, S, & Simpson, D. Support-challenge-vision: a model for faculty mentoring. Med Teach. 1998; 20(6):595-597. 55. Carr, PL, Bickel, J, & Inui, T. (2003). Taking root in a forest clearing: a resource guide for medical faculty. Boston, Mass: Boston University

School of Medicine. 56. Daloz, MA. (1990). Mentoring: guiding the journey of adult learners. San Francisco, CA: Jossey-Bass. 57. Hitchcock, MA, Bland, CJ, Hekelman, FP, & Blumenthal, MG. Professional networks: the influence of colleagues on the academic success of

faculty. Acad Med. 1995; 70:1108-16. 58. Humphrey, HJ. (2010). Mentorship in academic medicine, in teaching medicine series. Philadelphia, PA: ACP Press. 59. Morzinski, JA, & Fisher, JC. A nationwide study of the influence of faculty development programs on colleague relationships. Acad Med.

2002;77: 402-406. 60. Ragins, BR, & Kram, KE. (2007). The handbook of mentoring at work: theory, research & practice. Thousand Oaks, CA: Sage Publications.

Page 41: Toolbox for Evaluating Educators: Resources to Facilitate

41

Educational Leadership and Administration 61. Bolman, LG, & Deal, TE. (1997). Reframing organizations. San Francisco, CA: Jossey–Bass. 62. Cooke, M, Irby, DM, & O’Brien, BC. (2010). Educating physicians – a call for reform of medical school and residency. San Francisco, CA:

Jossey-Bass. 63. Kotter, JP (1996). Leading change. Boston, MA: Harvard Business School Press. 64. Kouzes, JM & Posner, BZ. (2012). The leadership challenge: how to make extraordinary things happen in organizations, 5th Edition. San

Francisco, CA: Jossey-Bass. 65. Morgenstern, BZ et al. (Eds), The guidebook for clerkship directors (4th Edition). Syracuse, NY: Gegensatz Press, 2012. 66. Senge, P. (2006). The fifth discipline: the art and practice of the learning organization, 2nd Edition. New York, NY: Crown Publishing Group at

Random House.