summary report 2011 - nebula.wsimg.com

29
METROWEST EVALUATION INSTITUTE Summary Report 2011 Submitted To: MetroWest Health Foundation Submitted By: Anita M. Baker, Ed.D. March 2012 Evaluation Services 101 E. Blair Tr Lambertville, NJ 08530 My participation has definitely enhanced my own evaluative thinking and I participate in most of the programs that involve formal evaluations. MWEI participant, fall 2011

Upload: others

Post on 21-Oct-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Summary Report 2011 - nebula.wsimg.com

METROWEST EVALUATION INSTITUTE

Summary Report 2011

Submitted To: MetroWest Health Foundation

Submitted By: Anita M. Baker, Ed.D.

March 2012

Evaluation Services 101 E. Blair Tr Lambertville, NJ 08530

My participation has definitely enhanced my own evaluative thinking and I participate in most of the programs that involve formal evaluations. MWEI participant, fall 2011

Page 2: Summary Report 2011 - nebula.wsimg.com
Page 3: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 1 -

INTRODUCTION The MetroWest Evaluation Institute initiative is an extension of the MetroWest Health

Foundation’s capacity building work. As stated in the original RFP, the MetroWest Evaluation

Institute (MWEI) was designed to give grantee organizations the knowledge, skills and tools to

evaluate, improve and communicate their work. It was initiated, in part, in response to a

research study by the Innovation Network titled, “State of Evaluation 2010: Evaluation Practice

and Capacity in the Nonprofit Sector,” which found that only 13% of nonprofit organizations had

any full-time employees dedicated to evaluation, and that most surveyed nonprofits (81%)

agreed that limited staff expertise was a challenge. The specific concept for the MWEI was

developed by a Foundation Outcome Measurement and Evaluation task force convened in 2010.

The task force, comprised of grantees, trustees and staff, examined ways the Foundation could

assist grantees with evaluation and determined that direct training would be useful. The

Foundation then queried other philanthropic organizations about strategies and ultimately

developed a competitive search for a team to develop and conduct the institute.

The MetroWest Evaluation Institute (MWEI) was implemented during Fall 2011 at the

MetroWest Health Foundation (MetroWest) offices

in Framingham, Massachusetts. The training

curriculum was designed to provide comprehensive

evaluation training and coaching to inspire

evaluative thinking and increase the capacity of

participants to incorporate evaluation into their

regular organization work. The program was

designed by Anita M. Baker, Ed.D., an independent

evaluation consultant who has developed and

conducted two other similar projects in Rochester,

New York and Hartford, Connecticut. It was

delivered by Baker (lead trainer) and three other

evaluation/management consultants (Victoria

Dougherty, Jodi Paroff and Tony Hall) to representatives from selected nonprofit organizations

who are the focus of this report (see following for full descriptions of participants).

Key Learning Objectives 1. Understanding the importance of evaluation for learning, program improvement, and communicating with stakeholders; planning for and initiating evaluation design; using logic for program planning and evaluation; incorporating evaluative thinking in organizational work.

2. Understanding the different methods and instruments for collecting and analyzing quantitative data and drawing conclusions.

3. Understanding the different methods and instruments for collecting and analyzing qualitative data and drawing conclusions.

4. Selecting data collection strategies for evaluation design; projecting necessary level of effort, costs and products of evaluation.

Page 4: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 2 -

In the past, MetroWest has offered single workshops to help build the skills and

management capacity of their grantees. This time the training period included four separate

teaching sessions, each with opportunities and requirements to practice and apply new skills.

Participants also completed readings assigned in advance of each training session, and

homework assignments to demonstrate their understanding of evaluation-related and

evaluative thinking concepts. In addition, evaluation coaching was offered to each participating

team in-between each formal teaching session. The Evaluation Institute culminated with the

development of rigorous evaluation designs focused on selected programs of the participating

agencies: these in particular demonstrated enhanced evaluation capacity. The designs were

presented and discussed at a final session, and are expected to be implemented during 2012.

Throughout the initial training period, participants provided feedback regarding the

Evaluation Institute in the form of response to brief assessment e-surveys after every session.

They also responded to a final, follow-up survey conducted six-weeks after the final Institute

session where they reflected again on the delivery and importance of MWEI, and provided more

details regarding outcomes and project status updates. This report presents a brief description

of MWEI and provides details about who participated, how and what training was provided and

what resulted. The final section of the report describes next steps and issues for further

consideration.

Page 5: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 3 -

MWEI TRAINING 2011

The purpose of the MWEI training was to inform participants about evaluation planning, data

collection and analysis, and use, and to provide participants with opportunities to immediately

apply what they learned. Participants included key decision-makers (e.g., Executive Directors,

Program Directors, other designees) and other staff from five nonprofit, grantee organizations

from the Framingham area and MetroWest community. The training curriculum was adapted

specifically for MWEI from the Participatory Evaluation Essentials manual (Baker and Bruner,

2010) which had been developed through the Bruner Foundation supported Rochester

Effectiveness Partnership, REP (1997 – 2003) and used successfully to guide the Hartford

Foundation for Public Giving’s Building Evaluation Capacity (BEC) training program (2006 –

present). The MWEI training took place at the MetroWest Health Foundation office from

September through December, 2011, and concluded with successful development of evaluation

designs and completion of organizational evaluative thinking assessments by all participating

teams.

Training Implementation

A total of five sessions about evaluation planning, methodology and use were conducted

during the training period (September – December 2011). This included four 5-hour sessions

and one 4-hour final design display session (24 hours total). Each training session included

some lecture-style presentation, opportunities for individuals and groups to try out new material

and to work on applications for their own organizations, and opportunities for individuals within

and across teams to confer regarding their work, evaluation and evaluative thinking.

Participants were engaged in evaluation-related learning throughout the training period.

All sessions were preceded by a reading assignment and followed by homework that resulted in

usable products such as logic models, surveys and other data collection instruments,

administration and analysis plans, and evaluative thinking action plans. Combined, the

homework assignments lead to an evaluation design scheduled for implementation in 2012. The

evaluation training team also introduced the concept of evaluative thinking and its applications

throughout the training, especially in the first session when individual organization assessments

were conducted, and in the final session when organizations summarized plans to enhance and

sustain these important functions for the long term. The topics, activities and homework

covered at each session of the training period for the class of 2011 are shown in Table 1.

Page 6: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 4 -

Table 1: MetroWest Evaluation Institute 2011, Session Descriptions

DATE Pre-Reading Assignments Session Content Activities Homework

I 9/14/11 AIDS Project Hartford Evaluation Report

“Setting Targets – A Manhattan Specific Example”

EVALUATION BASICS Intro/Importance Terminology/Background, Context Evaluation Questions Evaluation Stakeholders Evaluation Design Evaluation Logic Evaluative Thinking

Ice breaker, Tech Check,

Developing evaluation questions

Evaluation planning

Logic model scramble

Outcomes & indicator match up

Evaluative Thinking Assessment

1st Design work, identify selected program and Evaluation Questions Complete Logic Model for selected program

lI 10/6/11 Using Evaluation to Plan for the Zombie Apocalypse

Excerpts re survey and record review data from sample evaluation reports including AIDS Project Hartford, Future Care

Mainstream media article using survey and record review college access and success data

COLLECTION AND ANALYSIS OF DATA FROM SURVEYS AND RECORD REVIEWS; STRATEGIC USE OF DATA Data Collection Overview Surveys and Record Reviews Details Strategic Data Use and Data Management Analysis Planning

Gallery Walk - Logic Models, Evaluation Questions, Initial design thoughts

Survey Goof

Survey and Record Review Data Analysis – COA project

Data Use Planning Matrix

Revise own survey, or develop new for project Develop record review protocols Develop administration and analysis plans for surveys or record review Complete data matrix

III 11/3/11 Excerpts re interview, observation data from sample evaluation reports AIDS Project Hartford, Future Care

Review of Evaluation Journal Article, using observations

COLLECTION AND ANALYSIS OF DATA FROM INTERVIEWS AND OBSERVATIONS: INITIAL EVALUATION DESIGN PLANNING Interviews and Observations Analysis of Qualitative Data Evaluation Design Development – Data Collection Decisions

Analysis of open-ended survey questions

Interview analysis activity

Design Session

Worksheets – Compare and Contrast Surveys and Interviews, Observation and Record Review Scenarios Work on Design

Page 7: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 5 -

Table 1 (Continued): MetroWest Evaluation Institute 2011, Session Descriptions

DATE Pre-Reading Assignments

Session Content Activities Homework

IV 12/1/11 “An Ecological Understanding of Evaluation Use, A Case Study of the Active for Life Evaluation” (RWJF)

“Formative and Summative Evaluation of the St. Paul Technology for Literacy Center (TLC): A Utilization Focused Method” (Patton)

STAKEHOLDERS, REPORTING, DESIGN, ANCHORING EVALUATION Putting it Together – Evaluation Designs including Level of Effort (LOE) Projections, Timelines, Budgets

Working with Stakeholders (Boards, Staff)

Communicating Evaluation Results – Evaluation Reporting

Planning for the MWEI Design Display Session

Budgeting/Paying For and Commissioning Evaluation

Developing Plans to Sustain (Anchor) Evaluation Capacity and Evaluative Thinking

Design Session Identifying Stakeholders Developing Report Outline Working on an Anchoring Plan

Work on all Evaluation Design Sections Finalize Evaluation Design Finalize Report Outline Identify key Stakeholders Finalize Anchoring Plan

V 12/15/11

Final Design Presentations* Final Wrap-Up and Summary

Present Evaluation Designs to Peers and other guests at conference, using a tri-fold presentation board. Answer questions about design choices. Final discussion about sustaining evaluation capacity (anchoring evaluation)

* Note, the final session included all participants as well as MetroWest Health Foundation President and CEO, Martin Cohen, Rebecca Donham, Senior Program Officer, and other Foundation staff and invited guests.

Page 8: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 6 -

In addition to the training sessions, the evaluation training team also provided individual

technical assistance in-between scheduled training sessions for all participants, as needed, via

email or phone calls. This individualized technical assistance was conducted to help participants

complete their homework or directly apply what they had learned in their own organizations.

Some participants used this support to revise an existing survey, assess existing data collection

strategies, or review the evaluation design being proposed for one of their programs. Some

participants also requested assistance with logic model development or outcomes specification

for their programs.

Fall 2011 Participants

A total of nine organizations were considered for the MWEI class of 2011. Five

organizations were selected to participate, and the Foundation involved a program officer as well.

Table 2: MetroWest Evaluation Institute 2011, Selected Organizations

Organization Primary Service Area Teams

BayPath Elder Services Home care, direct services, and advocacy for elders

Director SCO Manager/ Housing Supervisor

Assabet Valley Collaborative

Education service agency providing special education programs, transportation, professional development

Program Directors (2) Clinical Coordinator

Advocates, Inc. Direct services and advocacy for persons with disabilities, offering a broad range of community-based services

VP Quality/Risk Management Sr. Director Chief Compliance Officer

Jewish Family Services of Metrowest

Social, health and community services Chief Operating Officer Program Director Program Specialist

MetroWest Health Foundation

Grantmaking and program initiatives to meet the unmet health needs of twenty-five communities in the MetroWest area of Massachusetts

Program Officer

Wayside Youth and Family Support Network

Human services agency providing family-based outreach services, residential treatment programs, and community-based counseling

VP Day & Residential Services VP Community Services

Page 9: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 7 -

MetroWest Health Foundation officers invited participation for the first Evaluation

Institute. They identified organizations that provide health services in the region that also met

the following qualifications/requirements.

Staff size was large enough that the absence of staff members would not disrupt daily service.

Organizational leadership was not in transition.

A minimum of two staff members including at least one senior leader, could be identified.

Senior-level officials (i.e., those with decision-making authority) attended and fully participated

in the MWEI training. They were specifically involved to increase the potential for both

extending and sustaining evaluation capacity and evaluative thinking in the organizations. As

shown in Table 2 (previous), the organizations involved individuals from various positions

according to their own needs for training (e.g., Chief Compliance Officer, Program Director, Chief

Operating Officer, Clinical Coordinator). The Foundation also involved a Program Officer in the

training as a full participant with homework and project responsibilities, so that evaluation

learning could be extended to the Foundation as well. Additionally, though not a full participant,

Rebecca Donham, MetroWest Senior Program Officer, attended all sessions and provided insight

and guidance throughout the training period.

Page 10: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 8 -

TRAINING RESULTS

This section of the report presents a summary of findings about the fall 2011 Institute for MWEI

participants, and projections about future use of the training. Findings were compiled from post-

session assessments administered after each training, final participant surveys that were

administered six weeks after the final design display session,1 and a follow-up interview with

MetroWest Foundation Senior Program Officer, Rebecca Donham, who oversaw the Institute.

Participants from all 6 organizations attended regularly and demonstrated that they were

learning about evaluation and practicing evaluative thinking. Most importantly, every

participating team conducted initial assessments of evaluative thinking, began to formulate plans

to enhance evaluative thinking and developed comprehensive evaluation designs for programs

they selected. Details about skill development and learning, perceived importance of MWEI,

initial or projected implementation of evaluation designs, other projected uses of the training, as

well as feedback and suggestions regarding ongoing needs for support are presented here.

Participants Developed Important Skills

As a result of the initial training, all participants demonstrated understanding of and

ability to conduct key evaluation-related skills. This included exposure to and experience with

the following:

Logic Models & Evaluation Design/Planning: developing logic models, using logic models to inform program evaluation, specifying evaluation questions, developing an evaluation design, choosing methods to address evaluation questions, projecting level of effort for evaluation, projecting costs for evaluation, and reviewing evaluation designs

Data Collection and Data Analysis Planning: developing surveys, planning administration of a survey, analyzing survey data; developing interview guides/protocols, planning to conduct interviews, planning to analyze interview data, analyzing interview data; developing observation protocols, planning to conduct observations, planning to analyze observations; developing record review protocols, planning to collect record review data, planning for analysis and analyzing record review data

Participants came to the training from different backgrounds and many had at least some

prior knowledge of the Institute concepts, especially logic models, surveys and interviews.

Specifically about half of the participants indicated they knew about logic models before MWEI

and about one third of the participants knew some about developing surveys and conducting

1 A total of 13 of the 14 surveys administered were fully answered.

Page 11: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 9 -

interviews. Many of those entering with this knowledge also reported learning more during the

training. Despite the varied knowledge and experience with topics related to evaluation,

participants definitely enhanced and added to their skill sets through MWEI. On a three point

scale ranging from not at all, a little, to a lot, a substantial majority of participants reported that

MWEI helped them a lot, to develop important evaluation skills (see Table 3). All other

participants indicated the training had helped at least a little, and for a few on certain topics, that

they already knew all that was covered on the topic.

Table 3: Percent of Class of 2011 Participants Reporting that MWEI Training Helped Them

Develop the Following Skills

(Source: Q8, final survey) MWEI Helped ‘A Lot’

Develop an evaluation design 100%

Review evaluation designs 84%

Specify evaluation questions 82%

Choose methods to address evaluation questions 82%

Project level of effort for evaluation 67%

Devee Develop a Survey 67%

Oland Plan for survey administration 60%

Plan for survey analysis 75%

Develop an interview guide or protocol 67%

Plan to analyze interview data 58%

Develop a record review protocol 64%

Plan to analyze record review data 67%

*Note these percentages are based only on those who indicated they did not know the information before MWEI. The total number of respondents = 13.

Page 12: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 10 -

(Detailed survey results are included in appendix 3 at the end of this report). By the end of the

initial training period, everyone shared a common language about evaluation, and every group

demonstrated they could apply what they knew to the development of evaluation designs.

Participants Rated the Training Favorably

MWEI participants completed anonymous electronic assessments of the training after

every session. Specifically, they were asked to clarify whether they learned about session topics

new to them, to provide an overall rating for the session, and they were asked how much they

thought the session would help with their individual design project, and with their regular work.

As shown in Figure 1, participant feedback regarding the sessions overall was consistently

positive. On a four point scale ranging from not so good, to excellent, most participants (90%)

rated each session, except for Session 1, as excellent or very good.

Figure 1: MWEI Session Ratings: 2011

Session 1 contained more theory than the other sessions, introducing a common vocabulary for

evaluation and evaluative thinking, as well as opportunities for discussion rather than hands-on

opportunities for applied skills. Senior Program Officer Rebecca Donham qualified the

differences between Session 1 and later sessions: “The first day is always more listening. It is

always hard to start with language and the terms we need to know. But the balance was just

0

20

40

60

80

100

1 2 3 4

Session Number

Percent of Participants who Rated each Session as Excellent or Very Good

Page 13: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 11 -

right. We started with more content, but by the end it was much more hands-on and less

presentation but more practice.”

Most participants indicated that each session would help a lot with their MWEI evaluation

design project. More than half of the participants consistently reported that what they learned in

the training, with the exception of the first training session, would also help a lot with their

regular work as well (See Figure 2).

Figure 2 MWEI Session Ratings: Applications of Session Information

Reflecting back, participants were especially positive about the training, with the exception of

the pre-reading assignments.

All 13 survey respondents rated the content presentations favorably– 100% indicated they were very good.

All 13 survey respondents rated the activities favorably (good or very good) including 10 (77%) who said they were very good.

All 13 survey respondents rated the slides and the Participatory Evaluation Essentials manual favorably including 9 (69%) who said they were very good.

All 13 survey respondents rated the handouts and the activities favorably including 10 and 8 (77% and 62% respectively) who said they were very good.

0

20

40

60

80

100

1 2 3 4

Session number

Percent of Participants who said Session Will Help a lot:

With regular work With evaluation project

Page 14: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 12 -

A total of 9 of the 13 survey respondents rated the pre-reading assignments favorably, including 4 who said they were very good and 5 who said they were good. All the others indicated they were fair, but this is the one area that could definitely be strengthened if future Institutes are conducted.

Additionally all but 1 participant indicated they found the trainers very available when they had

independent consultation needs.

Participants Reported That the Training was Worthwhile and Important

In addition to the ratings described above, the final follow-up survey gave Class of 2011

Institute participants an opportunity to reflect on the usefulness of the training overall and on

individual features. Again all participants indicated the training was worthwhile including 9

(75%) who said it was very worthwhile for them personally. Additionally, all participants

indicated the training was worthwhile to their agency including 7 (58%) who said it was very

worthwhile, 4 (33%) who said it was worthwhile and 1 (8%) who said it was at least somewhat

worthwhile (see Table 4 below).

Table 4: How Worthwhile was the MWEI Experience? (n = 12)

Not at all

Somewhat Worthwhile

Worthwhile Very

Worthwhile

. . . for you personally? 0 0 25% 75%

. . . for your agency? 0 8% 33% 58%

Please note one participant did not answer this question.

As clarified by a participant, “I viewed this as a priority to better the agency overall. ”

Other survey responses show that participants valued the opportunity to learn about

evaluation, and they also valued the opportunities to learn and connect with other professionals

during the training (see Table 5). With the exception of one participant who did not find the

presentation boards important, all participants indicated that all key aspects of the training were

at least somewhat important. Specifically,

All 13 survey respondents agreed the opportunity to learn about evaluation was very important.

All 13 survey respondents indicated the requirement to develop an actual evaluation for a selected program was important including 12 (92%) who said it was very important.

All 13 survey respondents indicated opportunities for consultation from the MWEI trainers were important including 10 (77%) who said it was very important and 3 who said it was somewhat important.

Page 15: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 13 -

All 13 survey respondents indicated opportunities to interact with colleagues in other organizations were important, including 9 (69%) who said it was very important and 4 who said it was somewhat important.

All 13 survey respondents indicated opportunities to interact with peers from their own organizations were important including 8 (62%) who said it was very important and 5 who said it was somewhat important.

A total of 12 survey respondents indicated the requirement to put the final presentation boards together was important, including 7 (54%) who said it was very important and 5 who said it was somewhat important.

Table 5: Percent who Reported That the Following

Opportunities Were Important (N=13)

Source: Q10 final e-survey Somewhat Important

Very Important

TOTAL*

Opportunities to learn about evaluation 0 100% 100%

Requirement to design an actual evaluation for a selected program

8% 92% 100%

Opportunities for consultations from MWEI evaluator/trainers

23% 77% 100%

Opportunities to interact with colleagues in other organizations

31% 69% 100%

Opportunities to interact with peers within own organization

38% 62% 100%

Putting together presentation board for the final conference

39% 54% 93%

Participants Successfully Developed Evaluation Designs

The final project for the MWEI training was development of evaluation designs. These

designs had to conform to standard professional evaluation practice and showed that MWEI

participants were able to apply what they learned. Each design described the subject program

and why it was selected, specified evaluation questions, and specified which data collection

strategies would be used to obtain data to answer the evaluation questions. The designs also

included projections of level of effort (i.e., who would do each task and how much time in days or

hours would be reserved for them), proposed timelines for evaluation activities (i.e., when –

Page 16: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 14 -

months/days/seasons evaluation activities would happen), and plans for use of the evaluation

results. During 2012, participants are expected to implement these designs, further develop

administration and analysis plans as needed, collect and analyze data according to their plans,

and develop reports about their findings. Further details about each project are provided in the

Appendix to this report (includes: Program Descriptions and Evaluation Questions – Appendix 1,

and Selected Evaluation Data Collection Methods: MWEI Class of 2011, Appendix 2).

All participants were required to use more than one data collection method, but each

team identified which methods made the most sense to obtain salient information to address

their questions. Each organization prepared a specific plan to show how each data collection

strategy would be used to answer each evaluation question. During the final design display

session, participants were able to explain their design choices, including why a program was

selected, and why a specific data collection strategy would work to address their needs. They

also presented draft or final instruments and administration plans for their data collection and

evidence that they had considered data analysis.

Participants Reported MWEI Helped Them Address Evaluation Challenges

The final Institute assessment question on the follow-up survey asked participants to

reflect on how MWEI had helped them. Again, almost all participants indicated the Institute had

helped them a lot with the key issues the Institute was designed to address. Specifically,

All 13 survey respondents agreed MWEI helped them design and conduct better evaluations of their programs including 11 (92%) who said the institute helped them a lot.

All 13 survey respondents agreed MWEI will or has helped them identify or work with an outside evaluator including 10 (77%) who said the institute helped them a lot.

Further when we asked participants to comment on whether they would continue using what

they learned, all agreed they would. Interestingly their comments were evenly split though,

regarding how that continued use would happen. About half of the participants indicated their

organizations would continue using the new evaluation capacity for at least about the next five

years. The other half also thought the information would be continually used, but only if the

training participants stay at the organization.

Page 17: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 15 -

Evaluative Thinking Is Being Enhanced

Evaluative thinking is a type of reflective practice that incorporates use of systematically

collected data to inform organizational actions. Key components of evaluative thinking include:

Asking questions of substance and determining what data are needed to address the questions

Gathering appropriate data in systematic ways

Analyzing data and sharing findings

Developing strategies to act on findings

In addition to program development and delivery, evaluative thinking can be applied to various

organizational functions such as mission development, human resource decision-making, or

communications/marketing,

All MWEI participating organizations conducted initial Excel-based assessments of

evaluative thinking in their own organizations (using a tool developed by the Bruner Foundation,

www.Brunerfoundation.org). On the final survey, all 13 respondents indicated that participating

in MWEI had helped them use evaluative thinking skills in multiple aspects of their work,

including 10 participants (77%) who indicated it had helped a lot. Additionally all survey

respondents indicated conducting the assessment using the automated tool had been at least

somewhat helpful including more than half had who said it had been helpful (23%) or very helpful

(31%).

Page 18: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 16 -

NEXT STEPS

Conducting an actual evaluation design was a central element of the MWEI training and the

component most likely to extend evaluative thinking beyond the fall 2011 for participants. This

section reports on evaluation design implementation as of February 2012, and presents issues

for further consideration as the Foundation considers future check-ins with grantees and design

of future Institutes.

Evaluation Design Implementation and Future Check-ins

As stated previously, each of the participating teams developed an evaluation design as

the culminating project for the initial training period.2 All organizations were encouraged to

implement those plans throughout 2012. By mid-February 2012, survey responses indicated that

all but one of the seven teams had initiated their work, and half of the participants were actively

collecting data. Additionally, one team reported completing one component of their data

collection, but as it was still relatively early in the year and data collection was mostly not

complete, no teams had begun data analysis or reporting. In response to the question about

what else is needed to ensure the projects get underway, participants indicated they need more

time in their work days, since this is mostly additional work, and they need to get other

colleagues involved. For example, one respondent said, “We are collecting data, inputting data,

and analyzing data as we go along. Unfortunately, the work continues to be a priority for me, but

not for my colleagues. More help and more hours in the day would be nice.”

Plans for an implementation “check-in” have been discussed. The Foundation is very

clear about its importance: “Training can be as thorough as possible, but once grantees are on

their own and back to their jobs, there may be glitches. I do think a check-in or additional

support is critical,” noted Donham. The Institute trainers also recognize the importance of this

and concur with Ms. Donham that additional support would be useful. (In fact the lead trainer,

Baker, did develop a full follow-up strategy for MWEI, but it was unclear until after this first

group began implementation, what would best constitute useful follow-up.) Questions were

included on the final survey regarding follow-up and participants had varying ideas (see Table

6). About half the participants indicated that each of the suggested follow-up strategies would be

2 Please note that the participants from Bay Path Elder Services each developed an evaluation design, so the total number

of projects was 7.

Page 19: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 17 -

at least worthwhile except meeting as a cohort in late May or June (an earlier meeting was

appealing to more).

Table 6: Percent who Reported That the Following Opportunities Would Be Worthwhile as Follow-up (N=10)

Not at all

Worthwhile Somewhat

Worthwhile Worth- while

Very Worthwhile

Receiving one-on-one help remotely via email and/or phone (2 -3 times over next few months)

10% 40% 20% 30%

Receiving one-on-one help in person (1 or 2 times over next few months)

30% 20% 30% 20%

Meeting as a cohort at a mid-point later this Spring to discuss project implementation to date, sharing trouble shooting, and receiving coaching

10% 40% 30% 20%

Meeting as a cohort in late May or June to discuss participants’ evaluation project results or progress

10% 60% 0 30%

Additional plans should be made soon regarding continued supports. These needs and the

following issues should continue to guide ongoing/final Institute plans.

Issues for Further Consideration: Challenges and Suggestions for Future

The final survey also asked participants about challenges that have been identified in

similar evaluation capacity building efforts. Specifically, participants reported whether any of

the following were serious challenges: time to work together as a team, getting homework done,

communicating with the trainer, competing priorities, and complexity of the training.

Summarized responses are presented in Table 7.

Table 7: Feedback About Challenges from Participants (N=13)

Source: Q16 final survey Not a Challenge

Somewhat Challenging

Very Challenging

Competing priorities at our agency 8% 54% 39%

Time to work together as a team 31% 54% 15%

Getting all the homework done 23% 62% 15%

Communicating with the trainers 92% 8% 0

Training content was too complicated 92% 8% 0

Page 20: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 18 -

As is always the case when undertaking comprehensive, intensive training, it is hard to strike a

balance between learning new things and responding to every day demands. The Evaluation

Institute had to contend with this too. As shown in Table 7, many participants found it hard to

address their regular work and organizational needs, and keep up with MWEI work. A total of 5

of the participants (39%) described this as very challenging and many others said it was at least

somewhat challenging. As stated in the previous section, this challenge has continued as the

participants work to implement their projects. Time was also a concern to the majority of

participants, although most identified it as only somewhat challenging. Homework was tricky,

too, but participants were diligent about getting it done. Fortunately, neither complexity of the

training concepts, nor communication with the trainers were identified as challenging by any

participants.

In addition to the question about challenges, there were also questions on how to improve

the Institute. As for the follow-up plans, there was little agreement on how to improve the

structure of the Evaluation Institute, or whether the training needed restructuring at all.

All but 1 participant agreed the ratio of formal presentation to discussions/activities was about right. (The one dissenting participant thought there was too much activity time.)

A total of 7 participants thought the length of the sessions was about right, but 5 others thought they were not long enough. Only 1 participant indicated they were too long.

A total of 7 participants indicated there were not enough sessions, and the other 6 thought the 5 sessions was about right. No one thought there were too many sessions.

Almost everyone (10 of 13) indicated that the amount of time between sessions was about right, but 2 participants thought it was too long and 1 thought the sessions were too close together.

All participants thought the trainers were available, including 12 of 13 who thought they were very available, and all of the participants indicated the Foundation was supportive of their organization’s commitment to learn about evaluation.

Further clarifications were provided through comments.

[The trainers] were extremely responsive and helpful whenever we had questions – AND [the trainers] were extremely knowledgeable and genuinely appeared interested in all topics by participants.

[Regarding other topics to cover] Covered everything, just not enough time. Would have been great to have full days, with a pace that allowed more depth and more discussion and questions. Also a more concentrated course, so things stayed fresh between sessions. For those of us with many competing priorities, it was hard to stay “connected” to the topics between sessions.

[Other improvement suggestions] Perhaps a longer institute with even more information. This could have been a legitimate class with graduate credits and/or CEUs! AND I would have liked more – Data analysis, evaluation writing, methods for presenting to stakeholders.

Page 21: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 19 -

When asked for ways to improve the training, many participants asked for more time on task and

said four sessions was not enough.

All but one participant said having 6 training sessions instead of 4 would be worthwhile, including 4 who said it would be very worthwhile, 3 who said it would be worthwhile, and 4 others who said it would be at least somewhat worthwhile.

All participants said that receiving individualized monthly coaching for 6 - 12 months following the initial training would be worthwhile, with 5 indicating it would be worthwhile or very worthwhile and the others (7 participants) indicating it would be only somewhat worthwhile.

Only about one-third of the participants thought it would be worth it to continue meeting as a cohort after the initial training, and 1 participant indicated it wouldn’t be worthwhile at all.

A few of their final comments are also worth sharing as further clarification regarding Institute

alterations.

I think an extra half hour at the end of each session would have been nice so that we could talk to the trainers about our projects in person.

There was a great deal of information/materials to absorb in few sessions. I would add one more session to the schedule.

Could have spent more time on developing the analysis plan, as this was the hardest to grasp – especially the part on all the planning that has to be done ahead of collecting the data.

The MWEI fall training 2011 ended smoothly and all teams finished their final session with a

clearer sense of how to use evaluative thinking at work and with comprehensive evaluation

designs and the instruments to carry them out. The following, however, will deserve ongoing or

initial attention for participants:

Ensuring that participants get support as they analyze real data from their own organizations and successfully learn how to plan for and conduct evaluation data analyses

Helping participants stay focused on rigorous evaluation of their projects while also managing other organizational demands

Helping participants summarize findings for external communication

Supporting efforts to enhance evaluative thinking at participating organizations.

The Foundation’s interest in developing staff evaluation expertise for its grantees was clearly

addressed through MWEI. Provision of additional Institutes if there is additional need among

grantees is encouraged. Specific recommendations for additional institutes follow.

Page 22: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 20 -

Future MetroWest Evaluation Institutes

In addition to the suggestions for support of the first MWEI participants, the following

recommendations for future Evaluation Institutes are respectfully submitted. These

recommendations are based on responses/suggestions from participants in the first MWEI and

on the lead trainer’s more than 15 years of experience with similar efforts. The

recommendations include both revisions to the structure/format/content of MWEI and

suggested additions. Specifically:

Convert the initial training into a 7-session Institute to enhance capacity.

Keep the group size small (4 – 6 participating organizations) and trainer/participant ratio similar (about 1:2).

After making very minor modifications3 to the curriculum and activities, deliver the content and related activities for Sessions 1 through 4 as in the first MWEI (4 -6 hour sessions), but add two additional sessions.

The two additional sessions should focus on areas where groups usually need additional assistance, and which were identified by the first MWEI participants as areas of ongoing support needs. This would include a session on instrument development and analysis planning which could be done as a series of individual consultations rather than a group session. The second additional session would focus on reporting and presentations and would include additional team planning time. Taken together these two additional sessions could be conducted in just 6 hours total for the participants.

As before, the final (7th) session would be an opportunity to showcase the work of the participating organizations.

Consider adding an optional six-month support project to strengthen capacity building investments.

Hold monthly consultation meetings for individual organizations to obtain one-on-one coaching and guidance related to implementation of their specific project, especially data analysis and reporting. Conduct one or two of these sessions at the organization locations, and include options for basic evaluation training for additional staff.

Conduct a final “Summer” session where study results are presented in a final showcase.

Invite alumni to attend the Summer showcase, collect and analyze data from them

regarding ongoing evaluation capacity use and needs, and consider other opportunities to support alumni evaluation capacity development/use.

3 Discontinue the pre-reading assignments, but definitely keep homework assignments.

Page 23: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report - 21 -

APPENDIX

Page 24: Summary Report 2011 - nebula.wsimg.com

This page intentionally blank

Page 25: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report Appendix Page 1

Appendix Table 1: Programs and Evaluation Questions Selected by MWEI 2011 Participants

ORGANIZATION Program Evaluation Questions

Advocates, Inc.

Clinical Risk Assessment Protocol Advocates provides a broad range of services for people with disabilities including mental health residential and day programs and employment services. Staff have a consistent method for assessing client risk across the organization, and Advocates evaluation will focus on implementation of this Risk Assessment Protocol. Every client in the Residential, Day Habilitation, and Employment programs has been assessed a risk rating, and every client with an elevated risk level has a Joint Crisis Plan and/or a Staff Action Plan. Staff should know the risk ratings of their clients and how to use the crisis plans for their clients who have them.

1) Are the policies/procedures developed by the divisions commensurate with the requirements of the clinical risk oversight document?

2) Have the programs implemented the risk systems as

prescribed by the division specific clinical risk protocols?

Assabet Valley Collaborative (AVC)

Professional Development Workshops AVC serves the region through community-based programs that reach beyond the scope of school-based services to support families and students with special needs. AVC provides services to meet the needs of the 13 member school districts to promote student success and community integration. The Professional Development workshop series was designed to provide specific training on alternate approaches and wrap around services for students with serious mental health issues, for school educators and clinicians (social workers, mental health and guidance counselors) who can directly improve the quality of services and outcomes for students and families.

1) How and to what extent has participation in the workshop series affected participating clinicians? a. increased their knowledge b. enabled them to add to their clinical skills c. increased the potential use of new skills and knowledge with students

2) How and to what extent has clinicians’ practice changed with families and students since participating in the workshops? a. changes in reasons for and numbers of referrals b. reduction of inappropriate punitive punishment of students with serious MH issues (or increased use of alternative strategies for students with serious MH issues).

Bay Path Elder Services

Congregate/Supportive Housing program at Memorial House Bay Path offers home care and related services enabling people to live independently and comfortably in their homes as they age. The congregate housing program seeks to reduce social isolation by providing activity programs, evening meals, and socializing opportunities for clients within the building. Clients are residents within the building who are either 60 years of age or older, or under 60 years of age with a disability. Program goals include improved

identification of tenants’ bio-psycho-social needs, increased span of time “aging in place” by residing in the community versus a nursing home, improved access to Case Management services.

1) Is this program effective i.e meeting the goals? 2) How will the changing needs of the tenants affect the

services and programs that are offered? 3) Are the outcomes as listed in the program logic model

achievable?

Page 26: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report Appendix Page 2

ORGANIZATION Program Evaluation Questions

Bay Path Elder Services

Person-Centered Planning (PCP) Bay Path offers home care and related services enabling people to live independently and comfortably in their homes as they age. PCP is a pilot program funded by the Executive Office of Elder Affairs, and goals include empowering consumers to reflect upon their hopes and dreams, and allow consumers to choose different types of services and providers while piloting a cost-neutral project. Program participants must be 60 years old or older, nursing-home eligible Medicare eligible and reside in 14 towns west of Boston who are enrolled in the Enhanced Community Options program. PCP allows older adults control over how to spend care-plan dollars, and allows for traditional and non-traditional services.

1) Since the implementation of Person-Centered Care, how many of each of the following has occurred? New services, New providers, Different types of providers

2) Since the implementation of Person-Centered Care: How satisfied are participants?

3) Since the implementation of Person-Centered Care, what are the benefits and challenges experienced by vendors?

Jewish Family Services (JFS)

Jewish Family Assistance Network JFS provides vital social, health and community services to alleviate suffering, enhance lives, and support people in need. The Jewish Family Assistance Network program serves 150 individuals and families each year from the Metrowest area experiencing short-term financial crisis and in need of intensive case management, referral and cash assistance to avoid further decline. The program goals include doubling the caseload to 300 cases per year, as well as offering money management skills and participation in financial literacy classes to those who indicate need. Identified families will receive a comprehensive array of coordinated services, including referrals to other organizations, action plans and/or cash assistance. The stated goals of the program are (a) clients will successfully follow and complete their action plan, (b) clients will be able to meet basic needs without assistance, (c) clients will have basic financial literacy.

1) To what extent were client’s basic needs met by the Family Assistance Network?

2) Which three interventions were most successful in helping clients meet their basic needs?

MetroWest Health Foundation

Racial and Ethnic Health Disparities Initiative MetroWest encourages and fosters leadership on critical health issues in the region. The Racial and Ethnic Health Disparities Initiative uses grantmaking to remove barriers to accessing health care for Hispanic and Brazilian populations, including insurance enrollment and links to affordable primary care. It also provides educational efforts to health providers and area teens in an effort to reduce teen pregnancy amongst Hispanic and Brazilian teens in Framingham.

1) To what extent were grantees able to move towards the Foundation’s set outcomes?

2) How were they able to address barriers to meeting these goals?

3) What specific strategies were most effective in meeting the goals of the Foundation’s strategic plan?

Page 27: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report Appendix Page 3

ORGANIZATION Program Evaluation Questions

Wayside Youth and Family Support Network

Prescott Street, ShortStop, Tempo Young Adult Resource Center Program Goals include Assist young adults, especially those who are aging out of state services, with behavioral health conditions and serious life challenges (homelessness, lack of family supports, little or no employment, etc.) to develop skills and access resources to make transitions to healthy adulthood; provide opportunities for peer leadership and advocacy, and provide a safe, non-stigmatizing, achievement oriented atmosphere that makes seeking help attractive to young adults. ShortStop and Prescott are mature programs, while Tempo is only 4 years old. Prescott serves young adults, 18-25, referred by DMH for residential treatment; other self-referred young adults for activities and information. This program has three levels of service: residential group home, transitional support services (support for youth who live independently in supported housing apartments) and outreach for youth living at home or other settings. ShortStop serves homeless young adults, ages 18-22, in need of housing, transitional living education and support at program. Tempo serves any young adult, 17-24, who needs resources for housing, employment, education, life skills, health care, behavioral health treatment, legal aid, parenting support, conflict resolution, community services, social activities, leadership opportunities, or help with planning a future.

1) What characteristics of Tempo attract young adults? What characteristics are not attractive?

2) Are young adults who come to Tempo likely to request help accessing behavioral health treatment? To attend an initial session of treatment? To continue in treatment for at least 8 sessions?

3) How many contacts with Tempo staff does it take on average for a young adult to request help with behavioral health treatment? What factors impact this?

4) What specific interventions are most successful at ShortStop (as viewed by the young adults) in assisting them to find employment or decide to return to school?

5) What specific elements of the STEPS program keeps youth coming back?

6) What specific staff or peer interventions are most effective (as viewed by the youth) in helping them feel motivated to maintain a healthy lifestyle that includes work and lifelong learning?

Page 28: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report Appendix Page 4

Appendix Table 2: Selected Evaluation Data Collection Methods: MWEI Class of 2011

Organization Organization Surveys Interviews Observations Record Reviews

Advocates, Inc. (Clinical Risk Assessment Protocol Evaluation)

Assabet Valley Collaborative (Professional Development Workshop Evaluation)

Bay Path Elder Services (Congregate/Supportive Housing program evaluation)

Bay Path Elder Services (Person-Centered Planning evaluation?)

Jewish Family Services (Jewish Family Assistance Network evaluation)

MetroWest Health Foundation (Evaluation of Racial and Ethnic Health Disparities Initiative grantmaking)

Wayside Youth and Family Support Network (Evaluation of 3 programs: Prescott Street, ShortStop, Tempo Young Adult Resource Center)

Page 29: Summary Report 2011 - nebula.wsimg.com

MetroWest Evaluation Institute, 2011 Summary Report Appendix Page 5

Appendix Table 3: Table 3 Detail: To what extent did involvement with MWEI

help participants learn the following skills? (N=13)

I Knew This

When I Started Not at All* A Little A Lot

% n % n % n % n

Logic models & Evaluation Design/Planning

Develop a logic model 53.8% 7 7.7% * 1 23.1% 3 46.20% 6

Use a logic model to inform program evaluation 38.5 5 7.7* 1 30.8 4 53.8 7

Specify evaluation questions 8.3 1 8.3* 1 16.7 2 75.0 9

Develop an evaluation design 7.7 1 7.7* 1 0 0 84.6 11

Choose methods to address evaluation questions 7.7 1 7.7 1 15.4 2 69.2 9

Project level of effort for evaluation 0 0 7.7 1 23.1 3 61.5 8

Project costs for evaluation 0 0 7.7 1 46.2 6 38.5 5

Review evaluation designs 0 0 7.7 1 7.7 1 84.6 11

Data Collection and Data Analysis Planning

Develop a survey 30.8 4 7.7* 1 23.1 3 46.2 6

Plan for administration of a survey 25.0 3 0 0 33.3 4 50.0 6

Plan for survey analysis 15.4 2 0 0 23.1 3 69.2 9

Develop an interview guide/protocol 7.7 1 7.7* 1 30.8 4 61.5 8

Plan to conduct interviews 30.8 4 0 0 61.5 8 23.1 3

Plan to analyze interview data 15.4 2 7.7* 1 38.5 5 53.8 7

Develop an observation protocol 15.4 2 15.4* 2 38.5 5 38.5 5

Plan to conduct observations 7.7 1 7.7* 1 69.2 9 15.4 2

Plan to analyze observations 7.7 1 15.4* 2 23.1 3 53.8 7

Develop a record review protocol 15.4 2 15.4* 2 30.8 4 53.8 7

Plan to collect record review data 15.4 2 7.7* 1 30.8 4 61.5 8

Plan to analyze record review data 15.4 2 7.7* 1 30.8 4 61.5 8

* Nothing New