implementing the etoioke hildren’s entre balanced scorecard · implementing the etoioke...
TRANSCRIPT
IMPLEMENTING THE ETOBICOKE CHILDREN’S CENTRE BALANCED SCORECARD
Akshay Rajaram, Project Coordinator, The Etobicoke Children’s Centre Ewa Deszynski, Executive Director, The Etobicoke Children’s Centre
Date: June 30, 2015 Grant: CBEC-1832
2
Table of Contents SUMMARY OF ACTIVITIES ............................................................................................................................ 3
NEXT STEPS ................................................................................................................................................... 6
APPENDIX I – SWOT ANALYSES ................................................................................................................. viii
APPENDIX II – MINUTES FROM CONVERSATIONS WITH OTHER COMMUNITY-BASED CHILD AND YOUTH
MENTAL HEALTH AGENCIES ........................................................................................................................ ix
APPENDIX III – FOCUS GROUP REPORT SUMMARY .................................................................................. xii
APPENDIX IV – ALIGNMENT BETWEEN THE ECC STRATEGIC PLAN AND BALANCED SCORECARD ........ xviii
APPENDIX V – GLOSSARY OF KEY SCORECARD TERMS ........................................................................... xxiii
APPENDIX VI – LIST OF GOVERNANCE AND PROGRAMMATIC INDICATORS ......................................... xxiv
APPENDIX VII – EXAMPLES OF REALIGNMENTS FOR CLIENT SATISFACTION AND EXPERIENCE SURVEY
FOR COLLABORATIVE PROBLEM SOLVING GROUP FOR PARENTS/CAREGIVERS .................................... xxv
APPENDIX VIII – CLIENT SATISFACTION AND EXPERIENCE SURVEYS FOR WEST END SERVICES FOR
ABUSE AND TRAUMA (WESAT) TEEN GIRLS’ GROUP (YOUTH PARTCIPANTS) ....................................... xxvi
APPENDIX IX – PERFORMANCE MEASUREMENT AND EVALUATION CYCLE ........................................ xxviii
APPENDIX X – DASHBOARD AND INDICATOR SUMMARY VIEWS .......................................................... xxix
3
SUMMARY OF ACTIVITIES
In October 2014, The ECC completed the development of the first iteration of its agency-wide
balanced scorecard. As the organization transitioned into the implementation phase, the authors
identified key priorities focused around two indicators: Functioning (an outcomes indicator in the Client
domain) and Client Satisfaction (an indicator in the Stakeholder Satisfaction domain). The activities
summarized below describe the work undertaken to collect, analyze, and report on data for each
indicator as part of a pilot scorecard.
1. October 2014 – January 2015: Client Satisfaction Environmental Scan
Following the publication of the results of the agency’s client satisfaction and experience (CSE)
2014 survey blitz, the Evaluation and Quality Improvement (EQI) team met with the six agency
programs1 to discuss opportunities for improving data collection and integrity. The findings emerging
from these discussions were captured in SWOT analyses (see Appendix I). To build on these findings, we
engaged in conversations with two Toronto agencies providing mental health services for children and
youth. These two community partners have had extensive experience in the area of CSE and provided
further insight into other models of collecting client satisfaction data (see Appendix II).
After completing the environmental scan the EQI team formulated an agency-wide CSE project
plan. The plan comprised two phases. Both phases would be coordinated and executed by a CSE
Research Assistant. Phase I would involve the review of the CSE surveys being used at the agency and
the development of a revised CSE survey for each service modality. Phase II would involve the piloting of
the revised CSE surveys in certain service modalities.
1 The six programs are Administrative and Support services, Applied Behavioural Analysis (ABA), Community Counselling and Consultative Services (CCCS), Day School Milieu Treatment Program (DSMTP), Management, and the Toronto Partnership for Autism Services (TPAS).
4
2. November 1, 2014 – March 30, 2015: Focus Groups
With the development of a performance measurement framework and the push towards
introducing standardized outcome measures for all service modalities of the agency, the authors
recognized the implications for change management and the potential resistance to implementation. To
address these issues systematically, the authors enlisted the expertise of Dr. Tara Black. Dr. Black and
her team at the University of Toronto (Faculty of Social Work) conducted focus groups to develop a
greater awareness of the staff’s thoughts, opinions, and feelings concerning the agency’s approach to
performance measurement and evaluation (see Appendix III for the summary report). One of the
primary recommendations was the adoption of “reciprocal adaptation.” This approach has been shown
to enhance staff buy-in with respect to the implementation of evidence-based practices,2 and the
authors are committed to following the tenets of the approach in implementing standardized measures
and other performance measurement and evaluation activities.
3. December 2014 – June 2015: The ECC Board Engagement and Review of The ECC balanced scorecard
In December 2014, an ad-hoc committee of the Board of Directors was formed to provide
recommendations regarding the Board’s use of the balanced scorecard to address the following
questions:
1) How can the Board use the balanced scorecard to monitor the Centre’s organizational health? Is it
meeting its mission and objectives?
2) Is the Board doing the right things from a strategic perspective?
3) How effective is the Board’s governance?
Between January 2015 and April 2015, the authors conducted a number of mapping exercises
with a focus on aligning the strategic directions identified in the agency’s strategic plan and the
2 Aarons, G. A., & Palinkas, L. A. (2007). Implementation of evidence-based practice in child welfare: Service provider perspectives. Administration and Policy in Mental Health and Mental Health Services Research, 34(4), 411-419.
5
indicators in the balanced scorecard (see Appendix IV). After sharing and discussing the results of this
mapping exercise with the ad-hoc committee, the committee chair and the authors participated in a
teleconference with the Centre of Excellence to align the balanced scorecard from a governance
perspective. The Centre of Excellence offered two key recommendations: 1) ensure that terms and
definitions are commonly understood and consistently used, and 2) identify which indicators of the
scorecard are related to governance or programmatic activities. Collectively, these recommendations
help ensure that front line staff, management, and the Board are using the same language to
communicate, and promote the focus on areas of the balanced scorecard that are directly related to
governance and strategic decision-making. Following this meeting, the ad-hoc committee reviewed and
approved a glossary of performance measurement terms as applied to the agency’s balanced scorecard
(see Appendix V). The committee also classified those indicators it will recommend to the Board as being
“governance” (see Appendix VI). These indicators will be used for the purpose of ensuring that The ECC
is meeting its mission and its obligations to its stakeholders, including clients, staff and funders.
4. February 2015 – April 2015: Client Satisfaction Phase I and Phase II (pilot)
In March 2015, the EQI team welcomed a CSE research assistant to execute Phase I and Phase II
of the agency’s CSE plan. After being oriented to the work in this area, the research assistant collected
the CSE surveys in use at the time and realigned the questions with the questions that were asked as
part of the survey used in the agency’s 2014 survey blitz (see Appendix VII for an example of this
realignment). Following this realignment, the research assistant met with the group facilitators or
service providers to discuss the revised survey and to gather information about the process of CSE data
collection. In total, the research assistant conducted nine consultations across 20 service modalities.
Following these meetings, the research assistant debriefed results with the Project Coordinator and a
second draft for each survey was developed based on the feedback of the group facilitators or service
providers. At the time of the writing of this report, revised surveys were launched for two group services
6
(see Appendix VIII for one of the surveys) and data has been collected and analyzed. The EQI team is in
the process of scheduling meetings with the facilitators of these groups to share the findings and to
formulate recommendations for the next group cycle. The overall organization of these activities is
consistent with the agency’s desire to implement feedback loops (see Appendix IX) with respect to
performance measurement and program evaluation.
NEXT STEPS
The ECC has invested board, management and front line resources towards organizational
planning and performance and is fully committed to implementing the balanced scorecard. The
following are next steps and actions:
Balanced Scorecard and The ECC Board of Directors
The Board ad-hoc committee will present governance indicators to The ECC Board for approval
in September 2015. Following approval, the EQI team will report data for these indicators at Board
meetings using a combination of dashboard and indicator summary views (see Appendix X). The Board
will use these data to help make strategic decisions and assess the agency’s performance in relation to
its strategic plan and mission. The Board ad-hoc group will also become a standing committee beginning
September 2015 (Strategic Governance Committee) to monitor progress on the continued
implementation of the indicators and to support the Board in understanding and using the data.
Client Functioning
Over the next fiscal year, under the direction of the leadership team and with the support of the
EQI team, clinical programs will implement standardized measures for those service modalities that are
not currently using measures to monitor and evaluate client outcomes. In the case of certain services
(e.g. individual therapy and family counselling, walk in clinic, and intensive child and family counselling),
the EQI team has already identified standardized measures (e.g. Outcome and Session Rating Scales) and
will offer staff training in September 2015.
7
Client Satisfaction
Between July 2015 and March 2016, revised CSE surveys will be approved and launched for the
remaining 18 service modalities. For three service modalities, Individual Therapy and/or Family
Counselling, Intensive Child and Family Counselling services, and the Day School Milieu Treatment
Program, a post-service telephone model will be implemented to assess client satisfaction and
experience. This post-service model involves a research assistant phoning and supporting clients with
responding to the CSE survey. Similar to other modalities, the survey responses will be aggregated and
presented to leadership and staff on a quarterly basis with follow up for improvements.
Knowledge Exchange and Transfer
The ECC Executive Director is actively involved with the Toronto lead agency, East Metro Youth
Services, in supporting their implementation plan to achieve the goals of Moving on Mental Health. The
ECC Executive Director will chair a sub-group of MCYS service providers whose task it is to provide
recommendations for Data and Performance Management across Toronto. The learnings from the work
at The ECC and supported and facilitated by the Centre of Excellence grant will inform the
implementation plan for Toronto.
Sustainability
At the time of the writing of this report, the agency is engaged in recruiting and training a full-
time Coordinator of Evaluation and Quality Improvement and a research assistant. These individuals will
be responsible for developing and executing a work plan that incorporates the next steps and action
items previously described. This position will also be responsible for continuing to build on the
achievements of the performance measurement team, which includes training, implementing,
monitoring, and providing feedback to staff at the agency. The performance measurement team will be
part of the ongoing EQI team.
viii
APPENDIX I – SWOT ANALYSES
Blitz Model The Blitz model involves surveying all clients or a segment of the client population visiting the agency within a pre-defined time period.
Post-Service Model
Strengths
•Fewer resources required
•Faster data analysis
Weaknesses
•Places heavy burden on Admin
•Low response rates
•Results not meaningful enough to be transferable to learning and adaptiaton
Opportunities
•Use of technology (e.g. tablets, mobile apps)
•Use of the website to collect data
Threats
•New Ministry requirements (data elements) that do not align well with process
•Model does not support scorecard Client Satisfaction indicator
Strengths
•Higher response rates
•Minimal resources
•No clinical or admin staff involvement
•Program-specific data
Weaknesses
•Data stored in external database; potential difficulties with integration
Opportunities
•CoE implementation funding
•Implementation of Scorecard satisfaction indicators
•CYMH AGENCY i currently using model
Threats
•Sustained funding for human resources to collect and analyse data and provide feedback to staff.
ix
APPENDIX II – MINUTES FROM CONVERSATIONS WITH OTHER COMMUNITY-BASED CHILD AND YOUTH
MENTAL HEALTH (CYMH) AGENCIES
Minutes of conversation with Director of Program Evaluation Services (Toronto CYMH Agency I) Participants: Director of Program Evaluation Services (Toronto CYMH Agency I) and Project Coordinator (The Etobicoke Children’s Centre) Date: October 28, 2014 Topic: Client satisfaction and experience and the use of a post-service model Overview:
Integrated client satisfaction process into the flow of service
Conduct telephone interviews with registered clients at end of service
Hand out paper forms at the end of service to non-registered prevention clients (group clients) 1. Who conducts the telephone interview? / Are there any other resources required?
Telephone interviews are conducted by a research assistant who works one evening per week. o We find that evenings are the best time to reach families, although for some families
daytime phone calls are required.
Pull the client data out of our client information system periodically (currently every three months).
The only other resource needed is a headset so the RA can type client responses directly into a computer program.
2. Are you using the same/similar TRCSE (Toronto Region Client Satisfaction and Experience) tool to
gather the data? We are using identical client satisfaction tools as The ECC to gather data. 3. How long after the end of service is the interview conducted? Within a 3 month period. Doing it sooner would be better, so looking at pulling the data monthly. 4. Is there a process to collect the client’s consent to complete the telephone interview? If so, when is
the consent collected? Does it expire? The client is asked for written consent during the service agreement for us to contact them at the end of service. The parent/guardian is also asked to provide verbal consent prior to the interview for their and their children's participation in the interview. The children are asked to give their verbal assent prior to the interview. 5. Are clients reminded during service that they will receive a phone call requesting their feedback on
satisfaction and experience? It might be helpful to remind clients to expect a phone call asking for their feedback on the service, but we currently do not do that. Reminders may increase participation.
x
6. Is the collected data stored in the client’s record or an external database? Do you aim to anonymize the data in any way?
The data collected is not stored in the client's record or in the centralized client information system.
It is stored in an external database that resides with the Program Evaluation department. o The programs and clinicians do not have access to this database.
Currently the data for treatment clients is not anonymized, as we have some plans to link it to the BCFPI and CAFAS data. The paper data for prevention clients is anonymized.
7. What is the response rate like?
The response rate for the non-registered prevention clients is close to 100%. This year (2014)
For the telephone surveys with treatment clients, we have not yet summarized this year's (2014) results yet, but they are certainly a lot better than we had achieved with our previous mail-out surveys.
Impression is that this year's response rates are somewhat lower. Of the families reached, only 59.5% of the children/youth completed the survey.
The main issue encountered was reaching a large number of families for a variety of reasons, including, the phone number was incorrect or not in service; the family no longer lived there; the phone number was for the non-custodial parent; the guardian was not home; or the guardian was ill. A few children and youth refused to participate but in a majority of the cases had trouble getting a hold of them as they seemed to be participating in various after-school activities and were not home.
Last year (2013)
43% of parents/guardians were successfully reached and all agreed to complete the survey (100% compliance rate).
8. Are staff (e.g. frontline, management, senior leadership) satisfied with this post-service telephone
interview model?
Overall, management and staff are relatively satisfied with this post-service telephone interview model for a variety of reasons:
o Response rates are not too bad, clinical and admin staff do not have to be involved in the process, the cost for a part-time research assistant to conduct the interviews and enter data is minimal, and the resources required to retrieve the data from the client information system are minimal.
Would like to increase response rates further, particularly for children/youth. There has been some discussion about possibly using email as an alternate method of contacting clients or perhaps having a computer or an iPad available for feedback, either at the end of service or in-service or both.
o May explore these options more in the future.
Hoping that the other two projects who participated in the ministry-funded client satisfaction studies will be willing to share the computer program and iPad app that they developed
For in-service feedback, currently have comment cards in the lobby at each site and clients can place their card in a locked box.
xi
Minutes of conversation with Director of Research and Evaluation (Toronto CYMH Agency II) Participants: Director of Research and Evaluation (Toronto CYMH Agency II) and Project Coordinator (The Etobicoke Children’s Centre) Date: January 8, 2015 Topic: Client satisfaction and use of “blitz” survey model
Out of TRCSE – Toronto Child and Youth Mental Health agency I was doing telephone follow-up and The ECC/Toronto Child and Youth Mental Health Agency II were conducting a blitz survey model.
CYMH Agency II has 20 or so clinical staff o Front desk enters appointments (centrally)
Rooted in the division of clinical/administrative plus CYSIS being difficult
CYMH Agency II has stayed with centralized model (2015) o On Thursday – clinical staff asked to enter appointments on centralized calendar o Appointments are entered by admin into the system so that following Monday the schedule
is known Know clients are coming and who is coming Able to prepare packages and give to clients This was done in March 2013 as part of the TRCSE project 60% response rate Huge resourcing – burden on R&E department
Most recent time of doing blitz (May 5 – June 6, 2014) o All of the clients attending appointments except clients attending the first appointment o 363 open, 253 clients attending appointments during that 30 day period
Difference = closed, dropped off, etc.
Very difficult to track o Completed it prior to their appointment and returned surveys to front desk in envelope
Little ballot attached for gift card o Response rate = 24% (N=60) o Gives multiple perspectives but no end of service o Problem with the end of service is tracking, resources to call/interview
Sent modified version of the Ministry Client Feedback Form (see toolkit) o Started sending in May 2014
All end of service clients Empirical response rate = cannot say because of measurement difficulties
o Couple come back every week Identified a couple of issues But it is not representative data Functionality = understanding miscommunication problems, looking for patterns
o For Day School and Residential – send specific client satisfaction forms o For groups – they get a standardized measure and a feedback questionnaire which is
mapped onto the program logic model
CYMH Agency II accreditation coming in October 2015 o Do not want to change anything o Going to continue with blitz model
xii
APPENDIX III – FOCUS GROUP REPORT SUMMARY
THE ETOBICOKE
CHILDREN’S CENTRE
FOCUS GROUP
SUMMARY REPORT
2/27/2015 An Analysis of Staff Perspectives on Performance
Measurement and Evaluation
In partnership with The Etobicoke Children’s Centre and
the University of Toronto.
Dr. Tara Black
Megan Pratt
xiii
The Etobicoke Children’s Centre Focus Group Summary Report An Analysis of Staff Perceptions on Performance Measurement and Evaluation
Introduction
In collaboration with the University of Toronto, six focus groups were conducted at The
Etobicoke Children’s Centre (The ECC) in the fall of 2014 to develop a greater awareness of The ECC
staff’s thoughts and feelings concerning the agency’s approach to performance measurement and
evaluation.
During the focus group sessions, the researchers did not interject opinions or manage
perspectives on the topic of performance measurement and evaluation. Researchers encouraged
participants to be honest in their responses without fear of reprisal. Confidentiality agreements were
signed to ensure participant anonymity.
The research team transcribed interviews verbatim from audio recordings. Resulting qualitative
data was thematically analyzed and grouped into categories of general understanding, strengths,
limitations, fears, needs, and recommendations. Researchers included negative case analysis to
supplement majority opinions and themes in results.
This summary report provides a brief overview of the study’s findings. Complete methodology
and results are available in the full report.
Organizational Change
Across the field of children’s mental health, there is growing recognition that evidenced-based practice (EBP) is important for effective service delivery. Provinces, as well as local agencies, have the responsibility to monitor outcomes and assess whether current services sufficiently meet children’s mental health needs (Waddell, McEwan, Shepherd, Offord, & Hua, 2005). To meet EBP standards, agencies can pursue measuring and evaluating existing practices to evidence the effectiveness of their programs. Optimal integration of performance measurement and evaluation processes at the agency level requires sustained support for workers and staff (Kimber, Barwick, & Fearing, 2012). Adapting EBP policies to real-world settings involves developing partnerships with service providers and increasing staff buy-in, since reluctance among practitioners to adopt EBP can pose a major obstacle to implementation efforts (Aarons & Palinkas, 2007; Kimber et al., 2012; McCrae, Scannapieco, Leake, Potter & Menefee, 2014). Greater staff engagement in the planning and development of policy implementation can lead to more long term gains and better client outcomes (Ignatowicz et al., 2014).
Literature on EBP implementation emphasizes the need to understand barriers and assess multiple levels of system readiness in preparing for organizational change, including staff motivation to change (Aarons & Palinkas, 2007; McCrae et al., 2014). Organizational culture and climate change necessitates understanding the characteristics of individual practitioners within agencies, including their willingness to change, their perceptions of the change process, and their knowledge of EBP (Kimber et
xiv
al., 2012). To achieve such an understanding of practitioners at The ECC, this study analyzes focus group data on worker perceptions of performance measurement and evaluation at the agency. Qualitative interviewing about the change process was chosen as the method for data collection, as it is conducive to developing transparent, trusting partnerships with service providers and decreasing staff anxiety (Kimber et al., 2012). By applying data generated in this report, The ECC management can develop bottom-up, strategic plans to best support workers in successfully transitioning to a balanced scorecard framework for performance measurement and evaluation.
Focus Group Major Themes
General Understanding
- Agency Level - Personal Evaluation
Strengths - Positive Outlooks on Performance Measurement/Evaluation, Getting Better Feedback - Comfort with Existing Performance Measurement and Evaluation Structures
Limitations - Performance Measurement/Evaluation as not Accommodating Cognitive, Cultural, and
Language Barriers of Some Clients - Discomfort with the Subjectivity/Objectivity of Performance Measurement/Evaluation - Notion that there is not Enough Time for Performance Measurement/Evaluation, Worker Time
Could be Better Spent - Performance Measurement/Evaluation as Having Minimal Day to Day Relevance
Fears - Performance Measurement/Evaluation in General - Adopting Business Models and Frameworks - Not Knowing Who Will Access Data and Outcomes, Sense that an Outsider is Imposing,
Developing, and Maintaining Performance Measurement/Evaluation at the Agency - Performance Measurement/Evaluation will Highlight Problems and Societal Factors Beyond the
Control of Employees - Performance Measurement/Evaluation Will Focus on Negative Aspects of Practice - Programs Being Cut and Jobs Being Lost
Needs - Better Structure/Definition/Understanding of Performance Measurement/Evaluation at the
Agency - Define Success Fairly and Accurately - More/Better Measurement Tools, More Training to Ensure Consistent Use of Tools - More Supervisor Guidance and Involvement in Performance Measurement/Evaluation Process - More Clarity of Worker Competencies - Goals that are Attainable, Relevant, Concrete, and Have an Associated Plan of Action - More Continuity and Follow-Up on Performance Measurement/Evaluation - Improved Client Satisfaction Surveys - Individualized Approaches to Performance Measurement/Evaluation - Involve Client Stakeholders in Performance Measurement/Evaluation - Better Communication and Dissemination of Information
xv
Recommendations
The following recommendations were developed through synthesis of focus group findings,
management perspectives, and relevant, best practice research.
1) RECIPROCAL ADAPTATION:
Enhance staff buy-in by adopting a flexible approach to agency change that promotes
reciprocal adaptation, defined as “the perceived need for the EBP to be adapted and the
need for providers to adapt their perceptions and behaviors” to accommodate performance
measurement and evaluation (Aarons & Palinkas, 2007).
Create knowledge exchange activities: e.g., sharing ideas and asking for feedback during workshops
2) COMMUNICATION:
Develop a two-way communications strategy that engages and informs all staff members on
important agency policies, protocols, and advancements.
Explain rationales (and adapt expectations as necessary) for staff daily routines, resources,
and productivity requirements, through ongoing dialogue with relevant stakeholders
(Aarons & Palinkas, 2007; Kimber et al., 2012).
Positively engage client stakeholders in the change process by publically reporting findings
and soliciting their experiences and perceptions of performance measurement and
evaluation (Aarons & Palinkas, 2007).
Invite all stakeholders to knowledge exchange meetings: e.g., presenting preliminary findings
3) ESTABLISH WORKING GROUPS:
Involve staff in the clinical transformation process by establishing working groups and
working group leaders to encourage peer to peer consultation and meaningful participation
in the change process (Kimber et al., 2012).
Create performance measurement and evaluation working group to meet monthly with representatives
from each department.
xvi
4) RESOURCE ALLOCATION:
Engage professional consultants and/or managers who can provide ongoing oversight,
coaching, and resources to aid staff throughout the implementation of performance
measurement and evaluation (Aarons & Palinkas, 2007).
Develop and implement staff training workshops on applying and interpreting relevant
performance measurement tools by department.
Schedule dedicated time for workers to complete performance measurements and
evaluations.
Set a recurring agenda item by department to discuss performance measurement feedback
5) METHODS:
Conduct client satisfaction surveys by telephone to help accommodate cultural, language,
and cognitive barriers of clients and increase rate of return.
Utilize electronic methods for data collection, analysis, and dissemination of findings, as
much as possible (e.g. CANS Electronic Form; posting reports on agency website).
Create a section on the website for knowledge exchange (i.e., posting reports)
xvii
References
Aarons, G. A., & Palinkas, L. A. (2007). Implementation of evidence-based practice in child welfare:
Service provider perspectives. Administration and Policy in Mental Health and Mental Health Services
Research, 34(4), 411-419.
Ignatowicz, A., Greenfield, G., Pappas, Y., Car, J., Majeed, A., & Harris, M. (2014). Achieving provider
engagement: Providers’ perceptions of implementing and delivering integrated care. Qualitative
Health Research, 24(12), 1711-1720.
Kimber, M., Barwick, M., & Fearing, G. (2012). Becoming an evidence-based service provider: Staff
perceptions and experiences of organizational change. The Journal of Behavioral Health Services &
Research, 39(3), 314-332.
McCrae, J.S., Scannapieco, M., Leake. R., Potter, C.C., Menefee, D. (2014). Who's on board? Child
welfare worker reports of buy-in and readiness for organizational change. Children and Youth
Services Review, 37, 28-35.
Waddell, C., McEwan, K., Shepherd, C. A., Offord, D. R., & Hua, J. M. (2005). A public health strategy to
improve the mental health of canadian children. Canadian Journal of Psychiatry, 50(4), 226-33.
xviii
APPENDIX IV – ALIGNMENT BETWEEN THE ECC STRATEGIC PLAN AND BALANCED SCORECARD
Strategic Agenda: 2012 – 2015
The Etobicoke Children’s Centre, in partnership with individuals, families and communities, provides equitable, caring, respectful and knowledgeable mental health services that are responsive to the diverse needs of children, families and our communities. Our community and stakeholders experience us as flexible and responsive, a respected and respectful partner and service provider, and as an organization that is committed to promoting collaboration and innovation. Most importantly, we are trusted to deliver positive results – as one partner put it, “Our children are in good hands with The Etobicoke Children’s Centre.” It is on this continuing strong foundation that we seek to advance our mission. Our Mission: Strengthening Children and Families through Excellence in Mental Health and Autism Services Organizations like ours work in a constantly changing environment. Looking forward, we see a number of emerging trends to which we must respond. Our communities are becoming more diverse, and the needs and children and families more complex. Funding opportunities and expectations are also changing, with greater focus on funding targeted to demonstrated and proven needs and evidence-based practices. There is also increased emphasis on client engagement, clear service delivery mechanisms and pathways, collaboration and formal partnerships with the intent of ‘right sizing’ the delivery system, eliminating duplication, and strengthening capacity to deliver the highest quality and method of service to the greatest number in greatest need at a manageable cost. We must ensure sustainable growth that is aligned with our demonstrated core strengths and resources. We are – and must remain – ‘change friendly.’ The Etobicoke Children’s Centre will, during the next three years, pursue the following four Strategic Priorities:
xix
1. Service Excellence The core of our work is providing exemplary service to children and their families. We are committed to exploring ways to expand our programs and services, extend our reach, and continually improve and renew our approaches. We will do this by:
Strategic Action Balanced Scorecard Indicator(s)
Aligning, positioning and expanding our services strategically and responsibly.
Functioning, Unique Clients Served, Service Duration, Wait Times, Program Development and Expansion, Special Projects
Embedding evaluative practices at all levels of the organization. Organizational Planning and Performance
Encouraging service innovation. Service Duration, Wait Times, Special Projects
Cultivating partnerships which will ensure clear pathways to effective and needed services.
Innovation in Practice
2. Strengthen Stake Holder and Community Connections Essential to our success is our ability to adapt to continually-changing community needs. We will do this by:
Strategic Action Balanced Scorecard Indicator(s)
Cultivating and strengthening client engagement at all levels of the organization
Client Satisfaction and Experience
Responding strategically to community demographics and community service needs
No corresponding indicators
Increasing awareness of our services No corresponding indicators
xx
3. Nurture Innovative Partnerships Building on our reputation as a collaborative leader, we will continue to lead and support alignment at the system level. We will do this by:
Strategic Action Balanced Scorecard Indicator(s)
Pursuing and nurturing partnerships that address important mutual capacity-building needs
Innovation in Practice
Supporting and participating in learning communities No corresponding indicators
4. Invest in the Future In order to ensure the sustainability of our important work, we must continue to invest in a number of areas including employees, organizational systems, governance and leadership. We will do this by:
Strategic Action Balanced Scorecard Indicator(s)
Enhancing Employee Well-being Our employees are our most important asset. We will foster a healthy, growth oriented, supportive and accountable workplace by: Investing in and supporting professional development and professional practices which are aligned with strategic directions. Strengthening employee engagement Developing leadership capacity and strengthening succession planning
Professional Development, Leadership Planning, Staff Satisfaction, Human Resources Management – Staff
xxi
Strategic Action Balanced Scorecard Indicator(s)
Strengthening Organizational Systems Strong organizational structures are key to supporting and growing The Centre’s work. Improving infrastructure & administration will ensure that all staff have the tools & supports necessary to excel. We will further our organizational strength by: Examining and transforming organizational and system structures which will support continual improvement and the achievement of excellence Re-visioning organizational accountabilities Improving performance and management structures
Stewardship, Organizational Planning and Performance, Human Resources – Staff
Strong Governance Strong governance is key to our relationship with and accountability to our funders, community and stakeholders. We will advance our governance by: Strengthening Board capacity & performance Strategic engagement with inter/intra-sectoral Boards Diversifying and expanding our resources
Governance, Stewardship
xxii
Strategic actions without indicators: 1. Responding strategically to community demographics and community service needs 2. Increasing awareness of our services 3. Supporting and participating in learning communities “Orphan Indicators”: Indicators not associated with a strategic priority/action Resources 1. Financial – Funding Received 2. Financial – Expenditures Incurred 3. Human Resources – Non-Staff 4. Human Resources – Time Key Processes 5. Risk Management 6. Human Resources Management – Student/Volunteer Stakeholder Satisfaction 7. Funder Satisfaction
xxiii
APPENDIX V – GLOSSARY OF KEY SCORECARD TERMS
Domain – A broad area of focus (e.g. clients/customers, resources/financial, etc.) encompassing smaller indicators. The ECC examples: Clients, Resources, Key Processes, Innovation and Leadership, and Stakeholder Satisfaction Performance Indicator – Set of quantifiable data elements that are measured to gauge or compare performance or evaluate success in/of a particular domain; indicators must be “SMART” (Specific, Measurable, Achievable, Realistic, Timely). The ECC Example: Client Functioning, Client Satisfaction, etc. “Programmatic Indicators” – Of or pertaining to the program and/or the operations of a particular program. “Governance Indicators” – Of or pertaining to the governance and/or stewardship of the organization Data Element – Fundamental building blocks of indicators. Each indicator will comprise different numbers of data elements. Because indicators must be measurable, data elements by extension must also be measurable. The ECC Example: Every unique service modality will contribute data to the “Client Functioning” indicator. Each of these “contributions” would be reflected by a data element Measure: A tool or process used to assess the degree, extent, quality or change of or in something The ECC Example: Different measures are used by each service modality to measure changes in client functioning or outcomes.
xxiv
APPENDIX VI – LIST OF GOVERNANCE AND PROGRAMMATIC INDICATORS
Categorization of scorecard indicators Governance (only) Programmatic (only)
1) Unique Clients Served 2) Service Duration 3) HR- Staff 4) Human Resources – Non-Staff 5) Human Resources – Time 6) HR – Professional Development 7) Human Resources Management – Staff (key process) 8) Human Resources Management – Learners (key process) 9) Program Development and Expansion 10) Leadership Development 11) Special Projects
Governance and Programmatic (i.e. both)
1) Client Functioning 2) Wait Times 3) Funding Received 4) Expenditures Incurred 5) Governance for Accreditation (key process) 6) Stewardship 7) Risk Management 8) Organizational Planning and Performance 9) Innovation in Practice 10) Client Satisfaction and Experience 11) Staff Satisfaction 12) Community Partner Satisfaction 13) Funder Satisfaction
xxv
APPENDIX VII – EXAMPLES OF REALIGNMENTS FOR CLIENT SATISFACTION AND EXPERIENCE SURVEY
FOR COLLABORATIVE PROBLEM SOLVING GROUP FOR PARENTS/CAREGIVERS
- “Staff” to “group leader”, “Service” to “group” - Currently use “Problem Solving Together” survey (and “TK-COT” as pre and post measure)
o Add #4 “Did you get useful feedback on the work you had done between groups” o Need to discuss #5 “Has the way you see your child’s behaviour changed” – very important
to them, about changing perceptions. Not captured in CSE #19-20 o Need to discuss #6 “Has the group helped you to better deal with your child”. Not captured o #9 & 10: Not appropriate: it is a group setting, therefore individual needs are not the focal
point o #13: Services is too vague (bus services, pizza services?) o #14 & 15: Parents are not involved in planning & no schedule
- Add a new Q regarding breaking down barriers to services o Marietta and Margaret to come up with question. Something along the lines of “If there
were stresses in going to the group were they addressed?” Example of revised questions: - The wait to start the group was reasonable - The group was available at a location that was convenient - The group was available at times that were convenient - The ECC was friendly and welcoming - It was easy to talk with the group leader - It was easy to share our views with the group leader - The group leader listened to us - The group leader talked to us in a way that was easy to understand
xxvi
APPENDIX VIII – CLIENT SATISFACTION AND EXPERIENCE SURVEYS FOR WEST END SERVICES FOR ABUSE AND TRAUMA (WESAT) TEEN GIRLS’ GROUP (YOUTH
PARTCIPANTS)
The name of the group you are attending: How many group sessions have you attended so far ECC?
Please choose the answer that is best for you… Strongly
Agree
Agree
Somewhat
Agree
?
Not Sure
Somewhat Disagree
Disagree Strongly Disagree
1. The wait time before I started the group was okay
2. The group was available at a location that is convenient
3. The group was available at times that are convenient
4. The staff at reception were friendly and welcoming
5. It was easy to talk with my group facilitators
6. It was easy to share my views with my group facilitators
7. I trust my group facilitators
8. My group facilitators listened to me
9. My group facilitators talked to me in a way that was easy to understand
10. My group facilitators understood my needs
11. My group facilitators understood my strengths and abilities
12. My group facilitators were sensitive to my cultural background
13. My group facilitators were able to help me
14. I got the support I needed from the group
15. I was involved in planning the group topics
16. It was clear what would happen in the group
17. My group facilitators were knowledgeable about other services when I had questions
18. I learned useful skills in the group
19. I am hopeful that I am better able to deal with my problems as a result of attending this group
20. I am hopeful that things will get better for me
21. Overall, I am satisfied with the group
22. I would recommend the group to a friend
xxvii
Without anyone knowing who you are, may we use your words in reports or printed materials? YES NO
What was good about the group?
What could make the group better?
Do you have needs for other youth mental health services that The ECC does not provide? YES NO
If yes, please identify:
What do you think are some of the most pressing youth issues or needs in your community?
xxviii
APPENDIX IX – PERFORMANCE MEASUREMENT AND EVALUATION CYCLE
Feedback loops are an important part of performance measurement and evaluation. As a result, The ECC is working to create and sustain the type of feedback loop captured in the diagram below.
1. Run service – Services (e.g. counselling, groups, etc.) are provided on an ongoing basis. Service providers administer the standardized measures for their service and collect the raw data.
2. Collect and analyze data – The EQI team collects the raw data, enters it to the system (e.g. MS Excel, MS Access) and analyzes the data.
3. Present to program staff – The EQI team presents the analyzed data to the service providers, their supervisors and the director of the program.
4. Review service – As a team, these individuals interpret the data and provide context for the findings.
5. Formulate recommendations – The program staff formulate recommendations with respect to improving the service (i.e. increasing safety, improving timeliness of treatment, increasing efficiency, improving effectiveness, improving equity, or increasing client-centeredness).
6. Write final report – The EQI team synthesizes the data analyses, discussion, and recommendations into a final report.
7. Implement recommendations – The program staff implement the recommendations and the cycle continues.
Run service
Collect and analyze
data
Present to program
staff
Review service
Formulate recommendations
Write final
report
Implement recommendations
xxix
APPENDIX X – DASHBOARD AND INDICATOR SUMMARY VIEWS
The Dashboard View is analogous to an executive summary. It presents each indicator, the target value (if any), and the result for the reporting period described at the top of the page (there are five possible reporting periods – the four quarters of the fiscal year and the fiscal year itself). It permits viewers to absorb a high level summary of the Centre’s performance for the reporting period.
xxx
The “Indicator Summary” view presents some more information about the indicator (i.e. definition, target, relevance to strategic vision) and describes the results in some more detail. The “Results and Analysis” section of the Indicator Summary page will compare results for the reporting period with the previous reporting period and present findings graphically.