not just an airs standard program evaluation & quality assurance
DESCRIPTION
Not Just an AIRS Standard Program Evaluation & Quality Assurance. Robert McKown, CIRS Director of Evaluation and Accountability Sherri Vainavicz, CIRS Manager, UW’s 2-1-1 Heart of West Michigan United Way Grand Rapids, Michigan. Workshop Objectives. - PowerPoint PPT PresentationTRANSCRIPT
Not Just an AIRS StandardProgram Evaluation
&Quality Assurance
Robert McKown, CIRSDirector of Evaluation and Accountability
Sherri Vainavicz, CIRSManager, UW’s 2-1-1
Heart of West Michigan United WayGrand Rapids, Michigan
25/23/2013
• Understand the value of evaluating Information and Referral Services1
• Develop and use outcome indicators to measure and demonstrate the impact of the I&R Services2
• Recognize relevant AIRS standards and accreditation criteria
3
4
• Understand and utilize quality assurance measures for managing and strengthening I&R Services
35/23/2013
Workshop Objectives
45/23/2013
What do you hope to learn today?
55/23/2013
What does your I&R service provide your community?
What is the “impact”?
How do you know this?
Community Impact
“…the systematic collection of information about the activities, characteristics, and outcomes or programs to make judgments about the program, improve effectiveness, and/or inform decisions about future programming”
65/23/2013Source: Salvatore Alaimo, PhD – Grand Valley State University
Program Evaluation
Patton, 1997
A means for organizational learning
Time and effort well spent ensuring-
o The effectiveness of programs
o The organization’s ability to adapt to a changing environment
75/23/2013Source: Salvatore Alaimo, PhD – Grand Valley State University
Program Evaluation is…
An episodic event but an ongoing development process
Something you do only to satisfy a funder
Something you do only to promote your work
A test or a punishment
85/23/2013Source: Salvatore Alaimo, PhD – Grand Valley State University
Program Evaluation is not…
• Assure you are attaining what you intend to accomplish1
• Obtain feedback to assess strengths and to identify where to focus efforts on building a stronger program
2
• Cost vs. efficiency benefits
3
4
• Accountability to the public and funding entities
95/23/2013
Purpose of Evaluation
Outcomes – benefits or changes for program participants
Outputs – direct products summations; volume) of program activities
Activities – what the program does
Inputs – all of the resources necessary to deliver the program
105/23/2013Source: Salvatore Alaimo, PhD – Grand Valley State University
Terminology
• New knowledge, gained skills, changed attitudes, modified behavior, improved condition
Outcomes
• Number of workshops, number of hours, number of participantsOutputs
• Staff, volunteers, money, facilities, equipment, systems, supplies
Activities
Inputs
• Counseling, mentoring, feeding, sheltering, building, entertaining, educating
MISSION
115/23/2013Source: Salvatore Alaimo, PhD – Grand Valley State University
Logic Model
1
• Follow-up23456
• Agency feedback• Secret shopper
• Data base quality checks
• Call accounting data• Abandonment• Average time to answer• Average time on call• Call trends (scheduling)
• Monitoring calls
125/23/2013
Quality Assurance Tools
Quality Assurance tools can work together. No one quality assurance tool measures or demonstrates all of the components of the I&R program. A gap or question identified by one tool may be filled or answered, affirmed or contradicted by another tool in your Quality Assurance tool kit.
135/23/2013
Be Practical
145/23/2013
Relevant AIRS Standards
The I&R service has a process for examining its viability as an organization, the effectiveness of its services, its appropriate involvement in the community and its overall impact on the people it serves. (Standard 29 – Quality Indicator 1)
155/23/2013
Standards
…method for tracking call volume, average speed of answer, abandoned calls, average call handling time and income call patterns(Standard 29 – Quality Indicator 2)
…creates internal reports to assess operational effectiveness (Standard 29 – Quality Indicator 3)
165/23/2013
Accreditation Standards
…conducts an annual evaluation of I&R activities (including the resource database and website) that involve inquirers, service providers…(Standard 29 – Quality Indicator 4)
The I&R conducts regular customer satisfaction surveys (Standard 29 – Quality Indicator 5)
175/23/2013
Standards
The I&R service involves inquirers, service providers and others…in the evaluation process; and modifies the program in response to evaluation…(Standard 29 – Quality Indicator 6)
185/23/2013
Standards
195/23/2013
Follow-upOverview
Telephone call to I&R inquirers to gather information about their 2-1-1 experience
Allows for evaluating program effectiveness
Results used to make better strategic decisions about service delivery
205/23/2013
Definition: Follow-up
215/23/2013
MeasurementsWhat Does a Follow-up Measure?
• Strengths of service delivery• Areas for service delivery ImprovementOperations
• Benefits to callersOutcomes
• Subjective level of satisfaction with serviceCaller Satisfaction
225/23/2013
3 Areas of Evaluation
• 80% will report being provided with appropriate referralsOperations
• 70% will report they received requested services from the referral agency
Outcomes
• 92% will report that they are satisfied or very satisfied with the I&R services they received
Caller Satisfaction
235/23/2013
Benchmarks
• Describes the overall characteristics of callers followed-up on
• Age, Gender, City, Nature of request, etc.
Demographic Data
• Frequencies• Percentages• Means / Averages
Quantitative Data
• Responses to open-ended questionsQualitativeData
245/23/2013
Analyze / Report Collected Data
Identification of community service gaps
Identification of incorrect or outdated agency/database information
Identification of reasons callers are not receiving services
Identification of I&R program strengths and potential staff training needs
255/23/2013
How Follow-up Data is Used
265/23/2013
Agency Survey
Questionnaire mailed to a sample of community agencies to gain their perception and experience with the I&R program
275/23/2013
Definition: Agency Survey
Accuracy of referrals
The perception and experience with the I&R program from the perspective of agencies
285/23/2013
What Does the Agency Survey Measure?
Survey link mailed to 20% of the local agencies on the database
Agencies asked to track the referral source for new clients for one month and to identify those referred by I&R program (previous surveys)
Agencies complete the survey
Analyze the results29
5/23/2013
Agency Survey Process
305/23/2013
Silent Monitoring
Observation of I&R call to determine and measure how well established call standards and elements are met
315/23/2013
Definition: Silent Monitoring
Quality of the I&R communication, whether essential elements were completed, familiarity with the phone system and database and general performance of the I&R specialist
Identifies best practices and strengths
Identifies gaps in knowledge about community resources and other areas for staff development
325/23/2013
What Does Silent Monitoring Measure?
Callers are provided message that their conversation may be monitored
I&R manager logs on to be able to listen to calls carried out on an I&R specialist’s phone extension or listens to recorded calls.
I&R manager listens and records which call elements were completed during the call
I&R manager shares the observations with the I&R specialist
I&R manager and team look for trends to identify strengths or gaps
335/23/2013
Silent Monitoring Process
Average score on silent monitoringof 80% of possible total score (88 out of a possible 110 points)
1% of calls monitored
345/23/2013
Benchmarks
Average score on silent monitoring-91 (83% of possible total score)
355/23/2013
Silent Monitoring Outcomes
365/23/2013
Applying Quality Assurance
Adjustments and changes made by presenter’s organization
Describe a change in policy or procedure in your program that was based on evaluation.
What was measured and what was the change?
375/23/2013
Adjustments and ChangesIdentified by Participants
Agency presentations and site visits
Schedules adjusted to assure the right number at the right time
Increase silent monitoring to try to gain a more objective measure in response to agency survey input that referrals are not as accurate as we desire.
Added temp resource database staff to update resource database
Hired someone with bi-lingual language skills when filling a vacant position
Found additional resources for staff
385/23/2013
Adjustments
Dashboardo Identify:
Strengths Gaps Next Steps Solutions
395/23/2013
Quality Assurance DataGroup Analysis of:
American Evaluation Association (AEA) evaluation search – http://www.eval.org/find _an_evaluator/evaluator_search.asp
Local affiliates of AEA
Michigan Association for Evaluation
Local colleges and universities
405/23/2013Source: Salvatore Alaimo, PhD – Grand Valley State University
How to Find an Evaluator…
5/23/201341
Robert McKownSr. Director of Evaluation & Accountability(616) [email protected]
Sherri Vainavicz2-1-1 Program Manager(616) [email protected]
r Mor
e In
form
ation
Con
tact
Thank you!