evaluation for change · 2016-02-09 · evaluation unit design & implementation governance...

20
6/19/2014 Suzanne Pope, Director Evaluation FACS Analysis & Research Evaluation for change Measures for success

Upload: others

Post on 08-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

6/19/2014

Suzanne Pope, Director Evaluation

FACS Analysis & Research

Evaluation for change Measures for success

Page 2: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

• Introduction to FACSAR – who are we and what we do

• What do we mean by evaluation

• What does evaluation mean to you

• How evaluation can be a tool to measure your impact and achievements

Workshop Overview

2 Thursday, June 19, 2014

Page 3: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

• Client and community focused analysis and research services

• Forward looking and integrated analysis responses considering a range of data,

analysis and modelling inputs

Data Analysis • Analyses data from a

range of sources • Data validation • Identify strategic data

gaps to inform policy • Ad-hoc data requests • Statistical reporting

Modelling • Model development • Demand forecasting • Trend forecasts • Future capacity-to-serve

analysis • Ad-hoc data requests • Scenario testing

Research • Undertake literature

reviews • Manage longitudinal

research studies • Conduct market

research • Synthesise existing

research and undertake meta-analysis

Evaluation • Baseline analysis and

establishment • Post project evaluation

analysis (pre-post comparison)

• Benefits realisation

Service Intent

Integration • Collates and analyses a range of research, data and modelling outputs • Applies a strategic lens to the analysis with the client objectives front of mind • Focused on answering questions with strategic insights rather than data • Answering questions asked… and unasked • Making policy linkages to enable a more considered and comprehensive response • Making connections across projects to maximise insight sharing across FaCS

Services

FaCSAR - was formed to provide integrated analysis and research functions

Page 4: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

Evaluation Unit

Design & Implementation

Governance

Quality Assurance

Capacity Development

• Strategies for major reforms

• Lead significant projects

• Assist divisions to identify evaluation priorities

• Independent Advisory Committee

• Evaluation schedule

• Compliance with NSW Framework

• Assist program areas with evaluation questions, specifications, critique reports

• Communicate findings

• Meta analysis

• Coaching and networking

• Resources and guidelines

• learning and development

Page 5: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

• There is a vast body of research that considers the efficacy of various specific interventions - that is not comprehensive, complete or consistent.

• Understanding return on investment, requires a key focus on outcomes - yet the focus of contracts and service delivery has more typically been on process and output measures rather than meaningful outcomes

• Much of the research is not suited to the traditional medical research model of randomised controlled trials and the like.

• Alternative research approaches are only just beginning to emerge to effectively measure complex investment/return variables and relationships.

What is evidence to support practice?

Thursday, June 19, 2014 5

Page 6: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

• How do you know if your program is working?

• Is it meeting goals (for clients/community) -

identify areas for improvement

• Provide evidence for the benefits (accountability, funding)

• Help make decisions; Is it needed?

Why evaluate?

6 Thursday, June 19, 2014

Page 7: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

Monitoring

• Ongoing ‘collection and analysis of information (data) in relation to a program that is able to provide key stakeholders with an indication of progress against stated goals and objectives’.

• Monitoring has traditionally focussed on processes (activities and outputs), but can also include outcomes.

Review:

• Program reviews tend to be less methodologically rigorous than evaluations and focus more on program outputs and efficiencies.

• Usually draw on retrospective performance data, but may include some additional data collection (e.g. interviews with key stakeholders)

Evaluation:

• “Systematic and objective process to make judgements about the merit or worth’ of a program, usually in relation to its effectiveness, efficiency and appropriateness” (NSW Treasury Program Evaluation Unit, 2013).

• Evaluation builds on monitoring, but is more in-depth and usually examines the level of outcomes achieved; intended and unintended effects; approaches that worked well; and reasons for success or failure.

Monitoring, Review, Evaluation

7 Thursday, June 19, 2014

Page 8: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

A comprehensive approach to continuous improvement aims to integrate the functions of monitoring, review and evaluation

Monitoring versus Evaluation

8 Thursday, June 19, 2014

Evaluation

Review

Monitoring

More intensive; involves

additional data collection;

one-off activity

Less intensive; undertaken

on an ongoing basis

Page 9: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

• OUTPUTS – (may be confused with outcomes) something that can be measured

For example - a procedural manual; care plan or service commodity (e.g. consultation)

• OUTCOMES – (‘intangible’ changes) made ‘tangible’ by measuring things about them

For example - change in: cognition - perception, knowledge, awareness; behaviour - skills, capabilities, choices; or conditions - security, stability, sustainability

Evaluation terms cont.

9 Thursday, June 19, 2014

Page 10: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

• INPUTS - the raw materials; resources; people

For example - equipment & facilities; people & funding; legislation & published research

• ACTIVITIES (what you do with inputs) - an action or intervention (single activity)

For example - assessing a client; developing a care plan; delivering a package of care

What do we mean – Evaluation terms

10 Thursday, June 19, 2014

Page 11: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

INDICATORS – (measurements that ‘indicate’) We measure them because we think they show us that something has happened

They come from a wide range of sources:

- Survey questions, interview questions

- Performance reporting elements

- Case plans, assessment measurements

Evaluation terms cont.

11 Thursday, June 19, 2014

Page 12: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

Roadmap for change

- identifies program drivers

- makes explicit your assumptions

- highlights the connection between what you plan to do AND

- how you think it will result in the change you want to see

Evaluation in action

12 Thursday, June 19, 2014

Inputs

Activities

Outputs Outcomes

Page 13: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

• The number one stumbling block when planning to evaluate a program is a lack of clarity about how a program is intended to ‘work’

Challenges of evaluation

13 Thursday, June 19, 2014

Page 14: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

• Inputs: financial, human and material resources

• Activities: action steps necessary to produce project or program outputs

• Outputs: Program deliverables or products, goods or services delivered to program recipients

• Outcomes: Results or changes for individuals, groups, communities, organisations, systems (i.e. what happens after outputs are delivered by a program)

Developing Program Logic

14 Thursday, June 19, 2014

Inputs Activities Outputs Outcomes

Page 15: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

How can we help you

1. on your terms – to get your views, your ideas, a sense of your needs and issues.

15

Page 16: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

• What do you do now

• What would you like to do better

• What would help you get there

What and how ….

16 Thursday, June 19, 2014

Page 17: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

• Using evaluation as a tool for continuous improvement

• To measure performance

• To demonstrate client benefits

Discussion point 1

17 Thursday, June 19, 2014

Page 18: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

• Can data and evidence of effectiveness help you stand out in the market place

• How can evidence based practice respond to a consumer driven market

Discussion point 2

18 Thursday, June 19, 2014

Page 19: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

• Participatory and inclusive research and evaluation – how to engage the consumer and carer perspective

• Consent and engagement – using consumer and carer feedback for service improvement

• From rhetoric to reality

Discussion point 3

19 Thursday, June 19, 2014

Page 20: Evaluation for change · 2016-02-09 · Evaluation Unit Design & Implementation Governance Quality Assurance questions, specifications, critique reports Capacity Development •Strategies

Evaluation toolkit

• http://www.dpc.nsw.gov.au/programs_and_services/policy_makers_toolkit/evaluation_toolkit

NSW Government Evaluation Framework

• http://www.dpc.nsw.gov.au/__data/assets/pdf_file/0009/155844/NSW_Government_Evaluation_Framework_August_2013.pdf

Better Evaluation

• http://betterevaluation.org/

Evaluation resources

20 Thursday, June 19, 2014