evaluation design and implementation puja myles [email protected]

16
Evaluation design and implementation Puja Myles [email protected]

Upload: daniela-sims

Post on 21-Jan-2016

223 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

Evaluation design and implementation

Puja [email protected]

Page 2: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

Session outline

-Evaluation frameworks-CDC framework for evaluation-Theory of change and logic models-RE-AIM framework-Maxwell’s quality assessment framework-Practical exercise: Using a logframe matrix

and decision models for evaluation planning/design

Page 3: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

What is an evaluation framework?

Step 3

Step 4

Step 2 Deciding and measuring health outcomes

Step 1

FRAMEWORK

Page 4: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

CDC framework for evaluation

Step 1: Engage stakeholdersStep 2: Describe the programStep 3: Focus the evaluation designStep 4: Gather credible evidenceStep 5: Justify conclusionsStep 6: Ensure use and share lessons

learned

Page 5: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

Step 1: Engage stakeholders

Key stakeholders:• People involved in programme

operations (funders, managers, administrators)

• People served or affected by the programme (clients, family members, elected officials, sceptics)

• Primary users of the evaluation (will be a subset of all the stakeholders identified; these are the people who can act on findings and bring change)

Page 6: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

Role of stakeholders

• Clarify the programme objectives• Help you elucidate the underpinning

theory of change• Help design and carry out the evaluation• Help frame recommendations for

practice based on findings• Initiate change/act on recommendations

i.e. ensure that the evaluation is meaningful

Page 7: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

-Mission and objectives of the programme-The problems addressed by the programme (nature and magnitude of the problem; populations affected)-How the programme intends to address the problem (theory of change)-Expected effects of the programme

Step 2: Describing the programme-1

Page 8: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

Step 2: Describing the programme-2

• Activities• Resources• Context (setting and environmental

influences e.g. Political/historical/social)• Logic Model

Page 9: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

Theory of change

• This approach involves setting out the series of outcomes that are expected to unfold as a result of the various components of the intervention as a basis for planning the evaluation strategy.

• Can be visualised as a sequential process of ‘if-then’

Page 10: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

Logic model/logframe matrix

• A practical approach to understanding the theory of change for a given intervention

• Can be used with stakeholders

Page 11: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

An example logframe matrix

Narrative summary

Verifiable indicators

Means of verification

Assumptions

Goal(Why are we doing this?)

Purpose(What will we achieve?)

Outputs(What immediate outcomes will we achieve?)

Activities (what will we do?)

Page 12: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

Step 3: Focusing the evaluation design

Things to consider:• Purpose of evaluation (feasibility,

effectiveness, change, empowerment, sponsor requirement)

• Evaluation questions (merit, cost-effectiveness, equity, quality)

• Feasibility• Ethics

Page 13: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

Study designs

Ovretveit (1998) outlined six basic evaluation designs:

• Descriptive• Audit• Outcome (the before-after comparison;

quasi-experimental design)• Comparative experimental • Randomised controlled experimental • Intervention to a service (impact on

providers and patients)

Page 14: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

CDC framework: Steps4-6

Step 4:Gather credible evidence (what outcomes and how will you measure these)

Step 6: Justify conclusions (attribution versus contribution; alternative explanations such as bias, chance, confounding)

Step 7:Ensure use and share lessons learned (stakeholder involvement; participatory approaches)

Page 15: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

RE-AIM framework for measuring public health impact

Glasgow et al (1999):• Reach (uptake; who benefits; who is left out)• Efficacy (include behaviour outcomes and

participant-centred quality of life measures; consider both positive and negative outcomes)

• Adoption (proportion & representativeness of settings): use direct observation, interviews, surveys

• Implementation (the extent to which a programme is delivered as intended); audit

• Maintenance: long-term maintenance of behaviour change (both clients and service providers)

Page 16: Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

Assessing Quality

Maxwell’s dimensions of health care quality:

• Access to services• Relevance to need (for the whole

community)• Effectiveness (for individual patients)• Equity (fairness)• Social acceptability• Efficiency and economy