steps in implementing an impact evaluation
DESCRIPTION
Victor Orozco, Development IMpact Evaluation Initiative (DIME). Steps in Implementing an Impact Evaluation. Steps. Step 1. Build capacity for IE. Objectives: Become informed consumers of impact evaluation Set the learning agenda - PowerPoint PPT PresentationTRANSCRIPT
DIME – FRAGILE STATESDUBAI, MAY 31 – JUNE 4
Steps in Implementing an Impact EvaluationVictor Orozco, Development IMpact Evaluation Initiative (DIME)
Steps
Build capacitySet learning
agenda
Design impact evaluation
Plan for IE implementatio
nConduct baseline
Roll out intervention
Collect follow up data
Analyze data
Feed results into policy
Step 1. Build capacity for IE
Objectives: Become informed consumers of impact
evaluation Set the learning agenda Use it as an internal management tool to
improve program over time
How Training Learning by doing
Step 2: Set learning agendaObjective:
Get answers to relevant policy and operational questions
How? Dialectic discussion involving key policy makers
and program managers Technical facilitation to structure framework of
analysis Focus on few critical policy (what) and
operational (how to) questions Discuss agenda with authorizing environment
and constituencies
Cont. 2: Questions Operational: design-choices of program
Institutional arrangements, Delivery mechanisms, Packages, Pricing/incentive
Management purpose Use random trials to test alternatives Measure effects on short term outcomes (months)
▪ take up rates, use, adoption Scale up better implementation modalities
Policy: effectiveness of program Accountability purpose Use random assignment or next best method Measure effects medium to long term Scale up/down, negotiate budget, inform
Step 3: Design IE
Exploit opportunities: Will roll-out take time? Is the budget allocated insufficient to cover everyone? Are there quantitative eligibility rules? If the program has universal access, does it have
imperfect take-up? Set scale:
Pilot to try out an intervention Large scale w. representative sample: more costly,
externally valid Large scale with purposeful sample: less costly,
indicative Do power calculation to determine
minimum sample size
Cont. Step 3
Select “best” method for each of your questions Feasible Requires least assumptions
Ethics Not to deny access to something for
which there is irrefutable evidence Test interventions before scale up when
you have no solid evidence
Step 4: Planning implementation Budget cost items
▪ Staff time (PROJECT FUNDS) and training (DIME)▪ Analytical services and field coordination (DIME)▪ Data collection (PROJECT FUNDS)▪ Discussions and dissemination (shared)
Timeline▪ Use it to organize activities, responsibilities and work
backwards to know when to start Team
▪ Government (program manager, economist/statistician); WB Project team (Task manager or substitute); Research team (Lead researcher, co-researchers, field coordinator); Data collection agency
Step 5: Assignment to treatment and control
The smallest unit of assignment is the unit of intervention Training and Credit: individuals and groups Municipal registration system: municipality
Create listing of treatment units assigned to the intervention and control units that are not
Explain assignment to responsible parties to avoid contamination
Step 6: Baseline data
Quality assurance : IE team (not data collection agency) to Design questionnaire and sample Define terms of reference for data
collection agency Train enumerators Conduct pilot Supervise data collection
Do not collect data before your design is ready and agreed
Cont. Step 6: Baseline data
Contract data collection agency Bureau of Statistics: Integrate with
existing data Ministry concerned: Ministry of
Agriculture/Water Resources/Rural Development
Private agencyAnalyze baseline data a feed back
into program and evaluation design if needed
Check for balance between treatment and control group: do they have similar average characteristics?
Step 7: Roll out interventionConduct intensive monitoring of roll-
out to ensure evaluation is not compromised
What if treatment and control receive the intervention?
What if all the control group receive some other intervention?
Step 8: Follow-up data
Collect follow-up data with the same sample and questionnaire as baseline data
At appropriate intervals
Step 9: Estimate program effects
Randomization: compare average outcomes for treatment and control group
Other methods: Use relevant econometric analysis , test assumptions, check robustness
Are the effects statistically significant? Basic statistical test tells whether differences are due to the
program or to noisy data Are they significant in real terms?
If a program is costly and its effects are small, may not be worthwhile
Are they sustainable? Is the trajectory of results sustained?
Step 10: Discuss, Disseminate and Feedback into policy Are you thinking about this only now? Discuss what are the policy implications of the
results What actions should be taken How to present them to higher ups to justify
changes/budget/scale up? Talk to policy-maker and disseminate to wider
audience If no one knows about it, it won’t make a
difference Make sure the information gets into the right
policy discussions Real time discussions Workshops Reports Policy briefs
Final step: Iterate
What do you need to learn next?