dr ian goldman head of evaluation and research

28
Dr Ian Goldman Head of Evaluation and Research Presentation to Mineral Resources Portfolio Committee 19 June 2013 The Presidency Department of Performance Monitoring and Evaluation Update on national evaluation system and call for evaluations

Upload: duy

Post on 19-Jan-2016

23 views

Category:

Documents


1 download

DESCRIPTION

The Presidency Department of Performance Monitoring and Evaluation. Update on national evaluation system and call for evaluations. Dr Ian Goldman Head of Evaluation and Research Presentation to Mineral Resources Portfolio Committee 19 June 2013. Outline of Presentation. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Dr Ian Goldman Head of Evaluation and Research

Dr Ian GoldmanHead of Evaluation and Research

Presentation to Mineral Resources Portfolio Committee19 June 2013

The Presidency Department of Performance Monitoring and Evaluation

Update on national evaluation system and call for evaluations

Page 2: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Outline of PresentationReminder of evaluation systemUpdate on evaluations and the system

Example of findings from ECD Issues emerging

Status of evaluations underway and recommended

Implications for portfolio committees

22

Page 3: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Key messages Evaluations provide a very important tool for portfolio committees to

get an in-depth look at how policies and programmes are performing, and how they need to change

It is important for improving performance that committees do use this information, and so departments are accountable – not to punish them, but to ensure they are problem-solving and improving the effectiveness and impact of their work, and not wasting public funds

Where Portfolio Committees have concerns about existing or new policies or programmes they can ask departments to undertake rigorous independent evaluations – historically, or for effective diagnostic evaluations prior to a new programme or policy

DPME will ensure committees are informed of all evaluations being undertaken and report regularly to the Chairs

33

Page 4: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

1. Reminder on evaluation system

44

Page 5: Dr Ian Goldman Head of Evaluation and Research

5

1.3 Performance Area: Monitoring and Evaluation

1.3.1 Indicator name: Use of monitoring and evaluation outputs

Indicator definition: Extent to which the department uses monitoring and evaluation information.

Secondary Data: AGSA findings on pre determined objectives – Reported information not reliable.

Question: Which set of statements best reflects the department’s use of M&E outputs?

Statement Evidence Performance level

Department does not have an M&E Policy/Framework or does not have capacity to generate information.

Not required Level 1

Monitoring reports are available but are not used regularly by top management and programme managers to track progress and inform improvement.

Quarterly monitoring reports

Minutes of top management meetings or programme meetings to assess use of reports

Level 2

Monitoring reports are regularly used by top management and programme managers to track progress and inform improvement.

Quarterly monitoring reports

Minutes of top management meetings or programme meetings to assess use of reports

Level 3

All above in Level 3 plus:

Evaluations of major programmes are conducted periodically and the results are used to inform changes to programme plans, business processes, APP and strategic plan.

All above in Level 3 plus: Evaluation Reports Changes to programmes

and plans

Level 4

Page 6: Dr Ian Goldman Head of Evaluation and Research

Score in M&EScore in M&E(based on self-assessments by (based on self-assessments by

103 national and provincial 103 national and provincial departments)departments)

6

Page 7: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Problem

Evaluation is applied sporadically and not informing planning, policy-making and budgeting sufficiently, so we are missing the opportunity to improve Government’s effectiveness, efficiency, impact and sustainability.

77

Page 8: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Why evaluate?

99

Improving policy or programme performance (evaluation for continuous improvement):

this aims to provide feedback to programme managers.

Improving decision-making: Should the intervention be continued? Should how it is implemented be changed? Should increased budget be allocated?

Evaluation for improving accountability: where is public spending going? Is this spending making a difference?

Evaluation for generating knowledge (for learning): increasing knowledge about what works and what does not with regards to a public policy, programme, function or organization.

Page 9: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Impact evaluation Has the intervention had impact at outcome and impact level, and why

Impact evaluation Has the intervention had impact at outcome and impact level, and why

DESIGNDESIGN

Design evaluationDoes the theory of

change seem strong?

Design evaluationDoes the theory of

change seem strong?

Economic EvaluationWhat are the cost-benefits?

Economic EvaluationWhat are the cost-benefits?

Implementation evaluation

- what is happening and

why

Implementation evaluation

- what is happening and

why

Different types of evaluations related to questions around the outcome model

10

Diagnostic what is the underlying situation and root causes of the problem

Diagnostic what is the underlying situation and root causes of the problem

Page 10: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Key aspects of the approach

Departments submit proposals for interventions to evaluate (policies, programmes, projects) – as they have to own the evaluation and implement the findings. 3rd parties, eg Treasury, Parliament can propose evaluations, but departments should normally submit

Selection by cross-government Evaluation Technical Working Group – based on importance (either by scale or because strategic or innovative)

Evaluations must be made public unless security concerns All evaluation reports go to Cabinet (which approves the Plan) To ensure independence:

Evaluations implemented as partnership between department(s) and DPME Steering Committee makes decisions on evaluation not department External service providers undertake the evaluation reporting to the Steering Committee

To ensure quality: Peer reviewers (normally 2) per evaluation Evaluation panel, standards, guidelines, training etc Quality assessment once completed

Joint funded – department and DPME, in some cases donors There must be an improvement plan which is monitored

1111

Page 11: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Evaluation process

1212

Call for evaluations for 2014/15 March 2013

Call for evaluations for 2014/15 March 2013 Depts submit

concepts for evals – 30 June

Depts submit concepts for evals –

30 June

Work starts on refining conceptAug/Sept

Work starts on refining conceptAug/Sept

Selection by Eval Tech Working Group

July

Selection by Eval Tech Working Group

July

Plan submitted into Cluster/Cab system

Sept

Plan submitted into Cluster/Cab system

Sept

Cabinet approves PlanNov/Dec

Cabinet approves PlanNov/Dec

Finalising TORs, procurementJan-May 2014

Finalising TORs, procurementJan-May 2014

Evaluation commissioned

Feb-July

Evaluation commissioned

Feb-July

Evaluation completedSept 14 to March 15

Evaluation completedSept 14 to March 15

Results to Cluster and Cabinet 1-2 months after

Results to Cluster and Cabinet 1-2 months after

Report public – to Parliament and Website

Immediate

Report public – to Parliament and Website

Immediate

Management Response/Quality Assessment

1 month after completion

Management Response/Quality Assessment

1 month after completion

Improvement Plan drafted<4 months from approval

Improvement Plan drafted<4 months from approval

Monitoring Improvement Plan

Monitoring Improvement Plan

Year 1

Year 2Year 3

Year 4

Communication of resultsCommunication of results

Page 12: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

2. Update on evaluations and the system

1313

Page 13: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Progress with National Evaluation System

2012/13 National Evaluation Plan approved June 2012, 2013/14 NEP in November 2012, call out now for the 2014/15 to 2016/16 NEP 2012/13: 8 evaluations 2013/14: 16 evaluations ECD evaluation completed June last year and on DPME website

23 evaluations underway or being scoped from 2012/13 and 2013/14 Over next 3-4 months remaining 7 evaluations from 2012/13 will be completed 2013/14 evaluations in various stages of preparation or implementation, including

key ones like Government Coordination Systems and Outcomes Approach Audit of evaluations from 2006 identified 83 evaluations which have

been quality assessed, 71 passed and will be up on the DPME website by June.

In the process we have developed a quality assessment tool which can be applied for evaluations and an evaluation repository is being created

1414

Page 14: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Progress (2) >10 Guidelines and templates - ranging from TORs to

Improvement Plans Standards for evaluations and competences drafted and out for

consultation, and standards have guided the quality assessment tool

2 courses developed, over 200 government staff trained Evaluation panel developed with 42 organisations which

simplifies procurement Gauteng, W Cape provinces have developed provincial

evaluation plans. DPME is working with other provinces who wish to develop PEPs, starting with Free State

1 department has developed a departmental evaluation plan (dti)

1515

Page 15: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

3. What are we finding?

1616

Page 16: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Findings on Early Childhood Development (ECD) Report on DPME website Need to focus on children from conception, not from birth, which

requires changes in Children’s Act Very small numbers of the youngest children (0-2 years old) are in

formal early child care and education (ECCE) centres. That plus emphasis on pregnancy means greater involvement of Health

Poorer children still don’t have sufficient access. Need to prioritise. Current provision privileges children who can access centre-based

services and whose families can afford fees, rather than home- and community-based provision

Need to widen set of services available Inter-sectoral coordination mechanism for providing ECD and

associated services needs to be strengthened Improvement Plan being implemented

17

Page 17: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Next evaluations to be public

Grade R Business Process Outsourcing Both are coming up with significant findings

which have major implications for how the programmes are designed

1818

Page 18: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Evaluation challenges emergingOverall the system is working well but some challenges. These include: Poor programme plans (for the government programmes which are being

implemented) and so difficult to evaluate - need for minimum standards for programme plans – DPME developing guideline for release in June on this

Poor communication channels from some DGs - programme managers often not aware of the possibility of conducting evaluations on their programmes

Some senior managers wary of evaluation and don’t see it as an opportunity to improve their performance

Making sure evaluations proposed are strategic ones and that key sectors covered

Sometimes departments not budgeting for evaluations and expecting DPME to provide all the money

Departments not planning ahead – very important for impact evaluations in particular where need to plan 3+ years ahead, also affects how rollout happens

Some policy makers wanting to dictate the sample to make things look good – invalidates the evaluation

Reluctance to rollout in carefully planned way which facilitates impact evaluation. To be clear on impact must compare with/without the intervention

1919

Page 19: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

4. Status with evaluations

2020

Page 20: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Evaluations underway or completed

2121

Department Title of evaluation ProgressDSD/DBE/DoH Diagnostic Review of Early Childhood

DevelopmentCompleted June 2012Improvement Plan being implemented

Trade and Industry

Implementation/design evaluation of the Business Process Services Programme

Final report approved

Basic Education Impact Evaluation of Grade R Final report submitted. Complete in June.

Rural Development

Implementation Evaluation of the Recapitalisation and Development Programme

Underway. Complete in July

Rural Devel-opment

Implementation Evaluation of the Comp-rehensive Rural Development Programme

Underway. Some problems. Complete July

Health Implementation Evaluation of Nutrition Interventions addressing under 5s

Underway. Complete in August.

Human Settlements

Implementation Evaluation of the Urban Settlements Development Grant

SP appointed. Complete December 2013

Human Settlements

Implementation Evaluation of the Integrated Residential Development Programme

Underway. Complete Jan 2014.

Basic Education Impact Evaluation of the National School Nutrition Programme

Stopped. Aim to restart.

Page 21: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Evaluations for 2013/14 (in various stages of preparation or implementation)

1. Evaluation of Export Marketing Investment Assistance incentive programme (DTI).

2. Evaluation of Support Programme for Industrial Innovation (DTI).3. Impact evaluation of Technology and Human Resources for Industry

programme (DTI).4. Evaluation of Military Veterans Economic Empowerment Programme

(Military Veterans).5. Impact evaluation on Tax Compliance Cost of Small Businesses (SARS).6. Impact evaluation of the Comprehensive Agriculture Support Programme

(DAFF).7. Brought forward Evaluation of MAFISA from 2014/15.8. Evaluation of the Socio-Economic Impact of Restitution programme

(DRDLR).

2222

Page 22: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Evaluations for 2013/14…continued 9. Evaluation of the Quality of the Senior Certificate (DBE).10. Setting the Baseline for Impact Evaluation of the Informal Settlements

targeted for upgrading (DHS).11. Evaluating interventions by the Department of Human Settlements to

facilitate access to the city (DHS). 12. Provision of state subsidised housing and asset poverty for households

and local municipalities (DHS).13. Impact evaluation of the Community Works Programme. (DCOG).14. Evaluation of the National Advanced Manufacturing Technology Strategy

(DST).15. Impact Evaluation of the Outcomes Approach (DPME).16. Impact/implementation evaluation of government coordination systems

including the cluster system (Presidency) – already underway

2323

Page 23: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Evaluations Recommended for 2014/15 and 2015/16

5 of a possible 15 evaluations for 2014/15. These will be reviewed as the plan is rolled next year, and additional evaluations included. Revitalisation of irrigation schemes (DRDLR). Evaluation of Funza-Lusaka Bursary Scheme (DBE). Impact evaluation of Ilima-Letsema (DAFF). Policy evaluation on support to small scale farmers (DAFF/DRDLR).

4 of a possible 15 evaluations for 2015/16. These will be reviewed as the plan is rolled next year, and additional evaluations included. Evaluation of LandCare (DAFF). Evaluation of the National Rural Youth Service (DRDLR). Evaluation of Implementation of the new school curriculum (DBE). Evaluation of the impact of implementation of the national evaluation system on

programme performance (DPME).

2424

Page 24: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

The call for 2014/15 to 2016/17

So again we are calling for national evaluations that are: Focused on priorities, notably 12 outcomes/NDP Large (over R500m or 10% of the population) Or strategic/innovative, and important to learn lessons or are very much in the public eye (hot topics) DPME will make available R750k, but cost could be from R1m-R4m depending

on scale. Make sure that if depts are thinking of an evaluation, they budget for it. In exceptional cases DPME may fund all, but that means other evaluations being funded entirely by the respective departments.

Thinking ahead for outer two years, especially for impact evaluations where a baseline may be needed now, and where you may need to plan rollout to facilitate impact evaluation

2525

Page 25: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

5. Implications for Portfolio Committees

2626

Page 26: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Relevance to portfolio committees Repository will provide 71 evaluations which can be a source of evidence Once final report approved departments given one month to provide a

management response to findings and recommendations After Cabinet approval a letter will be sent from DPME to relevant Portfolio

Committee with copy of evaluation suggesting relevant department is asked to come and present to the Committee

Provides an opportunity for committees to interrogate what departments are doing, and ask deeper questions as to whether what departments are doing is having an impact, is effective, efficient, relevant, sustainable

Once management response received depts develop improvement plans Results from 2012/13 evaluations should be out between now and December

and some 2013/14 evaluations will also complete in same period Committees could request departments to brief them on progress with

evaluations, their results, and the development and implementation of improvement plans based on the results

Committees can make suggestions to departments regarding priority areas for evaluation. Call is out now for proposals for evaluations for 2014/15 to 2016/17 – Portfolio Committees can ask departments to evaluate specific policies or programmes (but closing date for submissions 30 June).

2727

Page 27: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Conclusions (1) Interest is growing – more departments getting involved, more

provinces, and more types of evaluation Development of design evaluation will potentially have very big impact

– will build capacity in departments to undertake The story is travelling and SA is now being quoted around the world A challenge may emerge now as the evaluation reports start being

finalised and the focus shifts to improvement plans – need close monitoring of development and implementation of improvement plans to ensure that evaluations add value

Parliament could play a key oversight role in this regard – committees could request departments to present the evaluation results to them, request departments to present improvement plans to them, and request departments to present progress reports against the improvement plans to them

2828

Page 28: Dr Ian Goldman Head of Evaluation and Research

The Presidency: Department of Performance Monitoring and EvaluationThe Presidency: Department of Performance Monitoring and Evaluation

Conclusions (2) Evaluations provide a very important tool for portfolio committees to

get an in-depth look at how policies and programmes are performing, and how they need to change

It is important for improving performance that committees do use this information, and so departments are accountable – not to punish them, but to ensure they are problem-solving and improving the effectiveness and impact of their work, and not wasting public funds

Where Portfolio Committees have concerns about existing or new policies or programmes they can ask departments to undertake rigorous independent evaluations – historically, or for effective diagnostic evaluations prior to a new programme or policy (closing date for 2014/15 is 30 June)

DPME will ensure committees are informed of all evaluations being undertaken and report regularly to the Chairs

2929