web viewtitle. terms of reference for support evaluator. the tor makes reference to the lead...

22
Title TERMS OF REFERENCE FOR SUPPORT EVALUATOR 1 FOR THE PBEA DEVELOPMENTAL EVALUATION Purpose To recruit a suitable consultant/evaluator for the role of support evaluator for the DE exercise to be conducted in two PBEA participating countries Evaluation Timeline July 2014 through December 2015 Reporting to TBD I. Background 1. The Peacebuilding, Education and Advocacy Programme (PBEA) is a four-year (2012-2015) programme funded by the Government of the Netherlands (GoN), currently being implemented in 14 countries. The strategic vision of the programme is to “strengthen resilience, social cohesion and human security in conflict-affected contexts, ” with the strategic result of “strengthening policies and practices in education for peacebuilding. ” The strategic result will be achieved through five outcomes: a. Increase inclusion of education into peacebuilding and conflict-reduction policies, analyses, and implementation; b. Increase institutional capacities to supply conflict-sensitive education; c. Increase capacity of children, parents, teachers and other duty-bearers to prevent, reduce and cope with conflict and promote peace; d. Increase access for children to quality, relevant, conflict- sensitive education that contributes to peace; and, e. Contribute to the generation and use of evidence and knowledge on policies and programming on linkages between education, conflict and peacebuilding 2. UNICEF commissioned an evaluability assessment of the PBEA in 2013. This assessment was a systematic process used to determine the programme’s readiness to be evaluated by examining the coherence and logic of the design, and the capacity of the management systems and governance structures to implement, monitor and measure results. The assessment also examined whether there was a shared understandings of the desired results and/or outcomes, and proffered advice, through findings and recommendations, on how the 1 The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with the Lead Evaluator as a three person team. However, each individual will be recruited separately, with three separate contract being issued. 1

Upload: dinhnga

Post on 01-Feb-2018

218 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

Title TERMS OF REFERENCE FOR SUPPORT EVALUATOR1 FOR THE PBEA DEVELOPMENTAL EVALUATION

Purpose To recruit a suitable consultant/evaluator for the role of support evaluator for the DE exercise to be conducted in two PBEA participating countries

Evaluation Timeline July 2014 through December 2015Reporting to TBD

I. Background

1. The Peacebuilding, Education and Advocacy Programme (PBEA) is a four-year (2012-2015) programme funded by the Government of the Netherlands (GoN), currently being implemented in 14 countries. The strategic vision of the programme is to “strengthen resilience, social cohesion and human security in conflict-affected contexts,” with the strategic result of “strengthening policies and practices in education for peacebuilding.” The strategic result will be achieved through five outcomes:

a. Increase inclusion of education into peacebuilding and conflict-reduction policies, analyses, and implementation;

b. Increase institutional capacities to supply conflict-sensitive education;c. Increase capacity of children, parents, teachers and other duty-bearers to prevent, reduce

and cope with conflict and promote peace;d. Increase access for children to quality, relevant, conflict-sensitive education that

contributes to peace; and,e. Contribute to the generation and use of evidence and knowledge on policies and

programming on linkages between education, conflict and peacebuilding

2. UNICEF commissioned an evaluability assessment of the PBEA in 2013. This assessment was a systematic process used to determine the programme’s readiness to be evaluated by examining the coherence and logic of the design, and the capacity of the management systems and governance structures to implement, monitor and measure results. The assessment also examined whether there was a shared understandings of the desired results and/or outcomes, and proffered advice, through findings and recommendations, on how the programme might be strengthened, and on the design for the end-of-programme evaluation.

3. The evaluability assessment categorized country programmes into three levels of ‘evaluation readiness’, with the most advanced programmes deemed ‘evaluation ready’ (having most of the elements required to execute a credible end-of programme evaluation). The least ‘evaluation ready’ programmes require substantial inputs and support to bring them to a level where an evaluation effort would yield information that would enable credible findings/conclusions. The differential progress that country programmes have made towards implementation, diverse country contexts, the complexity of the PBEA, and indeed the advice of the evaluability assessment, point to a need for a well thought out and non-conventional end-of-programme evaluation effort.

4. The evaluability assessment also noted that using the conventional ex-post facto evaluation design - aggregating the contribution of education to peacebuilding across country programmes and selecting a sample of programmes for in-depth field work – would not represent PBEA results (or

1 The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with the Lead Evaluator as a three person team. However, each individual will be recruited separately, with three separate contract being issued.

1

Page 2: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

the lack thereof) adequately, nor will it capture the lessons that PBEA programming has to offer. Instead a “bottom-up” evaluation design was deemed more suitable to capture the diversity of interventions and themes, broad variations in country programme profiles, and variations in navigating programming complexity.

5. In addition to the advice on the evaluation approach, the evaluability assessment also made strong observations on the need to balance accountability to funding authorities with seizing the opportunity to capture emergent learning. Hence two major evaluative activities are proposed for the PBEA; an exercise to assist in the systematic documentation of the lessons of PBEA development and implementation over a period of 18 months or so, and a summative evaluation during the final quarter of the programme. The first evaluative activity will be based on the somewhat evolving thinking of ‘developmental evaluation’ (DE) – recruiting suitable personnel for the first evaluative activity is the subject of this terms of reference.

II. Purpose of the DE and its rationale

6. DE is an approach that injects evaluative thinking and supports adaptive learning in complex initiatives. This design combines the rigor of evaluation methodologies with the flexibility and creativity that is required in seeking solutions to development problems, typically involving innovation, high levels of uncertainty, and tackling social complexity (Patton, 2008; Gamble, 2008; Dozois, Langlois and Blanchet-Cohen, 2010). DE seems to be an appropriate design for capturing the learning that ensues in a programme as complex as the PBEA.

7. To commence in 2014 in two of the country programmes that are still in the early stages of implementation, this component will be executed by a three person evaluation team. One consultant will be a senior (experienced) lead evaluator who has experience conducting or supervising a DE, who will conceptualize and guide the technical execution of the evaluation. The other members of the team will be two support evaluators, each working from within a country programme as an integral part of the country programme team.

III. Scope of the evaluation and methodology

8. Scope: As previously stated, the evaluation’s purpose is to systematically capture the learning that can be infused into the programme to heighten its chances for success. Hence, the evaluation will provide comprehensive and evidence-based answers to two overarching questions, namely,

a. Are the programme impact pathways stipulated for the PBEA feasible/credible and likely to produce the intended results and/or outcomes? and,

b. What new learning and/or improvements were effected to improve attainment of results and/or outcomes?

The evaluation will cover two to three key outcomes that will be selected by the country programme team for intensive monitoring and study through the developmental evaluation. The evaluation will be based on a learning framework which will develop sub-questions that are customized to the programming context, as well as criteria for weighing the evidence on each question.

9. Methodology: Given that it is an adaptive, context-specific approach, the methodology of a DE is usually largely informed by the theme/subject matter under investigation and context. Its practice offers a great opportunity for innovation and experimenting with new ideas, even in terms of the approach and methodology. However, DE primers (Dozois, 2010) have identified entry points, practices and organizing tools that are emerging as part of the methodology for a DE investigation. Below are some of the steps, in building the methodology for the proposed DE, adapted from Dozois, 2010 and tailored more to the context of PBEA.

2

Page 3: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

Orienting the evaluation team: Evaluators undertake investigative work early in the initiative in order to build a deeper understanding of the identified problem or opportunity, resources, stakeholders and broader context. This could be a good starting point for the developmental evaluation of the PBEA.

Building relationships: The quality of relationships determines the degree to which the team can access information and influence change. For this reason, the methodology should consider a mapping of relationships that are critical to execute the DE, and a strategy to keep people engaged in the evaluation.

Orienting the implementation team (the country programme team): Related to the point above, a key part of the evaluators’ role is to help stakeholders test their assumptions, articulate and refine their models, extend their understanding, and cultivate a culture that supports learning. These activities will likely help the PBEA country teams to develop and maintain an adaptive orientation in complex and unknown territory.

Developing a learning framework: A learning framework is a good tool for DE practice. Working in collaboration with key stakeholders, developing a learning framework (slightly different from an evaluation framework), will guide the evaluation by mapping out potential areas for learning (and identify both opportunities and challenges), identifying data and/or evidence that is required to make decisions, and to articulate feedback mechanisms.

Observing: Evaluators carefully observe the unfolding situation in order to help the group identify leverage points, assess their efforts, and stay true to the core intent and principles of their initiative. Evaluators should (i) key developmental moments; (ii) group structure (iii) group dynamics; (iv) action or inaction; and (v) opportunities and threats. PBEA should be able to benefit from a sustained evaluation effort that can bring a better calibration of impacts of micro-level solutions on a macro-level conflict.

Sense-making: Sense-making is largely about making sense of the data that has been collected. The evaluator’s role is to help the group identify patterns, integrate new information, and consider the implications of their observations, and propose solutions.

Intervening: As a member of the programme team, evaluators actively help to shape the work by: (i) asking questions; (ii) facilitating discussion; (iii) sourcing or providing information; (iv) modeling solutions; (v) and, making new connections.

10. The evaluation (with inpuits from the Support Evaluator) will finalize the methodology for the DE, based on these rudimentary steps, and to enrich it with his/her knowledge and experience. Additional resources for building an evaluation methodology centered on reflective practice include guidance on evaluation peacebuilding interventions, (OECD/DAC, 2012; Reimann, Chigas, and Woodrow, 2012; and Rogers, 2012).

IV. Tasks, responsibilities, and management accountabilities

The lead evaluator (LE), contracted by the UNICEF Evaluation Office, will be responsible for leading the evaluation efforts from a global perspective. He/she will have overall responsibility for the technical guidance of the DE and its quality, and will be responsible for the following:

a. Conceptualize and develop the DE design and methodology, including the evaluation/learning framework, and this/her own work plan;

b. Provide inputs in the recruitment of the support evaluators, as well as guide and supervise them;

c. Develop a DE module and other materials as he/she sees fit. This is a guidebook that the support evaluators can use and/or refer to in their duty stations ;

d. Orient and/or coach of the evaluation team and review work plans for country level evaluation efforts, including and agreeing with support evaluators on a set of deliverables.

e. Undertake quality assurance missions to the participating countries;

3

Page 4: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

f. Provide on-going desk-based quality assurance of all outputs, including reviewing periodic country outputs/reports, and final report of the evaluation;

g. In conjunction with the support evaluators, provide milestone reports to the Chief of Education (or PBEA focal point) in each participating country; and,

h. Ensure that the evaluation manager (Evaluation Specialist in UNICEF New York) is regularly informed of the progress of the evaluation, possible causes of delay and issues to resolve.

11. Two support evaluators (to be contracted through a different but related process) will be responsible for the following;a. Adapting the methodology of the DE to the country context, clear it with the lead evaluator,

and finalize the inception report for the country evaluation;b. Develop and implement a work plan for evaluation activities;c. Direct the execution of the DE and assure its quality at the country level;d. Monitoring and tracking of the data that is required to inform decisions on the evaluation;e. Compiling progress reports according to an agreed schedule, clearing them with the lead

evaluator, and presenting them to the Chief of Education, Monitoring & Evaluation Chief/ (or designated PBEA focal point);

f. Make agreed inputs into other milestone reports, including developing powerpoint presentations (draft and final report for the country-level evaluation); and

g. Organize two dissemination events where he/she will present the evaluation findings.

12. The Country Office leadership (presumably the Education Chief) will be responsible for:a. Co-facilitating the recruitment of support evaluators; b. Provide supervision support in day-to-day execution of evaluation activities, but not technical

supervsion; and,c. Reviewing the progress reports, and all deliverables.

13. The Evaluation Specialist and evaluation manager (in UNICEF’s Evaluation Office in New York) will be responsible for;a. Coordinating, directing and supervising all activities of the Lead Evaluator. b. Consulting with the M&E Specialist (PBEA, at HQ) on a regular basis; c. Providing updates to the PBEA Programme Manager at agreed intervals;d. Assure the quality of all deliverables, as well as give final approval.

V. Key skills, academic/technical background, and experience required

14. This invitation is extended to evaluation practitioners and/or consultants with relevant experience in monitoring and evaluation of international development programmes, to submit a bid for support evaluator to execute the developmental evaluation study at the country level. The successful consultant should offer the following range of skills and experience:a. Demonstration of strategic thinking skills; b. Technical knowledge, skills and expertise in evaluation concepts and approaches, and

evaluating complexity, in particular; c. Programming and/or evaluation experience in the broad area of education in emergencies

and/or working with vulnerable populations in conflict affected countries. (evaluation experience in peacebulding programmes will be assessed as an added advantage);

d. Strong analytical skills to support both qualitative and quantitative research;e. Facilitation skills, particularly design and execution of stakeholder consultations and team

building exercises;f. Active listening and time management skills;

4

Page 5: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

g. Excellent oral and written communication and report writing skills in English; (proficiency and the ability to communicate in one other UN language will be assessed as an added advantage); and,

h. Computer literacy in Word, Excel and Power Point.

15. The consultancy is estimated at 18 months, stretching from July 2014 to December 2015. Consultancy fees will be payable at P-3 to P-4 level, depending on the qualifications and experience of the consultant. A salary will be paid to the evaluator on a monthly basis, and upon submission of satisfactory work products each month. Within the contract period, the evaluator will be expected to undertake an orientation trip to UNICEF New York for the purpose of consultations, and may be required to attend the PBEA annual global meeting in Istanbul, Turkey.

VI. Timeline, time allocation and deliverables

Task Output/deliverables Deadline

Adapt and/or customize the DE design to country context (learning framework, methodology, work planning, reporting format.)

Draft Inception report July 20, 2014

Participate in orientation and training of support evaluators (and team building)

Final Inception report and work plans

July 30, 2014

Mission to UNICEF, HQ New York for consultations with EO, PBEA and the PMT

Standard trip report attached in October report

July 30, 2014

Mission to PBEA Global review Meeting in Istanbul, Turkey Detailed trip report attached in October report

30 Oct, 2014

Monthly Report: Reporting format to be determined Monthly report on 30th day of the month

July 2014 thru Dec, 2015

Final reporting writing/compilation Draft and final reports;

Oct 30, 2015

Presentations at dissemination events; wrapping up PPT Presentation Nov 2015, Dec 2015

TOTAL 18 months

16. In summary, the Support Evaluator is responsible for the following deliverables:a. Country specific approach and methodology report: This report will include, among others,

customized evaluation approach and methodology, country –level evaluation work plan, country-evaluation module and other materials for the orientation and planning for country level evaluation efforts with country PBEA team;

b. Country-level evaluation reports/outputs: These will be developed by the support evaluators - the lead evaluator will provide quality assurance/review;

c. Final evaluation synthesis report for the DE: First, second and final drafts, according to the UNICEF House Style and UNICEF standards for evaluation report - these will be originated by the support evaluators - the lead evaluator will provide quality assurance/review; and,

d. PowerPoint presentation for country evaluation report: Presentation to be originated by support evaluators. Support evaluator will also be expected to organize two dissemination events where he/she will present the evaluation findings.

VII. Risk management and/or mitigation

2. The most critical risk for the evaluation pertains to balancing the need to produce results with readiness of different parts of UNICEF to execute a DE and get the full benefit of what it has to offer. To that end, careful consideration should be given to the following:

i) Clear delineation of the role of the support evaluator: There is always the risk that the support evaluator assumes the role of M&E officer for the entire education portfolio, or worst still, another pair of hands for the entire country office. The Evaluation Office and the

5

Page 6: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

two COs should come to an agreement that the role of support evaluator is primarily ‘evaluator’ who will assume the monitoring role only for PBEA activities.

ii) Management arrangements : Related to the point on the role of the support evaluator is the issue of who he/she reports to. In a typical independent evaluation, the evaluator would be supervised by the evaluation manager (in Evaluation Office). The arrangements proposed in Section IV above should be negotiated between the Evaluation Office and the Country Office leadership, (involving the Chief of Education and the Deputy Representative, or higher).

VIII. Expression of interest

17. All interested and eligible consultants should send an application packet including the following:a. A completed ‘expression of interest’ form (see Appendix A) b. Completed responses to the questions in Appendix B, including the professional fee/rate, per

person day. c. Updated CV/Resume, and completed Personal History Profile (P11)2, if UNICEF Evaluation Office

does not have it on file already; and,d. A sample evaluation report, with a clear indication of the applicant’s contribution in the report.

The application packet should be transmitted via email to Kathleen Letshabo using the following email:Email: [email protected] Subject: Expression of Interest- PBEA Developmental Evaluation (Support Evaluator)

IX. Closing date

The closing dates for this EoI is 16 June, 2014, 12:01 am (midnight), New York City time.

2 http://www.unicef.org/about/employ/files/P11.doc

6

Page 7: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

Appendix A

Expression of Interest Form

Support Evaluator for PBEA Developmental Evaluation

Please fill-in page 1 of the form in its entirety and submit it to us electronically or via fax.

First Name:

Last Name:

User Salutation:       Mr. Ms. Mrs. Dr.

Job Title:

Mobile: (please include country & city codeFax: (please include country & city code) Email address:

Address:

City: State: Postal Code: Country:

Alternate contact

7

    

     

          

          

                              

                                        

          

Page 8: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

Appendix B

Summary of applicant’s experience and preparedness for the DE

Using the questions 1- 7, submit a narrative detailing your experiences and preparedness to be support evaluator in a developmental evaluation. Your responses should cover all the issues below, and additional information where you deem necessary. Your response SHOULD NOT EXCEED 6 PAGES .

1. Provide information which will enable us to determine whether you have relevant experience to conduct the proposed developmental evaluation. Information should include:a. The number of years of experience in evaluation research; b. The number of evaluations you have carried out as lead investigator, and as a member of an

evaluation team, and the number of evaluations commissioned by UN agencies or comparable organizations;

c. A description of your technical competencies, skills and expertise in evaluation concepts and approaches, and analytical skills to support both qualitative and quantitative research;

2. Provide information which will enable us to determine whether you have relevant specialized knowledge in the areas that are critical to this work. Information should include;a. A description of programming experience in the education sector, and peacebuilding programmes, or

related sub-disciplines or themes;b. A description of evaluation or programming work in complex settings, as education in emergencies

programmes, and/or with vulnerable populations in conflict affected countries (evaluation experience in peacebulding programmes will be assessed as an added advantage);

c. A description of actions/decisions that demonstrate your leadership, management, and/or strategic thinking skills;

d. A description of ‘other’ skills or experiences that may be required to execute this evaluation, including facilitation skills, design and execution of stakeholder consultations, conducting team building exercises, time management skills, etc;

3. From your reading of the PBEA Evaluability Assessment Report3 and the concept note in Appendix C, what are the most critical issues that the developmental evaluation needs to address (response to this question should not exceed 1 page).

4. Provide any additional experience that may be critical to the success of the proposed work, including but limited to: affiliation to universities and/or professional bodies; contributions to communities of practice and/or blogs; and, other information you deem relevant to this work that would give you an advantage over others competing for the same consultancy.

5. Possible country programmes where a support evaluator is to be embedded include Chad, Myanmar, Palestine and Somalia (Puntland) and Yemen. a. Indicate what your motivation for considering this consultancy is, your willingness to work under the

modality of ‘being embedded in a country programme’ for 18 months, and why you think you would be a match for that type of work arrangement;

b. Rank the five duty stations in order of preference (from highest to lowest preference). Indicate if there is any that would you not consider at all?

6. Please provide your professional fee rate, per person day

7. Confirm the following;a. You have NO on-going litigation with the UN; you are NOT currently removed/invalidated or

suspended by the United Nations or UN system organizations;

3 http://www.unicef.org/evaldatabase/files/PBEA_Evaluability_Assessment_Final_Report.pdf8

Page 9: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

Appendix C

PBEA DEVELOPMENTAL EVALUATION: ORIGINAL CONCEPT NOTE

I. Background

1. The Peacebuilding, Education and Advocacy Programme (PBEA) is a four-year (2012-2015) programme funded by the Government of the Netherlands (GoN), currently being implemented in 14 countries. The strategic vision of the programme is to “strengthen resilience, social cohesion and human security in conflict-affected contexts,” with the strategic result of “strengthening policies and practices in education for peacebuilding.” The strategic result will be achieved through five outcomes:

Increase inclusion of education into peacebuilding and conflict-reduction policies, analyses, and implementation;

Increase institutional capacities to supply conflict-sensitive education; Increase capacity of children, parents, teachers and other duty-bearers to prevent, reduce and

cope with conflict and promote peace; Increase access for children to quality, relevant, conflict-sensitive education that contributes to

peace; and,Contribute to the generation and use of evidence and knowledge on policies and programming on linkages between education, conflict and peacebuilding

2. A unified Global Results Framework (GRF) was developed to guide an assessment of global corporate accountabilities. Based on the general guidance of the five strategic objectives, country programmes and other implementation units were expected to develop context specific programmes and adapt the five outcome results framework to their contexts. The country programmes integrated their results into the GRF via operational matrices outlining key objectives, indicators, and activities.

3. The PBEA is managed by the Programme Management Team (PMT) housed in the Education Section (Programme Division), working closely with the Humanitarian Action and Transition Unit (HATIS) and other divisions, sections, and units. The PMT provides overall leadership for the programme while implementation is carried out mainly through individual country-level programmes with technical support from regional offices and the PMT.

4. UNICEF commissioned an evaluability assessment of the PBEA in 2013. This assessment was a systematic process to determine the programme’s readiness to be evaluated by examining the coherence and logic of the design, and the capacity of the management systems and governance structures to implement, monitor and measure results. The assessment also examined whether there was a shared understandings of the desired results and/or outcomes, and proffered advice, through findings and recommendations, on how the programme might be strengthened, and on the most suitable design for the end-of-programme evaluation.

5. The assessment categorized country programmes into three levels of ‘evaluation readiness’, with the most advanced programmes deemed ‘evaluation ready’ (having most of the elements required to execute a credible end-of programme evaluation). The least ‘evaluation ready’ programmes were flagged as requiring ‘substantial inputs and support’ to get them to a point where the end-of-programme evaluation would yield useful information.

6. The evaluability assessment also noted that using the conventional ex-post facto evaluation design - aggregating the contribution of education to peacebuilding across country programmes and selecting a sample of programmes for in-depth field work – would not adequately capture the complexity represented in PBEA implementation, nor will it capture the lessons that PBEA programming has to offer. Instead a “bottom-up” evaluation strategy was recommended as more suitable to capture the diversity of interventions and themes, broad variations in country programme profiles, and variations

9

Page 10: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

in navigating programming complexity. In addition, the evaluability assessment also made strong observations on the need to balance accountability to funding authorities with capturing emergent learning.

II. The problem, and proposed solution

7. The concept of peacebuilding has evolved from a broad definition since its earlier articulation in UN system in the 1990s, to a more refined set of priorities. The provision of basic services, including primary education, is explicitly stated as one of these priorities. For instance, access to quality education is often a key component and measurement of basic safety and human security, while restoring schools and reconstructing education systems can increase government legitimacy. Civic education programmes have often been used to build conflict-management capacities, and to acquire dialogue skills which are necessary to foster reconciliation and support political processes. Education can also contribute to economic revitalization, provide constructive opportunities for empowerment of women and girls and disenfranchised youth, and has a strong potential to transform accepted norms around gender and power, to prevent violence and serve as a peace dividend.

8. Emanating from the increased recognition and acknowledgement of the linkages between education and peacebuilding, PBEA is a programme that attempts to harness the power of education to contribute to peace. However, the diversity country contexts, the level of complexity displayed in programming options, and the differential progress that country programmes have made towards PBEA implementation (and their evaluation readiness), all point to an end-of-programme evaluation strategy that will codify emergent learning in a manner that can benefit UNICEF as we move from this first generation of education and peacebuilding programmes, towards ‘programming for resilience’.

9. Hence, we propose a ‘developmental evaluation’ (DE) - an approach that injects evaluative thinking and supports adaptive learning in complex initiatives. This design combines the rigor of monitoring and evaluation practice with the flexibility and creativity that is required in seeking solutions to development problems, typically involving innovation, high levels of uncertainty, and tackling social complexity (Patton, 2008; Gamble, 2008; Dozois, Langlois and Blanchet-Cohen, 2010). While we acknowledge that DE will require a lot more inputs from the Evaluation Office (some of which may seeming to be programmatic), we submit that DE is an appropriate design for capturing the learning that ensues in a programme as complex as the PBEA.

10. Our proposal is for the DE component to commence during the first quarter of 2014 in two of the country programmes that are still in the early stages of implementation. The evaluation will be executed by a three person team; an experienced lead evaluator who will conceptualize and guide the technical execution of the evaluation from his/her base, as well as two support evaluators, one of which will be embedded with each of the two country programme teams 4. Hence, the purpose of this concept note is to provide a rationale for the developmental evaluation, broad strokes of the methodological approach, as well as roles and responsibilities of different actors.

III. Rationale for developmental evaluation and implications for practice

11. According to Patton (2008) and Gamble (2008), DE is suitable in:• Innovative interventions, requiring real-time learning, processing and feeding back new learning to

create the desired solutions;

4 A PBEA end-of-programme summative evaluation – an ex-post facto assessment of programme performance that will enable aggregation of programme results as per UNICEF accountabilities to the donor - is planned to commence in 2015. The evaluation will feature a desk-based document analysis and synthesis of results from all 14 programmes, in-depth field work in three of the country programmes that were deemed “evaluation ready” by the evaluability assessment, as well as integrate lessons from the developmental evaluation which is subject of this concept paper. The Evaluation Office will commission an independent evaluation firm to conduct the summative evaluation.

10

Page 11: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

• Highly emergent implementation environments, where there is consistent need to restate, reaffirm, or even modify the goals of an intervention as new learning comes in;

• Highly complex interventions, characterized by non-linearity in terms of design, execution, as well the relationship between input/output and outcome variables; and

• Socially complex themes or areas of inquiry, requiring collaboration among stakeholder in different parts of the system, as well as external stakeholders.

Most PBEA interventions operate under several of these conditions or situations, hence the programme is deemed to be a good ‘fit’ for a DE.

12. However, PBEA is not an “innovation” in a classical sense of the term. And, while there is an expectation that it will generate innovative education solutions towards building sustainable peace in schools, homes, communities, and education institutions, the goals of the programme have already been defined (see para. 1 above). In practice, DE leaves a lot of room for the evaluator and the programme team to define and modify goals in response to new learning, and thereby determine what ‘success’ of the innovation and/or intervention is.

13. This means that one of the challenging and yet critical tasks of the developmental evaluation will be to align goals for the DE with accountabilities for results in the two country programmes, while at the same time ensuring that programme implementers are empowered to act on new learning by effecting the necessary modification to the programme. To that end, the PBEA evaluability assessment concluded the PBEA had articulated good monitoring indicators at the activity and output levels, but had yet to articulate clear evaluation indicators at the outcome or goal levels (individual behavior change, policy implementation and social change)5. This DE offers an opportunity and space to propose what success will look like, in terms of outcomes that can be realized as well as ‘evaluation indicators’.

14. Considering also, that DE is a complex and time-intensive undertaking, a certain level of organizational/system “readiness to support learning” is required for this approach to be effective. It also requires sensitization at the country and regional offices to provide comprehensive understanding of the methodology. The following questions (adapted from Dozois et al., 2010) are useful in assessing the readiness of PBEA country programmes for DE. Is there a buy-in for developmental evaluation or champions in respective country programmes who can help cultivate buy-in?• Does the culture of programmes teams support learning – i.e., how do programme teams handle

failure? How do they handle feedback? Are they willing to take risks? • Is there clear recognition of resources and the effort that is required, and are there appropriate

resources?• Does leadership within programme teams understand the need for participatory

processes in the way in which decisions are made, or the need to alter power dynamics?• Are programme teams willing to adapt their structures (e.g., rules, routines, procedures) to

accommodate new ways of operation?• Are there any major issues that could interfere with the process (e.g., in-fighting among staff, an

unclear mandate, unstable financial support)?

14 In terms of readiness of PBEA implementing units to execute DE and get the full benefit of what it has to offer (or indeed the readiness of UNICEF as a whole), reconceptualization of evaluation is required to ensure that there is full understanding of what DE effort entails. The Evaluation Office and the Education Section will have to work with all levels to promote understanding of this approach and ensure buy-in and ownership. More importantly, while traditional evaluation practice is governed by distinctive notions of independence or objectivity, we need to negotiate a new set of working relationships and management arrangements characterized by objectivity and transparency in order to create an environment that enables meaningful learning.

5 PBEA Evaluability Assessment (para. 57). (http://www.unicef.org/evaldatabase/files/PBEA_Evaluability_Assessment_Final_Report_November_2013.pdf

11

Page 12: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

15 With regards to competencies required to execute the DE approach, DE literature suggests that while the skillset associated with traditional evaluation practice is definitely an asset, traditional evaluators may be less equipped to operate within the kind of ambiguity that is associated with innovation and emergent initiatives (Patton, 2008; Gamble, 2008; Dozois et al., 2010). Consequently, the skills set required of the evaluation team will be those deemed necessary in navigating complexity, namely some experience and/or leadership in the education and peacebuilding domain, strategic thinking, pattern recognition, relationship building. Strategic thinking is necessary to identify high level principles and purposes and cultivate actionable focus, while pattern recognition is essential in identifying categories for managing complexity.

16 On the issue of positioning of evaluators, lessons from DE practitioners indicate that each type of positioning presents advantages and disadvantages. Table 1 below, adapted from Dozois et al., 2010, highlights issues associated with the different options we have for positioning our evaluators, and implications for the PBEA. Consequently, the lead evaluator will be positioned externally. He/she will be required to make technical inputs on a regular basis, and to conduct two one-week missions to each of the two programme countries. The two support evaluators will be embedded with the country programme teams. However, thorough consideration should be given to this decision in terms of determining the suitable level/experience of embedded evaluators, their function and tasks, the likelihood to exercise their duties as well as the management arrangements6.

Table 1: Positioning of developmental evaluators

Positioning Advantages and disadvantages Implications for PBEADevelopmental Evaluator as internal staff member

Advantages: Internal staff members…• have tacit; insider knowledge that an outsider may

never acquire. • are situated where the action is, giving them

opportunities to observe the situation as it unfolds.• have established certain relationships; this can be a

significant advantage.

Initial proposals…• The two support evaluators should be

positioned as internal evaluators, even though they may be recruited from outside UNICEF.

• As internal staff members, support evaluators are expected to gain access to knowledge people, systems, to enhance their understanding of the PBEA

Disadvantages: Internal staff members…• often struggle with credibility as evaluators,

particularly if their previous roles, have not been focused specifically on evaluation.

• may be seen as less objective.• capacity to “speak truth to power” may be compro-

mised by their need to get along, or job security.• are usually expected to fulfill other roles, hence their

position as evaluators is sometimes diluted• may have adverse relationships or caught up in office

politics; this can be a significant disadvantage.

To be mitigated…• There could be issues with the credibility of

support evaluators if they are considered too junior in UNICEF terms (P3 or lower)

• Clear ToRs should be issued to ensure that support evaluators are assigned responsibilities and tasks that are directly linked to the evaluation

Developmental Evaluator as external consultant

Advantages: External consultants…• are generally considered to be more objective or

neutral; they are not party to internal politics• have more points of comparison to contribute, and a

broader network of information and influence to draw upon.

• can focus on the task more easier because they do not have other obligations within the organization.

Initial proposals…• The lead evaluator should be recruited as an

external independent consultant;• He/she should be positioned as an external

consultant to maintain objectivity and some measure of independence on the technical aspects of the DE.

• The lead evaluator is expected to gain sufficient insider knowledge through interaction with the two support evaluators.

Disadvantages: External consultants…• have less insider knowledge, sometimes a critical

requirement• have to work harder to build relationships and get

access to important pieces of information. • are generally more expensive, and the initiative may

have to limit contact time as a result.

6 Discussed in Section V of the terms of Reference for the Lead Evaluator.12

Page 13: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

IV. Scoping and executing developmental evaluation (methodology and other considerations)

17 Scope: As previously stated, the evaluation has the dual purpose of reporting on results, as well as to systematically capture the learning that can be infused into the programme to heighten its chances for success. Hence, the evaluation will provide comprehensive and evidence-based answers to two overarching questions, namely,

c. Are the programme impact pathways stipulated for the PBEA feasible/credible and likely to produce the intended results and/or outcomes? and,

d. What new learning and/or improvements were effected to improve attainment of results and/or outcomes?

Based on the question above, the evaluation will cover 1-2 key conflict drivers and corresponding programme outcomes. Additional monitoring will be provided for the selected outcomes. Evaluators will develop a learning framework (and/or theory of change) and criteria for weighing the evidence, to be validated with country teams. Country teams also be afforded the opportunity of develop sub-questions that are customized to the programming context.

18 DE primers have identified entry points, practices and tools that have been found useful in organizing a DE investigation (Dozois, 2010). Table 2 (adapted from Dozois, 2010) presents summarizes some conventional steps to consider in the execution of a DE, as well as how these might be interpreted in the context of the PBEA.

Table 2: Executing the DE: possible actions

Practice/Actions

Description Implications for PBEA

ENTRY POINTS/INITIAL ACTIONSOrienting the evaluation team

Evaluators undertake investigative work early in the initiative in order to build a deeper understanding of the identified problem/opportunity, resources, stakeholders, and broader context.

Time should be set aside for evaluators to review the PBEA proposal document, the global results framework, programme documents for the two selected countries, and the evaluability assessment report. They should also set up meetings with a range of stakeholders (Evaluation Office, PMT, TWG, regional focal points, and country teams).

Building relationships

The quality of relationships determines the degree to which the team can access information and influence change. For this reason, relationship building is critical to building a strong foundation for DE work.

The most important relationship will be that between the evaluators and the respective regional focal points and country teams. This should be managed carefully by the PBEA Programme Manager and Evaluation Manager (Evaluation Office).

Developing a learning framework

It is important to develop a learning framework early in the process. Co-created with key stakeholders, a learning framework helps to guide development by mapping key challenges and opportunities, high-lighting potential areas for learning, and identifying feedback mechanisms.

A learning framework is highly desirable in the case the PBEA. It will be the guiding document of the evaluation (and a contract of sorts between the evaluation team and the programme team).

Developing the rest of the methodology

Including the learning framework above, the methodology should articulate criteria to be used to asses evidence, delineate the scope of the evaluation, identify tools for collecting evidence.

OECD/DAC evaluation criteria should be reinterpreted for the PBEA. Criteria could also be expanded to reflect human right principles (such as participation, equality, empowerments, etc.). See Table 2.3 in UNEG Guidance Note (2011).

SOME CRITICAL PRACTICES FOR CONSIDERATIONOrienting the implementation team

A key part of a DE’s role is to help stakeholders sur-face and test their assumptions, articulate and refine their models, extend their understanding, and culti-vate a culture that supports learning. These activities help implementing teams to develop and maintain an adaptive orientation in complex and unknown territory.

Beginning with validating the learning framework mentioned above, orienting the implementing team should include ensuring that the team possesses a deep understanding of the conflict analysis. The country team (including the evaluator) should be on the same page on the most critical results for the programme, activate a strategy for incorporating new learning, and agree on roles and responsibilities.

Watching Evaluators carefully observe the unfolding situation in order to help the group identify leverage points, assess their efforts, and stay true to the core intent and principles of their initiative. Evaluators intentionally watch (1) Key developmental moments; (2) Group dynamics; (3) Structure; (4) Action/inaction; and (5) Threats and opportunities.

The core competencies outlined in Para. 16 are especially important to execute the DE in s coherent way, and organize emergent learning such that it is accessible to the whole country team. We expect that the lead evaluator will provide undertake initial coaching and preparation of the support evaluators to ensure that they are ready to execute the tasks.

Sense-making Sense-making is largely participatory in develop-mental evaluation: The evaluator’s role is to help the group identify patterns, integrate new information,

Evaluative skills are important here; so is reflective practice. It is important to come into the evaluation with a well-considered set of analytical tools, including innovating some

13

Page 14: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

and consider the implications of what they’re seeing and propose solutions.

tools if necessary.

Intervening As a member of the team, evaluators actively help to shape the work by (1) Asking questions; (2) Facilitating discussion; (3) Sourcing or providing information; (4) Modeling; (5) Pausing the action; (6) Reminding; and (7) Connecting.

This is the most important component of the DE. Strategic thinking and communication skills are required for this aspect. Also, this likely the aspect that has to be managed carefully in that the technical skills have to be complemented with the evaluator’s credibility.

19 The Lead Evaluator will be expected to build on some of these lessons, adapt and/or interpret them for the PBEA, or propose a completely different set of steps and/or methods to execute the developmental evaluation). Also, additional resources for building an evaluation methodology centered on reflective practice include guidance on evaluation peacebuilding interventions, (OECD/DAC, 2012), UNEG guidance (UNEG, 2011), and CDA working papers (Reimann, Chigas, and Woodrow, 2012; and Rogers, 2012).

V. Tasks, responsibilities, and management arrangements

20 The Lead Evaluator will have overall responsibility for the technical guidance of the DE and its quality. He/she will conceptualize and develop the DE approach, guide and supervise the technical inputs of support evaluators, be responsible for quality assurance of all outputs, including the final report of the evaluation. The Lead Evaluator will also be responsible for compiling progress reports (with inputs from support evaluators) and timely submission of deliverables to the evaluation manager in the Evaluation Office in NY.

21 The two support evaluators will be responsible for execution of the DE and its quality at the country level. They will also be responsible for development of a work plan for executing the evaluation in-country, assume the responsibility for data for monitoring progress of the PBEA outcomes that are the subject of the DE, and producing an agreed set of deliverables. In consultation with the PBEA Focal Point, Support Evaluators will also be responsible for reporting of progress and results to the Lead Evaluator and the Chief of Education in-country.

22 The Chief of Education in each participating country will co-facilitate the recruitment of the support evaluator and provide supervisory support in day-to-day execution of the evaluation activities. The Chief of Education and the PBEA Focal Point will sign off on all reports that are generated in-country before they are forwarded to the Lead Evaluator. The Deputy Representative and Senior Evaluation Specialist will be charged with the responsibility of adjudicating all disagreements.

23 The Evaluation Specialist (the evaluation manager in Evaluation Office, NY) will, in consultation with the PBEA Manager (and PBEA M & E Specialist, NY), recruit the Lead Evaluator, and co-facilitate the recruitment of the Support Evaluators. She will also be primarily responsible for coordinating, directing, and supervising all activities of the Lead Evaluator. She will also consult with PBEA M&E Specialist on a regular basis and provide update to the PBEA Programme Manager at agreed intervals, as well as approve all deliverables (See Appendix A, Section VI).

VI. Risk management and/or mitigation

24 The most critical risk for the evaluation pertains to balancing the need to produce results with readiness of different parts of UNICEF to execute a DE and get the full benefit of what it has to offer. To that end, careful consideration should be given to the following:i) Clear delineation of the role of the support evaluator: There is always the risk that the support

evaluator assumes the role of M&E officer for the entire education portfolio, or worst still, another pair of hands for the entire country office. The Evaluation Office and the two COs should come to an agreement that the role of support evaluator is primarily ‘evaluator’ who will assume the monitoring role only for PBEA activities.

ii) Management arrangements : Related to the point on the role of the support evaluator is the issue of who he/she reports to. In a typical independent evaluation, the evaluator would be supervised

14

Page 15: Web viewTitle. Terms of reference for Support Evaluator. The TOR makes reference to the Lead Evaluator. Two Support Evaluators recruited through this exercise will work with

by the evaluation manager (in Evaluation Office). We anticipate a protracted discussion on management arrangements, and suggest that the arrangements proposed in Section V above be negotiated between the Evaluation Office and the Country Office leadership, (involving the Chief of Education and the Deputy Representative, or higher).

VII. References

Dozois, E., Langlois M., & Blanchet-Cohen, N. (2010). DE 201: A practitioner’s guide to developmental evaluation. Montreal, Canada. J.W. McConnell Foundation and the International Institute for Child Rights and Development.

Gamble, J. (2008). A developmental evaluation primer. Montreal, Canada. J.W. McConnell Foundation.

OECD/DAC. (2012). Evaluating peacebuilding activities in setting of conflict and fragility: improving learning for results. Paris, OECD-DAC.

Patton, M. Q. (2008). Utilization-focused evaluation. (4th ed). CA: Sage Publications

Reimann, C. (2012). Evaluability assessments in peacebuilding programming. Working Papers on Program Review and Evaluation. #3: CDA, Cambridge, MA. CDA Collaborative Learning Projects.

Reimann, C., Chigas, D., & Woodrow, P. (2012). An alternative to formal evaluation of peacebuilding; programme quality assessments. Working Papers on Program Review and Evaluation. #4: CDA, Cambridge, MA. CDA Collaborative Learning Projects.

Rogers, M. (2012). Evaluating relevance peacebuilding programs. Working Papers on Program Review and Evaluation. #1: CDA, Cambridge, MA. CDA Collaborative Learning Projects.

UNEG (2011). Integrating human rights and gender equality in evaluation: towards UNEG guidance. United Nations Evaluation Group. New York, NY.

15