evaluation of results based managment in undpweb.undp.org/execbrd/pdf/rbm_evaluation.pdf · 2007....

150

Upload: others

Post on 13-Sep-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in
Page 2: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

Evaluation Office, December 2007United Nations Development Programme

ACHIEVINGRESULTS

RESULTS-BASED MANAGEMENT AT UNDP

EVALUATION OF

Page 3: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

Copyright © UNDP 2007, all rights reserved.Manufactured in the United States of America

Design: Suazion Inc. (NY, suazion.com) Production: A.K. Office Supplies (NY)

Team Leader Derek Poate, ITAD

International Consultants Paul BalogunMunhamo ChisvoRobert LaheyJohn Mayne

National Consultants Arcadie Barbaros ie (Maldova)Seheir Kansouh-Habib (Egypt)Farsidah Lubis (Indonesia)Kenneth Mwansa (Zambia)Oscar Yujnovsky (Argentina)

Evaluation Office Task Manager S. Nanthikesan

Advisory Panel Sulley Gariba, Former President, International Development Evaluation Association; Head, Institute for Policy Alternatives, Ghana

Peter Van der Knaap, Director of Policy Evaluation,Netherlands Court of Audit

Jehan Raheem, Professor, Brandeis University, Massachusetts,United States; former Founding Director Evaluation Office, UNDP

Odetta R. Ramsingh, Director-General, Public Service Commission,South Africa

Research Support Elizabeth K. Lang Nayma Qayum

Editor Margo Alderton

EVALUATION TEAM

EVALUATION OF RESULTS-BASED MANAGEMENT IN UNDP

Page 4: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

F O R E W O R D i

This report presents the assessment of anindependent evaluation conducted by the UNDPEvaluation Office of the adoption of results-based management as an approach. Results-based management was introduced in UNDP in1999 and considerable experience has beengained by the organization since its introduction.The Executive Board, recognizing the need totake stock and build on this experience, approvedin its annual session in 2006 the inclusion of thisevaluation in the Evaluation Office’s work plan.

The main purpose of this evaluation is toexamine the degree to which results-basedmanagement has fostered a results culture withinthe organization, enhanced capacity to makebetter management decisions, and strengthenedUNDP’s contribution to development results.The intent of this study is not to assess whetherresults-based management systems are in place inUNDP, as compliance on management systems isreviewed by audit. This evaluation was carefulnot to duplicate the extensive studies and reviewsof results-based management in UNDP carriedout both by the organization and by partners.The study did not, therefore, focus on how results-based management systems are used in reportingon UNDP’s performance to the Executive Boardor on the quality of the results frameworks usedor indicators selected. The core analysis of theevaluation is directed at how results-basedmanagement has been used to enhance UNDP’scontribution to development effectiveness.

The scope of the evaluation extended to allgeographic regions, covering country, regional,global and corporate levels of programming andorganizational work. UNDP’s experience in results-based management from 1999 to the present wascovered. Evidence for the assessment was drawnfrom case studies in five countries (Argentina,Egypt, Indonesia, Moldova and Zambia),interviews and focus-group discussions in UNDP

headquarters in New York, an electronic surveyof programme staff in country offices in allregions to which there were 365 responses, and adesk review of related evaluative literature.

UNDP was among the earliest UN organizationsto introduce results-based management and theevaluation finds that its experience has not divergedsignificantly from that of other public-sectoragencies. There are however, some specificchallenges that UNDP faces. The evaluationconcludes that UNDP is largely managing foroutputs rather than outcomes and that the linkagesbetween outputs and intended outcomes are notclearly articulated. The introduction of corporatesystems and tools, which have had some efficiencybenefits, have not, however, strengthened theculture of results in the organization or improvedprogrammatic focus at the country level. Thecurrent approach of defining and reporting againstcentrally defined outcomes tends to undermineUNDP’s responsiveness and alignment to nation-ally defined outcomes and priorities.

The evaluation makes a number of recommenda-tions to address these and other challenges. Thereare important recommendations relating todeveloping a culture of results at all levels throughstrong leadership, changes in incentives, investmentin capacity, streamlining systems, and investingand responding to performance audit and inevaluation. Stronger oversight by the regionalbureaux is recommended, moving from the currentemphasis on process compliance to a substantiveengagement with country offices on the contentof country programmes and their contribution todevelopment results. The evaluation recommendsa shift to a lighter corporate results framework inthe Strategic Plan that provides clear ‘boundaries’regarding priority areas defined by the ExecutiveBoard while allowing and enabling countryoffices to define development outcomes withnational partners, in line with national priorities

FOREWORD

Page 5: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

F O R E W O R Di i

as identified in the United Nations DevelopmentAssistance Framework (UNDAF).

We are very grateful to the Executive Boardmembers, governments and civil societyrepresentatives in the case-study countries whowere very generous with their time and ideas.I would like to express our particular gratitude to all the resident representatives, UNDP staffand members of the UN country teams in thecountries visited by the team, as well as thecolleagues in New York who provided vitalfeedback to the team to enable them to reachtheir conclusions.

The report is the result of the dedication andintense team work of a number of people. TheEvaluation Office is deeply grateful to the team that prepared the report. The team leader,Derek Poate, developed the methodology for theevaluation and led the drafting of the report.Other members of the team were Paul Balogun,Munhamo Chisvo and Robert Lahey, each ofwhom participated in one or more case-studymissions and contributed to the evolution of the main report. The team benefited from theinvolvement of John Mayne who has extensivelystudied results-based management in the UNsystem. The case studies were strengthened bythe work of national experts including ArcadieBarbarosie (Maldova), Seheir Kansouh-Habib(Egypt), Farsidah Lubis (Indonesia), KennethMwansa (Zambia) and Oscar Yujnovsky (Argentina).

The Evaluation Office invited independent expertsto join an advisory panel for the evaluation. Themembers of the panel were Sulley Gariba (former

President of the International DevelopmentEvaluation Association and Head of the Institutefor Policy Alternatives, Ghana), Peter Van derKnaap (Director of Policy Evaluation, NetherlandsCourt of Audit), Jehan Raheem (former Directorof the UNDP Evaluation Office and Professor atBrandeis University, United States) and OdettaR. Ramsingh (Director-General of the PublicService Commission,South Africa). The final reportbenefited from the comments and suggestions ofthe advisory panel.

The evaluation was ably task managed bySuppiramaniam Nanthikesan, Evaluation Advisor,in our office. Other colleagues in the office madeimportant contributions to the report includingEvaluation Advisor Oscar Garcia, Michelle Sywho handled administrative support, and AnishPradhan who provided information technologysupport to the electronic survey and to thepublication process. Research support was providedby Elizabeth K. Lang and Nayma Qayum.

As the report points out, results-based managementis a journey not a destination and results-basedmanagement is just one piece in the complex setof efforts directed at achieving intended results. Ihope that this evaluation will be useful in helpingUNDP chart a course in this journey to managebetter for results that enhance developmenteffectiveness in the countries where UNDP works.

Saraswathi MenonDirector, Evaluation Office

Page 6: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C O N T E N T S i i i

Acronyms and Abbreviations v

Executive Summary vii

1. Introduction 1

1.1 Methodology 11.2 The evolving aid context 41.3 What entails results-based management in UNDP? 51.4 Structure of the report 8

2. Results-based Management in UNDP 9

2.1 The evolution of results-based management in UNDP 92.2 Intended effects on the organization 11

2.2.1 Setting strategic goals 112.2.2 Aligning results 122.2.3 Aligning funding 142.2.4 Aligning human resources 152.2.5 Monitoring for results 162.2.6 Adjustment and learning 172.2.7 Accountability 172.2.8 Guidance and capacity building for results-based management 18

3. Findings from the Evaluation 21

3.1 Results orientation 213.1.1 Evolution of strategic goals and effects on the organization 213.1.2 Alignment with strategic goals at the country programme level 22

3.2 Managing for results 273.2.1 Monitoring at the country programme level 273.2.2 Adjustment of resources 293.2.3 Relations with the regional bureaux and substantive corporate oversight 30

3.3 Evaluation, learning and accountability 333.3.1 Role and use of evaluation 333.3.2 Learning and technical support 343.3.3 Accountability 353.3.4 Definition of clear roles and responsibilities (accountability relationship) 353.3.5 Clear performance expectations and reward systems (transparent incentive mechanism) 363.3.6 Credible and timely measurement and reporting of the results achieved (giving account) 363.3.7 Mechanism to hold to account 363.3.8 Accountability for outcomes 37

4. Discussion 39

4.1 Implications of the wider environment 394.2 A culture of results is being developed 404.3 Factors helping and hindering a culture of results 404.4 Staff survey on results-based management 424.5 Early gains, lost momentum 424.6 Managing for outcomes was correct but unsupported 444.7 Results-based management is more than tools and systems 46

CONTENTS

Page 7: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C O N T E N T Si v

5. Conclusions and Recommendations 47

5.1 Recommendations 50

Annexes

1. Terms of reference 532. People consulted 593. Evaluation team and advisory panel members 664. References 675. Tables 716. Results-based management in development organizations 877. Sources of funding to UNDP 978. Summary of results from the staff survey on results-based management 1099. Results-based management benchmarks assessment 11110. Findings from the country case studies 123

Boxes, Tables and Figures

Box 1. Results-based management is a challenge faced by other development partners 6Box 2. Managing for results in associated funds and programmes 10Box 3. Broad, permissive outcome statement 24Box 4. Making outcomes evaluable 24Box 5. Monitoring and evaluation system components 28Box 6. Office of Audit and Performance Review 32

Table 1. Key events in the development of results-based management 9Table 2. Prospective cultural changes in UNDP 13Table 3. Percentage of programmatic funding by source 14Table 4. Key features in implementation versus outcome monitoring 16Table 5. Key strategy documents and evolution of UNDP goals 22Table 6. Core fund allocation strategy 25Table 7. Targets in the Moldova Strategic Notes (2004-2007) 31Table 8. Factors supporting and holding back a results focus 41Table 9. Themes associated with managing for results 44

Figure 1. Theory of change for results-based management 2Figure 2. Force field analysis of factors for and against a results focus in UNDP Zambia 41

Page 8: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A C R O N Y M S A N D A B B R E V I A T I O N S v

ATLAS UNDP’s enterprise resource programme

BDP Bureau for Development Policy

CCA Common Country Assessment

CCF Country Cooperation Framework

CPAP Country Programme Action Plan

CPD Country Programme Document

M&E Monitoring and Evaluation

MDG Millennium Development Goal

MfDR Managing for Development Results

MYFF Multi-year Funding Framework

NHDR National Human Development Report

OECD-DAC Organisation for Economic Co-operation and Development-Development Assistance Committee

OSG Operations Support Group

PRINCE 2 Projects in Controlled Environments (a process-based method for effective project management)

RBA Regional Bureau for Africa

RBAP Regional Bureau for Asia and the Pacific

RBEC Regional Bureau for Europe and CIS

RBLAC Regional Bureau for Latin America and the Caribbean

RCA Results Competency Assessment

ROAR Results-oriented Annual Report

RSC Regional Support Centre

SURF Sub-regional Facility

UNCT UN Country Team

UNCDF UN Capital Development Fund

UNDAF UN Development Assistance Framework

UNFPA UN Population Fund

UNIFEM UN Development Fund for Women

UNV UN Volunteers

VDA Virtual Development Academy

ACRONYMS AND ABBREVIATIONS

Page 9: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in
Page 10: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

E X E C U T I V E S U M M A R Y v i i

INTRODUCTION

This report sets out the findings of an evaluation ofthe adoption and use of results-based management.The evaluation has focused on the organizationalstrategy, vision and expectations of the results-basedmanagement approach; the design, implementa-tion and use of the system to operationalize thisapproach; as well as the results of this effort. Thescope of the study covers the period 1999-2006,all geographic regions, and the adoption of results-based management at the programme, country,regional and corporate levels.

An inception report prepared by the evaluationteam proposed a theory-based methodology. Thetheory identifies a causal process with five keyelements. This five-stage process provides thestructure for enquiries:

n Setting a strategic framework to describe thedesired results

n Developing programmes aligned to thestrategic results framework

n Measurement and analysis of the actualresults achieved

n Use of information to improve design anddelivery of programmes

n Reporting on performance as part of theaccountability process

Information was gathered through interviewswith staff in UNDP and other United Nationsorganizations at headquarters; review of UNDPdocumentation and that of other organizations;visits to five country programmes (in Argentina,Egypt, Indonesia, Moldova and Zambia), whereinterviews were held with a wide range ofstakeholders in the United Nations system,government, civil society and other developmentpartners; and through an electronic survey ofUNDP staff in country offices. The choice ofcountries did not include conflict-affected

situations, a specification in the original terms ofreference, and this limits the ability to generalize,in the conclusions and recommendations, aboutthe use and impact of results-based managementin those contexts. A full list of people consultedin all locations is presented in the report.

THE EVOLVING DEVELOPMENT CONTEXT

The 1990s marked a shift from aid flows beingdetermined by geopolitical considerations tohaving a focus on the promotion of sustainablehuman development. That shift was accompaniedby a steady decline in official developmentassistance and increasing pressure to demonstratethe effectiveness of aid. UNDP adopted results-based management with the clear intent of reversingthe declining resource base, assuring predictabilityof programme funding and demonstrating aperformance focus to donors.

Since then, the role of UNDP has changed frommainly funding and implementing downstreamactivities to emphasizing upstream activitiesinvolving advocacy, policy support and capacitystrengthening, using national execution as thepredominant mode of delivering assistance. Thishas been reinforced by the changing environmentfor development cooperation. The increasingneed for country-based joint assistance strategies,as was emphasized by the Paris Declaration, hasled to new aid modalities, such as direct budgetsupport and sector-wide approaches.The decline inresources has been overcome, and programmaticservices delivered by UNDP increased from $2 billion in 1999 to $4.36 billion in 2005.

INTRODUCTION OF RESULTS-BASED MANAGEMENT

For results-based management to be successful,organizations need to develop and nurture aculture of results where enquiry, evidence and

EXECUTIVE SUMMARY

Page 11: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

E X E C U T I V E S U M M A R Yv i i i

learning are valued as essential to good manage-ment. The use of results information in managingand informing decision-making is usually seen as the main aim of introducing results-basedmanagement. Managers are expected to: under-stand why projects and other activities contributeto the outcomes sought—the theory of change, setmeaningful performance expectations, measureand analyze results, learn from this evidence toadjust delivery and modify or confirm programmedesign, and report on the performance achievedagainst expectations. When results-based manage-ment was introduced in UNDP, it was seen asinvolving all the features previously listed, andthe importance of building a ‘culture of results’was recognized.

In many respects, results-based management wasa logical continuation of management reformsfrom the 1990s. A large number of tools andsystems within UNDP are now associated withresults-based management, but these emergedwithout a guiding design over a 10-year period ofinnovation, redesign and change, which has attimes been unsettling for country office staff.

Results-based management is not without itscritics. In trying to set clear, concrete objectivesand targets, political scientists argue that results-based management can conflict with the need tokeep objectives sufficiently imprecise to gainwidespread support. Another criticism is thatmany of the developmental results sought byUNDP cannot easily be measured. As a result,results-based management forces measurementand reporting of less important results, especiallyoutputs. When an organization overemphasizesany set of performance indicators and targets,the staff tend to become preoccupied with thoseindicators and targets rather than the wider results.

It is important to take these concerns intoaccount when developing and, especially,managing the results-based management regime.

GOALS AND PROGRAMME FOCUS

One of the most visible elements of the results-based management approach was the adoption of

multi-year funding frameworks (MYFFs), whichhave strategic goals designed to help focus theprogramme and improve communication withexternal stakeholders. Alignment of countryprogrammes with strategic goals was furtherpromoted by a shift in focus from project outputsto outcomes. In a parallel initiative, a series ofAdministrator’s business plans introduced plansto change the culture of the organization and putforward the ‘balanced scorecard’ as a tool toreport against a broad range of physical andfinancial indicators of operational change.

This evaluation has found that the goals ofUNDP in the strategic frameworks have changedin presentation, but the underlying areas of work have remained almost the same as before.The focus areas under the goals have been ration-alized and simplified, but it is hard to identifysubstantive change to the scope of activities at thecountry level. Managers and staff in countryoffices believe that the MYFFs have helped tobring focus and improve positioning and advocacy.They have been a positive tool in conjunctionwith the reformed United Nations DevelopmentAssistance Framework (UNDAF) and countryprogramme document to foster dialogue aboutcountry results. However, their effect on countryportfolios has been limited to encouraging theremoval of ‘outlier’ activities. Projects have justbeen mapped to the new frameworks.

The reasons for these limited effects are several.The MYFF sub-goals and service lines were verygeneral and were not implemented with strictlymonitored ‘hard boundary’ rules about whatcould and could not be justified under them.Intended outcomes for country programmes alsotended to be vague, rather than rigorous statementsforcing consideration of what projects would bestcontribute to their achievement. In other words,outcomes were defined to cover the existingportfolio rather than drive decisions as to howthe portfolio should be refined and developed.

Results-based management assumes that managershave the flexibility to allocate programmeresources to maximize results. That assumptiondoes not generally hold true for UNDP. Core

Page 12: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

E X E C U T I V E S U M M A R Y i x

programme resources, the one area where managershave some financial flexibility, may have increasedduring the evaluation period, but core programmeresources have become a progressively smallerpercentage of total programme funds. Countryoffices have thus become more dependent onnon-core programme financing, which isrelatively inflexible, and often find that the coreprogramme resources are only sufficient to act asseed money.

Flexibility to change country office organiza-tional structures has the potential to foster closerlinkages to results. Development and investmentin thematic practice areas was intended to fosterskills. Twenty-five offices have appointedmonitoring and evaluation (M&E) specialistsand 10 have an M&E unit. There is evidence thatcountry offices organize their staff for delivery ofresults, with many structuring programme staffteams around broad programme areas. TheResults and Competency Assessment for staff hasthe potential to foster a results and accountabilityculture. But in practice, mobilization of resourcesand delivery are more powerful drivers of individualperformance among programme staff thanachievement of outcome level results. In its currentmodality, the Results and Competency Assessmentdoes not support managing for results.

MONITORING AND ADJUSTMENT

The results-oriented annual report wasintroduced to monitor country programmes, butit has become primarily a tool for reporting tosenior management—of little use for countryoffice or regional management—and is viewed asthe data source for the annual MYFF report tothe Executive Board. That report, in turn, iscriticized by Board members as being too vagueand containing little substantive detail on results.With the advent of results-based management,the focus of results shifted to outcomes, but apartfrom the results-oriented annual report, nospecific tools were developed to help monitorresults. The Atlas system is steadily gaining inimportance, but its primary focus is financialmanagement and project monitoring.

The removal, in 2002, of many mandatoryproject monitoring tools and procedures wasconsidered a risk, but it was done in the hope thatcountry offices would continue with projectmonitoring as well as moving into outcome-levelmonitoring. The evidence suggests that the movemay have led to a decline in project-level M&Ecapacity in some country offices, but it hasstimulated the creation of diverse M&Eapproaches in others—especially where there is astaff member dedicated to M&E. While someprogress has been made in country officestowards monitoring outcomes, approaches fail to explain how projects are contributing toprogramme outcomes.

Evidence from countries visited also shows thatcountry offices face challenges in definingoutcomes that are owned by other partners,although the UNDAF/country programmedocument/country programme action planprocess is a help, as are joint assistance strategies.This reflects the reality that many partnergovernments and donors have their own M&Esystems, which may or may not use results-basedmanagement. The clearest indicator of this is theabsence of joint outcome evaluations and thelimited use by UNDP of M&E evidence derivedfrom other systems of partners.

The oversight and management roles of theregional bureaux do not focus on results. Formalpoints of interaction with country offices areconcentrated in the development of new countryprogrammes, but the interaction focuses onprocess conformity rather than on substance.Thereafter, the engagement is generally driven byresource mobilization and delivery. Routineinteraction between country offices and therespective bureaux takes place through deskofficers in the regional bureaux, who are animportant link in understanding countryperformance but are overworked, junior in status,and often lack resources and technical skills.

In order to address shortcomings in the results-oriented annual report, the regional bureaux have approached information management indifferent ways, often drawing on the new

Page 13: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

E X E C U T I V E S U M M A R Yx

information tools. But the quantitative informa-tion in Atlas and other tools is not accompanied byqualitative information that explains performance.Some regional bureaux argue that the shift infocus to outcomes has not been matched by M&Etools to measure performance. The current systemis dominated by self-reporting through theresults-oriented annual report and the Resultsand Competency Assessment, without adequatemechanisms for periodic validations of quality.

Overall, the evaluation finds that there is littleevidence to indicate that results are being usedsystematically to inform adjustments to thecountry portfolios.

EVALUATION, LEARNING AND ACCOUNTABILITY

The adoption of results-based management in1999 was followed, in 2002, by the introductionof outcome-level evaluations. Adoption ofoutcome-level evaluations in the countries visitedhas been slow. In general, they have been under-budgeted and poorly timed to influence thecontent of country programme documents.Independent evaluation is an important elementof results-based management to validate perform-ance. Yet country evaluations conducted by theUNDP Evaluation Office have emphasizedlearning over accountability and have notmeasured performance against stated intentions.Validation occurs only through infrequent audits.However, the role of evaluation was strengthenedin 2006, when the Executive Board endorsed anew evaluation policy.

Little use is made of results for the purpose oflearning at the country level, and staff would liketo see more time allocated to this process. AtUNDP headquarters, the Bureau for DevelopmentPolicy’s (BDP) practice architecture is used fortechnical support, but this architecture has poorlinkages to build on lessons emerging fromevaluations and has produced few productsclearly tailored to business processes.

The UNDP accountability framework does notsupport results-based management. Roles and

responsibilities are generally clear, but countryprogramme outcomes and indicators are notsubject to quality assurance and there is littleindependent validation. Individual targets in theResults and Competency Assessment are self-selected and are often applied retrospectively andpoorly linked to incentives. Despite the intendedshift to managing for outcomes, individual staffremain tied to a project orientation and account-ability for outputs.

There is no evidence that the Resident Repre-sentative/Country Director is held accountable formanaging for outcomes, and there is considerablescepticism within UNDP over whether this isfeasible, despite evidence of moves towards such anapproach in sister organizations such as UNFPA.

MANAGING FOR RESULTS IN THE WIDER CONTEXT

UNDP works in a multilateral context, in whichits mandate emphasizes the centrality of nationalownership and the role of UNDP in buildingnational capacity. This implies the need to workthrough national systems, where feasible. UNDPprogrammes invest significant attention insupporting national systems for tracking progresstowards the Millennium Development Goals andother development results. But there is littleevidence of engagement with national planningand results/performance systems at the sector orthe programme level.

This suggests two important things. First,UNDP has missed opportunities to harmonizewith the results-based management approachesof national partner governments. This reducesthe scope for national capacity development andfor enhancing national ownership. Second,UNDP has not fully considered the implicationsof its results-based management approach onbroader United Nations reform initiatives such asthe Paris Declaration and the MonterreyConsensus, which emphasize closer alignmentand harmonization with governments anddevelopment partners.

Page 14: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

E X E C U T I V E S U M M A R Y x i

The shift in strategic direction and the newresults framework brought early gains to theprogramme through a clearer expression ofUNDP roles and functions. Staff have foundstrategic frameworks to be valuable in discussionswith governments and development partners.But the goals in the results frameworks for thetwo MYFFs and the draft strategic plan are toobroad to help UNDP focus support in areaswhere it has a comparative advantage. Thestrategic objectives have not been used to definewhat is and what is not appropriate for countryprogrammes to support.

Sharpening the focus in this regard would requirea change in the relationship between regionalbureaux and country offices. The focus wouldhave to shift away from oversight of processesand resources, to a greater emphasis on substan-tive content. The low level of core funding andhigh reliance on non-core funds by many countryprogrammes means that the managementspotlight needs to be on the degree to whichResident Representatives manage to refocus inthis situation.

In the inception report for this evaluation, a tableof benchmarks based on international performancestandards for results-based management wasintroduced as an evaluation tool. The reportassesses each benchmark, drawing on thefindings of the present study and relevantindependent studies. Unfortunately, in theirpresent form, it is not possible to use thebenchmarks to assess the status of UNDPrelative to other United Nations organizations orto the wider population of public-sector organi-zations that use results-based managementapproaches. But progress has been made in mostareas. Of the 21 categories assessed, two areassessed as ‘fully achieved’, 16 as ‘partiallyachieved’, and three as ‘not achieved’. The largenumber of benchmarks assessed as ‘partiallyachieved’ reflects the positive work of UNDPtowards creating an architecture to manage forresults, but it also reflects that too many elementsof the approach are not functioning effectively.

CONCLUSIONS

Overall, UNDP has established a cycle of settingand revising corporate goals, introduced improvedoffice systems to manage project finances,institutionalized the need to report on corporateand individual performance, and raised awarenessabout results throughout the organization.

Conclusion 1: The experience of UNDP withintroducing results-based management issimilar to that of other organizations.

UNDP was one of the first United Nationsorganizations to move to a results-based manage-ment approach. Review of the literature stronglysuggests that the experience of UNDP did notdiverge significantly from that of many otherpublic-sector organizations.The present evaluationidentified a number of areas where greaterprogress could have been made, but even underperfect conditions, it is unlikely that UNDPcould have fully institutionalized a results-basedmanagement approach within eight years.Subsequent conclusions and the recommendationstherefore focus on the key challenges for UNDP.

Conclusion 2: UNDP has a weak culture of results.

Adopting results-based management was alogical continuation of the management reformsthat occurred in the 1990s and a response topressure to improve performance across theUnited Nations as a whole. Significant progresshas been made on a number of fronts, sensitizingstaff to results and creating the tools to enable afast and efficient flow of information. Managingfor results has proved harder to achieve. In partic-ular, the strong emphasis on resource mobilizationand delivery; a culture that does not support risk-taking; systems that do not provide adequateinformation at the country programme level; alack of clear lines of accountability; and a lack ofa staff incentive structure all work againstbuilding a strong culture of results.

Page 15: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

E X E C U T I V E S U M M A R Yx i i

Conclusion 3: The corporatist approach hashad only a limited effect on developmenteffectiveness at the country level.

UNDP adopted a systems approach to stimulatemanaging for results, which meant that changewas to be driven by the implementation ofcentrally designed and prescribed systems. Thesewere developed primarily to enable aggregatereporting of UNDP performance to theExecutive Board while at the same time creatinga clearer focus for the programme.

In practice, the corporate goals and service linesset by headquarters have proved too numerous,with very permissive definitions. This has led tocountry offices manipulating their programmesto fit into corporate service lines, divertedattention away from country needs and madereporting to the Executive Board more aboutprocess than substance. There is little evidencethat this approach has significantly affected theshape of country-level programmes, but there issignificant evidence that it has imposed unnecessarytransaction costs at the country level. A notableomission is the lack of oversight systems thatfocus on tracking whether programmes use resultsto adjust resources such as people, money andpartnerships in order to improve future results.

Conclusion 4: Results-based management hasbeen misinterpreted as not supporting thedecentralized way in which UNDP works.

UNDP works in a strongly decentralized way,yet the results frameworks in the MYFF were not geared to country processes. Emerging new systems following the reform of countryprogrammes, including the UNDAF, the country programme document and the countryprogramme action plan have the potential tocreate objectives for United Nations organiza-tions that are aligned with national plans andresponsive to country needs.

Decentralization has been accompanied bydelegation of authority over the countryprogramme. Under current procedures, country

programmes are not scrutinized for developmentpotential by regional management, an abdicationof responsibility. As a result, evaluation andauditing are the only means to check that countryprogrammes are contributing to corporate goals.

The ‘top-down’ approach has inadvertentlyfuelled concerns that having corporate goals is ameans of imposing upon programmes at thecountry level. The role of results-based manage-ment is not to constrain the ways in whichprogrammes are negotiated at the country levelbut to provide a framework, so that UNDP workswithin its mandate and ensures that resources arealigned with achieving results. Once programmesare agreed upon at the country level, results-based management should provide standards fordialogue about how to set realistic outcomes,select objective indicators that demonstrateprogress towards development objectives, andjointly monitor progress.

Conclusion 5: Results-based management systemsare not helping build a results-based culture.

There are strong perceptions within UNDP thatfinancial administration and management systemshave improved. However, there is little evidencethat these systems have led to an increased focus onmanaging for outcomes. ATLAS and PRINCE2both deal with information at the project level,and the project is at the core of their designs. TheResults and Competency Assessment does noteffectively incorporate key results that reflectsuccessful management for results by individuals.There are also concerns that systems havebecome too complex and time-consuming.

Results systems have been designed mainly tomeet the demand for data for reporting to theExecutive Board rather than to manageoutcomes. Yet UNDP has not developed a systemfor reporting on its contribution towardsdevelopment results. This reflects a number ofissues. The corporate-level results frameworkshave never included high-level goals withsubstantive measurable and agreed-upon indica-tors against which to assess global progress. The

Page 16: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

E X E C U T I V E S U M M A R Y x i i i

goal-level reporting by UNDP contrasts starklywith the objectivity of reporting against theMillennium Development Goals. UNDP hasdeveloped a reporting system that aggregateswhether or not results will be delivered whenexpected. This approach has limitations: thecountry programme outcomes against whichUNDP is expected to deliver are poorly defined;the logic linking outputs delivered by UNDPwith achievement of the outcomes is often notexplained; and therefore, this reporting systemfails to report on UNDP performance relative towhat it is accountable for.

Conclusion 6: Managing for results requiresleadership.

The importance of leadership to drive results-based management forward has been noted inseveral parts of the present report. A goodexample of effective leadership was the role of theprevious Administrator in fighting decliningresources. A strong personal commitment wassupported by: a single, simple and consistentmessage on resource mobilization that wascommunicated to both internal and externalaudiences; development of systems to track,measure and report managers’ success in mobilizingresources; and a clear perceived link betweensuccessful resource mobilization and advance-ment within the organization.

The same drive and visible, consistent senior-levelsupport is needed for results-based management.Four relationships stand out as the most critical:at the Executive Board, to ensure the programmeis held to account for development results;between the Administrator or AssociateAdministrator and the Bureau Directors; betweenDirectors of Regional Bureaux and ResidentRepresentatives or Country Directors; andbetween Resident Representatives or CountryDirectors and staff within country offices.

RECOMMENDATIONS

Managing for results is a dynamic process, andmany of the issues raised in this report are known

to UNDP management and are receivingattention. There is genuine interest and supportat the country level for a better focus on results.

Recommendation 1: Strengthen leadership and direction.

The first and overarching recommendationaddresses the need to capitalize on what has beenachieved to date and establish a stronger cultureof results. The success of this is not dependentupon tools and systems, but leadership and direction.Sustained commitment by top management, theAdministrator and the Associate Administratoris required.

Strong leadership is necessary. Attention to thefollowing issues is also needed: a shift in theaccountability framework from process andcompliance to results; outspoken commitment bysenior management, especially the Directors of Regional Bureaux; a change in dialoguethroughout the organization that prioritizesmanagement for development results andaddresses how this will be balanced againstcompeting demands such as resource mobilization;time and space for staff to give feedback on andlearn from experiences; a shift in organizationalpractices to take risks and manage for outcomesrather than outputs; and improved capacity tomeasure results.

Recommendation 2: Global goals, localsolutions—Sharpen the role of the strategicresults framework.

Management should adopt a results frameworkthat distinguishes more clearly between corporategoals and country programme outcomes.

For the four UNDP focus areas, objectives shouldbe based on the key results areas, with indicatorsof substantive development change comparableto those used for the Millennium DevelopmentGoals. The corporate key results areas contain thebasis of what could be measurable goal-levelobjectives, for example: promoting inclusivegrowth; promoting gender equality; fostering

Page 17: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

E X E C U T I V E S U M M A R Yx i v

inclusive participation (in governance); andempowering the poor, women and youth. Thisapproach will take time. The Executive Boardand UNDP should start with those key resultsareas where internationally agreed-upon indica-tors already exist.

Identifying and reporting on UNDP contribu-tions should not be an obstacle, any more than itis for organizations reporting country progressagainst the Millennium Development Goals.The development of robust models that show thelinks between country programme outcomes andUNDP contributions with achievement of thesehigh-level objectives is key.

The current practice of setting corporateoutcome-level objectives and indicators withinthe strategic plan should end. Instead, outcomeobjectives and indicators should be set at thecountry programme level, where they should belinked to UNDAF outcome objectives in thecontext of agreed-upon national developmentobjectives. Comparable outcome objectivesshould be set within the regional and globalprogrammes.

The above change would reinforce the decentral-ized nature of UNDP activities and build onUnited Nations reforms. The change would haveto be supported by a shift in the oversight roles of the regional bureaux, senior management andthe Executive Board away from compliance with procedures towards ensuring that countryprogrammes implement robust, results-basedmanagement approaches and are designed tocontribute to the UNDP focus areas.

Recommendation 3: Support managing foroutcomes at country offices.

Managing for outcomes means that managerslearn from results and empirical evidence and usethat evidence to adjust either the projects undertheir control or the composition of the portfolioof projects to maximize the contribution of theorganization to that outcome.

Implementing such an approach requires thatUNDP consider the wider environment at thecountry level when defining outcomes. There is aneed for improved guidance on how to balancedemands on the results-based managementsystem to meet internal UNDP needs with thoseimposed by the wider environment within whichUNDP operates at the programmatic level. Thisincludes dealing with three core issues raised inthis report:

n Ownership of results at the country level

n The implications of harmonizing otherpartners’ results-based management approachesand systems

n UNDP accountability for managing for results

The positive effects of some of the newlydeveloped UNDP systems are noted above, withthe caution that they are based predominantly on managing projects. Introduction of newmanagement and reporting systems will imposesignificant costs on country programme teams,and the country-level perception is that there hasbeen insufficient appreciation at the corporatelevel of the impact of these costs.

Country offices want to be effective and needsupport in several ways:

n A streamlining of systems, aiming for a moreuser-friendly integrated approach with betterprioritization and introduction of newrequirements across the organization

n Improved practical tools and guidelines toplan how projects will contribute toprogramme outcomes and to improve thespecification of indicators

n A large-scale capacity-development programmeto improve staff knowledge and skills

n Improved design of programmes based onproven models of intervention that can betailored to country circumstances, managed,monitored and evaluated

n Introduction of quality assurance to examinecountry programmes and assess evaluability

Page 18: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

E X E C U T I V E S U M M A R Y x v

n Expanded use of country office outcomeevaluation plans geared to joint evaluationswith government and development partners

n Revision of the results-oriented annualreport to improve the evidence-base andstructure of the report

Recommendation 4: Expand investment anduse of evaluation and performance audit.

Improving country programmes requiresattention to detail and development of soundobjectives and indicators. A quality assuranceprocess is recommended as an ex ante way ofscrutinizing country programmes. This needs tobe supported by independent review of processesand compliance, along the lines of the currentenhanced audits.

The structure of results proposed here placesmore responsibility on country offices to develop

programmes that respond to country needs andcontribute towards global goals. It also frees themfrom having to fit into centrally determinedservice lines. The test, therefore, is whether theprogrammes that are developed contribute to thegoals of UNDP. This will require a strongerevaluation function that addresses both learningand accountability. The 2006 evaluation policy isa step in the right direction. The challenge now isimplementation that supports accountability andthe new results management guidance.

The above recommendations are intended to bemutually reinforcing and ought to be viewed as awhole. Some recommendations focus on overallframework rather than specific tools or issues.Dealing with leadership, the results framework,programme focus and accountability of theregional bureaux are the highest priority, followedby tools to help country offices chart contribu-tions to outcomes, and quality assurance systemsfor programme review.

Page 19: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in
Page 20: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 1 . I N T R O D U C T I O N 1

This report sets out the findings of an evaluationof the adoption and use of results-based manage-ment. UNDP adopted results-based managementin 1999. There is considerable organizationalinterest in reviewing UNDP’s experience asevidenced by the number of reviews beingconducted on related issues.1 In addition, recentevaluations conducted by the UNDP EvaluationOffice have highlighted persistent issues andgaps related to the operation of results-basedmanagement systems and approaches at thecountry office level.

The Concept Note for the evaluation states thatthe evaluation will focus on the organizationalstrategy, vision and expectations of the results-based management approach; the design, imple-mentation and use of the system to operationalizethis approach; as well as the results of this manage-ment approach.2 In doing so, the evaluation aimsto provide feedback on UNDP’s efforts to strengthenthe existing results-based management practicesand make forward-looking recommendations.The evaluation will not examine the tools ofresults-based management in any detail as thesehave been reviewed in other studies.

The scope of the study is quite broad. It coversthe period 1999 to 2006, all geographic regions,and the adoption of results-based management atthe programme, country, regional and corporatelevels. In the choice of countries that were visited

for this evaluation,3 this study has assessed theresults-based management approach under diversedevelopment conditions in which UNDPfunctions—including countries with very highaid dependence and varying capacities for M&E.This review also takes note of any specificcharacteristics peculiar to the specialized fundsfor UN Volunteers (UNV), UN DevelopmentFund for Women (UNIFEM) and the UNCapital Development Fund (UNCDF).4

1.1 METHODOLOGY

An Inception Report prepared by the evaluationteam proposed a theory-based methodology anddata collection using three research methodologies:5

n A desk review of relevant secondary material

n Five case studies of countries and theirregional bureaux

n Interviews with staff in headquarters and asurvey of staff in all country offices to gatherdata on issues emerging from the desk reviewand pilot-country case study

The theory of change for results-based manage-ment was derived from literature on theintroduction of results-based managementsystems and is presented diagrammatically inFigure 1.6 It identifies a causal process with fivekey elements. This five-stage process provides thestructure for enquiries.

Chapter 1

INTRODUCTION

1 MSI, ‘Background Paper on Results-Based Management in Development Organizations’, Management SystemsInternational, July 2006, p 6; Danish Ministry of Foreign Affairs, ‘Assessing Results Management at UNDP’, 15 June2006; UNDP Administrator’s Office, ‘Management and Workflow Review—Phase I’, January 2006.

2 UNDP, ‘Concept Note: Evaluation of Result Based Management at UNDP’, UNDP Evaluation Office, New York, NY,February 2007. See also the Terms of reference in Annex 1.

3 Argentina, Egypt (pilot country), Indonesia, Moldova, and Zambia.4 All three were added to the scope during the pilot country visit.5 Poate D, Balogun P, ‘Evaluation of Result Based Management at UNDP’, Draft Inception Report, March 2007.6 ‘RBM in UNDP: Overview and General Principles’, not dated.

Page 21: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 1 . I N T R O D U C T I O N2

1. Set out a strategic framework that describesthe objectives and desired results of theorganization and the strategies to be used toachieve those results.

2. Develop programmes and sub-programmesin the organization aligned to the strategicresults framework, showing more specifics onthe results expected—resources, outputs, and

Figure 1. Theory of change for results-based management

Programmeorientation

Primary institutionaleffects

Secondaryinstitutionaleffects

Goal

Managing for Results

Improved Performance:organizational & development

effectiveness

Institutionalincentives

Externalfactors

and risks

Externalfactors

and risks

Organizationallearning

Set strategic goals for UNDP

Goals create a framework to defineprogrammes, organizational structure

and management arrangements

Plan specific results to achieve the strategic goals

Align resources (people, money and partnerships) to achieve results

Monitor implementation for progress and performance

Timely informationgoes to

decision-makers

Knowledgeable interpretation

Efficient adjustment of resources (people, money and partnerships)

to achieve results

Evaluate strategicgoals

Decision-makerresponds

Page 22: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 1 . I N T R O D U C T I O N 3

the logic, sequence and timing of outcomesexpected to lead to the accomplishment ofthe programme objectives—and how theresults are to be measured.

3. Measure and analyze results achieved and thecontribution being made by the programmeto the expected results through both ongoingmonitoring and periodic evaluations.

4. Use the results information gathered to improvethe design and delivery of programmes.

5. Report on the levels of performance achievedas part of the accountability process.

The diagram follows the key principles forresults-based management at UNDP. The stagesin the diagram identify key effects, starting witha clearer orientation of the programme as awhole, followed by realignment of resourcestowards results, efficient adjustment of resourcesand links with knowledge and institutionallearning. An over-arching condition is having anorganizational climate or culture that encouragesmanaging for results. This issue is developedfurther with practical examples in Annex 6.

The implicit goal is improved performance(interpreted here as organizational effectivenessand contribution to development effectiveness).Details of the pathway by which results-basedmanagement processes that improve managementdecision making and stimulate developmenteffectiveness are not clearly specified in theliterature. This is methodologically challengingcompared with the situation in many otheragencies, since UNDP programmes are usuallyimplemented by development partners who, inmanagement terms, are at arm’s length fromUNDP and UNDP’s contributions are oftensmall in scale, and work through policy adviceand advocacy, as well as money.

Evaluating results-based management systemsand processes within UNDP poses two signifi-cant challenges:

n At present, there are no internationally agreed-upon standards that define what should be

included within a results-based managementsystem and how such components shouldoperate in practice.

n Initial discussions with headquarters stake-holders and review of discussion on thePractice Networks indicate that there is not aconsensus within UNDP on what theresults-based management system is, how itshould operate and the intended effects.

The evaluation team therefore defined a set ofbenchmarks, drawing on three sources ofperformance expectations, to assess whethersystems are in place and the expected processesare being used in their operation. First, theevaluation team adopted the benchmarksdeveloped during the UN Joint Inspection Unit’sreview of results-based management within theUnited Nations in 2004. These benchmarksprimarily focus on the existence and operation ofrelevant systems. Second, the team developedbenchmarks building on material presented inthe Joint Venture on Managing for DevelopmentResult’s Source Book. These benchmarks, whichare less precise in nature, mostly focus on assessingwhether systems are applied using a soundapproach. Third, the team developed additionalbenchmarks using the objectives set out in theParis Declaration on Aid Effectiveness, dealingprimarily with issues such as use of countrysystems, alignment and harmonization. Theseprovide a more forward-looking orientation to helpground the evaluation in the evolving contextwithin which UNDP operates. The benchmarksare set out in Annex 9 together with findingsfrom the evaluation, as discussed in Chapter 5.

Extensive consideration was given to theselection of case-study countries. A purposiveapproach was adopted, dictated by looking forcases that would provide evidence to test againstthe theory, rather than using a randomizedsampling approach. Specific criteria that wereassessed include:

n Countries where key informants believe thatthe office has used results-based managementapproaches to some extent. All managers will

Page 23: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 1 . I N T R O D U C T I O N4

use information to make decisions, but thefocus of the evaluation is on the extent towhich information from the results-basedmanagement systems is used.

n Size of the UNDP programme—previousanalyses and interviews with UNDPHeadquarter staff confirm that the size of theprogramme dictates whether or notsomebody will be dedicated full time withinthe UNDP country office to ensure theimplementation and smooth functioning ofthe results-based management systems.

n A broad range of development context (suchas high aid dependence and level of income),and taking account of audit reports on theprocesses followed.

n Representation across the regional bureaux,on the assumption that the managementstyles across the regional bureaux vary andthis may have affected how results-basedmanagement has been institutionalized.

There was neither time nor enough informationto select case studies that could be considered a‘representative sample’ of the entire population.7

Therefore, following case-study theory, in orderto generalize from the findings of the case studiesand avoid bias, the evaluation used the evaluationfindings to generalize from the theory underpin-ning results-based management and as outlinedin the Theory of Change. The unit of analysis forthe case studies was the country programme andthe relevant regional bureau. The final countriesselected were: Argentina, Egypt, Indonesia,Moldova and Zambia. This selection allowed fora better understanding of how decision making isaffected by results-based management across thechain of accountability within the organization.However, the evaluation did not examine results-based management in conflict-affected situations,a specification in the original terms of reference,which limits the ability to generalize in the

conclusions and recommendations about the use and impact of results-based management insuch contexts.

The primary data collection methodologies usedwere individual interviews with stakeholdersboth within and outside of UNDP and the use of group exercises to elicit country office views in key issues. Annex 2 is a list of all peopleinterviewed. This was supplemented by ananalysis at headquarters of how results-basedmanagement systems were developed, how that development was coordinated with thedevelopment of other key systems (mainly thehuman resources, knowledge management andfinancial allocation and monitoring systems), andhow their implementation was supported acrossthe organization.

Additional data was collected via an electronicsurvey of all management and programming staffin country offices (excluding the five countriesvisited). The survey was similar in style to theregular global staff survey—asking respondents ifthey agree or disagree with a series of statementsabout how results-based management functionsand the culture of results. A full list of the questionsappears in Annex 8 together with a summary ofresults. The questionnaire was sent to approxi-mately 1,700 staff; 365 replied.8

1.2 THE EVOLVING AID CONTEXT

In recent years within the international developmentcommunity, there has been enhanced cooperation toreduce poverty and work to increase developmenteffectiveness. The MDGs have set new standardsfor multilateral organizations, donors and partnercountries. For at least a decade, the internationalcommunity has been developing partnershipapproaches to development assistance, such assector approaches and poverty reduction strategies.Some of these approaches, such as general budget

7 Yin RK, ‘Case Study Research: Design and Methods’, 3rd Edition, Applied Social Research Methods Series, Volume 5,Sage Publications, 2003.

8 The anonymity of the survey system means that details of who did or did not reply are not known.

Page 24: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 1 . I N T R O D U C T I O N 5

support, have been the subject of intense debateregarding their effectiveness. Aid effectivenesshas also been a subject of increased attention overthe last few years, with agreements to worktowards better harmonization, alignment andresults. At the same time, policy makers havetried to reach beyond development assistance toconsider the impact of other policy measures andprivate-sector activities, especially the impact offoreign direct investment and remittances.

The 1990s saw a shift in the thinking andpractice of development cooperation from aidflows being determined by national strategicconsiderations to a focus on the promotion ofsustainable human development.9 This shift also resulted in a steady decline in officialdevelopment assistance and increasing pressurefrom the public in donor countries todemonstrate effectiveness of aid. In response,some bilateral organizations (led by USAID),and the public sector in some donor countries(such as Australia, Canada and New Zealand)began to adopt a results-based managementapproach that was widely used in the privatesector. To reverse the declining resource base,assure predictability of programme funding anddemonstrate performance focus to the donors,UNDP also adopted results-based management.

Since adopting this new management approach,UNDP has had to deal with both quantitativeand qualitative changes affecting the aid environ-ment, including:

n Expanded operational capacity—Programmaticservices delivered around the globe byUNDP increased from USD 2 billion in1999 to USD 4.36 billion in 2005.

n The changing role of UNDP—The organiza-tion has shifted from mainly funding andimplementing downstream activities toemphasizing upstream activities involving

advocacy, policy support and capacitystrengthening, and adopting NationalExecution as a predominant mode of deliver-ing assistance.

n The changing environment for developmentcooperation—Increasing the need forcountry-based joint assistance strategies asemphasized by the Paris Declaration has alsoled to new aid modalities, such as directbudget support and sector wide approaches.This environment reinforces the shift toupstream activities.

1.3 WHAT ENTAILS RESULTS-BASEDMANAGEMENT IN UNDP?

Different organizations define results-basedmanagement in different ways, yet there is astrong common denominator among definitions.All reflect the underlying idea of learning fromempirical evidence based on past experience andusing that information to manage. TheOrganisation for Economic Co-operation andDevelopment-Development Assistance Committee(OECD–DAC) Managing for DevelopmentResults Sourcebook10 puts it well:

“Results-based management asks managers toregularly think through the extent to which theirimplementation activities and outputs have areasonable probability of attaining the outcomesdesired, and to make continuous adjustments asneeded to ensure that outcomes are achieved.”

For results-based management to be successful,organizations need to develop and nurture aculture of results where enquiry, evidence andlearning are considered essential to goodmanagement. The use of results information inmanaging is usually seen as the main aim ofintroducing results-based management. In results-based management, managers are expected to:

9 UNDP, ‘The Multi Year Funding Framework Report by the Administrator’, UNDP Executive Board, DP/1999/CRP 4,January Session 1999.

10 OECD and World Bank, Emerging Good Practice in Managing for Development Results, First Issue, Source Book, 2006,p 9, available online at http://www.oecd.org/dataoecd/35/10/36853468.pdf.

Page 25: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 1 . I N T R O D U C T I O N6

n Understand why the programme and projectsare believed to contribute to the outcomessought—the theory of change.

n Set meaningful performance expectations/targets for key results (outputs and outcomes).

n Measure and analyze results and assess thecontribution being made by the programmeto the observed outcomes/impact.

n Deliberately learn from this evidence andanalysis to adjust delivery and, periodically,modify or confirm programme design.

n Report on the performance achieved againstexpectations—outcomes accomplished andthe contribution being made by theprogramme, i.e. what difference it is making.

When results-based management was introducedin UNDP, it was seen as involving all the abovefeatures, and the importance of a culture ofresults was well recognized. This is evident in aseries of notes produced in 2000.11 A longertreatment of these issues is given in Annex 6.

An important aspect of UNDP is the nature ofworking in a multilateral and decentralized

setting, where work is planned and managed atthe country level in response to diverse countryneeds yet done so within a global corporateenvironment that provides technical andmanagement support. Arrangements need to bemade so that planned results clearly respond tocountry needs yet are focused within a corporateframework that enables UNDP to add valuewithin its areas of competence. The decision onthose areas of competence is an issue for theExecutive Board, not an element of the results-based management system.

The challenges in implementing results-basedmanagement in an organization are many (seeBox 1).12 Perhaps key is, as the UNDP Overviewand Principles document noted,13 the importanceof emphasizing ‘management and learning’ overreporting and systems, in order to foster a ‘cultureof performance’. Developing results frameworks,measuring results and reporting results in anorganization clearly will involve systems. If aculture of performance can be developed, thenthe main purpose of results-based managementwill not be lost. But without strong efforts todevelop and support such a culture, the systemsbecome the dominant feature. Senior managers

The World Bank 2006 Annual Report on Operations Evaluation reviewed progress with managing forresults.14 The report found:

n The World Bank has instituted policies and procedures to manage better for results.

n These have not yet translated into improved practices at the operational level.

n World Bank managers and operational staff struggle to link goals to operations.

n Performance indicators are often inadequate.

n Many staff are unclear about how to use performance information in their day-to-day work.

n World Bank culture acts as a disincentive to managing for results.

These findings resonate with the challenges faced by UNDP, described in this report.

Box 1. Results-based management is a challenge faced by other development partners

11 UNDP, ‘RBM in UNDP: Overview and Principles’, 2000; UNDP, ‘RBM in UNDP: Technical Notes’, 2000; UNDP,‘RBM: Concepts and Methodology’, 2000.

12 See for example Mayne J, ‘Challenges and Lessons in Implementing Results-Based Management’, Evaluation, 2007,Volume 13, Issue 1, 89-107.

13 ‘RBM in UNDP: Overview and General Principles’, p 6, available online at http://www.undp.org/eo/documents/methodology/rbm/RBM-Overview-GP.doc.

14 World Bank, ‘Annual Report on Operations Evaluation’, IEG, 2006. Quote from website, www.worldbank.org/IEG.

Page 26: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 1 . I N T R O D U C T I O N 7

have a special role to play in fostering this climateof results through clear leadership anddemonstrating that results and results manage-ment do matter. Reviews of experience withresults-based management show that:

“Leadership support for results-based managementreforms is important. Without strong advocacyfrom senior managers, results-based manage-ment systems are unlikely to be institutionalizedbroadly or effectively within an agency. Leaderscan send strong messages of support for results-based management to their staff by giving speeches,sending out agency-wide notices, participating inresults-based management-oriented workshops,providing adequate budgetary support, etc.” 15

Not everyone considers results-based managementa good system. Critics of using results-basedmanagement in public management point to anumber of aspects of public-sector life thatmitigate against a rational approach to managing.A case is often made that trying to manage bynumbers in a political context is, at best, unreal-istic and can be dysfunctional. In trying to setclear and concrete objectives and targets, politicalscientists argue that results-based management canconflict with the need to keep objectives suitablyfuzzy in order to gain widespread support.

This is true to some extent, but in the end,UNDP has to make choices about fundingspecific programmes consistent with nationalpriorities. Clarity in objectives can only help theirdesign and delivery.

Other critics of results-based management arguethat many of the developmental results sought byUNDP and other public-sector organizationscannot be measured. As a result, results-basedmanagement forces measurement and reportingof other less important results, especially outputs.

But many, if not most, results sought can bemeasured, especially if the evaluation teamconsiders measurement in the public sector to bea means of reducing the uncertainty about whatis happening, rather than definitively provingsomething. Flexibility in measurement approacheswould allow a wide variety of means to be used toincrease understanding about the performance ofa programme from different perspectives.

Focusing on any set of performance indicatorscan distort behaviour as people work to reachtargets. Arguably, this is a characteristic of thedrive for resource mobilization and delivery inUNDP. While this is a real problem, there aremany ways to counter this tendency, such asfocusing on outcomes not outputs, reviewingmeasures regularly, using a balancing set ofindicators, and developing indicators in aninclusive manner.

There are legitimate concerns over results-basedmanagement, and organizations should be awareof the possible downsides of implementingresults-based management. More details andspecific references are given in Annex 6. It isimportant to take these concerns into account indeveloping and managing the results-basedmanagement regime.

The Overview and Principle document foresawthis potential problem when it noted that “RBM [results-based management] is a learningprocess … evolving over a considerable period of time and incorporating flexibility to makechanges as experiences are gained.”16 Results-based management is at the same time conceptuallyquite simple—seeking and using results informationto assist management and accountability—yet asignificant challenge to implement in organiza-tions since it does require culture change andpersistence. It is a journey, not a destination,requiring ongoing attention and commitment.

15 Binnendijk A, ‘Results-Based Management in the Development Cooperation Agencies: A Review of Experience’,Background Report, DAC OECD Working Party on Aid Evaluation, Paris, France, 2001, p 134. Available online athttp://www.oecd.org/dataoecd/17/1/1886527.pdf.

16 ‘RBM in UNDP: Overview and General Principles’, p 5, available online at http://www.undp.org/eo/documents/methodology/rbm/RBM-Overview-GP.doc.

Page 27: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 1 . I N T R O D U C T I O N8

1.4 STRUCTURE OF THE REPORT

This report is structured in five chapters.Following this introduction, Chapter 2 provides anoverview of results-based management and examineshow it was introduced to UNDP and what tools

and systems were developed. Chapter 3 presentsthe main findings from the country visits andinterviews at UNDP Headquarters. Chapter 4interprets the findings and tries to explain theprogress that has been made to date. Chapter 5includes conclusions and recommendations.

n The central feature of a results-based management system is managers using information to guidemanagement decisions.

n UNDP documents show that the importance of change to a culture of results was well recognizedwhen results-based management was introduced.

Key points

Page 28: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 2 . R E S U L T S - B A S E D M A N A G E M E N T I N U N D P 9

2.1 THE EVOLUTION OF RESULTS-BASED MANAGEMENT IN UNDP

The history of results-based management inUNDP can be traced to the mid 1990s. Manypeople in the organization see the process as alogical evolution of earlier initiatives, such asprogramme planning. For the purposes of thisevaluation, the Administrator’s Annual Report of1997,17 which calls for the establishment of “anoverall planning and results management systemin UNDP” is taken as a defining point. Box 2highlights some characteristics of the approachto results-based management taken by special-ized funds and programmes.

The Administrator’s Annual Report clearly statesthe intention to develop plans that would “bealigned with the budget for the 1998-1999biennium, and the organization would be able to

manage results against planned goals and targetsand with the appropriate resources assigned toachieve those results.”18 However, there is nosingle document in which the organizationdescribed how its overall approaches and systemswould be changed to institutionalize a results-based management culture, or how the organiza-tion would measure the degree to which theintended changes in management practice hadbeen achieved. The various tools and systems thatcomprise results-based management emergedover several years and are set out in detail inTables 1 and 2 in Annex 5.19 The main buildingblocks are summarized in Table 1.

UNDP has invested heavily in the developmentof new results-based management relevantsystems between 1998 and 2007 and the period is characterized by a high degree of change.

Chapter 2

RESULTS-BASED MANAGEMENT IN UNDP

Year Key Event

1998-1999 Strategic results frameworks piloted then adopted across all country programmes

2000 First Multi-year Funding Framework (MYFF); First Results Oriented Annual Reports (ROAR);Balanced scorecard introduced across all country offices

2002 Revision of Results Competency Assessment (RCA); Handbook on M&E for Results;Reorganization of Practice Areas to match the MYFF

2004 ATLAS (Enterprise Resource Planning Tool) introduced

Table 1. Key events in the development of results-based management

17 ‘Annual Report of the Administrator’, DP/1997/16/Add.7, 1997.18 Ibid, p 36.19 In the survey conducted for this evaluation, staff identified the following systems to be part of results-based management:

MYFF, CPAP results framework, annual work plan targets, ROAR, CO balanced scorecard, outcome evaluations,project evaluations, RCA performance targets and assessment, executive snapshot, Global Staff Survey results, partnersurvey results, and ATLAS.

Page 29: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 2 . R E S U L T S - B A S E D M A N A G E M E N T I N U N D P1 0

Within the UNDP group (UNCDF, UNV and UNIFEM), organizations have responded to results-based management indiffering ways. The text below reflects interviews held with the funds and a short review of documents onmanaging for results within their organizations.

UNCDF—A specialized fund within UNDP. For the first time, UNCDF is being incorporated in the same StrategicPlan as UNDP (for the period 2008-2011). In the recent past, UNCDF has developed a strategic view of the Fund’scomparative advantage in the context of the One UN vision. This has confirmed the Fund’s focus on two areas ofwork, namely Decentralization and Local Development and Microfinance/Inclusive Finance. There is a conscioussense of wanting to be leading reform, not catching up.

The Fund has a policy of trying to harmonize with UNDP and agree upon joint indicators and targets.20 Internaldocuments and interviews suggest that the Fund is looking at value-for-money questions, such as ‘What are thecosts of achieving their targets?’The Fund appears to be using results information: comparing countries, looking forbest practices, tracking new issues that arise, and figuring out where to invest—all examples of a results culture.A proposal to move towards supporting national programmes and sector-wide approaches as an aid modality wascirculated in February 200721 and has influenced the formulation of outcome indicators for the Strategic Plan. Thepaper describes the two focal areas of the programmes in a results framework that has five core results and 15outcomes groups. The outcomes are realistic, specific and measurable. In support of these objectives, the Fund hasset out planned intervention maps that show how financial and technical support from the Fund can lead toProgramme Purpose (Outcomes).

UNIFEM—The United Nations Fund for Women.22 In 1998, UNIFEM adopted several standardized tools to putresults-orientation into practice including the mandatory requirement of developing logical frameworks for allprogrammes, and the revision of the guidelines for periodic reporting to focus on results. Terminology is notidentical to UNDP but draws on their definitions and others such as the OECD-DAC and specialist organizationsincluding Save the Children. UNIFEM shares some characteristics with UNDP. UNIFEM’s work in empowering womenand promoting women’s human rights means that processes or ‘how things are done’ can be as important as thefinal results of a project or programme. In addition, much of the ‘soft assistance’ that UNIFEM provides in terms ofadvocacy, policy advice/dialogue, and facilitation/brokerage of information, etc. is geared towards creating orconsolidating processes that can facilitate women’s empowerment.

An independent review in 2002 identified degrees of ambiguity and/or fragmentation among UNIFEM practicesand policies in terms of what, when and how progress should be tracked. This led to simplification of the existingresults framework, but the main change came with the 2004-2007 MYFF. The strategic results framework has justfour goals and four outcomes (the outcomes cut across all four goals) that are realistic, specific and measurable.These are accompanied by a set of indicators that are to a large extent linked back to the MDGs and identifyinternational sources such as UNAIDS and UNFPA. An evaluation of the programme was underway during fieldworkfor this evaluation.

UNV—The United Nations focal point for promoting and harnessing volunteerism for effective development.UNV has possibly been more closely allied to the UNDP approach than the other two organizations. The secondMYFF influenced how UNV reports results, as the annual reports to the Board structure discussion around the fivegoals found in the MYFF.23 UNV has been involved in both reporting against the MYFF and developing the newstrategic plan, where their main concern was to ensure that ‘civic engagement’ was included as one of the ‘operationalprinciples for development effectiveness.’ One challenge has been that UNV cannot report against significantaspects of its work (i.e. promotion of volunteerism) because the MYFF does not cater for that as an objective.

In 2005 and 2006, UNV developed its own business model and supporting results framework, which have beenaligned with the MYFF and were approved by the June 2006 UNDP/UNFPA Executive Board. They were developed ina participatory manner with staff and partners. The UNV results framework is currently designed at the corporatelevel and needs further work to ensure consistent understanding of the results across staff within the organizationand internal coherence. The focus is now shifting to how to operationalize the framework at the business unit andproject levels, which will include definition of suitable indicators and strengthening analysis of how UNV contributesto achievement of results. Major technical challenges will include the fact that UNV’s contribution is mostly at thecommunity level and therefore not ordinarily picked up by national monitoring systems and the fact that UNVvolunteers work within projects administered by others, making it more difficult to identify UNV’s contribution.

Box 2. Managing for results in associated funds and programmes

20 ‘Proposed Partnership Framework for UNDP and UNCDF’, DP/2007/11; ‘Progress Report on the UNDP-UNCDFStrategic Partnership’, DP/2007/34.

21 UNCDF, ‘Moving to Sector-wide Approach for Decentralization and Local Development and Inclusive Finance—A Strategic Shift for UNDP and UNCDF in a Direct Budget Support Environment’, 2007.

22 The work of UNIFEM is mandated through two international agreements: the Beijing Platform for Action resultingfrom the Fourth World Conference on Women in 1995, and the Convention on the Elimination of All Forms of Discrimination

Page 30: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 2 . R E S U L T S - B A S E D M A N A G E M E N T I N U N D P 1 1

Annex 5 Table 1 shows that at least one keyinitiative relevant to results-based managementhas been launched across the organization everyyear. Each of these initiatives has demanded thatcountry offices do at least one of the following:

n Participate in the design of new systems

n Change their business processes to reflectnew systems introduced

n Change the information that is reported tocorporate level on their performance

2.2 INTENDED EFFECTS ON THE ORGANIZATION

In the absence of a single statement of strategythat results-based management can be evaluatedagainst, the approach taken here is to identify thefollowing various themes of reform as a structurefor analysis: setting strategic goals, aligning resultsand resources with those goals, monitoring forresults, adjustment and learning, and evaluationand accountability.

2.2.1 SETTING STRATEGIC GOALS

The intended purpose for setting strategic goalshas been to allow greater focusing of theprogrammes at the country level. This has beenconsistent during the past 10 years, as reflected inthe quotes below from 1997 and 2007:

“To be effective, UNDP cannot attempt to doeverything, even within its SHD [sustainablehuman development] framework. Given thediversity of national situations in programmecountries, achieving focus within the frameworkmust be accomplished primarily at the countrylevel. While respecting the need for country-level

flexibility, broad parameters and corporatestrategic objectives must be established globally tomaximise the capabilities, impact and substantiveaccountability of the organisation as a whole.”24

“The Strategic Plan, 2008-2011, seeks to takeresults-based management a step further byproviding an instrument that: (a) clearly articu-lates UNDP priorities, objectives, targets andperformance indicators; (b) creates a solid basisfor internal resource allocation; and (c) sets astronger platform for comprehensive resultsmanagement. ... This simplified framework willincrease UNDP’s focus, clarify its areas ofcomparative advantage, and facilitate themeasurement and reporting on results. For eachfocus area, UNDP will spell out its key resultsareas and outcomes, with a view to furtherstrengthening alignment.”25

These quotes clearly recognize that the focus ofprogrammes emanate from country-level needs,yet the instrument to implement this focus wasthe global strategic framework. This creates atension between corporate and country offices,and an effective results-based managementsystem has to balance these conflicting pressures.Annex 6 addresses this issue in more detail undera note on working in a decentralized structure.

The MYFFs, and presumably the new StrategicPlan, are also reported to have a prominentsecondary role in communicating with externalstakeholders, as a concise explanation of theprogramme. Intended uses include addressing:

n Questions from the differing ExecutiveBoard constituencies over which goalsUNDP should focus upon. For example, the

Against Women (CEDAW), known as the women's bill of rights. The spirit of these agreements has been affirmed bythe Millennium Declaration and the eight MDGs for 2015, which combat poverty, hunger, disease, illiteracy and genderinequality, and build partnerships for development. In addition, Security Council resolution 1325 on women, peace andsecurity is a crucial reference for UNIFEM's work in support of women in conflict and post-conflict situations.

23 UNIFEM, ‘How Are We Doing? Tracking UNIFEM Progress in Achieving Results for Management and Learning’,A Briefing Note, 2002; UNIFEM, ‘Multi-year Funding Framework 2004-07’, DP/2004/5.

24 ‘Annual Report of the Administrator’, DP/1997/16/Add.7, 1997, p 30.25 From ‘Frequently Asked Questions about the Strategic Plan’, internal guidance on UNDP intranet site concerned with

development of the new Strategic Plan.

Page 31: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 2 . R E S U L T S - B A S E D M A N A G E M E N T I N U N D P1 2

fact that the MYFFs and Strategic Plan, tosome degree, reflect where the demand forUNDP services is, allows the organization towork in areas that some Executive Boardmembers consider sensitive. The clearestexample quoted has been the use of theMYFFs in supporting UNDP’s increasedfocus in the governance area.

n Concerns within the wider UN family overwhether UNDP was moving into areas thatwould be better served by other organiza-tions within the United Nations.

n Donors’ questions about the ‘value added’of UNDP and, accordingly, a need todemonstrate results to donors. This wasparticularly important given the relativedecline in UNDP funding during the 1990sand thus the need to demonstrate to donorsthe value of increasing their commitments tothe organization.

2.2.2 ALIGNING RESULTS

Within the broad goals set within the MYFFsand the Strategic Plan, UNDP has embarked ontwo (until recently) parallel strands of work relatedto definition of results by the organization.

The first has been the move at the countryprogramme level from managing project inputsto managing a portfolio of projects and otherUNDP support26 to deliver at the outcome level.This move started in 1998, with the piloting ofstrategic results frameworks, which werestructured around delivery of outcomes, in alimited number of country programmes. Thistool was then rolled out across all countryprogrammes in early 1999. All those interviewedduring this evaluation who were involved indevelopment of strategic results frameworks statedthat the intended purpose of the tool was to fostera strategic management approach at the countryprogramme level, in which programme managers

would clearly be aware of how they expectedUNDP projects and other support to contributeto delivery of the agreed outcome. This intentionis also clearly set out in the quote below:

“A further general lesson that emerged was theimportance of stressing management overmeasurement. The fundamental goal of results-based management is to improve developmenteffectiveness, which requires helping managers tobetter manage. In comparing RBM [results-based management] systems, the distinction issometimes made between managing by resultsand managing for results. The former is princi-pally oriented towards accountability andexternal reporting; the latter focuses on a cycle ofplanning, periodic performance and organiza-tional learning. In implementing RBM,UNDP made a deliberate decision to emphasisemanagement and learning.… RBM mustexplicitly aim at changing the way the organi-zation is managed, fostering a strategic orienta-tion and culture of performance. Improvedexternal reporting was approached as veryimportant, but a secondary benefit.”27

Support for a strategic approach at the county levelhas also come from reforms to the UNDAF process,which have started to work through UNDP countryprogrammes in recent years. The introduction ofthe UNDP CPD and CPAP, using a formatcommon to the Executive Committee (ExCom)agencies,28 has helped to clarify the alignment ofUNDP’s country programme with nationalpolicies and harmonization with partner UNorganizations. In particular, joint programmingfor the UNDAF provides a stimulus for UNDPto ensure that its programmes reflect strategicareas from the MYFF and do not conflict withthose of other UN organizations.

The second strand of work has been theintroduction and use of the balanced scorecard in2000. This tool was originally introduced for

26 Advocacy, policy dialogue and institutional strengthening, and field presence.27 UNDP, ‘Development Effectiveness Report’, 2000, p 23-24.28 The ExCom Agencies are UNDP, UNFPA, UNICEF and WFP.

Page 32: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 2 . R E S U L T S - B A S E D M A N A G E M E N T I N U N D P 1 3

monitoring implementation and the results of theinternal management reforms proposed in theAdministrator’s Business Plans, 2000-2003,which were intended to drive cultural changewithin UNDP, as summarized in Table 2.29

These changes complement results-basedmanagement and support an environment thatfocuses on results.

2.2.3 ALIGNING FUNDING

The concept of results-based management assumesthat resources will follow results—in other words,as results are aligned with goals, resources wouldbe managed to achieve those results. In practice,the extent to which alignment with goals andresults from the results-based managementsystem was intended to influence the allocationof financial resources was severely limited. This

partly reflects the view of significant constituencieswithin the Executive Board that funding shouldprimarily reflect needs rather than results. Majorsources of funds during the period under evalua-tion are shown in Table 3.

The sources and uses of funds in UNDP at thecountry office and corporate levels is complex, withlimitations on management flexibility. The maindistinction is first between core and non-core.The targeting and allocation of core resources ismanaged as a resource supply to programmes andoperations. Shortfalls in funding to achievedevelopment results have to be made up by extra-budgetary income and non-core sources.30

Taking core funding first, there are three core pro-grammatic budgets:TRACs 1.1.1, 1.1.2 and 1.1.3. 31

Today Tomorrow

Project driven Ü Policy driven

Process orientation Ü Results orientation

Low-level specialized expertise Ü Clear competency profile

Low knowledge-based capacity Ü Innovative and information technology networked capacity

Risk aversion Ü Risk taking

Introverted, sceptical of partnerships Ü Outward looking, partnerships oriented

Cumbersome decision making Ü Flexible and real-time decision making

Bureaucratic culture Ü Merit-rewarding and initiative-driven culture

Weak management accountability Ü Responsive leadership management

Table 2. Prospective cultural changes in UNDP

29 UNDP, ‘The Way Forward. The Administrator’s Business Plans, 2000-2003’, 1999, para 28.30 ‘Assessment of Programming Arrangements, 2004-2007’, paper presented to the Executive Board DP/2007/8, available

online at: http://www.undp.org/execbrd/word/dp07-8.doc.31 More commonly spoken of as TRACs 1, 2 and 3, for simplicity.

Page 33: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 2 . R E S U L T S - B A S E D M A N A G E M E N T I N U N D P1 4

n The TRAC 1.1.1 budget represents theminimum level of resources targeted to beavailable for an individual programmecountry during a given financial period. It iscalculated in accordance with the board-approved distribution methodology, usingper capita gross national income and popula-tion as the primary criteria.

n TRAC 1.1.2 resources are in the firstinstance earmarked by region. These aresubsequently allocated by the regionalbureaux on an annual basis between countryprogrammes. In theory, allocation should beon the basis of the quality of the plannedUNDP assisted programmes. TRAC 1.1.2earmarking for a given region is equal to two-

thirds of the total TRAC 1.1.1 earmarkingfor all countries in that region. The allocationformula for TRAC 1.1.2 assignment for anindividual country was initially expressed as apercentage of the country’s TRAC 1.1.1earmarking, and ranged from 0 to 100percent (averaging 66.67 percent).32

n The TRAC 1.1.3 facility was established toprovide the Administrator with the capacityto respond quickly and flexibly to the needs ofcountries in special development situations.This budget, which has grown significantly, ismostly used to support work in crisis situations.

There is no prior prescription on how TRAC1.1.1 funds are allocated between programmes at

32 Temporary changes were made to the TRAC 1.1.2 allocation system through an Executive Board decision 2005/26.

Source of Funds 2000 2006

Core Funding

TRAC 1 & 2 17 11

TRAC 3 1 1

Other 3 1

Subtotal 21 13

Non-core Programmatic Funding

Thematic trust funds 0 1

Trust funds 17 24

Government cost sharing 47 35

Donor cost sharing 10 21

Other 5 6

Total 100 100

Total value (USD, million) 1,862 4,049

Table 3. Percentage of programmatic funding by source

Source: Compiled by Bureau of Management, UNDP

Page 34: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 2 . R E S U L T S - B A S E D M A N A G E M E N T I N U N D P 1 5

the country level, but it is only for TRAC 1.1.2resources that the possibility of assigningresources against country level results exists.

Non-core resources are all earmarked, such as inthe case of the Thematic Trust Funds and donorcost sharing, or allocated at the country level, asin the case of partner government cost sharing.

UNDP’s overall financial framework reflects acomplex mix of funding allocations according to:substantive programming components at thecountry, global and inter-country levels; line iteminputs (i.e. economist lines); and organizationunit funding (e.g., the Human DevelopmentReport Office and the Office of DevelopmentStudies). As such, the financial framework is notdirectly aligned with the UNDP goals orpractices, as defined in the MYFFs.

It is only at the lower levels of budgeting andprogramme planning, principally at the countryprogramme level and below, that alignment withthe MYFF goals and results takes place.However, UNDP does not actively monitor or settargets at the country programme level for thedegree to which programmatic resources arealigned around delivery of outcomes or theprogramme as a whole.

2.2.4 ALIGNING HUMAN RESOURCES

There are three areas in which UNDP could havealigned human resources with delivery of itsintended results. These are:

n Structuring within operational units todeliver specific outcomes

n Aligning the staff performance appraisalsystem, the RCA, with delivery of results

n Aligning the practice areas with delivery ofspecific goals within the MYFFs

During the period under evaluation, there wereno central prescriptions on how the country offices

should be structured, although guidance issued in2002 suggests organizing programme staffaround outcomes. The current approach of theManagement Change Team (an internal change-management service) is to help country officesalign with the strategic plan and work efficientlywith office-wide systems such as ATLAS.

The annual RCA was revised in 2002 tostrengthen linkage with delivery of results in alogical sequence:

1. Annual strategic targets are set for thecountry office based on the strategic resultsframework and objectives and indicators inthe balanced scorecard.

2. Once annual strategic results have beendefined, the team or sub-unit level developwork plans that aim to deliver against thedefined strategic targets.

3. Staff members agree upon a number ofindividual results against which they will bejudged. Those most relevant to the results-based management system are:

a. Five key individual results that aresupposed to identify the primary contri-butions of that staff member: resourcemobilization, resource delivery, learning,support to policy and output delivery

b. A key result relating to ’learning andgrowth‘ for individual staff members33

The RCA is intended to focus the work of staffaround strategic outcomes but is designed andused on an individual basis and not as an input toteam management.

The development of practice-area expertise wasoriginally intended to ensure that UNDP hadsubstantive policy capacity in the thematic areaswhere demand is greatest. Practice areas alignedwith the demands identified in the MYFF 2000-2003 were launched in 2002.34

33 It is important to note that these are self-selected results.34 Executive Board decision 2003/8.

Page 35: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 2 . R E S U L T S - B A S E D M A N A G E M E N T I N U N D P1 6

2.2.5 MONITORING FOR RESULTS

Monitoring is the conduit by which informationabout results is fed back to management. Duringthe evaluation period, there was a significantexpansion in the number of tools that report onperformance at country level. In fact, the focus ofmonitoring was primarily for an externalaudience.

“At headquarters, strategic results frameworksassist managers to judge whether the overallresults of UNDP assistance worldwide meet thegoals, principles and standards set out in theMission Statement, Business Plans andExecutive Board decisions as well as inoperational and thematic policies. As such, theyare intended to improve UNDP’s substantiveaccountability to national stakeholders and theExecutive Board and, for the first time, lay the

basis for a funding strategy to support approvedprogrammes based on results that are clearlyidentified and monitored.”35

The first clear directive about monitoring underresults-based management was given in 2002.Corporate guidance and prescriptions on M&Esystems were significantly revised to enhancetheir results orientation. These changes included:

n Removal of many of the previous mandatoryM&E requirements at the project level,shifting attention to programme outcomes

n Publication of the Handbook on Monitoringand Evaluating for Results (commonlyreferred to as the Yellow Book)

Table 4 shows the planned shift of emphasis thisnew policy brought.

Elements of Implementation Monitoring(Traditionally used for projects)

Elements of Outcome Monitoring (Used for a range of interventions and strategies)

Description of the problem or situationbefore the intervention Ü Baseline data to describe the problem or

situation before the intervention

Benchmarks for activities and immediate outputs Ü Indicators for outcomes

Data collection on inputs, activities andimmediate outputs Ü Data collection on outputs and how/whether

they contribute towards achievement of outcomes

More focus on perceptions of change amongstakeholders and more focus on 'soft' assistance

Systematic reporting on provision of inputs, etc. Ü Systematic reporting with more qualitative

and quantitative information on the progress of outcomes

Directly linked to a discrete intervention (or series of interventions) Ü Done in conjunction with strategic partners

Designed to provide information onadministrative, implementation andmanagement issues as opposed to broaderdevelopment effectiveness issues

Ü Captures information on success or failure of UNDP partnership strategy in achievingdesired outcomes

Table 4. Key features in implementation versus outcome monitoring36

35 UNDP, ‘RBM in UNDP: Overview and General Principles’, 2000, p 7.36 UNDP, ‘Handbook on Monitoring and Evaluating for Results’, 2002, p 11.

Page 36: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 2 . R E S U L T S - B A S E D M A N A G E M E N T I N U N D P 1 7

The changes summarized in Table 4 were pre-empted by the ROAR, which was introduced in2000. The subsequent M&E policy revision in2002 was a radical but belated attempt to cater tothe ROAR as the primary country officereporting tool for results and ameliorate risingconcerns about the apparent expansion of reportsand systems in country offices. Furthermore,other changes were taking place in the architec-ture of planning and reporting systems that weredriven as much by UN reform as by UNDP’sapproach to results. Annex 5 Table 2 bringstogether the full range of planning and reportingtools currently in use.

2.2.6 ADJUSTMENT AND LEARNING

Within the logic of programme management,there are two main occasions when the countryoffice would take a structured approach to adjust-ment of the programme. The first would beduring the development of the CPD, whichincludes development of the strategic resultsframework covering the forthcoming program-ming period. Inclusion of the strategic resultsframework in the new CPD is a direct responseto the introduction of results-based management.The second is during the annual review processfor the outcomes, which is a review process directlyintroduced in response to the introduction ofresults-based management.

The greatest change in approaches to learningderived from introducing results-based manage-ment was the move from M&E mainly at thelevel of the project to M&E at the level of theoutcome. This shift is neatly encapsulated in thefollowing quote from the UNDP’s Guidelines forOutcome Evaluators:

“An outcome evaluation aims to improveunderstanding of the outcome itself—its statusand the factors that influence or contribute to itschange. It does not look at the process of inputs,activities and other bureaucratic efforts but shifts

attention to the substantive development results(outputs and outcomes) that they are aimed ataffecting. It also provides real-time answersabout the outcome rather than waiting until aproject is completed and the outputs produced toask questions. These answers may be part of a‘questioning continuum’.”37

The institution of the UNDP ‘practices’ enablethe organization to provide substantive supportto programme countries. The objective is toencourage an internal culture of knowledgesharing and substantive skills development,capitalizing on the experience inherent in itsnetwork. As UNDP increasingly oriented itselftowards policy advisory services and capacitydevelopment, it recognized the need tostrengthen its substantive knowledge base in itskey practice areas. Key initiatives were to:

n Establish practices in areas of need andenhance staff participation in the practices

n Strengthen and sustain policy and substan-tive support services

n Increase learning and training

n Upgrade ICT for knowledge management

2.2.7 ACCOUNTABILITY

Results-based management asks managers tofocus on the outcomes to be achieved, track theoutputs and sequence of outcomes beingachieved and, based on a theory of change for theprogramme, adjust their activities and outputs tomaximize the likelihood that the desiredoutcomes are realized. It recognizes thatoutcomes by definition are results over whichmanagers do not have control.

The organizational structure of UNDP withcorporate headquarters, regional bureaux andcountry offices suggests that managementarrangements would reflect that structure interms of control and accountability. In 1996, the

37 UNDP, ‘Guidelines for Outcome Evaluators’, Monitoring and Evaluation Companion Series, #1, UNDP EvaluationOffice, New York, NY, 2002.

Page 37: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 2 . R E S U L T S - B A S E D M A N A G E M E N T I N U N D P1 8

Executive Board accepted the recommendations inthe 1997 Annual Report of the Administrator,38

including the shift to an ex-post accountabilityframework, in which country offices managedprogrammes, finances, administration andpersonnel. This was in turn based on theaccountability review.39 Regional bureaux were toassume a new role of holistic oversight of countryoffice performance—assembling and maintaininga complete, up-to-date overview of UNDPoperations and providing corporate managementwith consolidated monitoring reports andmonitoring country office compliance withdefined indicators of management performance,including delivery, resource mobilization and theResident Coordinator function.

In 2007, UNDP implemented a new AccountabilityFramework as an integral part of its StrategicPlan. The Framework addresses accountability atthe organizational level, manager’s level andindividual level. The roll-out of the AccountabilityFramework provides an opportunity to support astronger results-based management focus inUNDP by moving accountability beyond processand outputs.

2.2.8 GUIDANCE AND CAPACITY BUILDING FORRESULTS-BASED MANAGEMENT

A range of guidance has been issued to supportimplementation of the various components of theresults-based management system.

n Initial guidance issued in 2000 included:

• Results-based management in UNDP:Overview and General Principles

• Results-based management in UNDP:Technical Note

• Results-based management in UNDP:Selecting Indicators

n The Handbook on Monitoring andEvaluating for Results (2002)

n Guidelines for Outcome Evaluators (2002)

n UNDP’s Programme Management Guide,which has now been superseded with the newResults Management Guidance (2006). On-line training on results-based managementhas been available since 2006 but is not yet amandatory and integral part of the orienta-tion programme of new or incumbent staff.

n Guidance issued on an annual basis on howto enter data into the computerized reportingsystem for the ROARs. Such guidance oftenincluded rules on definition of outcomes, etc.and tended to act to reinforce how the systemwas supposed to operate.

n Access to technical advice from the corporatelevel through Sub-regional Facilities(SURFs), now called Regional SupportCentres (RSCs).

n A course called Managing for DevelopmentResults run by the Virtual DevelopmentAcademy.

The practice has been for UNDP to runworkshops to assist in the introduction of newtools. Thereafter, the practice networks supportfurther implementation.

38 ‘Annual Report of the Administrator (1997)’, DP/1997/16/Add.739 ‘Successor Programming Arrangements, Report of the Administrator (1995)’ Annual Session of the Executive Board,

5-16 June, New York, DP/1995/32.

Page 38: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 2 . R E S U L T S - B A S E D M A N A G E M E N T I N U N D P 1 9

n The tools and systems for results-based management evolved without a comprehensive design,creating a 10-year period of rolling innovation, redesign and change.

n The primary purpose of strategic objectives has been to help focus the programme, whilst alsoimproving communication with external stakeholders.

n Alignment of programmes to strategic goals was promoted by a shift of results focus from outputsto outcomes. In parallel, the balanced scorecard was a response to the Administrator’s Business Plansto change the culture of the organization and report against a broad range of physical and financialindicators of operational change.

n Very little flexibility was given in core financial resources to manage for results.

n Greater flexibility and closer linkages to results were created through scope to change country officeorganizational structures, the adoption of the RCA, and development of practice areas to fosterthematic skills.

n Developments in reporting were led by the ROAR, but monitoring lagged behind in terms of policyand tools to help monitor progress towards country outcomes.

n Adjustment of programmes was expected to take place mainly through annual and countryprogramme cycles. No specific provisions were made for more frequent interventions.

n Links to learning were supposed to occur through the shift from project to programme outcomeevaluations and interactions with the practice areas and networks.

n An accountability framework is under development with provisions for individual, managerial andorganizational accountability.

Key points

Page 39: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in
Page 40: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N 2 1

3.1 RESULTS ORIENTATION

3.1.1 EVOLUTION OF STRATEGIC GOALS ANDEFFECTS ON THE ORGANIZATION

The goals of an organization define theframework within which programmes andorganizational structure are developed. They alsoprovide a reference point for stakeholders andother parties with which the organization has arelationship. This has been a particularlyimportant issue for UNDP, whereby statementsabout the use of goals have been intended tofocus the programme.

Table 5 lays out the evolution of the goalstatements of UNDP between the first MYFFand the forthcoming Strategic Plan. Aside fromgoals 6 and 7, which are related to internalmanagement within UNDP and the UnitedNations in the first MYFF, the substantivedevelopment goals have remained almost thesame over the last three strategic documents.

Governance (defined as an enabling environmentfor sustainable human development under thefirst MYFF), poverty reduction and the environ-ment are constant across all three documents andperiods, with only the language changing toreflect the advent of the MDGs. Gender wasdropped as an explicit single goal in the secondMYFF and instead became one of six ‘drivers ofdevelopment effectiveness’.

The specific goal on HIV/AIDS found in thesecond MYFF will be dropped in the newStrategic Plan, but as noted under the StrategicPlan’s Frequently Asked Questions:

“As a co-sponsor of UNAIDS, UNDP willcontinue to be part of the worldwide effort torespond to the HIV/AIDS epidemic. Based onthe established division of labour, UNDP isdesignated as the lead organization on behalf ofthe UN system for addressing ‘AIDS as it relatesto development, governance, mainstreaming,legislation, human rights and gender.’ UNDP’swork in executing these responsibilities is nowreflected within the Democratic Governance andPoverty Reduction and MDG developmentfocus areas.”40

With the exception of the statement to achievethe MDGs, all the goals represent processes orareas of activity, without inherent targets orpredefined indicators of performance. Thismakes progress towards them hard to evaluate.

The statements of goals are supported by areas ofwork at a lower level called strategic areas inMYFF 1, service lines in MYFF 2, and key resultareas in the Strategic Plan. Annex 5 Table 3presents a comparison across these areas of work.This analysis shows that although the wordinghas changed and some components of MYFF 1,such as area-based and downstream work, havebeen dropped, there has been little substantivechange to the scope of activities. One majorinnovation associated with the new draftStrategic Plan is the proposal to build corporateoutcome indicators into the results framework,but at the time of writing this has yet to beapproved by the Executive Board.

Regarding the use of the strategic frameworks asa communication tool, senior managers inUNDP and partner organizations believe that

Chapter 3

FINDINGS FROM THE EVALUATION

40 Available online at http://intra.undp.org/results/strategic_plan_faq.shtml.

Page 41: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N2 2

the frameworks have had a positive effect on thepresentation of the programme to stakeholdersand third parties.

3.1.2 ALIGNMENT WITH STRATEGIC GOALS ATTHE COUNTRY PROGRAMME LEVEL

It is clear that the development of strategicframeworks was intended to help focus theprogramme, but the extent to which focus meantsubstantive change, or improved specification, isless clear. Increasing focus implies reducing therange of activities being supported. This raisesimportant questions about how UNDP interactswith government (as the owner of the plannedresults) and development partners. One purposebehind the drive for focus was to enable UNDPto foster areas of competence and avoid overlapwith other UN partners. Few changes were

expected from the first MYFF, as that was largelydrawn up around current programmes anddesigned to make the presentation of theportfolio more coherent and logical. There wasmore scope with MYFF 2.

Analysis of the five case-study countries revealssome signs of changing programmes, but adisparity exists between perceptions of staff andmanagement and the structures of the countryportfolios. According to a recent survey ofcountry offices, the large majority of ResidentCoordinators and Deputy Resident Representativesbelieve the practices and service lines introducedin 2002 have benefited their work by providinggreater focus, improving knowledge managementand focusing on results. This increased focus hasfacilitated positioning and advocacy at the

Strategy Document Goals Identified

MYFF 2000-2003 1. To create an enabling environment for sustainable human development

2. To eradicate extreme poverty and reduce substantially overall poverty[WSSD Commitment 2]

3. To protect and regenerate the global environment and natural resourcesasset base for sustainable human development

4. To achieve gender equality and advance the status of women, especiallythrough their own empowerment

5. To prevent or reduce the incidence of complex emergencies and natural,environmental, technological and other human-induced disasters, and toaccelerate the process of sustainable recovery

6. To provide effective UNDP support to the United Nations Agenda for Development

7. To achieve excellence in the management of UNDP operations

MYFF 2004-2007 1. Achieving the MDGs and reducing human poverty

2. Fostering democratic governance

3. Energy and environment for sustainable development

4. Crisis prevention and recovery

5. Responding to HIV/AIDS

Strategic Plan 2008-2011

1. Poverty reduction and the MDGs

2. Democratic governance

3. Crisis prevention and recovery

4. Environment and sustainable development

Table 5. Key strategy documents and evolution of UNDP goals

Page 42: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N 2 3

national level, especially by identifying areas oflegitimate involvement by UNDP.41 This viewwas echoed in the countries visited.

The main evidence in support of increased focusis the steady reduction in the number ofoutcomes, especially since MYFF 2. For example,the Indonesia programme has reduced outcomesfrom 28 to 5. But much of the change was drivenby structural guidance from headquarters limitingcountry programme outcomes to 10 or fewerunder MYFF 2. In three of the five countries,close inspection of the portfolio and discussionswith staff revealed that the underlying areas ofsupport have remained largely unchanged. Theexceptions are Zambia, where there has been aclear withdrawal from agriculture, rural develop-ment and food security, and employment genera-tion and sustainable livelihoods; and Argentina,where the programme saw a dramatic change oforientation following the economic crisis of2001-2002.

All of the case-study countries characterized thisprocess as a remapping of projects against theevolving strategic frameworks, rather thansubstantive change. This process has becomeincreasingly elaborate as the language ofobjectives has evolved and the framework ofresults has become steadily more complex.42 Forexample in Egypt, the programme in 1993 wasconfigured under two areas of concentration andnine themes. By 2007, the nine UNDP Egyptoutcomes were mapped into a framework of fiveUNDAF outcomes, four MYFF goals, nineservice lines and nine core results. It is only withthe advent of CPDs and the annual CPAP that apractical tool has emerged that enables UNDPand national partners to agree on the focus of

results in a way that is coherent with the widerspan of UN support at the country level. As morecountries develop second-generation UNDAFs,the clarity that the UNDAF/CPD/CPAP structurebrings has the potential to have a strong positiveeffect on both UNDP’s drive for focus andnational ownership of results.

In the five case-study countries, changes to thecomposition of the portfolio of projects are morelikely to arise from other pressures such as: carryover of projects from previous planning periods,the need and opportunity to mobilize resources(examined in the next section), and relationshipswith other development partners.43 Mostimportantly, country office staff have observedthat outcomes are broad permissive statementsthat act as umbrellas in the planning processrather than objectives that drive the choice ofprojects. A 2007 Country Programme OutcomeEvaluation for Egypt concluded that greaterfocus was needed within the outcome areas, apoint echoed in the case study for Zambia. Thereis clear evidence that countries have responded tostrategic guidance by removing outlier projects.For example, in Egypt, projects in support ofscience, space technology and infrastructure havebeen terminated.

The extent to which corporate strategic planshave focused the portfolio was examined by theOperations Support Group (OSG) in 2006. Thatanalysis concluded that 87.5 percent of a randomsample of projects were fully or substantiallyaligned to the 2004-2007 MYFF.The 12.5 percentunaligned actually accounted for 35.7 percent ofexpenditure (a finding that suggests mostunaligned projects had substantial infrastructureand procurement components).44 However, this

41 UNDP, ‘UNDP Management Review Phase II – Report of the Management Review Team’, Internal UNDP Report,2007, para 73.

42 Terminology has, at various times, included some or all of: goals, sub-goals, key result areas, outcomes, practice areas,service lines, and core results.

43 Examples were given in Indonesia and Zambia of circumstances where the World Bank assumed the lead from UNDPin post-tsunami reconstruction and decentralization due to a stronger presence on the ground and larger resources. Otherinstances include rationalization of projects that are within the mandate of other UN agencies, such as support to counterfemale genital mutilation in Egypt.

44 ‘Alignment Analysis’, OSG/ExO (AB), 30 Oct 2006.

Page 43: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N2 4

analysis raises a question about the quality ofoutcomes. The statements of goals in Table 5 andthe areas of work (service lines etc.) in Annex 5Table 3, reveal that the wording tends to bephrases that describe activities to be carried out,rather than measurable objectives. They arebroad, permissive statements within which awide range of project activities can be clustered(see Boxes 3 and 4.)

Concern over the quality of outcomes has beennoted by the Evaluation Office as part of a review

of M&E practices. It found, inter alia, that mostresults orientation was predominantly at theproject output level and that trying to useoutcomes as the focus of evaluation raisesdifficulties of linking the status of outcomes tointerventions and the relevance of currentapproaches to strategic and operational planningin country offices.45 The study was able to citesome instances of good practice at the countrylevel and similar examples were also found in thecase-study countries.46

The Indonesia country office has taken the UNDAF sub-outcomes as its overarching framework. The sub-outcomes are broad and cover the full spectrum of UNDP’s previous activities with no influence at all on UNDP’s portfolio. For example sub-outcome 1.4 reads:“By 2010, increased opportunities forachieving sustainable livelihoods in the poorest provinces of Indonesia through development andimplementation of appropriate participatory policies and programmes.”The implication is that, underthis outcome, anything to do with policy or a programme that increases opportunities for achievingsustainable development can fit. (Indonesia case study)

Box 3. Broad, permissive outcome statement

An example of training material from the European Commission, which uses a similar definition of outcomesas UNDP, illustrates how careful specification can help planners develop more measurable outcomes.

UNDP definitionOutcomes are actual or intended changes in development conditions that UNDP interventions areseeking to support. They describe a change in development conditions between the completion ofoutputs and the achievement of impact. For example:

n Improved national capacity to monitor human and income poverty and inequality

n Increased access of the poor to finance (formal, informal, micro)

n Reduction in the level of domestic violence against women

European Commission training guidanceThe outcome is the effect that project outputs will have on the beneficiary, institution or system. Itdefines the project’s success. Test the outcome by asking is it a realistic statement, specific to projectoutputs, and measurable at reasonable cost. For example:

n Improved performance in the national statistics office for timely production of human and incomepoverty statistics of verifiable quality to international standards

n Increased utilization of formal, informal and micro finance by poor households in (designated) areaor for (designated) purpose

n Reduction in the (reported or estimated) level of domestic violence against women in designated areas

Box 4. Making outcomes evaluable

45 UNDP, ‘An Assessment of UNDP Practice at Monitoring and Evaluation at the Country Level’, UNDP EvaluationOffice, New York, NY, February 2005, para 4.19, 4.21 and Summary.

46 See for example, Table 7 in the Egypt Case Study, which provides an illustration of improving outcome and indicators,UNDP Evaluation Office Internal Report.

Page 44: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N 2 5

The five study countries present a wide spectrumof funding status.47 At one extreme, Argentinaand Egypt have very low levels of core funding(less than 1 percent and less than 4 percentrespectively) and rely on mobilizing funds fromdonors and the government for virtually all theirprogramme. The proportion of core funds washigher in Indonesia (13 percent), Moldova (38percent) and Zambia (65 percent). However,evidence from the five countries was consistentthat resource mobilization and delivery were themajor internal criteria used to assess overallperformance. A key issue therefore is the strate-gies to use core funds and to mobilize funds insupport of outcomes. Table 6 summarizes theexplanations given by the five countries for howcore funding is managed. None of the countriesincorporate results in their decision process.

Mobilization of non-core funds creates apotential conflict for country programmemanagement. Many countries are dependent onfund mobilization to generate a ‘critical mass’of programme activities and staff their office.As a result, mobilization tends to be a drivingforce behind the work of programme staff.48

Examples exist, such as in the Indonesia office,where there has been success in matching non-core funds to their outcome objectives. But more

often, acquisition of resources is opportunisticand reflects availability of funds from donors andthe government rather than being driven byprogramme outcome objectives.

The 2007 Country Programme Evaluation ofEgypt concludes:

"The current partnership strategy is stronglydriven by resource mobilization ends and not theother way around. Diversification of UNDPpositioning strategy can hurt the total size of theprofile in terms of resources, but is likely todeepen impact of UNDP interventions at thepolicy level towards achieving the MDGsespecially forging alliances around difficult butcommon platforms in governance areas. There isthe temptation to invest in areas that carryprospects of non-core resources, resulting inUNDP spreading itself too thin at the expense offocusing on its strategic strengths and expertise.Projects in the pipeline need to be re-evaluatedwith this caveat in mind.”49

This same issue was raised by the EvaluationOffice in 2006:

“Conflict of interest and confusion of roles mayarise where UNDP seeks to combine the roles of

Strategy for Core Funds Argentina Egypt Indonesia Moldova Zambia

Use as seed money ü

Allocate according to needs ü ü

Allocate according to fund-raising potential

ü

No formal strategy ü

Table 6. Core fund allocation strategy

47 Funding analysis reflects experience over a variety of years as available at the CO: Argentina 2005-2008; Egypt 2002-2006; Indonesia 2001-2005; Moldova 2002-2006; and Zambia 2004-2006.

48 55 percent of staff agreed with the statement “Because most of our funds are raised through cost sharing or from donors,we have little scope in allocating resources across our programme or within outcome areas according to results;” 62 per-cent agreed with the statement “In my office, country programme staff are under more pressure to raise resources andensure timely delivery than on enhancing the contribution by UNDP to achievement of the outcomes.”

49 Afaf Abu-Hasabo et al., ‘Evaluation of Egypt Country Programme (2002 – 2006)’, UNDP Egypt, March 2007.

Page 45: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N2 6

policy coordinator, donor, rights advocate,neutral broker and project implementer. Inparticular, areas where UNDP has a comparativeadvantage but is unlikely to mobilize externalresources—such as donor coordination, neutralbroker activities and advocacy for human rights—can be crowded out by activities for which cost-sharing assistance is more readily available andgovernment consent easier to secure.”50

Reforms to human resources management overthe period were driven by the Administrator’sBusiness Plans and the aim to reorient UNDPupstream to policy engagement rather thanprocessing of projects.

The experiences in the five countries share somecommon threads, but are largely different.Changes in Argentina were linked to theexpansion in the country programme. Thesechanges did enable some realignment of staff tosubstantive programme areas but were not linkedto results-based management as such. Theorganizational structure of the Egypt CountryOffice was changed by each of the previous twoResident Coordinators and is currently beingreassessed by the new incumbent. Unique amongthe five study countries, Egypt adopted a flatstructure in 2002 with responsibilities for projectsallocated to individuals without a structure basedaround programmes or outputs. Various systemswere put in place to manage for outcomes, but thesehave not been maintained and staff feel this hascontributed to a lack of strategic direction duringthe transition to the present management team.

Three of the five country offices underwentreprofiling exercises with varying degrees ofreorganization between operations and program-ming staff, including the development of new jobdescriptions and the need for new skills.Indonesia also adopted a flat structure but withunits grouped around programme areas andstaffed according to programme size. Zambiareported missing an opportunity during reprofiling,with existing staff being fitted into new job

descriptions rather than new skills being found ordeveloped. Moldova underwent reprofiling andhas continued with further reform, culminatingin a change process and restructuring in 2006.This is the nearest example of a results orientation,with the drivers being:

n Aligning staff around delivery against keyUNDAF outcome areas relevant to UNDP

n Addressing the burdens imposed by the rapidincrease in programme resourcing

n Strengthening UNDP’s resource mobiliza-tion capacity51

The RCA has been adopted in all offices withvarying degrees of success. It is generally seen asstrengthening the potential for accountabilitywith the link between staff member to supervisor tosenior manager. RCAs do not yet include measuresof whether staff are managing for outcomes. Forthis reason, the Zambia office has found theRCA to be a more powerful tool for operationsstaff, whose procedures foster stronger accounta-bility and feedback, than programme staff wherethe focus should be on managing for outcomes.

Specific criticisms were raised about three aspectsof the RCA, both in the country offices and in headquarters:

n First, although staff performance assessmentis rated on a five-point scale, in practice onlythree points are used, as the lower scores offour or five trigger management action,which all parties seek to avoid. This results inclustering of staff and poor differentiation in assessment.

n Second, there is no difference in the financialreward for good performance between gradethree and the highest grade of one, whichreduces incentive for higher performance.

n Third, the targets set for performance assess-ment are self selected and, in an estimated 70 percent of cases, are agreed retrospectivelyduring the year of assessment.

50 ‘Annual Report of the Administrator on Evaluation in 2005’, DP/2006/27, para 67.51 UNDP, ‘Moldova Case Study Report’, UNDP Evaluation Office, Internal Report, para. 4.33.

Page 46: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N 2 7

3.2 MANAGING FOR RESULTS

3.2.1 MONITORING AT THE COUNTRYPROGRAMME LEVEL

Monitoring at the country level is an interestingdiscussion. To a large extent, the expectations setout in Table 4 of Annex 5 have not beenachieved, but the five case-studies country revealsome local initiatives as country managers try tograpple with the challenge.

Table 4 in Annex 5 brings together responsesfrom the case-study countries about progresstowards outcome monitoring. The findings showthat progress is uneven, but there are consistentsigns of greater use of baseline data and settingindicators for outcomes. There is also evidence ofprogress towards more collaborative work withpartners and greater use of national surveys andtools, such as the preparation of the NationalHuman Development Report (NHDR). Theweakest areas are the intermediate steps thatdemonstrate how UNDP-supported projects aredelivering contributions towards outcomeobjectives. This type of analysis will not bederived from national statistics and requires botha clear causal pathway against which progress canbe charted and data collection that is specific to

project activities. Similarly, the country visitsreveal little progress towards data collectionabout perceptions of change among stakeholdersand systematic qualitative information.

The shift to outcome monitoring and the 2002monitoring policy has contributed to a growth oflocal initiatives as managers look for ways toimprove their control of the project cycle. Theseinitiatives are particularly strong in thosecountries that have invested in dedicated M&Estaff or M&E units: Argentina, Egypt andIndonesia are examples in the study sample.52

n In Argentina, there has been a focus ondeveloping tools for the whole project cycle,with a strong focus on monitoring and riskassessment. A new ex-ante evaluationprocedure has been introduced with proceduressuch as review by the Local Project ApprovalCommittee.53 Some of these procedures arebeing taken up by the Regional Bureau forLatin America and the Caribbean (RBLAC)as part of regional guidelines.

n The Egypt country office has retained thelong-standing quarterly and annual projectreports, joint annual tripartite reviews and

52 Currently, 25 country offices have M&E specialists and 10 have a dedicated M&E Unit.53 The ex-ante evaluation procedure is referred to elsewhere in this report as ‘appraisal’ following normal project cycle convention.

n UNDP’s goals in the strategic frameworks have changed in presentation but the underlying areas ofwork have remained almost the same.

n The focus areas under the goals have been rationalized and simplified, but it is hard to identifysubstantive change to the scope of activities at country level.

n Managers and staff in country offices believe the MYFFs have helped to bring focus and improvedpositioning and advocacy. They have helped remove outliers on country portfolios, but haveotherwise had little effect. Projects have just been mapped to the new frameworks.

n Outcome statements for country programmes tend to be broad and permissive rather thanproviding a test for how well a project contributes.

n Mobilization of resources is, in practice, a more powerful driver of individual performance amongprogramme staff than achievement of results.

n There is some evidence that country offices organize their staff for delivery of results.

n The RCA is welcomed as a tool but in its current modality does not support managing for results.

Key points

Page 47: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N2 8

project self-evaluations that were made non-obligatory in the 2002 policy. Standardapproaches to M&E have been introduced inproject documents and the office reports M&Ebecoming stronger—borne out by a review ofhow outcomes and indicators for programmeshave improved over the period. The officestill finds it hard to link project monitoringto programme outcomes for the ROAR.

n Indonesia has developed a number of tools,most prominent of which is Touchstone, alocally developed project and programmemanagement software. A database bringstogether information to demonstrate the wayin which individual projects contribute toprogramme outcomes. (See Box 5 for a rangeof instruments in use.)

n Less progress has been made towardsmonitoring in the Moldova programme. Thenational setting was formerly not veryconducive—national planning only started in2002 with the uptake of Medium Term

Expenditure Framework and few governmentsystems are in operation. The country officemonitors projects through project reportsand annual programme reviews, thoughstakeholders report it is difficult to getgovernment counterparts to devote the timenecessary for in-depth discussion of issues.

n Zambia appears to have made most progresstowards working through government systemsand with the United Nations and other develop-ment partners. This is a reflection of thecountry having a large budget supportprogramme for aid. Harmonized working wasfostered by a joint assistance strategy alignedwith the Fifth National Development Plan(which has an M&E component). UNDP’sprogramme document is directly aligned tothe plan as well. The programme worksthrough sector advisory groups and partici-pates in joint programme annual reviews.

A common complaint is the absence of tools torelate progress with projects to contributions to

The Indonesia Country Office’s internal M&E functions are supported by the appointment of M&E focalpoints in each programme unit who interact with the Planning, Monitoring and Evaluation Unit in bi-weekly coordination meetings.

In addition to the standard corporate tools such as MYFF, the ROAR, and outcome evaluations, a range ofother instruments for monitoring are operational. These include:

n Touchstone (project management guide)

n Project Data Base System (has the format of ROAR, tracks results quarterly)

n A five-year M&E Plan

n Unit/programme workplans

n Unit-level M&E Guides (e.g., Crisis Prevention and Recovery Unit and Environment)

n Quarterly review meetings with the government

n Bi-monthly meetings with the State Secretariat, the National Development Planning Agency and theMinistry for Foreign Affairs

n Joint field visits and joint monitoring of selected projects with government counterparts

n Management team meetings

n Unit head team meetings

n Bi-weekly meetings with M&E focal points

n Annual Results Reports

n Reporting on progress towards the achievement of MYFF and balanced scorecard targets

Box 5. Monitoring and evaluation system components

Page 48: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N 2 9

programme outcomes. This disconnectcontributes to the finding that more progress hasbeen made with reporting than in the use ofreports. One consistent message from thecountry case studies is that the ROAR isprimarily a reporting tool to headquarters, withlittle operational value at country level. The shiftfrom projects to the ROAR undermined theevidence about results in UNDP projects andfailed to provide new tools to track progress fromprojects towards outcomes. Staff were questionedabout the ROAR in the staff survey on results-based management. This produced the counter-intuitive finding that 62 percent of staff consideredthe ROAR to be an effective outcome monitoringtool, but among managers (Deputy ResidentRepresentatives and above), 59 percent think it isnot effective—a finding more in line withobservations in the case-study countries.

Another channel of performance reporting fromcountries is the Resident Coordinator’s annualreport.54 This is described in the currentguidelines as “an essential element of theaccountability and results framework for UNoperational activities.” The report follows astructured format with some narrative andtabular data. It brings together information aboutperformance during the past year in the followingformat: an introductory letter highlightingprogress in UN reform, major results achieved,good practices and lessons learned; results anduse of funds table; UN Country Team (UNCT)workplan matrix; and good and bad coordinationpractices and procedures. A sample of thesereports was examined from each of the case-studycountries and an example of reporting againstone UNDAF outcome is reproduced in Table 5of Annex 5. The report covers key issues ofperformance, but the information is limited toobservations about delivery of outputs, with littleor no dialogue about contribution to outcomes.The nature of the presentation differs betweencountries, and reveals the varied quality ofoutcomes and indicators already noted in this

report. The example in Annex 5 illustrates howdifficult it is for a reader to track what progress isbeing made and to understand how the UNCTwill contribute to development outcomes.

3.2.2 ADJUSTMENT OF RESOURCES

Adjusting work in response to results is thecornerstone of an effective results-based manage-ment system. This study has failed to find anyconvincing evidence that suggests that results areinfluencing management. All country officesrecognized that adjustments are made to projectsfrom time to time, but there were no obviousexamples of information about results guidingdecisions about finance or human resources,especially not at the level of programme outcomes.Some practical examples were noted in Moldova:

n There has been a balance between support atthe central and local government levels. Thebalance between these two streams of work hasbeen a continued focus of the programme,affected by UNDP’s assessment of theprospects of creating sustainable capacity atthe central level.

n In its strengthening of the government’spoverty-monitoring capacity, UNDP initiallygap-filled this function by directly contractingand managing consultants. UNDP realizedthat this was unsustainable and used the2005 outcome evaluation to help convince thegovernment to start creating its own capacityin this area. UNDP supported this process bybuilding the capacity of the new staff.

n The Moldova programme is closing itsproject supporting strengthening of theParliament based on evidence that support isunlikely to be effective, since there is insuffi-cient support from the partner.

n In its project supporting re-integration ofvictims of human trafficking, monitoringhighlighted that sustained impact dependedupon helping women find jobs. Therefore,the subsequent phase of the project hasfocused upon this challenge.

54 Copies dating back to 2002 can be found via the UNDP internal website at: http://www.undg.org/index.cfm?P=490.

Page 49: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N3 0

These examples illustrate practical responses toinformation from a range of sources. The formalresults-based management tools may or may not have been prominent, but managers have tomake decisions with whatever information is onhand—in this case, a mixture of monitoring andperiodic evaluations.

3.2.3 RELATIONS WITH THE REGIONALBUREAUX AND SUBSTANTIVECORPORATE OVERSIGHT

Despite the intention in the mid 1990s forregional bureaux to undertake holistic oversightof country office performance, the managementrelationship between country offices andheadquarters has remained ill-defined.55

Most significantly, the regional bureaux have notbeen considered accountable for the developmenteffectiveness of country operations in theirregion. One of the reasons for this is theintention to decentralize management andaccountability to the country level and thelimited number of entry points at which regionalbureaux can intervene in country programmes.Projects are planned and approved withincountry offices, so one of the main entry pointsfor the regional bureaux is the CPD approvalprocess, which occurs every four years. Even thisis of limited scope as the CPD is, on paper atleast, a government programme presented to theExecutive Board by UNDP.

In practice, the regions have approached theirrole of substantive oversight in a variety of ways,some more imaginative than others.56 Interactionwas more likely to occur around operationalissues, and good performance was traditionallyseen as resource mobilization and delivery.Indonesia found the interaction with RegionalBureau for Asia and the Pacific (RBAP) over thedevelopment of the 2006 CPD to be ‘timely andconscientious’, although most interaction wasabout process and procedural compliance ratherthan substantive content.

All the regions have had to adjust their relation-ships to accommodate the new monitoring tools:ATLAS, the balanced scorecard, Global StaffSurvey, RCA and the Dashboard have changedthe information environment for regionaldirectors. These tools predominantly deal withfinancial and process information, and thatpredominantly at the level of projects.

The Moldova and Regional Bureau for Europeand the CIS (RBEC) experience is interesting. Inresponse to ambiguity over the precise meaningof oversight, RBEC has developed its own tool tomaintain oversight of country-level operationsand effectiveness. This is the Strategic Note,which was introduced in 2002. The Note is a six-page document, drafted annually, that briefly setsout progress on resource and programme issues inthe past year and major actions in the comingyear. More importantly, the Strategic Noteincludes mutually agreed targets on what thecountry office will deliver in the coming year andwhat has been delivered against the targetsagreed for the past year. Targets set for Moldovabetween 2004 and 2007 are in Table 7 and showthat the dominant focus of oversight is onresource mobilization and delivery.

The Moldova Resident Representative reportsthat there has been no substantive discussionbetween the Moldova country office and theRBEC over development results. Instead, discus-sion focuses on resource mobilization anddelivery. In common with annual targets set inthe balanced scorecard and ROAR, targets needonly be agreed upon in April-June, four to sixmonths into the planning year, which consider-ably weakens their value as tools that driveintended behaviour.

Regional Bureau for Latin America and theCaribbean (RBLAC) and Regional Bureau forAfrica (RBA) have developed a Composite

55 UNDP, ‘Annual Report of the Administrator and Related Matters’, 1997, DP/1997/16/Add.7. Addendum: ChangeManagement: UNDP 2001.

56 Definition of oversight has now been given in the 2006 Results Management Guide.

Page 50: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N 3 1

Performance Ranking index to track countryperformance. RBA argues that this has to someextent informed allocation of TRAC 2 funds.However, there is a sense of informationoverload, as much of the data in ATLAS andother tools is project-based and hard to interpretwithout supporting qualitative explanation.

The tools provide a strong quantitative asset tooversight. Less strong is the managementinterface to understand qualitative issues.Regional bureaux typically have a desk officer orprogramme officer responsible for the day-to-dayrelationships with the country office. Sometimesthese relationships work well, but there are anumber of weaknesses:

n Manpower resources mean that, in someregions, each desk officer has to deal with as

many as eight countries.

n Travel budgets restrict country visits to, atmost, two a year

n Desk officers are typically employed atgrades P3 or P4. Resident Representativeswill usually be D1, and in the strongly hierar-chical culture of UNDP, that grade disparitymakes it difficult for the desk officer to raisecritical questions. This results in issues beingmoved up to the level of regional director ordeputy director.57

n Desk officers do not always have thetechnical skills to intervene in substantiveprogramming issues.

Tools such as ATLAS are changing the relation-ship between regional bureaux and country

57 The absence of a deputy regional director in RBA is believed to have undermined the oversight role in the recent pastand led directly to RCA target setting for RR and DRRs not being completed for 2007.

Year Targets

2004 n Delivery: 100% of USD 2.66 million (of which TRAC is USD 1.48 million)n Resources mobilized: USD 1.48 million

2005 n Core delivery: 100% of USD 1.77 millionn Non-core delivery: USD 4.2 millionn CCA and UNDAF finalizedn A national monitoring system in place, with DevInfo at its heart, helping to track

progress towards Poverty Reduction Strategy Papers and MDG targetsn One outcome evaluation/best practices 'knowledge management product' on

local development

2006 n Non-core delivery: USD 3.75 millionn Resources mobilized: USD 6.0 millionn One outcome evaluation (Human Rights & Access to Justice)n CPD approved by the Executive Boardn NHDR on quality of economic growth for human development

2007 n Non-core delivery: USD 5.0 millionn Resources mobilized: USD 8.0 millionn 1 programme evaluation conducted (Joint Programme Support to Strategic Policies)n 1 study on non-governmental organizations publishedn 1 joint programme launched (Disaggregated Statistical Data)

Table 7. Targets in the Moldova Strategic Notes (2004-2007)

Page 51: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N3 2

offices. As project management informationexpands in content and coverage, the imbalanceof greater attention to finance and process ratherthan results may be reduced. However, much ofthe progress reporting in ATLAS and the ROARis still self assessment and focussed on projectperformance. Validity only gets checked at the

country level once every four years when anexpanded audit is carried out.58 The manage-ment response to the Office of Audit andPerformance Review for 2004 for UNDPZambia shows that the expanded audit by theOffice is a useful process for course correction,but some of the recommendations may have been

The Office of Audit and Performance Review works on a decentralized basis through about 45 staff intotal. Some 35 country office audits were completed in 2005-2006. Their extended or full-scope auditsdeal with good management, including results-based management. Staff have not received any specifictraining in results-based management per se. The results-based management Guide is used as theiraudit criteria.

The approach adopted is to look at outcome coherence with UNDAF, outcome specification and relationto outputs, targets, and measurement approaches. They make a compliance check to see if the annualreview and update of projects has been done and followed up on.

MYFF reports (the ROAR) are examined and their quality, consistency with previous statements andexpectations, and the evidence behind statements made are assessed.

In Office of Audit and Performance Review’s judgement, results planning and measurement are often ofpoor quality: output/outcome distinctions are not clear, results are not measurable, and indicators arepoor. They believe there is a need to have a procedure for outcome verification.

Box 6. Office of Audit and Performance Review

n Some progress has been made towards outcome monitoring, and country offices have shown interestinginitiatives, but there is little explanation of how projects are contributing to programme outcomes.

n The absence of firm procedures has led to creative diversity in M&E, especially where the countryoffice has a dedicated staff member.

n The ROAR is primarily an upward reporting tool with little utility in the country office or for regional management.

n There is no clear and convincing evidence that results are being used systematically to informadjustments to the country portfolios.

n Strong and effective decentralization has been accompanied by an ill-defined role of oversight forthe regional bureaux.

n Diverse approaches at the regional level, often drawing on the new information tools, have beenimplemented for interaction with country programmes. The quantitative information in ATLAS andother tools is predominantly project-focused and not matched by qualitative information thatexplains performance.

n Desk officers in the regional bureaux are an important link in understanding country performancebut are overworked, under-resourced, junior in status and often lack technical skills.

n The dominance of self-reporting is not adequately balanced by the current system of periodicexpanded audits.

Key points

58 Although the Office of Audit and Performance Review carries out full-scope audits every year, each year, a different sample ofcountries is selected. This works out to a frequency of one full-scope audit per country in every four years, on average.

Page 52: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N 3 3

overtaken by events. Hence the frequency of suchex-post audits, though appropriate from a costperspective, may not be appropriate to fosterstrategic managing for results (see Box 6).

3.3 EVALUATION, LEARNING AND ACCOUNTABILITY

3.3.1 ROLE AND USE OF EVALUATION

In 2002, UNDP’s formal project-level evaluationrequirements were dropped and outcome-levelevaluations were introduced. New rules,mandated by UNDP’s new Evaluation Policy of2006, now make it mandatory that countryoffices fund and carry out outcome evaluationsand that there be a follow-up managementresponse for all evaluations.

Among the five case-study countries, outcomeevaluations have been carried out in all exceptArgentina, where thematic ‘cluster’ evaluationshave been conducted. Zambia has retainedaspects of the old evaluation practice by conduct-ing mid-term and terminal evaluations ofprojects, but in the other countries, these onlytake place at the instigation of a developmentpartner. Egypt was the only country to haveundertaken a country programme evaluation.

Resources for evaluations is one of the issues thathave arisen. Under funding has led to lateimplementation. In the case of Egypt, findingswere delivered too late to influence the CPDprocess. In Moldova, no allowance was made forthe lack of project-level evaluation as a dataresource. As a result, the duration of consultanttime was too short to gather evidence. Poorquality of consultants was highlighted in severalcountries and related to budget constraints.

None of the countries had clear evidence thatevaluation findings had influenced programmes.

In Egypt, concerns were raised over unclearaccountability for the management response andfollow up to evaluations. The closest example ofa positive management response was found inZambia for an audit. The Office of Audit andPerformance Review Full Scope Audit of 2004(report released in 2006), contains a matrixstating management’s response.59 The responseshows concrete suggestions or progress alreadymade by the country office towards adjustingoperations on the basis of the recommendationsin the audit.60 The matrix also indicates theresponsible manager, the implementation date,level of priority and the cause.

The findings at the country level show thatcountry offices are adapting to the new evalua-tion policy. In some respects, it is too early to passjudgement on issues of accountability andmanagement response to evaluation findings. Butviewed within the wider context of the moveaway from project evaluation and the overallfocus of the Evaluation Office at Headquarters,the evidence suggests a decline in the quality anddepth of the evaluation base to support managingfor results.

The UNDP Evaluation Office was peer reviewedunder the auspices of the OECD-DAC Networkon Development Evaluation in 2005.61 Theirfindings were generally positive and noted thedifficulties of evaluation as UNDP moves intosofter areas of policy advice. However the reviewhighlighted some critical issues:

n Evaluability of programmes has beenseverely and consistently constrained by theperformance within UNDP’s results-basedmanagement systems (para 92 et seq)

n Follow-up systems to evaluations were foundto be weak (para 144)

n The relative emphasis has been greater onlearning as opposed to accountability (page 32)

59 Office of Audit and Performance Review, ‘Full Scope Audit Report: UNDP Country Office in Zambia’, 2006, p 30.60 The response section also allows management to explain/defend progress made vis-à-vis challenges encountered.61 OECD-DAC, ‘Peer Review: UNDP Evaluation Office’, December 2005. Conducted as part of Peer Assessment of

Evaluation in Multilateral Organizations by OECD-DAC.

Page 53: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N3 4

Recommendations made to strengthen the roleof evaluation as a ‘reality check’ for theprogramme and to strengthen evaluation againstintended results have been reflected in theEvaluation Policy endorsed by the ExecutiveBoard in June 2006. Specific actions include thecommitment to issue management responses forall evaluations, the Evaluation Office’s mainte-nance of a system to track management responsesand report cases where there are concerns overimplementation of the management commitmentsto senior management, and adoption of a policyof disclosure for all evaluations.

3.3.2 LEARNING AND TECHNICAL SUPPORT

Learning and technical support are supposed tobe provided through interaction between countryoffice staff and the practice forums and networks.The practices forums are not only for communi-cation but also have a wider range of organiza-tional functions including knowledge manage-ment, advocacy, partnership building, learningand professional development, and providingprogramme countries access to substantiveresources (such as rosters of experts and co-financing possibilities).

Together with the knowledge networks, theSURFs and RSCs were planned to helptransform UNDP into a knowledge-basedorganization. They provide technical and policyadvice, referrals, comparative experiences andissues-based applied research from specialistswith in-depth, multi-disciplinary knowledge ofthe regions in which they operate.

Findings from the country offices highlightdifficulties the countries have had in putting thissupport into practice. The value of SURFs’ andRSCs’ cyclical support in the design of newprojects is recognized, but learning from results atthe country level is less clearly articulated. Thetwo strongest examples come from Argentinaand Indonesia.

The Argentina programme used a cluster evalua-tion of four projects linked to the Medical InputsProcurement System to reassess their approach.62

One project was subsequently closed, the otherswere reoriented to new objectives, and the lessonslearned were applied to project planning in otherprovinces. In Indonesia, the 2005 Annual Reportdescribes a process of reducing geographicalcoverage to focus on provinces with a low humandevelopment index and susceptibility to conflictor disasters. The office also cites lesson learningas the force behind a decision to collaborate morewith the private sector.

No evidence was found of interaction with thepractice areas for lesson learning. Interviews atheadquarters with the capacity development,democratic governance and poverty practice areasin BDP reveals a variety of approaches beingfollowed. Provision of advice appears to bedemand driven and staff who were interviewedacknowledged that horizontal communicationacross the practice areas and with otherheadquarters units, such as the Evaluation Officeand the regional bureaux, could be stronger.Too much interaction hinges on personalrelationships. A clear tension exists between theview that technical support needs to be respon-sive to demands and able to cope with variety ofrequests from the country offices, and the viewthat BDP should be more prescriptive aboutintervention models that demonstrate how todesign UNDP-supported actions that lead toprogramme outcomes.63

The Management Review Team looked at thepractice areas in 2006. While the basic approachis in place, implementation of the concept stillremains a challenge, as shown in the quote below:

“Having successfully introduced the practicearchitecture, the next challenge will be tostrengthen and refine the practice approach and

62 These procurement projects comprised a significant part of country office resource mobilization at the time.63 One point of view is that staff in country offices lack the tools to manage for outcomes and, in particular, lack support to

plan and implement sound interventions. Some staff argue that only three areas of work have well-developed interven-tion models with supporting technical advice: capacity development, HIV/AIDS, and support to elections.

Page 54: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N 3 5

its implementation. This is critical, given thatcountry offices, despite their satisfaction with theoverall practice architecture, have not receivedconsistent and high quality support across allservice lines, and that support has generally notincluded advice to UNCTs. Concerns have beenvoiced repeatedly over:

n The lack of systematic and coherent deliveryof policy advisory services;

n Inadequate definition of roles and responsibili-ties, and unclear expectations on part of clients;

n A high cost and lack of flexibility in thefinancial model;

n Too broad a scope in terms of themesaddressed and products and services offered;

n A disconnect between knowledge productsand services and business processes; and

n Products and services not always adapted tonational development context.”64

3.3.3 ACCOUNTABILITY

A sound accountability framework would beexpected to consist of at least four basic features:65

1. Definition of clear roles and responsibilities(accountability relationship)

2. Clear performance expectations and rewardsystems (transparent incentive mechanism)

3. Credible and timely measurement andreporting of the results achieved (givingaccount)

4. Mechanism to hold to account (fair review ofresults, 360-degree feedback, reward achieve-ment or appropriate consequences for underachievement, resolve disputes, applyincentive system, or adjust if necessary)

3.3.4 DEFINITION OF CLEAR ROLES AND RESPONSIBILITIES (ACCOUNTABILITY RELATIONSHIP)

UNDP undertook a global exercise to reprofilecountry offices to ensure organizationalstructures and human resources matched the newwork flows and processes introduced by results-based management. This was a good attempt todefine more clearly roles and responsibilities ofcountry office staff. Three of the five case-studycountries participated in this as described earlier.Job descriptions were modified and staff wentthrough a transparent re-recruitment exercise.However, since then, job descriptions of old staffhave remained static while new results-basedmanagement tools and guidelines have beenreleased, which have added new roles andresponsibilities for staff. The main changes intools include: phased rolling out of ATLASstarting in 2004, together with gradual develop-ment of additional features; introduction ofPRINCE 2 in 2006; and release of the ResultsManagement User Guide in 2006. These havechanged workflows and processes, often withsignificant impact on staff workload, but with noincentives for use of the tools. The nature of workand type of skills needed has also changed asUNDP refocuses itself on upstream policy andlegislative reforms and institutional strengthening.Yet the results-based management proceduresand reporting tools take up a large percentage ofstaff time.66

A critical issue in defining responsibilitiesconcerns the focus of development results onproject outputs or programme outcomes. Results-based management was accompanied by a shift ofattention to outcomes, but this has never beenreflected in definition of responsibilities. Staffquestioned during the country case studiesindicated that there is a high degree of consensus

64 UNDP, ‘UNDP Management Review Phase II—Report of the Management Review Team’, UNDP Evaluation Office,New York, NY, 2007, para 73. Internal UNDP Report.

65 Early drafts of new proposals for an accountability framework prepared by Bureau of Management were in circulationduring this evaluation.

66 The view was expressed that senior staff spend a disproportionate amount of time approving transactions in ATLAS. Themain point being that results-based management increased, rather than reduced, the work to be done at country offices,yet many of these country offices have a shrinking allocation of core funds and therefore have to cut down on staff fund-ed by the core budget.

Page 55: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N3 6

on individual responsibilities and accountabilityfor project outputs, but no consensus and varyingviews about accountability for outcomes. Thestaff survey found that 61 percent of staff agreedthat roles and responsibilities at all levels areclearly set out and known to staff; 79 percentagreed that the Resident Representative/CountryDirector is accountable for achievement ofcountry programme outcomes (see Annex 8).

3.3.5 CLEAR PERFORMANCE EXPECTATIONSAND REWARD SYSTEMS (TRANSPARENTINCENTIVE MECHANISM)

At the level of the Resident Representative/Country Director, the balanced scorecard isconsidered the predominant oversight instrumentfor setting targets and assessing country officeperformance. Senior country office managementparticipates in the setting of some of the balancedscorecard targets while others are set byheadquarters. The RCA is also at the core ofperformance evaluation of country office topmanagement. The Global Staff Survey results areconsciously monitored by senior managementand internal adjustments are made to improvethe relationship between staff and managementso that the indicators improve. In addition, thepartnership survey, together with many othercriteria, contributes to performance evaluation ofthe Resident Representative.

The Resident Representative or Country Directorset targets annually, both for their individualRCAs and for the country office balancedscorecard, a process noted above. The balancedscorecard, Global Staff Survey and GlobalPartnership Survey are increasingly being usedfor benchmarking country office performanceagainst others in the same region or beyond.Hence there is always a desire by country officesto measure up to, or outdo others in their group.

Below the level of top management, the balancedscorecard is not cascaded down to units withinthe country office nor is it used for headquartersunits such as OSG or Bureau of Management.The main instrument for cascading corporatetargets to lower level staff is the RCA, describedearlier in this report.

However, the link between corporate targets andindividual staff RCA targets, especially fordevelopment results, is weakened by the absenceof unit-level workplans. The link betweencorporate and individual staff targets is clearerand stronger for quantitative targets of resourcemobilization, delivery and financial accountabil-ity. The quality of targets in RCAs gives theimpression that UNDP is preoccupied withmobilization and delivery of non-core funds,whilst development results are secondary.

3.3.6 CREDIBLE AND TIMELY MEASUREMENTAND REPORTING OF THE RESULTSACHIEVED (GIVING ACCOUNT)

In terms of results related to financial and processtargets, ATLAS and the Dashboard haveenhanced timeliness of measurement andreporting. The inter-linkage of ATLAS and theRCA with the balanced scorecard provides forsome transparency in results reporting within thebalanced scorecard. The main gap, however,relates to timely and credible measurement andreporting of development results, currently nothandled satisfactorily either through the ROARor through outcome and country-programmeevaluation. Without a corresponding pressurefrom top management for results, the systemstend to exacerbate both a project focus and areporting culture.

3.3.7 MECHANISM TO HOLD TO ACCOUNT

The end-of-year review of RCA results is participatory, and corporate RCA Guidelinesgive staff an opportunity to challenge theirperformance ratings through the Career ReviewGroup. UNDP has three principal mechanismsof holding staff to account: promotion, annualsalary adjustment, and contract renewal.Employees funded by core resources have two-year (renewable) contracts, while those paidfrom extra-budgetary resources usually have one-year contracts.

Top management is aware of the consequences ofpoor performance through promotion prospects.Employees strive to achieve so that theircontracts may be renewed. In theory, these

Page 56: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N 3 7

processes should function adequately. In reality,this accountability framework is weak for anumber of reasons.

First, staff contracts and the tenure of office ofthe Resident Representative/Country Directorare both shorter than the period of the CPD andnot necessarily in phase with the CPD. Henceaccountability for results implicitly concerns onlyshort-term targets—such as resource mobiliza-tion, delivery and project outputs—rather thanlonger-term development outcomes. Sinceprogramme staff in many offices are funded fromextra-budgetary resources, their accountabilityfor results is aligned more to outputs of projectsas opposed to delivering at outcome level.

Second, regional bureaux oversight of the processof measuring and reporting results is weak andhence scores in RCAs, the Global Staff Survey,the balanced scorecard and the ROAR are notimmune to strategic manipulation.67 The qualityand reliability of the ROAR is only independentlyevaluated at the time of the Full Scope Audit.

3.3.8 ACCOUNTABILITY FOR OUTCOMES

There are several challenges in consideringaccountability for development outcomes. First,there are influences other than the programmeitself, such as other programmes and social andeconomic factors. Second, many outcomes ofinterest take a number of years to bring about.There is a need to reconcile the planning and

reporting period with the longer time framesoften required for outcomes to occur. Third,UNDP is, in most instances, only a smallcontributor of resources in comparison to donorsand government. So the attribution of outcomesto UNDP’s efforts needs to be carefullyconstructed.

This issue has been examined in other settingsand both New Zealand and Canada haveconcluded that there is a need for a revisedconcept of accountability to take into account thefact that outcomes are not controlled bymanagers and that managing for outputs alone isincompatible with results-based management. Inboth countries, the case has been made thatmanagers need to be accountable for influencingoutcomes rather than achieving outcomes, andfor adjusting activities and outputs as a result oftracking performance to date.68

There is some evidence that similar ideas arebeing put into practice within UNFPA. UNFPAdescribes an ‘accountability for outcomes’ in their2007 Strategic Plan as—in relation to the aboveelements—being accountable for:

n Ensuring financial controls

n Achieving and monitoring outputs

n Monitoring outcomes (global trends andoutcome indicators)

n Ensuring outputs contribute to outcomes69

67 It is possible, for instance, for senior management to persuade staff to rate them favourably in the Global Staff Survey inexchange for better RCA scores, which would also boost his/her RCA/balanced scorecard scores.

68 See Annex 6 references for Baehler (2003) and Auditor General of Canada (2002).69 UNFPA, ‘UNFPA Draft Strategic Plan 2008-2011: Accelerating Progress and National Ownership of the ICPD

Programme of Action’, Executive Board Informal Meeting, 16 May 2007.

Page 57: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 3 . F I N D I N G S F R O M T H E E V A L U A T I O N3 8

n The 2002 evaluation guidance has led to outcome-level evaluations and country-programme evaluations. In the countries visited, these were not adequately budgeted and were poorly timed toinfluence new CPD planning.

n Country evaluations conducted by the Evaluation Office have emphasized learning over accountabil-ity and not measured performance against stated intentions.

n Results are rarely used for learning at the country level. The practice architecture in BDP is appreciated for technical support, but has poor linkages with functions such as evaluation and hasproduced few products clearly tailored to business processes.

n The accountability framework linked to results-based management is weak. Roles and responsibilitiesare generally clear, but targets are self-selected and poorly linked to incentives.

n Tools such as ATLAS and the balanced scorecard have greatly improved timeliness and access toinformation but are geared towards resources and process. The ROAR lacks substance on results andis rarely assessed for quality.

n Individuals are tied to a project orientation and accountability for outputs. All accountability foroutcomes is vested in the Resident Representative/Country Director.

n Results-based management is about managing for outcomes that are not within managementcontrol. Accountability can be set for individual staff to manage for outcomes and this approach hasbeen taken up by UNFPA.

n In the 2006 Evaluation Policy endorsed by the Executive Board, commitments were made tostrengthen follow-up systems for evaluations.

Key points

Page 58: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 4 . D I S C U S S I O N 3 9

4.1 IMPLICATIONS OF THE WIDER ENVIRONMENT

While this evaluation was focused on UNDP’sinternal arrangements, the wider context withinwhich the results-based management approachhas developed has had significant implications.UNDP works in a multilateral context, in whichits mandate emphasizes the centrality of nationalownership and the role of UNDP in buildingnational capacity.

This implies the need to work through nationalsystems, where feasible. The evidence from thefive case-study countries suggests that UNDPprogrammes invest significant attention insupporting national statistical and povertymonitoring systems for tracking progress towardsthe MDGs and other development results.However, the case studies show little evidencethat staff make comparable efforts to engage withnational planning and results/performancesystems at the sector/programme level wherebudgets are allocated and medium-term planningand objective setting occurs.

This suggests two important things. First,UNDP does not systematically look for opportu-nities to harmonize UNDP’s results-basedmanagement approach with results-basedmanagement approaches in national partnergovernments (when present). Not only does thisreflect a shortcoming in the planning process, butit also signifies missed opportunities for nationalcapacity development and further enhancingnational ownership. Second, UNDP has not fullyconsidered the implications of its results-basedmanagement approach on broader UN reforminitiatives, such as the Paris Declaration and theMonterrey Consensus, which reinforce greater

alignment with governments and harmonizationbetween development partners. This requiresguidance on how to balance demands on theresults-based management system to meetinternal needs relative with those imposed by thebroader environment within which UNDPoperates. This evaluation suggests that countryprogrammes are aware of these challenges buthave struggled to find solutions to three issueswith little support from the corporate level.These are:

n Who do results really belong to?

n The implications of working throughnational systems and harmonizing with other partners’ results-based managementapproaches and systems.

n What should UNDP be accountable for?While UNDP accountability is clearly setout in legal agreements, accountability forresults remains unclear. This applies atseveral levels. For example, what accountabil-ity for project outputs should lie withprogramme staff, if the major approach toproject implementation is nationalexecution? If results-based managementmeans a focus on results at the outcome level,then how can staff be held accountable formanaging towards this?

UNDP has not revised its corporate guidance toclarify such issues. The guidance has not beenadjusted to reflect the shift in results-basedmanagement approaches towards how to managefor results, rather than by results. Nor has theguidance been revised to clarify the significanceof cost-sharing as a source of funding. The casestudies suggest that funding partners at the countrylevel, particularly donors, persist in applying theirown performance assessment requirements,

Chapter 4

DISCUSSION

Page 59: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 4 . D I S C U S S I O N4 0

imposing additional transaction costs for bothUNDP and national implementing partners.

The country programme staff perceive corporateinitiatives as being focused on meeting corporateagenda—particularly the demand for reporting andbetter financial administration.This, in turn, reflectsa corporate response to the wider environment.

More broadly, there has been a general shift indemand from assessing aid effectiveness toassessing development effectiveness. WithinUNDP, this has most clearly been seen insustained demand from some constituencieswithin the Executive Board for an aggregatedmeasure of UNDP’s contribution to developmenteffectiveness. The development of the MYFFs/Strategic Plan and systems suggests that trying tomeet this demand has been the UNDP’s focus atthe corporate level, with little attention tosupporting development of stronger manage-ment decision making based on results at theprogrammatic level.

4.2 A CULTURE OF RESULTS IS BEING DEVELOPED

A number of authors and reports have looked atthe issue of a ‘results culture’, what it is and howto get there.70 Based on this literature, an organi-zation with a strong culture of results:

n Engages in self-reflection and self-examina-tion, seeking evidence on what is beingachieved

n Engages in results-based learning, withadequate time and opportunity

n Encourages experimentation and change,including risk taking

Thus, a weaker culture of results might, for example:

n Gather results information, but limit its usemainly to reporting

n Acknowledge the need to learn, but notprovide the time or structured occasions to learn

n Undergo change only with great effort

n Claim it is results-focused, but discouragechallenge and questioning the status quo

n Talk about the importance of results, butfrown on risk taking and mistakes

n Talk about the importance of results, butvalue following process and delivering outputs

The evidence from interviews in the studycountries and at headquarters is that UNDP hasmade some progress. However, the overallconclusion is that the organization still has a longway to go if it is to build a strong and sustainableresults-based culture. This conclusion is in linewith that from an assessment of results-basedmanagement at UNDP, which noted the need toenhance the culture of the organization: “…anongoing change management effort to embed aresults-based culture in the organization isrequired.”71 A report comparing results-basedmanagement efforts at a number of multilateraldevelopment organizations including UNDPconcluded that “… [these] multilateral developmentinstitutions need to work to amend their internalincentive structures in favour of results.”72

4.3 FACTORS HELPING AND HINDERINGA CULTURE OF RESULTS

The evaluation team conducted force-fieldanalysis exercises in the study countries to seekthe views of staff on factors that affect a resultsfocus in UNDP.73 An example from Zambia is

70 See text and bibliography in Annex 6.71 Dalberg and Global Development Advisors, ‘Assessing Results Management at UNDP’, Commissioned by the Danish

Ministry of Foreign Affairs, 2006, p 20.72 Flint M, ‘Easier Said Than Done: A Review of Results-Based Management in Multilateral Development Institutions’,

Department for International Development (DFID), London, UK, 2002, p 50.73 The ‘force-field’ is a visual sorting tool used to categorize issues and stimulate discussion.

Page 60: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 4 . D I S C U S S I O N 4 1

reproduced as Figure 2. These exercises found abroad consensus around the features summarizedin Table 8.

It is interesting how clear the importance of thegovernment is. It can be a positive, as a force

seeking results, and a negative, as a constraintwhere planning and M&E capacity is weak. Thetension between country offices and headquartersis also clear, with belief that the results-basedmanagement approach is heavily driven byheadquarters’ needs.

Factorsholding back

change

_____________

Centralizedpower

structures_____________

Too manychangingsystems

_____________

Tools with no feedback

loop_____________

PAST SITUATION

_____________

Skills gap in strategic

posts_____________

Too manymeetings>60% of

staff time_____________

DwindlingUNDP coreresources

_____________

Staff dev. time(5%) too shortfor mandatory

courses_____________

_____________

Re-profiling/devolution of

power to units_____________

_____________

Lack ofconfidence injunior officers _____________

Perceptionthat UNDP is a donor

_____________

Turnover ofcounterparts

_____________

_____________

Teamwork insome units

_____________

Trusted role of UNDP as

neutral partner _____________

_____________

Focus ondelivery at the

exclusion ofdev. results

_____________

Weak govt.ownership

_____________

_____________

InterlinkedRBM tools

(dashboard,RCA, BSC,

ATLAS, etc)_____________

Joint AnnualReviews

_____________

Joint planningwith partners_____________

_____________

Unresolved x-budgetary

issues_____________

_____________

On-the-jobtraining

_____________

Results-oriented new RR

_____________

Mgmt. practicenetworks and RSCs

_____________

Donorpressure

_____________

RESULTSFOCUS

_____________

UNDPCoordination

role-benchmarking

vis-à-vis other UN

_____________

Factorssupporting

change

Figure 2. Force field analysis of factors for and against a results focus in UNDP Zambia

Factors Supporting Results Focus Factors Holding Back Results Focus

n Government and development partnersseeking results

n Growing professional competence

n Stronger project management

n Appointment of M&E specialists or units

n Increased delegation of responsibilities

n Interlinked results-based management tools

n Lack of planning and M&E in government

n Too much focus on the UNDP headquarters agenda

n Focus on delivery rather than results

n UNDP comparative advantage not clear

n Unclear guidelines and frequent changes

n Tools tailored to headquarters needs

n Lack of coordination and poor communication

Table 8. Factors supporting and holding back a results focus

Page 61: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 4 . D I S C U S S I O N4 2

4.4 STAFF SURVEY ON RESULTS-BASED MANAGEMENT

A wider sample of views was sought in anelectronic survey of all staff in the country officesnot visited for the evaluation. The results aresummarized in Annex 8. The responses reveal anumber of interesting views.

n Overall, staff do not think there is a strongculture and leadership for results, althoughmanagers (Deputy Resident Representativesand above) take a more positive view thatUNDP does encourage risks in pursuit ofresults and has an adequate budget forresults-based management.

n Staff and managers believe that programmesare well focused, organized to deliver outcomes,planned with reference to evidence, and havewell understood outputs and outcomes.There is a high degree of optimism.

n The ROAR is clearly seen as the primarytool for outcome monitoring, althoughmanagers think it is less effective than staffdo. All agree that UNDP monitoring andreporting is not well harmonized with otherdevelopment partners nor does it make use ofgovernment systems (both importantfeatures of working in a decentralized way,driven by country needs, and reflected in theParis Declaration).

n Staff and managers think adjustment is donecollectively in discussion with stakeholders(another important feature of deliveringthrough partners) and that there is adequatescope for managers to manage. All also agreethat results affect neither the BiennialSupport Budget nor programme allocation(including TRAC 2) and that cost sharingreduces scope to allocate resources accordingto results.

n Responses about evaluation and accountabilityare less straightforward. The majority of staffagree that roles and responsibilities are wellknown. Most agree that pressure to mobilizeresources and deliver on time is greater than

pressure to achieve results. Staff think resourcemobilization is a more important factor foradvancement in their RCA, but managersdisagree. The Resident Representative/Country Director is acknowledged by allstaff to be accountable for outcomes.

n Staff believe they have received adequatetraining and support from RSCs/SURFs fordesign and indicators, but are not givenenough time to learn from results and evalua-tions. Managers disagree about support fromRSCs and headquarters and think adequatetime for learning is made available.

n Both staff and managers think that UNDP’srewards systems do not provide realincentives to strengthen a results culture.

Other supportive evidence includes the 2006Global Staff Survey finding that “responses tothe question ‘My office works consistentlytowards achieving long-term objectives’ have hita low point, with the bulk of the decline comingin country offices.”

4.5 EARLY GAINS, LOST MOMENTUM

The shift in strategic direction and new resultsframework brought early gains to the programmethrough a clearer expression of UNDP roles andfunctions. This fostered an improved relationshipwith the Executive Board that helped preventfurther decline in resources. There is goodevidence that staff have found strategicframeworks valuable in discussions with bothgovernment and development partners. But theclarity of expression in the first MYFF thathelped bring these gains was not taken further insubsequent rounds. MYFF 2 was a missedopportunity to improve programme focus, anoriginal aim of the corporate strategy. The draftStrategic Plan is still under debate, but does nottake focus much further.

The goals in the results frameworks for the twoMYFFs and draft Strategic Plan are too broad tofocus UNDP’s support in areas where it has a

Page 62: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 4 . D I S C U S S I O N 4 3

comparative advantage. UNDP has attempted, ineach succeeding results framework, to reduce thenumber of ‘strategic areas’74 under which itmakes a contribution. This approach has hadlimited impact upon programme composition atthe country level beyond some attrition of outlierprojects. Therefore, it has not been a successfulapproach to enhancing focus across the organiza-tion into declared areas of comparative advantage.

This is because the strategic objectives have notbeen used as a set of ‘hard boundary’ rules,defining what is and is not possible for countryprogrammes to support. Instead, this approachhas increased work for staff in countryprogrammes, as retrofitting the programme intoeach new corporate results framework hasbecome a clerical exercise rather than anapproach having real influence on futureprogramme composition.

Enhancing focus would require two things. First,an operational definition of focus for UNDPprogrammes. Second, change in the relationshipbetween regional bureaux and country offices. Todate, this relationship has mainly focused onoversight of processes and resources, with littlesystematic discussion of programmes’ substantivecontent. This has meant that programme focus isnot discussed. However, the low level of corefunding and high reliance on non-core funds willmean that the management spotlight needs to beon the degree to which Resident Representativesmanage to refocus in this situation. This is whereinteraction between regional bureaux andcountry offices should concentrate.

In the Inception Report for the evaluation, atable of benchmarks was introduced as an evalua-tion tool, drawing on performance standards forresults-based management abstracted from threesources: the Joint Inspection Unit of the UN, theParis Declaration on Aid Harmonization andAlignment, and the OECD-DAC Source Book

on Managing for Development Results. Thosebenchmarks provide an organizing framework forthe evaluation assessment. Annex 9 sets out indetail the assessment of each benchmark,drawing on the findings of this study and relevantindependent studies.

Progress has been made in most areas. Two of the21 categories are assessed as fully achieved, 16 are partially achieved and 3 are not achieved. Thelarge number of partially achieved benchmarksreflects the positive work of UNDP in creatingthe architecture to manage for results. But thelarge number of partially achieved benchmarksalso reflects this evaluation’s finding that toomany elements of the approach are not functioningsatisfactorily. Most importantly, results perform-ance is not informing decisions about programmesor resources across the organization.The evaluationcase studies identified managers who have a strongresults orientation for decision making, but theymake little use of information drawn fromUNDP’s results-based management systems tosupport their work and decision making.

It is not possible to use the benchmarks to assessUNDP’s status relative to other UN organiza-tions, or the wider population of public sectororganizations that use results-based managementapproaches. This is because benchmarking acrossorganizations requires a database that collatesindividual organizational performances against acommon standard, and the current benchmarkshave not been set up in this way.

The literature highlights a number of areas wherean organization committed to embeddingresults-based management should focus whileintroducing the approach. As shown in Table 9,UNDP has addressed many of the themes thatare normally recommended but not in a consis-tent and sustained manner. This has resulted in aculture of results akin to the ‘weak’ culturedescribed in section 4.2.

74 Terminology used for describing the corporate-wide strategic areas has changed with each results framework. UnderMYFF2, the service lines have defined these areas.

Page 63: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 4 . D I S C U S S I O N4 4

4.6 MANAGING FOR OUTCOMES WASCORRECT BUT UNSUPPORTED

The move to manage for outcomes was sensible,but the structure needs to be revisited, given theexperience and escalation in UN reform,

changing aid-delivery modalities with the growthof sector and general budget support, and theParis Declaration. A critical question is: Whoseoutcomes need to be based on country-ownedobjectives with which the United Nations and

What Should Be Expected What Is Found in UNDP

Demonstrated senior managementleadership and commitment

n The MYFF initiative was, to some extent, an initiative inparallel with the Administrators Business Plans during2000-2003 and contributed to a disconnect between thebalanced scorecard and the ROAR.

n Results-based management is not closely associatedwith any one Administrator. Current results-basedmanagement initiatives, though innovative and positive,are associated more with middle/senior managementthan with top management.

Informed demand for resultsinformation

n All managers acknowledge the importance of results,but management is driven consistently by resourcemobilization and delivery targets, rather than results.Some managers claim that’s because there are nosystems to measure results well, but then don’t commitresources to solving that problem.

Supportive organizational systems,practices and procedures

n Current systems do not provide rewards for risk taking or for achieving results. There is no evidence that theorganization values enquiry and reflection.

n Management at the project level is geared towardsoutputs, and although managers have the scope andflexibility to adjust operations, the evaluation found noevidence of results being a significant consideration inthat process.

A results-oriented accountability regime

n Accountability is disjointed with a narrow focus on theResident Representative; accountability at lower levels islimited to managing for outputs.

A capacity to learn and adapt n There is little formal learning although the practicenetworks are lively and widely used. Staff think that notenough time is set aside for learning, though managersdiffer in this view.

Results measurement and resultsmanagement capacity

n There is no recognized institutional home for results-based management in UNDP, nor is there a core ofexcellence in the organization. Some countries haveappointed M&E specialists but there is no corporatepolicy or incentive to do this.

Table 9. Themes associated with managing for results75

75 For further elaboration and a bibliography, see Annex 6, ‘Results-based management in development organizations.’

Page 64: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 4 . D I S C U S S I O N 4 5

other development partners are aligned? This hasproved challenging in several ways. Processes tofoster alignment between UN organizations’objectives and national objectives have evolvedsubstantially during the period. New rounds ofthe UNDAF show evidence of improvedownership and alignment. But this has led to acomplex multiplicity of higher levels goals:MDGs, country national plans, UNDAFoutcomes, UNDP strategic outcomes, andUNDP country outcomes. UNDP’s strategieshave not simplified this framework and guidedresources and monitoring. As a result, the UNDPcorporate outcome statement is more of a blandprogramme aspiration than an accountablemanagement objective. UNDP country officesare at pains to set outcomes that are distinct fromUNDAF and other UN organizations, yet inreality UNDP should probably be contributingtowards joint outcomes.

The attempt to shift monitoring focus fromoutputs to outcomes failed for several reasons:projects are a natural unit of management aroundwhich resources are configured; project tools aresimple and effective; lines of responsibility arestraightforward at the project level; and there wasan established body of experience at managingprojects in UNDP. Collaboration with otherdevelopment partners takes place mainly at theproject level, perpetuating the project as thenatural unit of analysis. For projects to contributeto outcomes there needs to be a convincing chainof results or causal path. Despite familiarity withtools such as the logframe, no new methods weredeveloped to help country staff plan and demonstratethese linkages and handle projects collectivelytowards a common monitorable outcome.

Quality and technical support issues also failedthe system. The ROAR was developed foroutcome monitoring, but the initial design wastoo complicated and it became diverted intoMYFF reporting—an example of ExecutiveBoard pressure shifting the system away fromcountry needs and from managing for results.Selection of indicators and targets is hard(especially in the newer ‘soft’ areas of policy

engagement), and little guidance was available.The UNDP results framework has involved non-standard and changing terminology, whichhas been difficult to work with in non-Englishspeaking contexts. Results-based management is demanding from a quality perspective, with the need for well stated outcomes, objectivelyverifiable indicators, timely and accuratereporting for management action, and accounta-bility. Poor initial specification of outcomes andindicators can negate the rest of the system, yetno provisions were made for quality assuranceand independent scrutiny beyond infrequentexpanded audits.

Faced with these problems—and bearing in mindthat development outcomes are slow to emerge,hard to attribute to small UNDP interventions,and in areas difficult to measure objectively—continuing with project-level quarterly andannual reporting on outputs was as a good tool asany and has helped maintain a basis of evidence.Furthermore, project managers have freedom tointervene. Limited resource flexibility tends tomean that adjustments take place within projectsrather than within outcomes. In view of the pre-eminence of projects, it is not surprising thataccountability is presently at the project outputlevel. This may be satisfactory for programmestaff but at Assistant Resident Representative/Deputy Resident Representative/Country Directorlevel, there is a need to move to managing for outcomes.

Evaluation has been under used as a tool ofresults-based management, although it isbecoming more prominent with the advent ofoutcome and country programme evaluations. Itis increasingly inappropriate for UNDP toevaluate outcomes in isolation from governmentand other development partners. A forward-looking stance for results-based management isthat stakeholders, as a group, agree to manageinputs and contributions to a common resultabove the level of the project output. This leadslogically to joint evaluations. Joint programmereviews are already found in countries with highlevels of sector programmes and budget support.

Page 65: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 4 . D I S C U S S I O N4 6

A first step would be joint UNDAF outcomeevaluations, an area where UNDP is well placedfor an active role.

4.7 RESULTS-BASED MANAGEMENT ISMORE THAN TOOLS AND SYSTEMS

To a large extent, systems are in place foreffective results-based management, but thechallenge is how they have been implementedand the degree to which they have helped fostera ‘results culture’ within UNDP. This culture ofresults is emerging, but the question is whetherputting in more systems will improve thesituation, given clear evidence in Chapter 3 oflimited capacity at the country office level andthe low level to which results are considered inmanagement decision making. As a result,systems have had the effect of reinforcing areporting rather than performance culture.

Results-based management cannot work ifproject level M&E systems are not operatingeffectively. The decision to remove mandatoryM&E requirements in 2002 was a risk, and theresult is that M&E capacity is weak in manyoffices. ATLAS and PRINCE 2 are positiveefforts to build capacity at the project level, butthere remains a danger that they push towards aprocess and reporting approach rather thanmanaging for results.

Systems can change behaviour if supervised well,as seen with financial management underATLAS. But systems don’t provide the types ofdata that managers at the country office levelreally need in order to manage better strategi-cally. In practice, lack of good data in thereporting system is because those responsible forinputting the data don’t see it as somethingimportant they are accountable for. It’s just onemore imposition from headquarters.

Page 66: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 5 . C O N C L U S I O N S A N D R E C O M M E N D A T I O N S 4 7

The findings from this study have beensummarized in short text boxes at the end ofmajor sections and Chapter 4 has drawn togethercore strands from the analysis. Elements of aresults focus within UNDP predate introductionof results-based management into UNDP, whichfor this evaluation is taken to have occurred in1997, when the Administrator’s Annual Reportcalled for the establishment of “an overall planningand results management system in UNDP.” Thiswas operationalized in 1999 with the introduc-tion of strategic results frameworks across allprogramme countries and development of thefirst MYFF. Chapter 2 discusses the subsequenteight-year period of rolling innovation, redesignand change. It is important to understand thatthis evolution was not guided by a comprehensivedesign and that there is little consensus withinUNDP on what the results-based managementapproach and systems include.

What is clear is that, over the period, UNDP hasestablished a cycle of setting and revisingcorporate goals, has introduced improved officesystems to manage project finances, has institu-tionalized the need to report on corporate andindividual performance, and has raised awarenessabout results throughout the organization.

Conclusion 1: The experience of UNDP withintroducing results-based management issimilar to that of other organizations.

UNDP was one of the first UN organizations tomove to a results-based management approach,

but the information does not exist to rank itsachievements and status relative to other organi-zations. Review of the literature discussingexperiences with results-based management (seeAnnex 6) strongly suggests that UNDP’s experi-ence has not diverged significantly from that ofmany other public-sector organizations.76

Overall, this evaluation identifies a significantnumber of areas where greater progress couldhave been made, but even with perfectknowledge and the required managementcommitment, it is unlikely that UNDP couldhave fully institutionalized a results-basedmanagement approach within eight years.Subsequent conclusions and the recommenda-tions therefore focus on the key challenges forUNDP and draw on wider experience on howthese may be successfully addressed.

Conclusion 2: UNDP has a weak culture of results.

International experience suggests that an organi-zation with a strong culture of results:

n Engages in self-reflection and self-examina-tion, seeking evidence on what is beingachieved

n Engages in results-based learning, withadequate time and opportunity

n Encourages experimentation and change,including risk taking

Adopting results-based management was a logicalcontinuation of management reforms during the

76 See Box 1, which summarizes challenges identified by the World Bank 2006 Annual Report on Operations Evaluationon operationalizing managing for results.

Chapter 5

CONCLUSIONS ANDRECOMMENDATIONS

Page 67: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 5 . C O N C L U S I O N S A N D R E C O M M E N D A T I O N S4 8

1990s, and UNDP probably had little option inview of pressure to improve performance from theExecutive Board and across the United Nationsas a whole. Significant progress has been made ona number of fronts: sensitizing staff to results, andcreating the tools to enable a fast and efficientflow of information. Despite considerable invest-ment in development of systems, managing forresults has proved harder to achieve. In particular,the strong emphasis on resource mobilizationand delivery, a culture supporting a low level ofrisk-taking, systems that do not provide informa-tion relevant for managing for results at thecountry programme level, the lack of clear linesof accountability, and the lack of a staff incentivestructure to judge performance based onmanaging for development results all workagainst building a strong culture of results.

Conclusion 3: The corporatist approach hashad only a limited effect on developmenteffectiveness at the country level.

UNDP adopted a systems approach to stimulatemanaging for results, which meant that changewas to be driven by the implementation ofcentrally designed and prescribed systems. TheMYFF strategic plans were used to set corporateresults frameworks with complex structures ofservice lines that tried to reflect the diversity of country programmes. These were primarilydeveloped to enable aggregate reporting ofUNDP performance to the Executive Boardwhile at the same time creating a clearer focus tothe programme.

UNDP has not developed corporate oversightsystems that track the degree to which countryprogrammes implemented a results-basedmanagement approach, instead focusing ondevelopment of systems required for upwardscorporate reporting and oversight of processes.Notable is the lack of oversight systems thatfocus on tracking whether programmes useresults to adjust resources (people, money andpartnerships) to improve future results.

In practice, the corporate service lines set byheadquarters have proved too numerous, with

very permissive definitions. This has led tocountry offices manipulating their programmesto fit into corporate service lines, divertingattention away from country needs, and has madereporting to the Executive Board more aboutprocess than substance. There is also littleevidence that this approach has significantlyaffected the shape of country-level programmes,but there is evidence that it has imposedunnecessary transaction costs at country level.

There is little evidence indicating a significantrole for results-based management systems in thestrategic allocation of resources (people andmoney) within UNDP.

Conclusion 4: Results-based management hasbeen misinterpreted as not supporting thedecentralized way in which UNDP works.

UNDP works in a strongly decentralized way, yetthe results framework in the MYFF were notgeared to country processes. Emerging newsystems under the UNDAF/CPD/CPAPreforms are seen to have the potential to createobjectives for UN organizations that are alignedwith national plans and responsive to countryneeds. Working through these structures, UNDPcountry office are now able to define realisticoutcome objectives that are within UNDP’smandate, aligned to the UNDAF and harmonizedwith other development partners.

Decentralization has been accompanied bydelegation of authority over the CPD. Undercurrent procedures, country programmes are notscrutinized for development potential by regionalmanagement, an abdication of responsibility. As aresult, evaluation and audit provide the onlymeans to check that country programmes arecontributing to corporate goals.

The corporatist top-down approach has inadver-tently fuelled concerns that having corporategoals is a means of imposing programmes at thecountry level. Decisions about the nature andcontent of country programmes are inevitablyreached through a political process between the

Page 68: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 5 . C O N C L U S I O N S A N D R E C O M M E N D A T I O N S 4 9

Resident Representative/Country Director, thenational government and those funding specificprojects. The role of results-based management isnot to constrain that process but to provide aframework so that UNDP works within itsmandate, or areas of competence, and ensure thatadequate resources are aligned behind achievingresults agreed among partners. Once programmesare agreed upon at the country level, results-based management should provide standards as abasis for dialogue about how to craft realisticoutcomes, select objective indicators that candemonstrate progress towards developmentobjectives, and jointly monitor progress.

Conclusion 5: Results-based management systemsare not helping build a results-based culture.

There are strong perceptions within UNDP thatsystems related to financial administration andmanagement have improved. Also, training inPRINCE 2 and ATLAS may be strengtheningproject administration and management skills, insome offices, where these have declined in thelast decade.

However, there is little evidence that systemshave led to increased focus on developmentresults (managing for outcomes). ATLAS andPRINCE 2 both deal with information at theproject level and the project is at the core of their designs. The RCA does not effectivelyincorporate key results that reflect successfulmanagement for results by individuals. There arealso concerns that systems have become overlycomplex and time-consuming.

Design and use of results systems have mainlyfocused on producing data to meet reportingcommitments to the Executive Board, ratherthan managing for outcomes, which is central toachieving a results orientation in UNDP at theprogrammatic level. However, UNDP has failedto develop a system for reporting on its contribu-tion towards development results, which meetsthe demands of constituencies within theExecutive Board. This reflects a number of issues.

The corporate level results frameworks havenever included high-level goals with substantivemeasurable and agreed indicators against whichto assess global progress towards meeting thegoals. Comparison of the goal-level reporting byUNDP with the objectivity of reporting againstthe MDGs is stark. UNDP also needs to becontrasted with many other UN organizations,which can rely on objective results data reportedthrough internationally developed systems todiscuss whether or not progress is being madeunder their key mandated areas.

UNDP has developed a reporting system thataggregates whether results will be delivered whenexpected or not. This approach has limitations.First, because the country-programme outcomesagainst which UNDP will deliver are poorlydefined and there is insufficient consistencyacross the country programmes on the definitionof an outcome. Second, because the logic linkingoutputs delivered by UNDP with achievement ofthe outcomes, and the higher level objectivesfound in the corporate strategic resultsframeworks, is often not explicitly defined andaccessible. Third, and the main point of interestfor the Executive Board, this approach fails to report on UNDP’s performance againstoutcomes for which it is accountable.

Conclusion 6: Managing for results requiresleadership.

The importance of leadership to drive results-based management forward has been notedseveral times in this report. A good example ofeffective leadership comes from the role of theprevious administrator in fighting the decline inresources. Staff are quick to acknowledge theadministrator’s success at shifting the focus ofmanagers. A strong personal commitment wassupported by: a single simple and consistentmessage on resource mobilization, which wasused for both internal and external audiences;development of systems to track, measure andreport managers’ success at resource mobilization;and a clear perceived link between successfulresource mobilization and advancement withinthe organization.

Page 69: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 5 . C O N C L U S I O N S A N D R E C O M M E N D A T I O N S5 0

The same drive and visible, consistent senior-levelsupport is needed for results-based management.Four relationships stand out as the most critical:at the Executive Board to ensure the programmeis held to account for development results;between the administrator or associate adminis-trator and the directors of bureaux; betweendirectors of regional bureaux and ResidentRepresentatives or Country Directors; and byResident Representatives/Country Directorswithin country offices.

5.1 RECOMMENDATIONS

Managing for results is a dynamic process, andmany of the issues raised in this report are knownto UNDP management and are receivingattention. There is genuine interest and supportat the country level for a better focus on results.As noted at the beginning of this report, results-based management is a journey not a destination.The recommendations here are designed to helpUNDP navigate that journey.

Recommendation 1: Strengthen leadership and direction.

The first and overarching recommendationaddresses the need to capitalize on what has beenachieved to date and establish a stronger cultureof results. The success of this is not dependentupon tools and systems, but leadership anddirection. Sustained commitment by topmanagement, the Administrator and theAssociate Administrator is required.

Strong leadership is necessary. Attention to afocus on UNDP’s results throughout manage-ment processes is necessary. That commitmentneeds to cascade down through the criticalmanagement relationships highlighted earlier.Without that, changes to systems will merelyreinforce a reporting culture.77

Leadership is a necessary but not a sufficientcondition. Attention to the issues summarized inTable 9 is also needed: a shift in the accountabil-ity framework from process and compliance toresults; outspoken commitment by seniormanagement, especially the directors of regionalbureaux; a change in dialogue throughout theorganization that prioritizes management fordevelopment results and addresses how this willbe balanced against competing demands such asresource mobilization; time and space for staff togive feedback on and learn from experiences; ashift in organizational practices to take risks andmanage for outcomes rather than outputs; andimproved capacity to measure results.

Recommendation 2: Global goals, localsolutions—Sharpen the role of the strategicresults framework.

Management should adopt a results framework thatdistinguishes more clearly between corporategoals and country programme outcomes.

UNDP’s operational role has four focus areas:crosscutting and multi-sectoral challenges ofpoverty reduction, democratic governance, crisisprevention and recovery, and environment andsustainable development. For these focus areas,objectives should be based on the key resultsareas, with indicators of substantive developmentchange comparable to those used for the MDGs.The corporate key results areas contain the basisof what could be measurable goal-level objectives,for example: promoting inclusive growth; promotinggender equality; fostering inclusive participation(in governance); and empowering the poor,women and youth. This approach will take time.The Executive Board and UNDP should startwith those key results areas where internationallyagreed-upon indicators already exist. This will bea major challenge for the programme, but this willprovide clear guidance to country programmesabout UNDP’s overall objectives and help ensure

77 Experience around the world underscores the important role of top management in creating a focus on results. Examplesare quoted in Annex 6.

Page 70: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 5 . C O N C L U S I O N S A N D R E C O M M E N D A T I O N S 5 1

that debates on performance are about develop-ment rather than the quality of reporting.

Identifying and reporting on UNDP contribu-tions should not be an obstacle, any more than itis for organizations reporting country progressagainst the MDGs. Key to this would be thedevelopment of robust models that show thelinks between country programme outcomes andUNDP contributions with achievement of thesehigh-level objectives. This approach would alsostrengthen the quality of information reported,since senior management would have a greaterinterest in ensuring that the information isaccurate, as it should also be part of the internalUNDP accountability framework.

The current practice of setting corporateoutcome-level objectives and indicators withinthe strategic plan should end. Instead, outcomeobjectives and indicators should be set at thecountry programme level, where they should belinked to UNDAF outcome objectives in thecontext of agreed-upon national developmentobjectives. Comparable outcome objectives shouldbe set within the regional and global programmes.

This change would reinforce the decentralizednature of UNDP activities and build on UNreforms. The change would have to be supportedby a shift in the oversight roles of the regionalbureaux, senior management and the ExecutiveBoard away from compliance with procedurestowards ensuring that country programmesimplement robust, results-based managementapproaches and are designed to contribute to theUNDP focus areas.

Recommendation 3: Support managing foroutcomes at country offices.

Managing for outcomes means that managerslearn from results and empirical evidence and usethat evidence to adjust either the projects undertheir control or the composition of the portfolioof projects to maximize the contribution of theorganization to that outcome. At a minimum,this means that there is a clear link between

projects and changes at the outcome level. Moresophisticated approaches require managers todemonstrate that they are monitoring theassumptions and risks associated with theprojects to ensure that they adjust the projectoutputs and project portfolio if the assumptionsdon’t hold valid and to manage risks.

Implementing such an approach requires thatUNDP consider the wider environment at thecountry level when defining outcomes. There is aneed for improved guidance on how to balancedemands on the results-based managementsystem to meet internal UNDP needs with thoseimposed by the wider environment within whichUNDP operates at the programmatic level. Thisincludes dealing with three core issues raised inthis report:

n Ownership of results at the country level

n The implications of harmonizing otherpartners’ results-based management approachesand systems

n UNDP accountability for managing for results

The positive effects of some of the newlydeveloped UNDP systems are noted previouslyin this report, with the caution that they arebased predominantly on managing projects.Introduction of new management and reportingsystems will impose significant costs on countryprogramme teams, and the country-level perceptionis that there has been insufficient appreciation atthe corporate level of the impact of these costs.

Country offices want to be effective and needsupport in several ways:

n A streamlining of systems, aiming for a moreuser-friendly integrated approach with betterprioritization and introduction of newrequirements across the organization.

n Improved practical tools and guidelines toplan how projects will contribute toprogramme outcomes and to improve thespecification of indicators. These are neededto change the configuration of country

Page 71: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

C H A P T E R 5 . C O N C L U S I O N S A N D R E C O M M E N D A T I O N S5 2

programmes so that programme objectivesdrive the selection of projects that areimplemented. This will also help resolve thetension between resource mobilization anddevelopment effectiveness.

n A large-scale capacity-development programmeto improve staff knowledge and skills.

n Programmes that are designed aroundproven models of intervention that can betailored to country circumstances, managed,monitored and evaluated. The BDP shoulddevelop documented intervention models ofgood practice based on lessons from theprogramme and from development partners,making greater use of evaluations as a sourceof learning.

n Quality assurance to examine countryprogrammes and assess evaluability. Periodicprogramme-wide reviews of programmeobjectives, indicators and measurementarrangements can be used to provide neutralfeedback to managers.

n Expanded use of country office outcomeevaluation plans geared to joint evaluationswith government and development partners.

n Working in a multisectoral environmentmeans that, increasingly, countryprogrammes need to work within the contextof medium-term expenditure frameworks,programme budgets and sector-wideprogrammes. Future outcome evaluations,when possible, should be joint evaluationswith government and development partners.As the tools and guidelines are developed,the ROAR should be revised to improve theevidence-base and structure of the report.

Recommendation 4: Expand investment anduse of evaluation and performance audit.

Improving country programmes requiresattention to detail and development of soundobjectives and indicators. A quality assuranceprocess is recommended as an ex ante way ofscrutinizing country programmes. This needs tobe supported by independent review of processesand compliance, along the lines of the currentenhanced audits.

The structure of results proposed here placesmore responsibility on country offices to developprogrammes that respond to country needs andcontribute towards global goals. It also frees themfrom having to fit into centrally determinedservice lines. The test, therefore, is whether theprogrammes that are developed contribute to thegoals of UNDP. This will require a strongerevaluation function that addresses both learningand accountability. The 2006 Evaluation Policy isa step in the right direction. The challenge now isimplementation that supports accountability andthe new results management guidance.

These recommendations are intended to bemutually reinforcing and ought to be viewed as awhole. Some recommendations focus on overallframework rather than specific tools or issues.Dealing with leadership, the results framework,programme focus and accountability of theregional bureaux are the highest priority, followedby tools to help country offices chart contribu-tions to outcomes, and quality assurance systemsfor programme review.

Page 72: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 . T E R M S O F R E F E R E N C E 5 3

CONTEXT AND PURPOSE OF THE EVALUATION

1. UNDP adopted a results-based managementapproach in 1999, with agreement of the firstMYFF. The goal of results-based manage-ment is to enhance UNDP’s contribution todevelopment effectiveness. There is consider-able organizational interest in reviewingUNDP experience in results-based manage-ment as evidenced by the number of reviewsthat have been conducted on related issues.1

In addition, recent evaluations conducted bythe UNDP Evaluation Office have highlightedpersistent issues and gaps related to theoperation of results-based managementsystems and approaches at the country officelevel. Recognizing the need to take stock ofUNDP results-based management experi-ence, the Executive Board approved theUNDP evaluation agenda that included theevaluation of results-based management atUNDP in the 2006 June Session.

2. Though a number of studies of results-basedmanagement have been conducted bydifferent organizational units within UNDPand within the wider UN system,2 these aremostly based on desk studies and interviewswith select stakeholders at UNDP headquarters.Moreover, these studies have focused onassessing whether a results-based manage-ment system is in place and the quality of the

results information reported. This evaluationaims to complement these studies rather thanduplicate them and is expected to provide inputto the processes underway in the organiza-tion to improve results-based management.

3. Therefore, the primary intention is not toassess whether results-based managementsystems are in place (this is the function ofthe expanded audits) or how they are used inreporting on UNDP performance to theExecutive Board and external audiences (thishas been covered extensively in otherreviews). Nor is the intention to focus onassessing the quality of the resultsframeworks used or indicators selected as thishas been commented upon extensively inother studies. Finally, the evaluation does notseek to evaluate the effectiveness or impact ofindividual projects or programmes at thecountry level. It limits itself to attempting toidentify whether or not the contribution ofresults-based management to enhancingsuch results and impacts can be identified,based on results and impact informationalready produced by the country programmesand partners.

4. The main purpose of the evaluation will be toexamine the degree to which the results-based management approach adopted byUNDP since 1999 has fostered a resultsculture within the organization, enhanced

Annex 1

TERMS OF REFERENCEEVALUATION OF RESULTS-BASED MANAGEMENT AT UNDP

1 MSI, ‘Background Paper on Results-Based Management in Development Organizations’, Management SystemsInternational, July 2006, p 6; Danish Ministry of Foreign Affairs, ‘Assessing Results Management at UNDP’, 15 June2006.; UNDP Administrator’s Office, ‘Management and Workflow Review—Phase I’, January 2006.

2 Ibid. UN, ‘Implementation of Results-Based Management in the UN Organizations’, Joint Inspection Unit, 2004;OECD, ’Results-Based Management in Development Co-operation Agencies: A Review of Experience’, DAC WorkingParty on Aid Effectiveness, 2001.

Page 73: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 . T E R M S O F R E F E R E N C E5 4

capacity to make better managementdecisions, improved focus and efficiency, andstrengthened UNDP’s contribution todeliver development results.

AUDIENCE AND USE OF THE EVALUATION

5. The evaluation will be presented to theExecutive Board in January 2008 and thefindings of this study will feed into the ongoingefforts to develop UNDP’s Strategic Plan.

SCOPE AND EVALUATION CRITERIA

6. The evaluation will cover the period 1999-2006, although for the purpose of assessingthe effectiveness of results-based management,the evaluation will examine managementapproaches that preceded the introduction ofresults-based management.

7. It will cover all geographic regions andevaluate results-based management at theprogramme, country, regional and corporatelevels. At the country level, the evaluationwill assess the results-based managementapproach under the diverse developmentconditions in which UNDP functions—including, but not restricted to, programmesize, development context such as aid-dependence, and varying capacities formonitoring and evaluation (M&E).

8. The evaluation will address the following:What was results-based management expectedto achieve? How was it adapted to suit thechanging context of UNDP and the aidenvironment? What were the intended andunintended results of the results-basedmanagement approach at UNDP? Whatworked (and did not) and why?

9. The evaluation will, to the extent possible,evaluate against the evaluation criteria ofrelevance, efficiency and effectiveness. It isconsidered that the other commonly appliedcriteria—impact, value-for-money, clientsatisfaction and sustainability—are notrelevant within the context and availability ofdata for this evaluation.

APPROACH

10. A theory-based approach, based on anunderlying model of organizational change,will be used. This will allow the evaluation tobuild a cumulative picture of progress along achange pathway.

11. The basic theory underpinning results-basedmanagement is based on a systems under-standing of how organizations operate. Theway in which results-based management mightinfluence changing behaviour within theorganization can therefore be represented as alogic model or theory of change, using simpleassumptions about stimulus and response. Tobe used as a model for evaluation, a theory ofchange should have certain characteristics:3

n Postulated relationship must appearcoherent and logical

n Planned change must be ‘doable’(resources and expertise)

n Theory of change must be ‘testable’(amenable to empirical investigation andverification)

12. The theory of change that will be used has beenderived from literature on the introduction ofa results-based management system and ispresented diagrammatically in Figure 1.4

The diagram follows the key principles forresults-based management at UNDP. Itstarts with setting strategic goals thencontinues with planning results to achievethe goals, monitoring implementation for

3 These characteristics summarize the evaluability of the theory.4 See ‘RBM in UNDP: Overview and General Principles’, 2000.

Page 74: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 . T E R M S O F R E F E R E N C E 5 5

progress and performance, taking correctivemanagement action and evaluating results.The stages in the diagram identify keyeffects, starting with a clearer orientation of

UNDP initiatives followed by realignment ofresources towards results, efficient adjustmentof resources, and links with knowledge andinstitutional learning.

Figure 1. Theory of change for results-based management

Programmeorientation

Primary institutionaleffects

Secondaryinstitutionaleffects

Goal

Managing for Results

Improved Performance:organizational & development

effectiveness

Institutionalincentives

Externalfactors

and risks

Externalfactors

and risks

Organizationallearning

Set strategic goals for UNDP

Goals create a framework to defineprogrammes, organizational structure

and management arrangements

Plan specific results to achieve the strategic goals

Align resources (people, money and partnerships) to achieve results

Monitor implementation for progress and performance

Timely informationgoes to

decision-makers

Knowledgeable interpretation

Efficient adjustment of resources (people, money and partnerships)

to achieve results

Evaluate strategicgoals

Decision-makerresponds

Page 75: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 . T E R M S O F R E F E R E N C E5 6

13. The implicit goal is improved performance(interpreted here as development effective-ness). Details of the pathway by whichresults-based management processes thatimprove management decision making andenhance an agency’s contribution to develop-ment effectiveness are not clearly specified inthe literature. This is also methodologicallymore challenging compared with the situationin many other agencies, since UNDPprogrammes are usually implemented bydevelopment partners who, in managementterms, are at arm’s length from UNDP andUNDP’s contributions are often in terms ofsoft assistance, rather than money.This issue willbe explored further during the evaluation.

14. Under a systems approach, results-basedmanagement should be just one of a numberof sub-systems operating within UNDP.The evaluation will therefore examine howthis particular sub-system has interacted withother key sub-systems and the impacts ofthese interactions on how the results-basedmanagement sub-system is actually used.The results-based management sub-systemmay be defined as including:

n The MYFF and ROAR, which focus ondevelopment results

n The Balanced Scorecard and Dashboard,which focus primarily on trackingorganizational efficiency

15. Supplementing use of the theory-basedapproach, the evaluation will also seek toassess whether systems are in place and theexpected processes are being used in theiroperation against benchmarks. Benchmarksshould be drawn from the following: thosebenchmarks developed during the UN JointInspection Unit’s review of results-basedmanagement within the United Nations in2004; material presented in the Joint Ventureon Managing for Development Result’sSource Book; and the objectives set out in theParis Declaration on Aid Effectiveness.

METHODOLOGY

16. Data will be collected using three researchmethodologies. These are:

n A desk review and analysis of relevantsecondary material

n A survey of all country offices aiming togather data on issues emerging from thedesk review and pilot regional case study

n A number of case studies

17. The unit of analysis will be the country officeup to the relevant regional bureau andExecutive Office. This will allow a betterunderstanding of how decision making isaffected by results-based management acrossthe chain of accountability within theorganization. This will then be supple-mented by an analysis at headquarters levelof how results-based management systemswere developed and their implementationsupported across the organization and howdevelopment of the results-based manage-ment systems was coordinated with develop-ment of other key systems (mainly thehuman resources, knowledge managementand financial allocation and monitoringsystems). This approach to defining the unitof analysis is illustrated in Figure 2.

INCEPTION PHASE AND DESK REVIEW

18. The inception phase will include a headquar-ters-based mapping of issues raised in theterms of reference and preparation ofbackground documentation. A research ortechnical assistant based in New York willprepare detailed syntheses and extracts of thedocumentation collected. This activity willsupport an initial visit by the evaluation teamto UNDP headquarters to meet withheadquarters units, including the EvaluationOffice, management, regional bureaux,Operations Support Group, Bureau ofManagement and Bureau for DevelopmentPolicy. This visit will also provide anopportunity to select the five countries for

Page 76: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 . T E R M S O F R E F E R E N C E 5 7

inclusion in the case studies. At the end ofthe Inception Phase, the evaluation team willprepare an Inception Report elaborating onthe terms of reference and describing howthe evaluation will be carried out, refiningand specifying the expectations, methodology,roles and responsibilities, and timeframe.This Inception Report will be reviewed withthe Evaluation Office, Expert Panel andother UNDP stakeholders and, if necessaryamended, before the launching of the mainevaluative work.

IMPLEMENTATION, INCLUDINGCOUNTRY VISITS

19. The implementation phase will include thefollowing activities:

n Consultations at headquarters aimed atgathering information on the develop-ment and implementation of relevantsystems and the role of management andthe regional bureaux in the managementand oversight of country programmes.

n Visits by members of the evaluation teamto the five country programmes. Visitswill last for approximately seven to eight

working days and will include meetingswith the government and non-govern-mental organizations, as well as the UNCountry Team.

n Development and use of a web-basedsurvey questionnaire aimed at relevantstaff in UNDP country offices.

FINALIZATION AND REPORT

20. An advisory group consisting of three to fourinternational experts in results-basedmanagement and development evaluationwill be the external reviewers and theEvaluation Office will provide the internalreview. The review process will involve thefollowing: assessing the feasibility of theterms of reference; assessing the soundness of the methodology proposed by the team;verifying satisfactory operationalization ofthe terms of reference; addressing reliability andappropriateness of the evidence used in thestudy; and ensuring findings and recommen-dations are relevant, based on solid evidenceand are within the mandate of UNDP.

21. The preliminary findings for case studies willbe presented to the stakeholders in mission

Figure 2. Clusters of the evaluation enquiries

Des

k re

view

and

sta

ff s

urve

y

Hea

dqua

rter

s in

terv

iew

s

Regi

on/c

ount

ry c

ase

stud

ies

AssistantAdministrator

Deputy DirectorCPA

UNCT

AssistantAdministrator

Deputy DirectorCPA

UNCT

AssistantAdministrator

Deputy DirectorCPA

UNCT

AssistantAdministrator

Deputy DirectorCPA

UNCT

AssistantAdministrator

Deputy DirectorCPA

UNCT

Pilot Region Case Study

Case Study Region 2

Case Study Region 3

Case Study Region 4

Case Study Region 5

Administrator / Associate Administrator

Headquarters Units

OSG BOM BDP BCPR Mgmt Rvw Team Other

Page 77: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 . T E R M S O F R E F E R E N C E5 8

countries for verification of facts andevidence as part of the country missions.Short reports, detailing information andfindings from the country visits will beproduced and sent to the country teams andregional bureaux to allow discussion andchecking of their factual accuracy.

22. The evaluation team will meet subsequent tothe country visits to identify major findingsprior to drafting of the main EvaluationReport. Preliminary findings and conclusionswill be informally presented to UNDPmanagement and also at the June 2007Executive Board session. The EvaluationOffice will then assume responsibility forensuring that the draft main EvaluationReport is circulated among all key stakehold-ers (within UNDP, the Evaluation Officeand the advisory group) to ensure thatfindings and recommendations are relevant,based on solid evidence and are within themandate of UNDP. The team will respond toall comments presented by the reviewers andwill provide a rationale for any and allcomments which it disagrees with.

23. Final responsibility for the content of theMain Report shall lie with the UNDP

Evaluation Office. The final EvaluationReport, along with the management’sresponse to the conclusions and findings, willbe presented at the January 2008 ExecutiveBoard session.

CONDUCT OF EVALUATION

24. The evaluation will be conducted inaccordance to UN Evaluation Group Normsand Standards. These are available online at:http://www.uneval.org/docs/ACFFC9F.pdf.

EVALUATION TEAM

25. An international team of independentconsultants supported by national expertsand research/technical assistance, as needed,will undertake the evaluation. There will befour to five international team members withan array of experience linked to results-basedmanagement. A research assistant will be postedwith the Evaluation Office for the prelimi-nary desk review and to support the evalua-tion team. All team members will be selectedand recruited by the Evaluation Office.

Page 78: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 2 . P E O P L E C O N S U L T E D 5 9

NEW YORK

UNDP

Akopyan, David, Planning Specialist, Office ofPlanning & Budgeting, BOM

Al Soswa, Amat, Assistant Administrator &Regional Director, RBAS

Bazile-Finley, Jocelline, Director, Office ofPlanning & Budgeting, BOM

Biha-Keflemariam, Giovanie, Chief, Office ofPlanning & Budgeting, BOM

Bilgrami, Razina, Deputy Chief / ProgrammeAdviser, South & West Asia Division, RBAP

Chungyalpa, Kunzang, Chief, CountryOperations Division, RBAS

Clapp, David, Programme Adviser, RBASCravero, Kathleen, Assistant Administrator &

Director, BCPRde Langen, Maikke, Business Analyst, BOM/CBSElizondo, Ligia, Director, OSGEriksson, Thomas, Office of Corporate

Planning, BOMFartash, Parviz, Senior Programme Adviser &

RBM Focal Point, RBECFianu, Martin, Senior Adviser & Chief of

Staff, RBAGatto, Susana, Coordinator, RBLACGjuzi, Albana, Programme Manager, Western

Balkans Cluster, RBECGleeson, Brian, Former Director, Office of

Human Resources, BOMGrynspan, Rebeca, Assistant Administrator &

Director, RBLACGwaradzimba, Fadzai, Former Resident

Representative, The Gambia, RBAGyles-McDonnough, Michelle, Programme

Adviser, OSGHage, Juliette, Programme Adviser, RBASJahan, Selim, Cluster Leader, Poverty Group, BDP

Jenks, Bruce, Assistant Administrator &Director, BRSP

Jones, Terrence, Director, CapacityDevelopment Group, BDP

Kandasamy, Salleppan, Chief a.i., Office ofAudit & Performance Review

Karim, Moin, Programme Adviser, RBASKarl, Judith, Chief, Central Strategy & Policy

Cluster, BCPRKarlsrud, John, Policy & Strategy Analyst, RBAKhoury, Antoine, Chief, Internal Audit Section,

Office of Audit & Performance ReviewKomatsubara, Shigeki, Country Programme

Adviser & Focal Point, Zambia, RBALawry-White, Janey, Monitoring & Evaluation

Specialist, BCPRLoemban Tobing, Martin, Team Leader,

Oversight of Resources & PerformanceAssessment, Office of Planning &Budgeting, BOM

Melkert, Ad, Associate Administrator, UNDPMishra, Tapan, Chief a.i.,

Learning Resource CentreMostashari, Ali, Operations Specialist, RBAMurali, B., Programme Specialist, RBASMuttukumaru, Romesh, Deputy Director, BRSPNavarro, Napoleon, Programme Specialist, RBAPNorris, Pippa, Director, Democratic Governance

Group, BDPOhiorhenuan, John, Deputy Director, BCPROliveira, Marielza, Programme Manager,

Oversight & Support, RBLACPasha, Hafiz, Assistant Administrator &

Regional Director, RBAPPronyk, Jason, Programme Specialist, OSGRamachandran, Selva, Chief, Regional Support

Unit, RBAPReza, Rini, Programme Adviser, RBAP

Annex 2

PEOPLE CONSULTED

Page 79: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 2 . P E O P L E C O N S U L T E D6 0

Rodriques, Stephen, Programme Specialist, OSGRogivue, Jean-Claude, Chief, South & West

Asia, RBAPRuedas, Marta, Deputy Assistant Administrator

& Deputy Regional Director, RBECRussell, Andrew, Deputy Director, OSGSaleh, Turhan, Country Director, NigeriaSassenrath, Yves, Operations Manager, BRSPSolovieva, Alexandra, Programme Specialist, RBAPSrivastava, Sudha, Chief, Programme &

Operations Support Cluster, BCPRSuokko, Maria, Programme Specialist, RBAPSvendsen, Mads, Management Consulting TeamTamesis, Pauline, Practice Manager, Democratic

Governance Group, BDPVan der Vaeren, Claire, Chief, Southeast Asia &

the Pacific, RBAPVinton, Louisa, Senior Programme Manager

& Team Leader, Western CIS & Caucasus, RBEC

Wandel, Jens, Director, BOM/CBSYounus, Mohammed, Programme Adviser, RBASYuge, Akiko, Assistant Administrator &

Director, BOM

OTHER UN ORGANIZATIONS AND BILATERAL DONORS

Buchholz, Kai, Policy Specialist, UNDGErken, Arthur, Associate Director, UNDGFegan-Wyles, Sally, Director, UNDGFlaman, Richard, Consultant to OSGHauge, Arild, Chief, M&E Inspection, OIOS,

UN SecretariatHuisman, Jan, Deputy Chief of Service, Finance

Management & Support Services,UN Secretariat

Kadirgamar, Chandi, Evaluation Adviser,UNCDF

Keijzers, Henriette, Deputy Executive Secretary,UNCDF

Marcelino, Elena, Innovation and LearningSpecialist, UNIFEM

McCouch, Robert, Programme EvaluationOfficer, OIOS, UN Secretariat

Morgan, Richard, Deputy Director, Division ofPolicy & Planning, UNICEF

O'Brien, Brendan, Chief, Strategy and PolicyUnit, UNFPA

Sandler, Joanne, Deputy Director, Programmes,UNIFEM

Usmani, Farah, Strategic Planning Adviser,Strategic Planning Office, UNFPA

von der Mosel, Katrin, Head, Evaluation Unit,UNV

Wang, Vivienne, Strategic Planning Office,UNFPA

Weiser, Janet, Chief, Evaluation and MediaResearch Office, UN Secretariat

ARGENTINA

UNDP

Aquilino, Natalia, Monitoring &Evaluation Officer

Barral, María Laura, Monitoring &Evaluation Consultant

Bertranou, Julián, Governance Cluster Coordinator

Bohorquez, Paola, Programme AssistantBotino, Gabriel, Programme AssistantBril, Tomás, Programme Assistantdel Río, Cecilia, M&E AssociateFerraris, Verónica, ConsultantGarcía, Virginia, Communications AssociateGatto, Susanna, Coordinator, RBLACGentilini, Mariano, Operations ManagerIglesias, Guillermo, Finance AssistantIrizar, Manuel, Programme AssistantJoensen, Vivian, Human Resources AssociateKotzer, Daniel, Social development

cluster CoordinatorLeivi, Milena, Programme AssistantLopez, Beatriz, Administration AssistantMartinez, Carlos Felipe, Resident

Representative/Resident CoordinatorMomeño, Ivan, Project Coordinator –

Italian CooperationMontaña, Silvia, General Services CoordinatorNovak, Daniel, Productive Development

Cluster CoordinatorOleandro, Silvina, IT Coordinator

Page 80: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 2 . P E O P L E C O N S U L T E D 6 1

Oliviera, Marielza, Desk Officer,Argentina, RBLAC

Pallares, Ulises, Project Assistant - ItalianCooperation

Rucks, Silvia, Former DRR in UNDPArgentina Country Office

Tomasini, Daniel, Environment Cluster Coordinator

Vinocur, Pablo, Programme CoordinatorWawrzyk, María Angelica, Finance Coordinator

DONORS

Caminal, Diego, Inter-American Development Bank

Longobardi, Felice, Director, Italian CooperationPiani, Valeria, Director Assistant,

Italian Cooperation

GOVERNMENT

Ávalos, Pedro, Projects National Director,Ministry of Foreign Affairs

Franganillo, Luis, Advisor, Ministry of Foreign Affairs

Lacueva Barragán, Marcela, Project Coordinator -ARG/06/010 Bridging the Gap: usingCollaborative Planning to Strengthen Tiesbetween Local Government and CivilSociety in Argentina, Cabinet Chief

Nilus, Pamela, Project Coordinator -ARG/04/007 Citizenship Audit,Cabinet Chief

Novik, Marta, Under Secretary for TechnicalProgramming Labor Studies, Ministry ofLabor, Employment and Social Security

Ojea Quintana, Rodolfo, Under Secretary forInternational Coordination andCooperation, Ministry of Foreign Affairs

Otero, Martín, Under Secretary Administrationand Finance, Health Ministry Buenos Aires City

Oyhanarte, Marta, Under Secretary forInstitutional Reform and Democracystrengthening, Cabinet Chief

Swarckberg, Frida, Project Coordinator -ARG/04/043 Employment Observatory,Ministry of Labor, Employment and Social Security

EGYPT

UNDP

Arafa, Naglaa, Programme AnalystAsfour, Yasmine, Programme AssistantBayoumi, Mohamed, Environment

Specialist (ARR)El Ghoneimy, Maha, Programme AssistantEl Hazek, Dalia, Programme AssistantEl Sharkawi, Amin, ARREzzeldin, Karin, Communications AssociateFarid, Nada, Programme AssistantGalbiati, Simona, Programme OfficerGuirguis, Samia, Environment Programme

Manager (retired)Hedeya, Rania, Programme Analyst Mahmoud, Yasmine, UNV AssistantMobarek, Hamed, Policy AdviserMokbel, Hala, Finance AssociateNiazi, Shahdan, Programme AssistantNofal, Mayar, Programme AssistantRawley, James, Resident Coordinator &

Resident RepresentativeRifaat, Noha, Results-based Management OfficerSabri, Sarah, Programme AssistantVigilante, Antonio, Previous

Resident CoordinatorWaly, Ghada, ARR/SpecialistWeng, Zenab, Programme Associate/

ATLAS FPZeitoun, Nahla, Research & Policy Associate

OTHER

Abdelhamid, Doha, Country Representative, IDEAS

Abdel-Malek, Talaat, Executive Director, PEMAAl Jawaldeh, Ayoub, Deputy Country

Director, WFPAli Shahein, Noha, International Cooperation

Specialist, IDSCArmanious, Usama Tharwat, Second Secretary,

Ministry of Foreign AffairsAtif, Gawaher, Senior Adviser, UNSICBonilla, Javier Menendez, First Secretary Social

Affairs, EU

Page 81: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 2 . P E O P L E C O N S U L T E D6 2

El Gammal, Yasser, Human DevelopmentCoordinator, World Bank

El Hamaky, Yomna M. H., Chairman ofEconomics Department, Ain ShamsUniversity

El Hilaly, Hanaa, General Manager, SFDEl Shawarby, Sherine, Macro-economic Policies

and Poverty, World BankEladawy, Maha, New Social Contract

Project, IDSCElmoselhy, Ali, Minister, Ministry of

Social SolidarityGaballah, Noha, Unit Manager, MCITGenena, Tarek, Environment Consultant Gomaa, Salwa, Consultant Guindi, Manal, Development Officer, CIDAHassan, Farkhonda, Secretary General, National

Council for WomenHelmy, Omneya, Director General, National

Council for WomenManoncourt, Erma, Representative, UNICEFMansi, Walid, Manager M&E, SFDMorsy, Mayar, Programme Coordinator,

UNIFEMNassar, Heba, Vice Dean Environment &

Community Affairs, Cairo UniversityOsman, Magued, Chairman, IDSCParajuli, Bishow, Representative & Country

Director, WFPSobhy, Hoda M., Project Coordinator, National

Council for WomenSpada, Marco, Director, Italian EmbassySteit, Samiha Abou, Adviser to the Secretary

General, National Council for WomenVaughan-Games, Darcy, Consultant Youssef, Amany, Acting Deputy General

Manager, SFDZaineldine, Ayman A., Deputy Assistant

Minister for International Cooperation,Ministry of Foreign Affairs

INDONESIA

UNDP

Aberg, Caroline Programme Officer,Governance Unit

Afwan, Ismah, Programme Officer, CPRU

Bjorkman, Hakan, Country DirectorHandoko, Assistant Resident Representative,

Operations UnitLacsana, Yanti T., Programme Manager, MDG

Support Unit Lazarus, Dennis, Deputy Resident

Representative, Operations UnitManurung, Martha, Resource Management

Analyst (Resource Management Unit),Operations Unit

Pratama, Ari Yahya, Programme Officer, MDGSupport Unit

Programme/Project Managers/Officers onResults Culture in UNDP Staff Workshop

Purba, Sirman, Programme Monitoring andEvaluation Officer, PMEU

Rahmatsyah, Teuku, Programme Planning andCoordination Officer, PMEU

Raju, Krishnaveny, Human Resources Specialist,Operations Unit

Rianom, Ariyanti, Planning, Monitoring and Evaluation Analyst/Learning Manager, PMEU

Sayoko, Priyo Budhi, Assistant ResidentRepresentative, Unit Manager,Environment Unit

Sinandang, Kristanto, Senior ProgrammeOfficer, CPRU

Song, Gi Soon, Programme Capacity Building& Management Manager, Governance Unit

Syebubakar, Abdurrahman, Team Leader,MDGs & Human Development, MDGSupport Unit

Widagdo, Nurina, Assistant ResidentRepresentative, Head of Unit,Governance Unit

Widjaya, Irene, Knowledge ManagementAssociate, PMEU

GOVERNMENT

Al-Farisi, Salman, Director, Directorate forDevelopment, Economic and EnvironmentAffairs, Ministry for Foreign Affairs

Arrya Tirto Sumarto, Staff of the Sub-Divisionfor UNDP, Multilateral TechnicalCooperation, Bureau for InternationalTechnical Cooperation, State Secretariat

Page 82: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 2 . P E O P L E C O N S U L T E D 6 3

Kuswoyo, Ade, Staff, Bureau for MultilateralEconomic Cooperation, NationalDevelopment Planning Board(BAPPENAS)

Pangaribuan,Yan Piter, Deputy Director, Bureaufor Multilateral Economic Cooperation,National Development Planning Board(BAPPENAS)

Rianto, Dauna, Head of the Sub-Division forUNDP, Bureau for International TechnicalCoopearation, State Secretariat

Rukma, Arwandriya, Head of the Division forMultilateral Technical Cooperation, Bureaufor International Technical Cooperation,State Secretariat

Simatupang, Delthy, Director, Bureau forMultilateral Economic Cooperation,National Development Planning Board(BAPPENAS)

Sumirat, Agung Cahaya, Directorate of UNEconomic Development and EnvironmentAffairs, Ministry for Foreign Affairs

OTHER

Alie, Nahruddin, Programme Officer, UNIDOFengler, Wolfgang, Senior Economist, World

BankFrielink, Barend, Principal Programmes

Coordination Specialist, ADBGunawan, Ade, Programme Officer, Association

for Community Empowerment (ACE)Hartini, Titik, Executive Director, Association

for Community Empowerment (ACE)Lun, Borithy, Operations and Finance Advisor,

Partnership for Governance Reform,Indonesia (PGRI)

Makalew, Richard J., PhD, National ProgrammeOfficer, Population and DevelopmentStrategies, UNFPA

Miftah, Akhmad, Programme Officer,Association for Community Empowerment(ACE)

Mishra, Satish, Managing Director, Strategic AsiaMoretto, Sakura, Project Officer, Economic

Regional Cooperation/Good Governance, EUNakamura, Toshihiro, Assistant Resident

Representative, UNDP, Sierra Leone (Headof the Planning, Monitoring and EvaluationUnit, UNDP, Indonesia, January 2005 up toFebruary 2007)

Santoso Ismail, Martha, AssistantRepresentative, UNFPA

Sobary, Mohamad, Executive Director,Partnership for Governance Reform,Indonesia (PGRI)

Soeprijadi, Piet, Deputy Executive Director,Partnership for Governance Reform,Indonesia (PGRI)

Suhardi, Edi, Public Relations and ResourceMobilization Senior Manager, Partnershipfor Governance Reform, Indonesia (PGRI)

MOLDOVA

UNDP

Artaza, Ignacio, Deputy ResidentRepresentative, a.i. Resident Representative

Barcari, Liudmila, UN Coordination OfficerChirica, Nadejda, Programme Budget AssociateDumitrasco, Angela, Portfolio ManagerFilatov, Vasile, Programme Officer—

Governance Team LeaderGuzun, Ion, Project AssistantNiculita, Aliona, Programme Officer—

Local and GovernanceOsovschi, Margareta, Operations ManagerPeleah, Mihail, Portfolio ManagerPlesinger, Jan, Portfolio ManagerProfir, Tatiana, Human Resources Sandu, Maya, Programme CoordinatorVremis, Vitalie, Portfolio Manager

OTHER

Albu, Viorel, Project Manager, BetterOpportunities for Youths and Women Project

Badrajan, Valentina, Chief, Aid CoordinationSection, Office of the 1st Deputy Prime Minister

Botan, Igor, Director, ADEPTBurkly, Michael, Supervisory General

Development Officer, USAID Catrinescu, Natalia, Head of General

Department for Macroeconomic Policiesand Development Programmes, MET

Cazacu, Diana, Project Management Assistant,USAID

Page 83: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 2 . P E O P L E C O N S U L T E D6 4

Ciobanu, Svetlana, Director, RegionalSustainable Development Centre, Ungheni

Cosuleanu, Ion, Project Manager,E-governance Project

Cosuleanu, Valeriu, Junior ProfessionalAssociate, World Bank

Dalman, Ada, Social Assistant, SocialRegeneration Centre, Ungheni

Enciu, Constantin, National Consultant HumanRights Outcome Evaluation

Girbu, Viorel, Principle Consultant, AidCoordination Section, Office of the 1st Deputy Prime Minister

Gutu, Oxana, Project Manager, EuropeanCommission

Laur, Elena, Project Officer, M&E, UNICEFMartinenco, Lucia, Project Manager,

JPPM ProjectMartiniuc, Corneliu, Project Officer, HIV in the

Uniformed Services ProjectMelnic, Vlad, Project Manager, Civil Society

Strengthening ProjectMunteanu, Victor, Programme Director, Law,

Soros FoundationOancea, Vasile, Administrator, Social

Regeneration Centre, UngheniPetrusevschi, Margareta, ConsultantPouezat, Bruno, Ex-Resident

Representative/Resident CoordinatorMoldova (2002-2007), UNDP, Azerbaijan

Racu, Ana, Project Manager, Support toImplementation of the National HumanRights Action Plan Project

Salah, Mohamed, Programme Officer, UNICEFSecareanu, Stefan, Chair of Parliamentary

Committee on Human Rights, Parliamentof Moldova

Skvortova, Alla, Head of Section, DFIDSterbet, Valeria, Judge, Constitutional CourtTatarov, Sergiu, Head of Experts Group, Public

Administration Reform Unit, Office of the1st Deputy Prime Minister

Tcaciuc, Tamara, Social Assistant, SocialRegeneration Centre, Ungheni

Torres, Ray, Representative, UNICEFUrsu, Mircea, Programme Coordinator,

Integrated Local Development Programme Vasiloi, Rosian, Chief, General Director’s

Office, Border Guards Service

ZAMBIA

UNDP

Blaser, Jeremias, Assistant ResidentRepresentative Governance

Brainerd, Gbemi, Project Support Officer(General Services)

Chansa, Barbara, Programme AssociateChibozu, Dellah, Maintenance ClerkChirwa, Eldah, National EconomistChuma, Aeneas C., UN Resident CoordinatorHannan, Abdul, Deputy Resident

Representative (Programmes)Jallow, Abdoulie Sireh, Economic AdvisorKaira, Michael B., Finance OfficerKasonso, Margaret, Personal AssistantKumwenda, Dr. Rosemary, HIV and

AIDS AdvisorMuchanga, Amos, Programme Analyst -

EnvironmentMukonde, Morgan, Head AdministrationMulenga, Leah, Programme AssociateMusenge, Oswald, Registry ClerkMwansa, Violet, Switch Board Operator

(Receptionist)Mwanza, Mbiko, Finance ClerkMwiya, Nancy, Finance ClerkNasilele-Saasa, Hazel W., Programme AssociateNgómbe, Assan, Programme AnalystNyirenda, Enelesi, Finance ClerkPhiri, Arthur, Programme AnalystSibandi, Dorica, Human Resources AssociateSinyama, Laura, Programme AssociateSiyumbwa, Fanny, Common

Services CoordinatorSletzion, Bereket, Deputy Resident

Representative (Operations)Soko, Mike, Senior Governance AdvisorSongiso, Sonny N., Finance ClerkThornicroft, Hazel, Finance AssistantYerokun, Dellia M., Programme AnalystYoyo Sichinga, Mashike, Administrative

Assistance (Operations & Travel)

Page 84: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 2 . P E O P L E C O N S U L T E D 6 5

OTHER UN AGENCIES AND BILATERAL DONORS

Bhebhe, T., Africa Development BankEiji, Inui, JICAFinnegan, Gerry, ILO Representative, ILOLundstol, Olav, First Secretary – Country

Economist, Royal Norwegian EmbassyMcDowal, Bruce Lawson, Deputy Head of

Programmes, DFIDNorrby, Charlotta, Swedish EmbassyValerio Parmigian, First Secretary,

Italian EmbassySievers, Peter, Counsellor Development, Royal

Danish EmbassySozi, Dr. Catherine, Country Coordinator,

UNAIDSSylwander, Lotta R., UNCEF Representative,

UNICEF Vanden Dool, Robert, Royal

Netherlands Embassy

GOVERNMENT AND CIVIL SOCIETY INSTITUTIONS

Chikwenya, Nicholas, Assistant DirectorPlanning, Ministry of Health

Chirwa Mambilima, Justice Irene, Chairperson,Electoral Commission of Zambia

Chiwangu, Dr. Paulina, Monitoring andCoordination Advisor (UNV), NationalAIDS Council

Cholwe, Patrick, Assistant Director, Ministry ofFinance & National Planning

Imbwae, Gertrude, Permanent Secretary,Ministry of Justice

Isaac, Pricilla M., Deputy Director (E & VE),Electoral Commission of Zambia

Kagulula, Solomon, Principal Planner, Ministryof Health

Mateo, Tommy, Programme Officer (Research &Policy), Civil Society for Poverty Reduction

Mpepo, Besinati P., Executive Director, CivilSociety for Poverty Reduction

Mulapesi, Grace M., Commissioner, ElectoralCommission of Zambia

Mulenga, Oswald, Director Monitoring &Evaluation, National AIDS Council

Mwansa, John, Acting Director, Ministry ofFinance & National Planning

Mwenda, Josephine, Principal Planning Officer,Ministry of Finance & National Planning

Mwinga, Doris Katai K., Clerk of the NationalAssembly, National Assembly

Page 85: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 3 . E V A L U A T I O N T E A M A N D A D V I S O R Y P A N E L 6 6

TEAM LEADER

Derek Poate, ITAD

INTERNATIONAL CONSULTANTS

Paul Balogun (UK)Munhamo Chisvo (Zimbabwe)Robert Lahey (Canada)John Mayne (Canada)

NATIONAL CONSULTANTS

Arcadie Barbarosie (Maldova)Seheir Kansouh-Habib (Egypt)Farsidah Lubis (Indonesia)Kenneth Mwansa (Zambia)Oscar Yujnovsky (Argentina)

EVALUATION OFFICE TASK MANAGER

S. Nanthikesan

ADVISORY PANEL

Sulley Gariba, Former President, International Development Evaluation Association;Head, Institute for Policy Alternatives, Ghana

Peter Van der Knaap, Director of Policy Evaluation, Netherlands Court of AuditJehan Raheem, Professor, Brandeis University, Massachusetts, United States; former Founding

Director Evaluation Office, UNDPOdetta R. Ramsingh, Director-General, Public Service Commission, South Africa

Research Support

Elizabeth K. Lang Nayma Qayum

Annex 3

EVALUATION TEAM AND ADVISORY PANEL

Page 86: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 4 . R E F E R E N C E S 6 7

Afaf Abu-Hasabo et al., ‘Evaluation of EgyptCountry Programme (2002 – 2006)’,UNDP Egypt, March 2007.

Aucoin P, Heintzman R, ‘The Dialectics ofAccountability for Performance in PublicSector Management Reform’, in Governancein the Twenty-first Century: Revitalizing thePublic Service, B. G. Peters and D. J. Savoie,Eds., McGill-Queens's University Press, 2000.

Auditor General of Canada, ‘ManagingDepartments for Results and ManagingHorizontal Issues for Results’, Report of the Auditor General of Canada to theHouse of Commons, Ottawa, Ontario,December 2000.

Auditor General of Canada, ‘ModernizingAccountability in the Public Sector’, Reportof the Auditor General of Canada to theHouse of Commons, Chapter 9, Ottawa,Ontario, 2002. Available online athttp://www.oag-bvg.gc.ca/domino/reports.nsf/html/20021209ce.html/$file/20021209ce.pdf.

Auditor General of Canada, ‘Moving TowardsManaging for Results: Report of theAuditor General of Canada to the House ofCommons’, Ottawa, Canada, 1997, Chapter11. Available online at: http://www.oag-bvg.gc.ca/domino/reports.nsf/html/ch9711e.html.

Baehler K, ‘Managing for Outcomes:Accountability and Thrust’, AustralianJournal of Public Administration, 2003,62(4): 23-34.

Barrados M, Mayne J, ‘Can Public SectorOrganizations Learn?’, OECD Journal onBudgeting, 2003, 3(3): 87-103.

Behn R, ‘Rethinking DemocraticAccountability’, Brookings Institute, 2000.

Behn R, ‘The Psychological Barriers toPerformance Management, or Why Isn'tEveryone Jumping on the Performance-management Bandwagon’, PublicPerformance & Management Review, 2002,26(1): 5-25.

Binnendijk A, ‘Results-Based Management inthe Development Cooperation Agencies: AReview of Experience’, Background Report,DAC OECD Working Party on AidEvaluation, Paris, France, 2001. Availableonline at: http://www.oecd.org/dataoecd/17/1/1886527.pdf.

Botcheva L, White CR, Huffman LC,‘Learning Culture and OutcomesMeasurement Practices in CommunityAgencies’, American Journal of Evaluation,2002, 23(4): 421-434.

Bouckaert G, Peters BG, ‘PerformanceMeasurement and Management: TheArchilles' Heel in AdministrativeModernization’, Public Performance andManagement Review, 2002, 25(4): 359-362.

Burgess K, Burton C, Parston G,‘Accountability for Results’, Public ServicesProductivity Panel, London, England, 2002.Retrieved 20 April 2004 at http://www.hm-treasury.gov.uk/Documents/Public_Spending_and_Services/Public_Services_Productivity_Panel/pss_psp_accproj.cfm.

Clark ID, Swain H, ‘Distinguishing the Realfrom the Surreal in Management Reform’,Canadian Public Administration, 2005, 48(4).

Cousins B, Goh S, Clark S, Lee L, ‘IntegratingEvaluative Inquiry into the OrganizationalCulture: A Review and Synthesis of theKnowledge Base’, Canadian Journal ofProgram Evaluation, 2004, 19(2): 99-141.

Dalberg Global Development Advisors,‘Assessing Results Management at UNDP.Commissioned by the Danish Ministry ofForeign Affairs’, New York, NY, 2006.

Danish Ministry of Foreign Affairs, ‘AssessingResults Management at UNDP’,15 June 2006.

David T, ‘Becoming a Learning Organization’,Marguerite Casey Foundation, 2002.Available online at http://www.tdavid.net/pdf/becoming_learning.pdf.

Annex 4

REFERENCES

Page 87: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 4 . R E F E R E N C E S6 8

Dubnick MJ, ‘Accountability Matters’, ShaniConference, University of Haifa, Israel, 2004.Available online at http://www.andromeda.rutgers.edu/~dubnick/papers/haifashani.pdf.

Flint M, ‘Easier Said Than Done: A Review ofResults-Based Management in MultilateralDevelopment Institutions’, UK Departmentfor International Development (DFID),London, United Kingdom, 2002.

Fontaine Ortiz E, Tang G, ‘Results-basedManagement in the United Nations in theContext of the Reform Process’, JointInspection Unit, United Nations Geneva,Switzerland, JIU/REP/2006/6.

Fontaine Ortiz E, Kuyama S, Münch W, andTang G, ‘Managing for Results in the UNSystem Part 1: Implementation of Results-based Management in the United NationsOrganizations’, Joint Inspection Unit,United Nations Geneva, Switzerland, 2004,para 42-46.

‘Frequently Asked Questions about theStrategic Plan’, internal guidance on UNDPintranet site concerned with development ofthe new Strategic Plan.

General Accounting Office, ‘Results-OrientedCultures: Insights for U.S. Agencies fromOther Countries' Performance ManagementInitiatives’, US General Accounting Office,Washington, DC, 2002. Available online athttp://www.gao.gov/new.items/d02862.pdf.

General Accounting Office, ‘An EvaluationCulture and Collaborative PartnershipsHelp Build Agency Capacity. ProgramEvaluation’, Washington, DC, 2003.Available online at http://www.gao.gov/new.items/d03454.pdf.

Goh S, ‘The Learning Organization: AnEmpirical Test of a Normative Perspective’,International Journal of OrganizationalTheory and Behaviour, 2001, 4(3&$):329-355.

Greiling D, ‘Performance Measurement:A Remedy for Increasing the Efficiency ofPublic Services?’, International Journal ofProductivity and Performance Management,2006, 55(6): 448-465.

Hatry H, ‘We Need a New Concept ofAccountability’, The Public Manager, 1997,26: 37-38.

Hernandez G, Visher M, ‘Creating a Culture ofInquiry: Changing Methods–and Minds–onthe Use of Evaluation in NonprofitOrganizations’, The James IrvingFoundation, 2001.

Hood C, ‘Gaming in Targetworld: The TargetsApproach to Managing British PublicServices’, Public Administration Review,2006, 66(4): 515-521.

Hood C, ‘Public Service Management byNumbers: Why Does it Vary? Where Has itCome From? What Are the Gaps and thePuzzles?’, Public Money and Management,2007, 27(2): 95-102.

Longhurst R, ‘Review of the Role and Qualityof the United Nations DevelopmentAssistance Frameworks (UNDAFs)’,Overseas Development Institute, London,England, May 2006, p 26.

Mayne J, ‘Challenges and Lessons inImplementing Results-Based Management,’Evaluation, 2007, Volume 13, Issue 1,p 89-107.

Mayne J, ‘Evaluation for Accountability: Realityor Myth?’, In Making Accountability Work:Dilemmas for Evaluation and for Audit, M-LBemelmans-Videc, J Lonsdale, B Perrin,Eds. Transaction Publishers, NewBrunswick, NJ, 2007.

Michael D, ‘Governing by Learning:Boundaries, Myths and Metaphors’, Futures,1993, January/February: 81-89.

Moynihan DP, ‘Goal-Based learning and theFuture of Performance Management’,Public Administration Review, 2005, 65(2):203-216.

MSI, ‘Background Paper on Results-BasedManagement in DevelopmentOrganizations’, Management SystemsInternational, July 2006.

Norman R, ‘Managing Through Measurement or Meaning? Lessons from Experience withNew Zealand’s Public Sector PerformanceManagement Systems’, International Reviewof Administrative Sciences, 2002.

Office of Audit and Performance Review,‘Full Scope Audit Report: UNDP CountryOffice in Zambia’, 2006, p 30.

Page 88: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 4 . R E F E R E N C E S 6 9

OECD-DAC, ‘Managing for DevelopmentResults, Principles in Action: Sourcebookon Emerging Good Practices (Final - March2006)’, OECD-DAC, Paris, France 2006.Available online at http://www.mfdr.org/Sourcebook/1stSourcebook.html.

OECD, ‘Results-Based Management inDevelopment Co-operation Agencies:A Review of Experience’, DAC WorkingParty on Aid Effectiveness, 2001.

Pal LA, Teplova T, ‘Rubik's Cube? AligningOrganizational Culture, PerformanceMeasurement, and HorizontalManagement’, Carleton University, Ottawa,Ontario, 2003. Available online athttp://www.ppx.ca/Research/PPX-Research%20-%20Pal-Teplova%2005-15-03[1].pdf.

Perrin B, ‘Effective Use and Misuse ofPerformance Measurement’, AmericanJournal of Evaluation, 1998,19(3): 367-379.

‘Progress Report on the UNDP-UNCDFStrategic Partnership’, DP/2007/34.

Radin BA, ‘Challenging the PerformanceMovement: Accountability, Complexity andDemocratic Values’, Georgetown UniversityPress, Washington, DC, 2006.

Smutylo T, ‘Building an Evaluative Culture’,International Programme for DevelopmentEvaluation Training, World Bank andCarleton University, Ottawa, Ontario, 2005.

Thomas P, ‘Steering by Remote Control:Planning, Performance Measurement,Accountability and Learning’, FMI Journal,2003, 14:3 (Summer): 31-34.

Thomas P, ‘Performance Measurement andManagement in the Public Sector’,Optimum, 2005, 35(2): 16-26. Availableonline at http://www.optimumonline.ca/print.phtml?id=225.

United Nations, ‘Implementation of Results-Based Management in the UNOrganizations’, Joint Inspection Unit, 2004.

UNCDF, ‘Results-oriented Annual Report ofthe United Nations Capital DevelopmentFund’, 2007.

UNCDF, ‘Moving to Sector-wide Approach forDecentralization and Local Developmentand Inclusive Finance—A Strategic Shiftfor UNDP and UNCDF in a Direct BudgetSupport Environment’, 2007.

UNDP, ‘Action Brief for the Operations Group:Key Principles and Approach to the RollOut of the Accountability Framework inUNDP’, Bureau of Management, 2007.

UNDP Administrator’s Office, ‘Managementand Workflow Review—Phase I’, NewYork, NY, January 2006.

UNDP, ‘Alignment Analysis OSG/ExO (AB)’,New York, NY, 30 Oct 2006.

UNDP, ‘An Assessment of UNDP Practices inMonitoring and Evaluation at the CountryLevel’, UNDP Evaluation Office, NewYork, NY, February 2005, para 7.4 and 7.5.

UNDP, ‘Annual Report of the Administrator(1997)’, DP/1997/16/Add.7, New York, NY.

UNDP, ‘Annual Report of the Administrator onEvaluation in 2005’, DP/2006/27, NewYork, NY.

UNDP, ‘Assessment of ProgrammingArrangements, 2004-2007’, paper presentedto the Executive Board DP/2007/8,available online at: http://www.undp.org/execbrd/word/dp07-8.doc.

UNDP, ‘Change Management: UNDP 2001’, 1997.

UNDP, ‘Development Effectiveness Report’,New York, NY, 2000, pp 23-24.

UNDP, ‘Evaluation of Gender Mainstreaming.UNDP Evaluation Office, New York,NY, 2006, p vi.

UNDP, ‘Guidelines for Outcome Evaluators’,Monitoring and Evaluation CompanionSeries, #1, UNDP Evaluation Office,New York, NY, 2002.

UNDP, ‘Handbook on Monitoring andEvaluating for Results’, 2000, p 11.

UNDP, ‘RBM in UNDP: Overview andPrinciples’, New York, NY, 2000.

UNDP, ‘The Multi Year Funding FrameworkReport by the Administrator’, UNDPExecutive Board, DP/1999/CRP 4, NewYork, NY, January Session 1999.

UNDP, ‘Proposed Partnership Framework forUNDP and UNCDF’, DP/2007/11, NewYork, NY, 2007.

UNDP, ‘RBM: Concepts and Methodology’,New York, NY, 2000.

UNDP, ‘RBM in UNDP: Overview andGeneral Principles’, 2000.

Page 89: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 4 . R E F E R E N C E S7 0

UNDP, ‘RBM in UNDP: Technical Notes‘,New York, NY, 2000.

UNDP, ‘UNDP Management Review PhaseII—Report of the Management ReviewTeam’, New York, NY, 2007, para 73.

UNDP, ‘The Way Forward: The Administrator’sBusiness Plans, 2000-2003’, 1999, para 28.

UNFPA, ‘UNFPA Draft Strategic Plan 2008-2011. Accelerating Progress and NationalOwnership of the ICPD Program ofAction’, Executive Board Informal Meeting,16 May 2007.

UNIFEM, ‘How Are We Doing? TrackingUNIFEM Progress in Achieving Results forManagement and Learning’, A BriefingNote, 2002.

UNIFEM, ‘Multi-year Funding Framework2004-07’, DP/2004/5.

White H, ‘The Road to Nowhere? ResultsBased Management in InternationalCooperation’, Euforic. Available online athttp://www.euforic.org/detail_page.phtml?

lang=en&page=resource_briefing_evaluat_edit3.

World Bank Roundtable, ‘Moving fromOutputs to Outcomes: Practical Advicefrom Governments Around the World’,Prepared by B. Perrin for the World Bankand the IBM Centre for The Business ofGovernment, Managing for Performanceand Results Series, Washington, DC, 2006.Available online at: http://www.worldbank.org/oed/outcomesroundtable.

Wye C, ‘Performance Management for CareerExecutives: A "Start Where You Are, UseWhat You Have" Guide’, In Managing forResults 2005, JM Kamensky and A Morales,Eds. Rowman and Littlefield, Oxford,England, 2005.

Yin RK, ‘Case Study Research: Design andMethods’, 3rd Edition, Applied SocialResearch Methods Series, Volume 5, SagePublications, 2003.

Page 90: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S 7 1

Table 1. Timeline of key events in UNDP’s adoption of results-based management Table 2. Results reporting and data capture systems in UNDP (2007)Table 3. Comparison of areas of work across strategic documents (2000-2007)Table 4. Evidence of outcome monitoring in five countriesTable 5. Example of outcome reporting from the Indonesia Resident Coordinator Annual Report for 2006

1997 A comprehensive 2001 change management process was launched in May 1996. The focuswas on overhauling the way UNDP does business, rather than revisiting the mission andmandate, which were covered under Executive Board Decision 94/14. Recommendationsmade to the Executive Board in DP/1997/16/Add.7 aimed to achieve a client-focused, speedy,learning and growing organization. The following principles specific to results-based manage-ment in the action plan were included in this document:n Shift to ex-post accountability framework, in which country programmes manage

programmes, finances, administration and personnel.n CCFs should reflect country-owned strategies within UNDP programming framework, with

measurable targets for both impact and results.n Enhance programme focus at the country level by establishing broad corporate

strategic objectives.n Develop overall corporate performance indicators as a basis for establishing objective

management performance criteria.n Clarify broad strategic areas in which UNDP would mainly work.n Evaluation Office should complete development and testing of programme performance

indicators for use as basis for better monitoring, feedback and organizational learning.n Complete revision and simplification of programming management procedures.n Build on recent pilots, decide at the senior level to move ahead with a system for perform-

ance management and measurement in programming, finance and administration. (Notethis was the work that lead to the Balanced Scorecard).

n Identify and assign clear responsibility for administering and coordinating the perform-ance management and measurement system. Office of Audit and Performance Reviewand Evaluation Office should take responsibility for development of performance indica-tors at corporate and programme levels.

n Focus functions of the regional bureaux on the holistic oversight of country officeperformance, a function at present not covered by any unit, and on management supportand quality control.

n Regional Bureau Directors should report to the Administrator through the AssociateAdministrator. This includes assembling and maintaining a complete, up-to-date overviewof UNDP operations and providing corporate management with consolidated monitoringreports. Monitoring country office compliance with defined indicators of managementperformance, including delivery, resource mobilization and the Resident Coordinator function.

n Establish the Executive Committee as a collegial body to take management decisions onstrategic direction, corporate planning and policy definition.

Annex 5

TABLES

Table 1. Timeline of key events in UNDP’s adoption of results-based management

Year Key Event

Page 91: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S7 2

1998

1999

2000

2001

2002

2003

Use of strategic frameworks piloted in selected UNDP country programmes.

SRFs produced across all UNDP country offices. The principle of managing for outcomesrather than project outputs established, although key initial change identified is the movefrom managing inputs to managing for project outputs.

n The Way Forward: The Administrator's Business Plans 2000-2003 (DP/2000/8), presented tothe Executive Board. Outlines how UNDP will be transformed into a more results-orientated organization.

n The first corporate MYFF 2000-2003 is introduced. The MYFF comprised two basicelements: a corporate-level SRF that aggregated information across the SRFs produced bythe country offices and an integrated resource framework for the organization.

n First ROAR produced by country offices based on the 1999 SRFs. This was reporting onperformance in the year before the 1st MYFF became operational.

n Balanced Scorecard introduced across all UNDP country offices in December, focused onmeasuring how successful offices are at introducing new ways of doing businessdescribed in The Administrator’s Business Plan 2000-2003. Balanced Scorecard focusedaround four perspectives: client satisfaction, internal efficiency, learning & growth andfinancial resources.

The SRF is internalized into the country programme outline, so with each new countryprogramme formulation, UNDP will establish intended outcomes and outputs in dialoguewith the government and other national stakeholders.

n Most mandatory requirements for project level M&E are abolished in UNDP and shift tomonitoring at the level of the outcome promoted. Guidance issued at corporate level in‘Handbook on Monitoring and Evaluating for Results’ that outlined principles that shouldbe used and how they should be applied.

n Introduction of the RCA in its present approach, with explicit focus on assessment ofperformance against results.

n UNDP experience with aligning and assessing country programmes with the SRF leads toidentification of six core practice areas: democratic governance, poverty reduction, crisisprevention and recovery, energy and environment, information and communicationstechnology, and HIV/AIDS. Organization-wide knowledge management and technicalexpertise to support work of the country offices is reorganized around these six practice areas.

Publication of assessment of the 1st MYFF. Key issues and results flagged include:

n Reduction in number of outcomes from an average of 14 per country programme in 2000to less than 10 in 2002 is an indicator of increased focus.

n Decline in the percentage of outcomes drawn from UNDP sample outcomes and increase in outcomes agreed between the country office and partner government from2000 onwards.

n Problems in the quality of formulation of outcomes and outputs in country SRFs.

n Updates on progress tended to be limited to non-key achievements and linked more tooutcome indicators than outcomes.

Table 1 cont-d

Year Key Event

Page 92: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S 7 3

2004

2005

2006

2007

n Second corporate level MYFF (2004-2007) becomes active. Concept of service lines (30)introduced, which would act as a soft boundary rule on what UNDP should support atcountry level. Concept of core results (90) introduced, partially to address problemsencountered by country offices when identifying outcomes in their SRFs.

n Requirement for setting of annual outcome level targets introduced. Also expected thatcountry offices would establish baselines at outcome level and also estimated budget by outcome.

n ATLAS is introduced across all UNDP country offices (January). ATLAS is a computerizedERP system that aims to allow integration of all project-level information in a singlesystem; i.e. budgeting, financial transactions and potentially performance (M&E) information. ATLAS also allows aggregation of all relevant project data by outcome.

ATLAS used as a basis for reporting against the new MYFF, based on self assessment ofprogress against agreed annual outcome level targets.

n Balanced Scorecard revised to include 5th perspective tracking results performance thatdraws data from the ROAR reports. This represents move to link MYFF and BalancedScorecard results reporting.

n New guidance on Results Management issued at the corporate level that replaces theUNDP programming manual and Handbook on Monitoring and Evaluating for Results.Aspiration is that practice at country office level will reflect the new guidance by end 2007.

Development of the ‘Strategic Plan’ for 2008-2011, which will succeed the MYFF 2004-2007.

Table 1 cont-d

Year Key Event

Page 93: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S7 4

Table 2. Results reporting and data capture systems in UNDP (2007)

Planning

Reporting

Planning

Planning

Planning

Reporting

MYFF

MYFFReport

CCA

UNDAF

CPD

CPAP

ROAR

CountryOfficeBalancedScorecard

2000

2000

2000

Piloted in1997/1998.Rolled outin 2000.

2003

2006

2000

2000

Set goals and parameters for focus

Report on performance to Executive Board

The analytical foundation of the UNDAF

The programming instrument by which all UN organizations work withgovernments to define a mutually agreed upon assistance framework.

The programming instrument in which agovernment and UNDP set out theirplanned cooperation over a multi-yearperiod. Responds to both the UNDP MYFFand country UNDAF. Approved by theExecutive Board. Includes SRF andintended outcomes.

Formal agreement between UNDP andthe government to execute the countryprogramme. The CPAP further refines the programme design and capacitydevelopment strategies outlined morestrategically in the CPD. It details theprogramme, the major results expectedand the strategies for achieving theseresults and clarifies the arrangements forprogramme/project implementation andmanagement.

Annual assessment of performanceagainst intended outcomes. ROARs alsoreport on other aspects of performance,as defined in annual guidance issued.During MYFF 2004-2007 also reportedagainst the emphasis given to each of six‘drivers of development effectiveness’ whilepursuing results and further information inthe ‘strategic outlook’ section.

Targets set annually (mostly by head-quarters) and monitor managementperformance at the country office level.

4 yearly

Annual

In advance of UNDAF

Beginning of program-ming cycle

Beginning of program-ming cycle

Beginning of program-ming cycle

Annual

Annual

ROARs

ROARs

CCA provides analyti-cal foundation for theUNDAF. Draws onPRSP monitoring andNHDR, for example.

CCA

Draws on CCA andUNDAF processes.CPD replaced CCF in 2003.

Based on the CPD.Introduced in 2003,but only mandatorysince 2006.

No defined processbefore 2003. From2003, M&E reformedto reflect outcomeorientation, so ROARshould be based onevidence from annualproject reviews,outcome monitoringand project andoutcome evaluations.Synthesized duringannual programmereview. As CPAP wasintroduced, called theannual CPAP review.

Draws on staff andpartner surveys,data from financialsystems (from ATLASsince 2004). Since2006, 5th perspectivedraws on ROARperformance data.

Tool Date of intro-duction

Use Updated Data capture system

Corporate Level

UN Country Team

Country Programme Level

Page 94: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S 7 5

Table 2 cont-d

Planning

Reporting

Planning

Reporting

ATLASProject Tree

ROAR

Projectdocument

ProjectAnnualWork Plan(AWP)

AnnualProjectReport(APR)

2004

2000

2002

2002

Maps projects and finance againstindividual outcomes

Annual targets set

The project document serves twoessential and related purposes:

n It specifies the goals and expectedresults of UNDP intervention.

n It is the vehicle through which UNDPprovides financial and technicalsupport to achieve these results.

A detailed reporting on planned activitiesand results of activities, prepared jointlyand signed by the country office andimplementing partner

n Focus on outputs with narrative onprogress towards achieving CP outputs

‘Umbrella AWP’ identifies all projectsaccording to specific partner.

Product of annual review involving all keyproject stakeholders and the implement-ing partner, and focused on the extent towhich progress is being made towardsoutputs, and that these remain aligned toappropriate outcomes. All evaluations ofthe project, or related outcomes shouldbe considered, along with their respectivemanagement responses and action plans.

As needed

Annual

Start ofproject

Annual

Annual

ATLAS

Annual review.Setting of annualtarget againstoutcome introducedin 2002.

Tool Date of intro-duction

Use Updated Data capture system

Outcome Level

Project Level

Page 95: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S7 6

I.1.1. Promote public awareness and policy dialogue on sustainable human development issues (e.g.,through human development reports, regional humandevelopment reports, NHDRs and national long-termperspective studies).

I.3.1. Promote participation in development planningand other decision-making processes at subnational level.

III.1.2. Strengthen capacity of national and sectoralenvironmental planning agencies to manage theenvironment and natural resources base.

I.1.1. Promote public awareness and policy dialogue on sustainable human development issues (e.g.,through human development reports, regional humandevelopment reports, NHDRs and national long-termperspective studies).

I.3.1. Promote participation in development planningand other decision-making processes at subnational level.

III.1.2. Strengthen capacity of national and sectoralenvironmental planning agencies to manage theenvironment and natural resources base.

II.2.5. Promote entrepreneurship and access of the poorto microfinance services.

I.1.3. Promote equitable management of globalizationwith emphasis on social protection for the interests ofthe poor.

I.1.2. Promote private sector development.

See all of Goal IV.

1.1. MDG reporting

1.2 Policy reform toachieve the MDGs

1.3 Local povertyinitiatives includingmicrofinance

1.4 Globalizationbenefiting the poor

1.5 Private sectordevelopment

1.6 Gendermainstreaming

3. Supporting the participatorypreparation andimplementation of MDG-basednational develop-ment strategies5

3. Supporting the participatorypreparation andimplementation of MDG-basednational develop-ment strategies

1. Promotinginclusive growthand gender equality6

2. Fosteringinclusive globalization

1. Promotinginclusive growthand gender equality

1. Promotinginclusive growthand gender equality7

Table 3. Comparison of areas of work across strategic documents (2000-2007)

MYFF 2000-2003Strategic Area of Support

MYFF 2004-2007Service Line

Corresponding Key Result Area inthe Strategic Plan

5 Reporting is addressed along with planning, monitoring and evaluation under this key result area.6 There is no specific outcome or reference to micro-finance under this key result area.7 Gender mainstreaming is also addressed at the outcome level in all poverty key result areas and under the other three

development focus areas as well.

Page 96: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S 7 7

8 Civic engagement is also addressed as a strategic outcome under the democratic governance key result area “fosteringinclusive participation.”

9 Under democratic governance, the inclusive participation key result area includes an outcome on inclusive communica-tions and e-governance for accountability and transparency.

I.1.2. Strengthen institutional capacity of electoralcommissions, systems and processes, and reformelectoral laws.I.2.4. Strengthen a culture of good governance,including support to reform initiatives and thepromotion of consensus building and tolerancebetween different political and social parties.1.3.1. Promote participation in development planningand other decision-making processes at subnational level.I.2.6. Support the capacity of the poor and civil societyfor self-organization and development of alliances (e.g., community organizations, trade unions, farmers’associations and political parties).

No corresponding specific area of support.

2.1.1. Promote pro-poor macroeconomic and sectoralpolicies and national anti-poverty plans that areadequately financed (e.g., 20/20 initiative).

I.2.1. Develop institutional capacity of parliamentarystructures, systems and processes.

I.2.2. Strengthen institutional capacity of electoralcommissions, systems and processes, and reformelectoral laws.IV.1.5. Promote voter education for women and gender-sensitivity of elected and appointed officials regardingwomen’s rights.

I.2.3. Reform and strengthen the system of justice,including legal structures and procedures.IV.3.4. Build the knowledge and capacities of lawenforcement agents and other officials (judges, lawyers,police, immigration officers, jail wardens) regardingwomen’s human rights under national and internationallaw and treaties.

1.7 Civil societyempowerment

1.8 Making ICTDwork for the poor

2.1 Policy support

2.2 Parliamentarydevelopment

2.3 Electoral systems

2.4 Justice andhuman rights

3. Supporting theparticipatorypreparation andimplementation of MDG-basednational develop-ment strategies8

No correspondingpoverty key resultarea.9

3. Promotingknowledge andpractices aboutdemocraticgovernancegrounded in UN values

2. Strengtheningresponsivegoverning institutions

1. Fosteringinclusive participation

2. Strengtheningresponsivegoverning institu-tions (justice)3. Promotingknowledge andpractices aboutdemocraticgovernancegrounded in UNvalues (human rights)

Table 3 cont-d

MYFF 2000-2003Strategic Area of Support

MYFF 2004-2007Service Line

Corresponding Key Result Area inthe Strategic Plan

Page 97: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S7 8

10 This key result area includes an outcome on inclusive communications and e-governance.11 Key result areas 1-3 make reference to ‘national, regional and global levels’.12 This issue is mainstreamed under this key result area.13 This issue is mainstreamed under this key result area.

No corresponding specific area of support.

I.3.1. Promote participation in development planningand other decision-making processes at subnational level.

I.3.2. Support development of sound decentralizationpolicies, including increased allocation of resources tothe subnational and national levels.

I.3.3. Develop capacity of local authorities.

I.4.1. Promote an efficient public sector that improves(economic) management and provides open access to services.

I.4.2. Support awareness initiatives and nationalprogrammes that combat corruption and enhanceintegrity and accountability in the management ofpublic and private resources.

III.1.1. Develop and implement legal and regulatoryframeworks and policies that link sustainable environ-ment and management of natural resources to criticalareas of development.

III.1.2. Strengthen capacity of national and sectoralenvironmental planning agencies to manage theenvironment and natural resources base.

III.1.3. Strengthen national and local capacities forcollection, analysis and dissemination of environmentalinformation and statistics.

III.1.4. Develop mechanisms for effective mobilization offinancial resources for national action in environmentaland natural resource management.

III.3.4. Development, promotion and exchange of soundenvironmental practices and technologies (such asthose on climate change).

III.2.1. Implement national and local programmes thatpromote sustainable management of energy, land,water, forest and other biological resources.

2.5 E-governance

2.6 Decentralizationand local governance

2.7 Public administra-tion reform and anti-corruption

3.1 Frameworks and strategies

3.2 Effective watergovernance

1. Fosteringinclusive participation10

No correspondinggovernance keyresult area.11

2. Strengtheningresponsivegoverning institu-tions (publicadministration)

3. Promotingknowledge andpractices aboutdemocraticgovernancegrounded in UN values (anti-corruption)

1. Mainstreamingenvironment andenergy12

3. Adapting toclimate change

1. Mainstreamingenvironment andenergy13

Table 3 cont-d

MYFF 2000-2003Strategic Area of Support

MYFF 2004-2007Service Line

Corresponding Key Result Area inthe Strategic Plan

Page 98: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S 7 9

III.2.1. Implement national and local programmes thatpromote sustainable management of energy, land,water, forest and other biological resources.

III.2.1. Implement national and local programmes thatpromote sustainable management of energy, land,water, forest and other biological resources.

III.2.1. Implement national and local programmes thatpromote sustainable management of energy, land,water, forest and other biological resources.

No corresponding specific area of support.

V.2.1. Promote preventive development and a culture of peace.

V.2.2. Support the implementation of peaceagreements.

V.3.1. Strengthening social capital by ensuring supportfor affected populations, including refugees and thedisplaced, and their access to sustainable livelihoodsand socioeconomic recovery through integrated area-based approaches and/or specific reintegrationprogrammes.

V.2.3. Strengthen public security, civil protection andpolicing and promote disarmament and demobilizationof ex-combatants, and conversion of military assets tocivilian use.

3.3 Access to energyservices

3.4 Sustainable landmanagement

3.5 Conservation ofbiodiversity

3.6 Control of ozone-depleting substancesand PersistentOrganic Pollutants

4.1 Conflict prevention andpeacebuilding

4.2 Recovery

4.3 Small arms,disarmament

4. Expanding accessto environment andenergy services forthe poor

1. Mainstreamingenvironment andenergy14

2. Catalyzingenvironmentalfinance15

1. Mainstreamingenvironment andenergy16

2. Catalyzingenvironmentalfinance17

2. Catalyzingenvironmentalfinance18

1. Reducing the riskof conflicts andnatural disasters

2. Restoring thefoundations fordevelopment after crisis

No correspondingCPR key resultarea.19

Table 3 cont-d

MYFF 2000-2003Strategic Area of Support

MYFF 2004-2007Service Line

Corresponding Key Result Area inthe Strategic Plan

14 This issue is mainstreamed under this key result area.15 This key result area includes sustainable environment and management.16 This issue is mainstreamed under this key result area.17 This key result area includes biodiversity conservation in list of environmental concerns.18 This key result area includes an outcome dealing with Montreal Protocol financing.19 Key result area on restoring foundations for development after crisis encompasses these issues under outcome ‘security

situation stabilized’.

Page 99: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S8 0

V.3.2. Promote ratification and implementation of theOttawa Convention to ban land mines, and buildnational capacity for comprehensive mine actionprogrammes, ensuring application of standards andappropriate technologies.

V.3.1. Support implementation of the YokohamaStrategy for a Safer World: Guidelines for NaturalDisaster Prevention, Preparedness and Mitigation, andmainstream vulnerability analysis and hazard-mappinginto all development policies.

V.3.2. Develop institutional capacity for disaster preven-tion, preparedness and mitigation, including prepara-tion of national/local plans, improved early warningsystems, trained human resources and increasedinterregional/national information exchanges.

No corresponding specific area of support.

II.1.3 Strengthen capacity of governments and vulnera-ble groups to take preventive measures and reduce theimpact of health epidemics, such as HIV/AIDS.

II.1.3 Strengthen capacity of governments and vulnera-ble groups to take preventive measures and reduce theimpact of health epidemics, such as HIV/AIDS.

II.1.3 Strengthen capacity of governments and vulnera-ble groups to take preventive measures and reduce theimpact of health epidemics, such as HIV/AIDS.

4.4 Mine action

4.5 Natural disasterreduction

4.6 Special initiativesfor countries intransition

5.1 HIV/AIDS andhuman development

5.2 Governance ofHIV/AIDS responses

5.3 HIV/AIDS, humanrights and gender

No correspondingCPR key resultarea.20

1. Reducing the riskof conflicts andnatural disasters(prevention)

2. Restoring thefoundations fordevelopment aftercrisis (response)

2. Restoring thefoundations fordevelopment after crisis

4. Mitigating theimpact of AIDS onhuman develop-ment (povertyfocus area)

4. Strengtheninggovernance of AIDS responses(Governance Focus Area)

4. Mitigating theimpact of AIDS onhuman develop-ment (povertyfocus area)21

Table 3 cont-d

MYFF 2000-2003Strategic Area of Support

MYFF 2004-2007Service Line

Corresponding Key Result Area inthe Strategic Plan

20 Key result area on restoring foundations for development after crisis encompasses these issues under outcome ‘securitysituation stabilized’.

21 This key result area includes one outcome dealing with human rights and another outcome dealing with gender relatedvulnerability.

Page 100: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S 8 1

Table 4. Evidence of outcome monitoring in five countries

Baseline datato describethe problemor situationbefore theintervention

Indicators foroutcomes

Data collection on outputsand how/whether theycontributetowardsachievementof outcomes

Aspect ofOutcomeMonitoring

The HumanDevelopment Index(a UNDP product)provides some dataat the national andprovincial levels.There are newinitiatives to carryout pre-appraisal ina systematicfashion, but all atthe project level.

At the highest level,indicators foroutcomes are usedfor reporting(progress againsttargets) in theannual ROAR.

Data are collectedon outputs, butrarely on how theycontribute tooutcomes. Theannual planningand target settingexercise for ROARmay make a link(reinforced by thequarterly monitor-ing of ROAR), butonly implicitly.‘Attribution’ is anissue for evaluationand no ‘outcome’evaluations arebeing conducted.

Baseline dataappears as a briefsituation statementand has becomemore detailed inthe most recentCPD.

Indicators are shown foroutcomes since2003.

The workingfeatures on ATLAS,for instance, are 90percent related tofinancial reportingand 10 percentrelated to substan-tive projectmanagement.

The CCA provides arigorous analysis of thebaseline developmentsituation. The depth of thesituation analysis in theCPD is much lighter giventhe restrictive page lengthof the document, so thethree documents have tobe read together. Thedevelopment baselinesituation is not necessarilythe same as the baselinesituation for the outcomespecified. No outcomebaseline assessment wasundertaken for SRF 2001-2003 and 2004-2005.

The current CPD/CPAP(2006-2010) is the first toexplicitly attach an annexon the SRF with a columnon “outcome indicators,baseline situation andtargets.”

Some of the indicators foroutcomes specified in thecurrent CPAP (2006-2010)are set too high at long-term development goal(impact) level and wouldbe difficult to link to thecontribution of projectactivities that UNDP plansto undertake. An exampleis the component on“strengthening humandevelopment to achievethe MDGs”where outcomeindicators relate tonational poverty statistics.

Indicators for otheroutcomes, for example,under the governancecomponent, are notSMART, e.g.,“participatoryprocesses and mechanismsin democratic governanceadopted.”

This has improved significantly under thecurrent CPAP (2006-2010).The Project Databasedeveloped by the PMEUcaptures project-specificinformation on outputsand their contribution tooutcomes on a quarterlybasis and this informationis used to eventuallyprepare the end of yearMYFF Report (ROAR).Progress report format inATLAS is not used. JointCPAP Review Meetingsare held quarterly as wellas annually and the latterare attended by membersof other UN organizationsas well.

The level ofbaseline dataaround individualoutcomes is mixed.This partly reflectsthe fact that work ismainly organizedaround projectsand programmaticareas. It alsoreflects a compara-tive lack of funds tocarry out necessarypre-appraisal andvariation in levels ofexpertise in theoffice.

Indicators foroutcomes aredefined, but areonly used forreporting against inthe annual ROAR.

The workingfeatures on ATLAS,for instance, are 90percent related tofinancial reportingand 10 percentrelated to substan-tive project manage-ment. Performancereports in ATLAS areseen as a require-ment for headquar-ters and are notused in discussingprogress withpartners. Data arecollected onoutputs, but rarelyon how theycontribute tooutcomes.

Baseline dataappeared as a briefsituation statementin CCF1 and CCF2and has becomemore detailed inthe most recentCPD which is nowbased on the Fifth NationalDevelopment Planwhich has an M&Eframework and Key Performanceindicators in Table38.2 on page 372for all sectors.

Based on the Fifth NationalDevelopment Plan.Baseline indicatorsare shown foroutcomes with2005 as thebaseline and 2010as the target year.

The workingfeatures on ATLAS,for instance, are 90 percent relatedto financialreporting and 10percent related tosubstantive projectmanagement.But for the newCPD, the nationalsources of data will be used foroutcome resultsassessment.

Argentina Egypt Indonesia Moldova Zambia

Page 101: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S8 2

Table 4 cont-d

More focuson percep-tions ofchangeamongstakeholdersand morefocus on‘soft’assistance

Systematicreportingwith morequalitativeand quanti-tativeinformationon theprogressof outcomes

Done inconjunctionwithstrategicpartners

Aspect ofOutcomeMonitoring

More focus on‘substantive’projects andchanging relation-ship with nationalcounterparts, butfocus still largely ondelivery of inputsand outputs.

The ROAR providesinformation onprogress againsttargets, though ithas been more of areporting tool. Therecent quarterlymonitoring by thecountry office doesthough assist theability to document,review and updatetargets, an annualexercise.

More dialogue withnational counter-parts aroundmonitoring, but stillat activity andoutput level.

Policy orbehavioural changeindicators may notbe included innational statistics.

The ROAR is moreof a reporting thanmonitoring tool.

Support to thepreparation of theNHDR and MDGreporting hasstrengthenednational systemsand introduced asense of resultsfocus on the part ofthe government.

Country office isalso drawing fromperiodic nationalsurveys (such as the NationalDemographicHealth Survey(DHS)), Jointplanning, AnnualTripartite Reviewsand nationalexpertise.

Policy or behaviouralchange indicators are notincluded in national M&Esystems. UNDP is assistingthe National AidsCommission in develop-ing and implementing aneffective M&E system thatdraws on local expertiseand capacities resident atthe district through tonational level.

In the absence ofqualitative data innational M&E systems,individual cooperatingpartners commission theirown impact studies, oftenat high cost.

The Project Databasesystematized outputreporting mainly tosimplify compilation ofthe ROAR. The ROAR ismore of a reporting thanmonitoring tool. Therewas no feedback given byRegional Bureau to thecountry office on the2006 ROAR.

Support to the prepara-tion of the NHDR andMDG reporting hasstrengthened nationalsystems. A DEVEINFOMDG monitoring tool isbeing developed as a keymonitoring tool forUNDAF 2006-2010.Monitoring of progresstowards achievement ofthe UNDAF outcomes wasplanned to be undertakenthrough Annual UNDAFoutcome reviews, a mid-term UNDAF review andannual field visits byUNDAF outcome groups,but that plan has not beennot thoroughly followed.Joint CPAP reviews(quarterly, half yearly andannually) are drawing onnational expertise andinvolve field visits.

No. Most focus isdelivery of inputsand outputs.

No.

No.

Policy orbehaviouralchange indicatorsmay not beincluded innational statistics.

The ROAR is moreof a reporting thanmonitoring tool.However, the NHDRand the MDGreports are more atoutcome level thanoutput level.

Support to thepreparation of theNHDR and MDGreporting hasstrengthenednational systemsand introduced asense of resultsfocus on the partof the government.

Country office isalso drawing fromperiodic nationalsurveys, such as theNationalDemographicHealth Survey(DHS). Jointplanning andAnnual ProgrammeReviews draw onnational expertiseand operationalinformation.22

Argentina Egypt Indonesia Moldova Zambia

22 The National HIV/AIDS Joint Programme Annual Reviews for 2004, 2005 and 2006 relied mostly on information fromgovernment, civil society and partner implementing agencies at the national level.

Page 102: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S 8 3

Table 4 cont-d

Capturesinformationon success orfailure ofUNDPpartnershipstrategy inachievingdesiredoutcomes

Aspect ofOutcomeMonitoring

Any informationmonitored atproject level andreflects inputs andoutput monitoring.

Results of smallprojects are at alevel too low to becaptured in nationalor regional statistics.

Quarterly CPAP ReviewMeetings, BimonthlyTechnical Meeting, andQuarterly Joint Field Visitsare able to capture anddiscuss technical andadministrative glitchesthat affect smoothimplementation andachievement of outputsand, due to their focus onprojects, to a smallerextent outcomes.

No. Results of smallprojects are at alevel too low to becaptured innational or regionalstatistics. Butresults werereflected in theJoint ProgrammeAnnual ReviewReports.

Argentina Egypt Indonesia Moldova Zambia

Page 103: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

23 UNDP, ‘Indonesia Resident Coordinator Annual Report for 2006’, Indonesia.24 Formatting or referencing of performance indicators and actual outputs to link to expected outputs has not been altered

from the original document.

A N N E X 5 . T A B L E S8 4

UNDAFOutcome 2

Sub-outcome: Promotionof democracy and partici-pation through civiceducation in the formaland non formal sectors andmass media (lead agencyUNESCO).

n Series of training on the importance of pressfreedom for govern-ment officials, membersof parliament andjudiciary officials.

n Training for mediaprofessionals oncivic/participatoryjournalism, with focusat district levels.

n Strengthening ofpublic-servicebroadcasting throughseries of training onmanagement andjournalism skills.

n Support for communityradios at district andsub-district levels toenhance public partici-pation in voicing theiraspirations.

n Seminar on importanceof the freedom of thepress in strengtheningdemocratization.

n Training for mediaprofessionals topromote democracy,transparency, andaccountability.

Sub outcome: Partnershipfor governance reform(lead agency UNDP).

n Support to localgovernments

n Training organized in at least 5 major cities in Indonesia, withminimum total partici-pants of 250 people.

n Training for mediaprofessionals areorganized in at least 5districts in Indonesia,with minimum totalparticipants of 100people.

n Training for publicbroadcasters (TVRI andRRI) in central office inJakarta as well as atprovincial levels.

n Minimum 5 trainingswill be organized, withnumber of total partici-pants of 60.

n At least one training willbe organized, with totalparticipants of 15people (15 managersand 15 reporters).

n At least two seminarswill be organized (onefor Aceh and one forNorth Sumatera), withnumber of total partici-pants about 200.

n At least 50 journalistsfrom Aceh and fromNorth Sumatera will be trained.

• In 10 Kabupaten/Kotain 5 Provinces

• Number of multi-stakeholder forums

• Increased capacity of associations (exactindicators TBC)

• 6 telecentres

n Trainings wereorganized in the citiesof Jambi, Palembang,Medan, Batam, andBandung, with about120 people total partic-ipants. (UNESCO)

n 300 radio and TVjournalism studentsgraduated from theSchool for BroadcastMedia (SBM).

n Series of trainings wereorganized in Aceh,focused on radioreporters from about20 radio stations in theprovince of NanggroeAceh Darussalam.(UNESCO)

n One training wasorganized for 15 TVRIreporters. (UNESCO)

n Trainings wereorganized in threeradio stations inKendal, Pati and Bantulwith about 20 radiopeople as total partici-pants. (UNESCO)

n Not materialized dueto the unavailablity of funds.

n Not materialized dueto the unavailablity of funds.

(i) Poverty ReductionStrategy and actionplans have beendrafted in 10districts in 5provinces

(ii) Participatorydevelopmentplanning and health

Table 5. Example of outcome reporting from the Indonesia Resident Coordinator Annual Report for 200623

OutcomeArea

Expected Outputs Performance Indicators24

Actual Outputs at Year End

Page 104: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S 8 5

mainstreaming pro-poor policy in localdevelopment planning.

n Promotion of multi-stakeholder forums forlocal developmentissues.

n Support to associationsof local government.

n Increased availability ofinformation and accessto communications inrural areas.

n Support to completionof regulations thatdefine the Law 32 /2004 as presidentialdecree and acceptanceby national andregional stakeholders.

Sub outcome: NationalMDG Monitoring andSectoral Needs Assessmentand Plan 2006- 2015 (leadagency UNDP).

n Pilot MDG joint UN andinterested donorprogramme in NTT province.

Sub outcome: Review ofthe National Action Planfor Human Rights followedby programme of UNsupport to the implemen-tation of the action planand monitoring (leadagency UNDP).

n Recruitment of senior UN human rights advisor.

n UN joint human rightsaction plan prepared.

n Support to establish-ment ofCommunications andInformation Centreestablished in the

established serving at least 1,200 people

• Regulations finalized

n Joint programmedocument prepared

n Plan prepared andendorsed by UNCountry Team,Communications andInformation Centreestablished

• Minimum 10 newsignatories

• GC secretariat run bysignatories

• Two workshops onimplementation ofGC principles

n Indonesian guidelineson GC principles andimplementationpublished

• Increased level ofcivic awarenessamong citizens andnumber of citizensparticipating inpolitical processes

• Secretariat in place

• Electoral system,processes andmechanismsenhanced; number ofnational, regional andlocal parliamentsparticipating inproject

n Plan produced andendorsed by the UNCountry Team

sector throughmulti-stakeholderforums have beendeveloped in 5provinces.

(iii) 8 telecentres havebeen established in6 provinces, serving2500 direct usersand 60 communitygroups.

(iv) Final draft has beensent to thePresident's office.

n Needs assessment hasbeen conducted. Aconcept note has beenprepared for pilotingan area based prepara-tory assistance project.

n Interviews with short-listed candidates havebeen conducted, panelhas taken a finaldecision and recruit-ment of advisor isunder process.

n No development dueto absence of HRAdvisor

n Cappler projectdocument has beenrevised and anarchitect identified;curriculum develop-ment initiated

1. 37 new signatoriesjoined the GlobalCompact

2. Outreach eventswere conducted inJakarta andPalembang

3. Local Network forGlobal Compactestablished

OutcomeArea

Expected Outputs Performance Indicators

Actual Outputs at Year End

Table 5 cont-d

Page 105: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 5 . T A B L E S8 6

Department of Law andHuman Rights Advisorrecruited.

Sub outcome: Promotinggood corporategovernance through theGlobal Compact (leadagency UNDP). Support toGlobal Compact inIndonesia to:

n Increase participation.

n Strengthen secretariatin APINDO to runGlobal Compact activities.

n Raise awareness.

Sub outcome: Programmeto increase participation ofCivil Society Organizationsin the development ofnational pro-poor planningand democratic processes(lead agency UNDP).

n Support to ‘Democracyin Action’ initiatives.

n Deepening DemocracySecretariat established.

n Enhanced capacity ofessential democraticinstitutions.

Sub outcome: Review ofUN conventions, treatiesand protocols and UNaction plan to support thegovernment in implemen-tation and the parliamentin the ratification process(lead agency ILO).

n Production of a five-year plan of action tosupport Indonesia’scommitments to UNinstruments.

4. 1 workshop on‘orientation toGlobal Compact ‘conducted tosignatories

(i) Efforts to raisepublic awarenessof peace,pluralism, nationalunity, and civicrights have beencarried out,involving youths,religious organi-zations, massmedia organiza-tions, academia,and governmentofficials as targetgroup.

(ii) DeepingDemocracySecretariat hasnot beenestablished yet.

(iii) Electoral supportdecided to focuson Pilkada andsupport forprocess to draftrequired legisla-tion in prepara-tion for 2009elections.

n Preliminary work forthe review has beenconducted.

n UNIFEM supported

• Inter Governmentalcooperation inimplementingCEDAW principles.

• Non-governmentalorganizations topromote andmonitor theimplementation.

OutcomeArea

Expected Outputs Performance Indicators

Actual Outputs at Year End

Table 5 cont-d

Page 106: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 6 . R B M I N D E V E L O P M E N T O R G A N I Z A T I O N S 8 7

INTRODUCTION

There are numerous references in UNDPdocuments to the need for a transformation to anew culture of results:

n “Establish an enabling environment forchange to a results-based organization...”25

n “A culture of accountability for results is atthe heart of recreating UNDP...”26

n “RBM [results-based management] must aimat … fostering a … culture of performance.”27

Thus in the minds of those developing the UNDPresults-based management approach—and consis-tent with other observers on what is needed foreffective results-based management—fostering a‘culture of results’ is at the heart of implementingresults-based management. This note addresseswhat a culture of results entails and how such aculture can be developed and maintained.

A CULTURE OF RESULTS

A number of authors and reports (see referencesin Annex 4) have looked at the issue of a resultsculture, what it is and how to get there. Based onthis literature, an organization with a strongculture of results:

n Engages in self-reflection and self-examination:• Deliberately seeks evidence on what it

is achieving28

• Uses results information to challenge andsupport what it is doing29

• Values candor, challenge and genuinedialogue30

n Engages in results-based learning:• Makes time to learn31

• Learns from mistakes and weakperformance32

• Encourages knowledge transfer33

Annex 6

RESULTS-BASED MANAGEMENT INDEVELOPMENT ORGANIZATIONSBUILDING A CULTURE OF RESULTS

25 UNDP, ‘Change Management: UNDP 2001’, 1997.26 UNDP, ‘The Way Forward: The Administrator’s Business Plans, 2000-2003’, 2001.27 UNDP, ‘RBM in UNDP: Overview and General Principles’, 2000.28 Botcheva L, White CR, Huffman LC, ‘Learning Culture and Outcomes Measurement Practices in Community

Agencies’, American Journal of Evaluation, 2002, 23(4): 421-434; General Accounting Office, ‘An Evaluation Culture andCollaborative Partnerships Help Build Agency Capacity. Program Evaluation’, Washington, DC, 2003; and Smutylo T,‘Building an Evaluative Culture’, International Programme for Development Evaluation Training, World Bank andCarleton University, Ottawa, Ontario, 2005.

29 Hernandez G, Visher M, ‘Creating a Culture of Inquiry: Changing Methods–and Minds–on the Use of Evaluation inNonprofit Organizations’, The James Irving Foundation, 2001.

30 David T, ‘Becoming a Learning Organization’, Marguerite Casey Foundation, 2002.31 Ibid.32 Barrados M, Mayne J, ‘Can Public Sector Organizations Learn?’, OECD Journal on Budgeting, 2003, 3(3): 87-103; and

Goh S, ‘The Learning Organization: An Empirical Test of a Normative Perspective’, International Journal ofOrganizational Theory and Behaviour, 2001, 4(3&$):329-355.

33 David T, ‘Becoming a Learning Organization’, Marguerite Casey Foundation, 2002; Goh S, ‘The Learning Organization:An Empirical Test of a Normative Perspective’, International Journal of Organizational Theory and Behaviour, 2001, 4(3&$):329-355; and Hernandez G, Visher M, ‘Creating a Culture of Inquiry: Changing Methods–and Minds–on theUse of Evaluation in Nonprofit Organizations’, The James Irving Foundation, 2001.

Page 107: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 6 . R B M I N D E V E L O P M E N T O R G A N I Z A T I O N S8 8

n Encourages experimentation and change:

• Supports deliberate risk taking34

• Seeks out new ways of doing business35

Thus, a weaker culture of results might,for example,

n Gather results information, but limit its usemainly to reporting

n Acknowledge the need to learn, but not providethe time or structured occasions to learn

n Undergo change only with great effort

n Claim it is results focused, but discouragechallenge and questioning the status quo

n Talk about the importance of results, butfrown on risk taking and mistakes

n Talk about the importance of results, but valuefollowing process and delivering outputs

A CULTURE OF RESULTS AND UNDP

Based on the headquarters interviews, there islimited evidence of a results culture in UNDP,especially contrasted with interviews of severalother sister organizations. There is some otherevidence supporting this perception.

In summarizing the Global Staff Survey, it wasnoted that “responses to the question ‘My officeworks consistently towards achieving long-termobjectives’ have hit a low point, with the bulk ofthe decline coming in COs [country offices].”

The Dalberg Global Development Advisorsassessment of results-based management atUNDP noted the need to enhance the culture ofthe organization: “First, an ongoing changemanagement effort to embed a results-basedculture in the organization is required.”36 In a 2002report comparing results-based managementefforts at a number of multilateral developmentagencies, including UNDP, Flint concluded that“ … [these] multilateral development institutionsneed to work to amend their internal incentivestructures in favour of results.”37

While it is difficult to generalize, it appears thatthere is not a strong results culture in UNDP.Results-based management is seen mainly as areporting regime, rather than a results-informedmanagement regime.

FOSTERING A CULTURE OF RESULTS

Fostering a culture of results is a significantchallenge for an organization. There are anumber of factors that are needed to build such a‘culture of inquiry:’38

n Demonstrated senior management leader-ship and commitment

n Informed demand for results information

n Supportive organizational systems, practicesand procedures

n A results-oriented accountability regime

n A capacity to learn and adapt

n Results measurement and results manage-ment capacity

34 Pal LA, Teplova T, ‘Rubik's Cube? Aligning Organizational Culture, Performance Measurement, and HorizontalManagement’, Carleton University, Ottawa, Ontario, 2003.

35 General Accounting Office, ‘An Evaluation Culture and Collaborative Partnerships Help Build Agency Capacity.Program Evaluation’, Washington, DC, 2003; Goh S, ‘The Learning Organization: An Empirical Test of a NormativePerspective’, International Journal of Organizational Theory and Behaviour, 2001, 4(3&$):329-355; and Smutylo T,‘Building an Evaluative Culture’, International Programme for Development Evaluation Training, World Bank andCarleton University, Ottawa, Ontario, 2005.

36 Dalberg Global Development Advisors, ‘Assessing Results Management at UNDP. Commissioned by the DanishMinistry of Foreign Affairs’, New York, NY, 2006, p 20.

37 Flint M, ‘Easier Said Than Done: A Review of Results-Based Management in Multilateral Development Institutions’,UK Department for International Development (DFID), London, United Kingdom, 2002, p 50.

38 Adapted from Auditor General of Canada (2000) and OECD-DAC (2006).

Page 108: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 6 . R B M I N D E V E L O P M E N T O R G A N I Z A T I O N S 8 9

In each of these areas, based on the literature andidentified good practices, there are a number ofapproaches that can be used to foster a culture of inquiry.44

Demonstrated senior management leadershipand commitment. All discussions of implement-ing results-based management identify strongsenior leadership as essential. Providing visible

39 Perrin B, ‘World Bank Roundtable—Moving from Outputs to Outcomes: Practical Advice from Governments Aroundthe World’, World Bank and the IBM Centre for The Business of Government, Managing for Performance and ResultsSeries, Washington, DC, 2006. Available online at: http://www.worldbank.org/oed/outcomesroundtable/.

40 Ibid, p 23.41 Ibid, p 24.42 Auditor General of Canada, ‘Moving Towards Managing for Results’, Report of the Auditor General of Canada to the

House of Commons, Chapter 11, Ottawa, Ontario, 1997. Available online at: http://www.oag-bvg.gc.ca/domino/reports.nsf/html/ch9711e.html.

43 Ibid, pp 11-16.44 Discussed here are those practices directly aimed at building a culture of results.

This report summarizes the discussion at a two-day workshop held in December 2004 at the World Bank in Washington,DC, with participants from both developed (Canada, Ireland, Netherlands, United Kingdom, United States) anddeveloping countries (Chile, Columbia, Egypt, Mexico, Spain, Tanzania, Uganda).39 In addition to the final reportfrom the workshop, the country papers are available on the web site. Among the conclusions of the participants:

The Use of Both Top-Down and Bottom-Up Support . It was clear from the experiences of countries with anoutcome approach that support from both the top political and administrative levels, as well as from middlemanagement and staff within government, are essential for the approach to work. We consider each of these below.

The Role of Political and Senior Management Commitment and Direction. A common theme reinforced byexperiences in many different jurisdictions is the necessity of top-level support for an outcome orientation. As theprevious section indicated, a political imperative to produce and to be able to demonstrate results that are ofcentral importance to government is a prerequisite for any reform effort.

Thus, support from the top is essential to provide legitimacy and priority to an outcome orientation. This requiresan expressed and ongoing commitment from senior-level officials as well as from the political level. Such commit-ment can provide direction and coordination to the reform effort as well as the necessary clout and profile toensure attention and action. Top-level support can aid in garnering the necessary resources and system-widesupports and in providing overall coordination. As with any other major organizational change effort, senior-levelcommitment is required to address the inevitable challenges that are sure to come up, to continue the momentum,and to make adjustments and changes to the approach as needed.

This commitment needs to be backed by actions as well as words. Otherwise, it is not likely to be taken seriously.For example, how much attention is given to an outcome approach vis à vis other priorities? How is it resourced andsupported? How much recognition is given to those who undertake an outcome approach? And perhaps mostimportant of all, how is it used? Does it represent an actual shift in how management and policy are carried out, oris it perceived as just a paper exercise?40

“Senior-level support and commitment to an outcome focus has been provided in a number of different ways.For example, in Egypt the process was led personally by the minister of finance, who met directly with those mostclosely involved on a bimonthly basis. In Colombia, the president himself provides the leadership, talking aboutresults wherever he goes. The outcome approach in Mexico is closely related to the president’s political agenda. Butin other countries, such as the United States, support comes from across the political spectrum. In some countries,legislative or even constitutional changes may be needed to facilitate the new out-come focus, whereas this is notnecessary in other jurisdictions.” 41

A study by the Auditor General of Canada42 reviewed the results-based management experiences in a number oforganizations in the United States and the Canadian federal government that had made significant progress inresults-based management. Among its findings were that:

“Managing for results takes hold when senior management visibly supports the approach and is poised to takeadvantage of opportunities to move towards a focus on results. Our cases and the review of experience of otherjurisdictions suggests that senior management commitment and leadership is the most common and perhaps mostimportant feature of successful managing for results.” 43

Senior Leadership

Page 109: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 6 . R B M I N D E V E L O P M E N T O R G A N I Z A T I O N S9 0

and accountable leadership through actions suchas identifying and supporting results-basedmanagement champions, walking the talk(providing results-based management-consistentleadership), and demonstrating the benefits ofresults-based management.

Informed demand for results information. Aculture of results can be greatly enhanced ifmanagers at all levels, especially senior levels,consistently and routinely ask for resultsinformation in planning, implementing andreviewing contexts. In this way, results informa-tion becomes a routine and natural part ofmanaging the organization.

Supportive organizational systems, practicesand procedures. Having the right formal andinformal incentives in place is essential tofostering a culture of results. Rewarding goodmanaging for results—such as undertaking self-evaluation, taking informed risk and experimen-tation, and sharing information on results—demonstrates that the organization does indeedvalue inquiry and reflection. Managers seeking toachieve outcomes need to be able to adjust theiroperations as they learn what is working andwhat is not.45 Managing only for plannedoutputs does not foster a culture of inquiry aboutthe impacts of delivering those outputs.

A results-oriented accountability regime. Ifmanagers are simply accountable for following

procedures and delivering planned outputs, there islittle incentive to seek evidence on the outcomesbeing achieved. If managers are held accountablefor whether or not they achieve outcomes, theywill seek to ensure accountability only for outputs.There is a need to adapt the accountabilityregime to include the idea of influencingoutcomes, being accountable for outcomes, andrewarding good managing for outcomes.46

A capacity to learn and adapt. Learning fromempirical evidence on past performance is what aresults culture is all about. Deliberate efforts areneeded to build a capacity for and acceptance oflearning in an organization. Creating institution-alized learning events,47 providing group learningopportunities,48 supportive information sharingand communication structures,49 making thetime to learn and providing adequate resources todo so,50 seeing mistakes as opportunities tolearn51 and focusing on best practices52 are allways to help foster a culture of learning.

Results measurement and results managementcapacity. Building a culture of results in anorganization requires the capacity to articulateand measure results, and a capacity to understandhow results information can be used to helpmanagers manage. Some level of in-houseprofessional results-based management supportis usually required to assist managers and staff.Senior managers and managers need tounderstand results-based management and how

45 General Accounting Office, ‘Results-Oriented Cultures: Insights for U.S. Agencies from Other Countries' PerformanceManagement Initiatives’, US General Accounting Office, Washington, DC, 2002.

46 Auditor General of Canada, ‘Modernizing Accountability in the Public Sector’, Report of the Auditor General of Canadato the House of Commons, Chapter 9, Ottawa, Ontario, 2002; and Baehler K, ‘Managing for Outcomes: Accountabilityand Thrust’, Australian Journal of Public Administration, 2003, 62(4): 23-34.

47 Barrados M, Mayne J, ‘Can Public Sector Organizations Learn?’, OECD Journal on Budgeting, 2003, 3(3): 87-103; andMoynihan DP, ‘Goal-Based Learning and the Future of Performance Management’, Public Administration Review, 2005,65(2): 203.

48 David T, ‘Becoming a Learning Organization’, Marguerite Casey Foundation, 2002.49 Cousins B, Goh S, Clark S, Lee L, ‘Integrating Evaluative Inquiry into the Organizational Culture: A Review and

Synthesis of the Knowledge Base’, Canadian Journal of Program Evaluation, 2004, 19(2): 99-141.50 David T, ‘Becoming a Learning Organization’, Marguerite Casey Foundation, 2002.51 Barrados M, Mayne J, ‘Can Public Sector Organizations Learn?’, OECD Journal on Budgeting, 2003, 3(3): 87-103; and

Michael D, ‘Governing by Learning: Boundaries, Myths and Metaphors’, Futures, 1993, January/February: 81-89.52 Pal LA, Teplova T, ‘Rubik's Cube? Aligning Organizational Culture, Performance Measurement, and Horizontal

Management’, Carleton University, Ottawa, Ontario, 2003.

Page 110: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 6 . R B M I N D E V E L O P M E N T O R G A N I Z A T I O N S 9 1

to support it. This capacity can be enhancedthrough training, using peer champions, andproviding senior managers with the kinds ofresults question they can be routinely asking. TheJoint Inspection Unit benchmarks for results-based management stress the need for adequateresults-based management capacity.53

CONCLUSIONS

Developing a culture of results in an organizationwill not happen through good intentions andosmosis. Many UN organizations face thischallenge. It requires deliberate efforts by theorganization, especially its senior managers, toencourage and support such a culture. It needs to beclear to managers and staff that results informa-tion is valued and expected to be a regular part ofplanning, budgeting, implementation and review.

RESULTS-BASED MANAGEMENT IN A DECENTRALIZED STRUCTURE AND THE ROLE OF A CORPORATE STRATEGY

UNDP, like many other UN organizations, facesa potential tension between having a corporateheadquarters vision of the organization and theneed for its country offices to design programmesreflecting national priorities.

This tension is highlighted even more in aresults-based approach to planning andmanaging. Efforts to set clear and measurableintended results at the corporate level for all ofUNDP could conflict with efforts at the countrylevel to set clear results that reflect nationalpriorities. While using both a top-down and abottom-up approach to developing resultsframeworks is usually seen to be a good practice,54

the challenge is to get these approaches to meetconsistently in the middle.

As identified in interviews, UNDP’s StrategicPlan serves several ends. It provides:

n A clear vision of where UNDP overall is going

n Identification of key priorities

n A map of the development results expected

At the same time, the Strategic Plan needs to allowcountry offices to be responsive to national priori-ties. The challenge is to find the right balance.

UNDP is under pressure from Member Statesand from the One UN initiative, to be morefocused. In several of the smaller UN organiza-tions interviewed (UNCDF and UNFPA) theneed to become more focused was emphasizedand actions taken in that regard described. In thislight, a key role of the UNDP Strategic Planwould be setting clear priorities or boundaries onwhat business UNDP is in. Clarity in this regardwould allow country offices to turn aside requestsfrom national governments that did not fitUNDP’s current mandate as agreed upon byMember States. Some interviewees alluded tothis, arguing that country offices welcomed (orwould welcome) such clear direction.

Within a well-defined mandate, the StrategicPlan can also identify the overall developmentresults UNDP hopes to achieve. This is wherethe link with country office plans is critical. Toomuch detail at the corporate level wouldundermine country efforts to reflect nationalpriorities. In the interviews, UNICEF arguedthat there need not be a tension between theselevels. In their case, corporate priorities weregeneral enough to allow country programming toreflect national priorities, while at the same timewere specific enough to provide clear directionwith respect to what business UNICEF was in.At the MDG level, common goals can be set forcountries. Similarly, for UNDP, the StrategicPlan can set common outcomes at a level thatwould allow countries to ‘fit in’ their own

53 Fontaine Ortiz E, Tang G, ‘Results-based Management in the United Nations in the Context of the Reform Process’,Joint Inspection Unit, United Nations Geneva, Switzerland, JIU/REP/2006/6.

54 Perrin B, ‘World Bank Roundtable—Moving from Outputs to Outcomes: Practical Advice from Governments Aroundthe World’, World Bank and the IBM Centre for The Business of Government, Managing for Performance and ResultsSeries, Washington, DC, 2006, p 7. Available online at: http://www.worldbank.org/oed/outcomesroundtable/.

Page 111: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 6 . R B M I N D E V E L O P M E N T O R G A N I Z A T I O N S9 2

development goals. This also allows UNDP to beable to report back to its Executive Board againstthe common outcomes agreed to.

Thus, the UNDP corporate Strategic Plan canset clearly defined boundaries on what businessUNDP is in and provide overall direction on thedevelopment results UNDP seeks over itsplanning period. This would allow countryoffices to develop programming plans based onnational priorities within the UNDP framework.

ACCOUNTABILITY IN A RESULTS-BASED MANAGEMENT REGIME

Traditionally, accountability systems are based onbeing accountable for following proper proceduresand using approved resources. When resultsmanagement regimes are initially introduced,there is additional focus of being accountable fordelivering planned outputs. These are all activitiesover which managers have or should havecontrol. Thus it is reasonable to expect them tobe held to account for their actions.

Results-based management, on the other hand,asks managers to focus on the outcomes to beachieved, to track the outputs and sequence ofoutcomes being achieved and, based on a theoryof change for the programme, to adjust theiractivities and outputs to maximize the likelihoodthat the desired outcomes are realized. Results-based management asks managers to learn fromprior experience and adjust their operations asrequired. And it recognizes that outcomes bydefinition are results over which managers do nothave control; they are results that managers andtheir programmes, through their activities andoutputs, influence and contribute to.

Thus, accountability focused solely on process,resource utilization and output achievement issomewhat at odds with results-based manage-ment. As several writers have noted there is aneed to augment accountability systems to takeinto account the focus on outcomes.55

There are several challenges in consideringaccountability for outcomes. First, there areinfluencing factors other than the programme atplay, such as other programmes and social andeconomic factors. Second, many outcomes ofinterest take a number of years to bring about,frequently beyond the biennium. There is a needto reconcile the two-year planning and reportingperiod with the longer time frames often foroutcomes to occur.

Baehler56 discusses this first issue in light of NewZealand’s focus on outcomes and away fromoutputs. She concludes that rather than beingaccountable for outcomes per se, they should beaccountable for managing for outcomes, and allthat implies. The Auditor General of Canada57

argues the need for a revised concept of account-ability to take into account the fact that outcomesare not controlled by managers. She argues thatmanagers would need to be accountable forcontributing to (influencing) outcomes ratherthan achieving outcomes per se, and for learning,i.e., for having adjusted activities and outcomesas a result of tracking performance to date.

UNDP58 is implementing an AccountabilityFramework as an integral part of its StrategicPlan. The Accountability Framework addresses

55 Aucoin P, Heintzman R, ‘The Dialectics of Accountability for Performance in Public Sector Management Reform’, inGovernance in the Twenty-first Century: Revitalizing the Public Service, B. G. Peters and D. J. Savoie, Eds., McGill-Queens's University Press, 2000; Behn R, ‘Rethinking Democratic Accountability’, Brookings Institute, 2000; BurgessK, Burton C, Parston G, ‘Accountability for Results’, Public Services Productivity Panel, London, England, 2002;Dubnick MJ, ‘Accountability Matters’, Shani Conference, University of Haifa, Israel, 2004; Hatry H, ‘We Need a NewConcept of Accountability’, The Public Manager, 1997, 26: 37-38; and Mayne J, ‘Evaluation for Accountability: Realityor Myth?’, In Making Accountability Work: Dilemmas for Evaluation and for Audit, M-L Bemelmans-Videc, J Lonsdale, BPerrin, Eds. Transaction Publishers, New Brunswick, NJ, 2007.

56 Baehler K, ‘Managing for Outcomes: Accountability and Thrust’, Australian Journal of Public Administration, 2003, 62(4): 23-34.57 Auditor General of Canada, ‘Modernizing Accountability in the Public Sector’, Report of the Auditor General of Canada

to the House of Commons, Chapter 9, Ottawa, Ontario, 2002.58 UNDP, ‘Action Brief for the Operations Group: Key Principles and Approach to the Roll Out of the Accountability

Framework in UNDP’, Bureau of Management, 2007.

Page 112: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 6 . R B M I N D E V E L O P M E N T O R G A N I Z A T I O N S 9 3

accountability at the organizational level, themanager’s level and the individual level. The rollout of the Accountability Framework provides agood occasion to support a greater results-basedmanagement focus in UNDP, by seeing whatpeople are accountable for moving beyondprocess and outputs. Following proper processesand delivering outputs with approved resourcesare still important but need to be seen in abroader framework where one is accountable, forexample, for:

n Following proper processes

n Delivering planned and modified outputswith approved resources

n Measuring the planned outcomes of interest

n Demonstrating the contribution being madeby UNDP to the accomplishment of theplanned outcomes

n Demonstrating what was learned over thebiennium in delivering the outputs, and whatchanges were made as a result

Demonstrating learning and contribution wouldrequire some tracking of at least immediateoutcomes to provide an indication if the expectedchain of results that underlies the theory ofchange of the programme is in fact beingrealized. This chain of results needs to includethe fact that UNDP delivers through partners,hence ensuring partners have the capacity todeliver and monitor for results is an importantresult. If things seem to be happening asexpected, then no changes are likely required.The theory of change is being confirmed andthere is a basis for arguing that a contribution isbeing made to the intended outcomes. Otherwise,some changes are called for, since the theory ofchange is not being realized; perhaps the partner-ing arrangement is not well structured or the

delivered programmes are not working asexpected. One would expect learning to occurand changes made in the activities and outputsproduced to maximize the likelihood that theprogramme is contributing to the achievement of theoutcomes. Describing and explaining the reasonsfor such changes demonstrates that learning andgood results-based management are occurring.

In essence, following Baehler,59 accountability foroutcomes can be reasonably interpreted as beingaccountable for good results-based management,rather than for achievement of outcomes per se.

There is some evidence that ideas close to theseare being put into practice within UNFPA.UNFPA60 describes an “accountability foroutcomes” in their Strategic Plan as—in relationto the above elements—being accountable for:

n Ensuring financial controls

n Achieving and monitoring outputs

n Monitoring outcomes (global trends andoutcome indicators)

n Ensuring outputs contribute to outcomes

It would be useful for UNDP to discuss with UNFPAthis approach to accountability for outcomes.

The second challenge mentioned above waslinking the often longer time required foroutcomes to be achieved with the shorterplanning and reporting period. Wye, in reflectingon many years of experience in the United States,argues that this should not be a problem.61 Oneshould be able to “provide a narrative explanationof the reporting cycle and the issues raised, anduse available data to comply with the reportingschedule.”62 In other words, write a narrativestatement on when the outcomes will beachieved, and develop a milestone tracking

59 Baehler K, ‘Managing for Outcomes: Accountability and Thrust’, Australian Journal of Public Administration, 2003, 62(4): 23-34.60 UNFPA, ‘UNFPA Draft Strategic Plan 2008-2011. Accelerating Progress and National Ownership of the ICPD

Program of Action’, Executive Board Informal Meeting, 16 May 2007.61 Wye C, ‘Performance Management for Career Executives: A "Start Where You Are, Use What You Have" Guide’, In

Managing for Results 2005, JM Kamensky and A Morales, Eds. Rowman and Littlefield, Oxford, England, 2005.62 Ibid, p 68.

Page 113: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 6 . R B M I N D E V E L O P M E N T O R G A N I Z A T I O N S9 4

system to report progress in the interim years.Many of the outcomes UNDP is seeking willtake many years to accomplish. Restricting theconsideration of outcomes and reporting to thetwo-year planning budgeting period or even thefour-year strategic planning period does notsupport a results-based management orientation.At the same time, it is quite reasonable to expectUNDP to indicate the extent to which expectedintermediate outcomes along the results chain ofa programme are being realized as useful effortsto track progress towards longer-term outcomes.

CHALLENGES TO RESULTS-BASED MANAGEMENT

Not everyone supports results-based managementas a good thing. Critics of this approach to publicmanagement point to a number of public-sectoraspects that mitigate against a rational approachto managing. Some of the most commoncriticisms—and responses to them—include:

n Trying to manage by numbers in a politicalcontext is, at best, unrealistic and can bedysfunctional.63 For example, in trying to setclear and concrete objectives and targets,political scientists argue that results-basedmanagement runs up against the real need ina political environment such as at the UNDP,to keep objectives suitably fuzzy so as to gainthe widest support. This is true to someextent, but in the end, the UNDP has to fundspecific programmes consistent with specificnational priorities. Clarity can only help theirdesign and delivery. And by remaining

flexible, the measurement of results caninclude a range of end results sought bydifferent parties.

n In a similar vein, trade-offs are inevitable andare best handled through the regular politicalprocess not via some rational analysis of prosand cons.64

This is also true, but results-basedmanagement is not intended to replace themanagement process or the political process.Rather, it should be seen as one means ofinforming debate and decision making, notas making decisions.

n Many of the results sought by UNDP andother public sector organizations cannot bemeasured.65 As a result, results-basedmanagement forces measurement andreporting of other less important results.Norman found this view expressed in hisreview of New Zealand’s results-basedmanagement experiences.66

But many, if not most, results sought canbe measured, especially if we considermeasurement in the public sector to be ameans of reducing the uncertainty aboutwhat is happening rather than definitivelyproving something. Flexibility in measurementapproaches would allow a wide variety ofmeans to be used to increase understandingabout the performance of a programme fromdifferent perspectives.

n It is not plausible to hold organizations toaccount for outcomes over which they havelimited control.67

63 Thomas P, ‘Performance Measurement and Management in the Public Sector’, Optimum, 2005, 35(2): 16-26; and RadinBA, ‘Challenging the Performance Movement: Accountability, Complexity and Democratic Values’, GeorgetownUniversity Press, Washington, DC, 2006; and Hood C, ’Public Service Management by Numbers: Why Does it Vary?Where Has it Come From? What Are the Gaps and the Puzzles?’, Public Money and Management, 2007, 27(2): 95-102.

64 Thomas P, ‘Performance Measurement and Management in the Public Sector’, Optimum, 2005, 35(2): 16-26; and RadinBA, ‘Challenging the Performance Movement: Accountability, Complexity and Democratic Values’, GeorgetownUniversity Press, Washington, DC, 2006.

65 Clark ID, Swain H, ‘Distinguishing the Real from the Surreal in Management Reform’, Canadian PublicAdministration, 2005, 48(4); and Radin BA, ‘Challenging the Performance Movement: Accountability, Complexity andDemocratic Values’, Georgetown University Press, Washington, DC, 2006.

66 Norman R, ‘Managing through Measurement or Meaning? Lessons from Experience with New Zealand’s Public SectorPerformance Management Systems’, International Review of Administrative Sciences, 2002.

67 Clark ID, Swain H, ‘Distinguishing the Real from the Surreal in Management Reform’, Canadian PublicAdministration, 2005, 48(4); and Radin BA, ‘Challenging the Performance Movement: Accountability, Complexity andDemocratic Values’, Georgetown University Press, Washington, DC, 2006.

Page 114: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 6 . R B M I N D E V E L O P M E N T O R G A N I Z A T I O N S 9 5

However it is plausible to hold organiza-tions to account for influencing outcomesand for managing for results. This is how accountability for outcomes should be interpreted. UNFPA is moving in this direction.

n Focusing on any set of performance indica-tors ends up causing perverse behaviour—gaming—as people work to make thenumbers go up.68

While this is a real problem, there aremany ways to counter this tendency, such as

focusing on outcomes not outputs, reviewingmeasures regularly, using a balancing set ofindicators, and developing indicators in aninclusive manner.

There are legitimate concerns over results-basedmanagement and organizations should be awareof the possible downsides of implementingresults-based management. The key point isperhaps the need to take these concerns intoaccount in developing and, especially, inmanaging the results-based management regime.

68 Clark ID, Swain H, ‘Distinguishing the Real from the Surreal in Management Reform’, Canadian PublicAdministration, 2005, 48(4); Perrin B, ‘Effective Use and Misuse of Performance Measurement’, American Journal ofEvaluation, 1998,19(3): 367-379; Radin BA, ‘Challenging the Performance Movement: Accountability, Complexity andDemocratic Values’, Georgetown University Press, Washington, DC, 2006; Hood C, ‘Gaming in Targetworld: TheTargets Approach to Managing British Public Services’, Public Administration Review, 2006, 66(4): 515-521; and HoodC, ‘Public Service Management by Numbers: Why Does it Vary? Where Has it Come From? What Are the Gaps andthe Puzzles?’, Public Money and Management, 2007, 27(2): 95-102.

Page 115: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in
Page 116: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 7 . S O U R C E S A N D T Y P E S O F F U N D I N G T O U N D P 9 7

Annex 7

SOURCES AND TYPES OF FUNDING TO UNDP

ProgrammeExpenditure 2000-2006

Time Period

2000 2001 2002 2003 2004 2005 2006

Core

TRAC 1/2 17 15 14 14 13 12 11

TRAC 3 1 1 1 1 1 1 1

Other 3 2 2 2 2 1 1

Total Core 21 18 17 17 16 14 13

Non-Core

Government Cost Sharing 47 48 40 43 37 32 35

Donor Cost Sharing 10 10 13 13 18 23 21

Thematic Trust Funds - - 1 1 1 1 1

Trust Funds 17 16 22 20 21 24 24

GEF 3 6 6 6 5 5 6

MP/CAP21 2 2 2 1 1 1 1

Other Development PAF - - - - 0 0 0

Other - - - - - - -

Total Non-Core 79 82 84 84 83 86 88

Total Core + Non-Core (USD, thousands)

1,862,047 2,030,908 2,140,582 2,402,634 2,870,369 3,662,506 4,048,946

Note: Due to rounding, the sum core and non-core values for the tables may not equal 100 percent.

A. Overall analysis. UNDP overall (percentages)

Page 117: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 7 . S O U R C E S A N D T Y P E S O F F U N D I N G T O U N D P9 8

ProgrammeExpenditure 2000-2006

Time Period

2000 2001 2002 2003 2004 2005 2006

Core

TRAC 1/2 47 47 47 45 44 26 24

TRAC 3 2 1 1 1 2 1 1

Other 4 4 3 3 0 0 1

Total Core 53 52 51 49 46 27 26

Non-Core

Government Cost Sharing 3 2 1 3 1 3 5

Donor Cost Sharing 10 15 14 17 27 33 29

Thematic Trust Funds - - 2 3 1 1 1

Trust Funds 28 24 23 21 19 31 35

GEF 5 7 7 7 5 4 4

MP/CAP21 1 2 1 1 1 1 0

Other Development PAF - - - - 0 1 0

Other - - - - - - -

Total Non-Core 47 50 48 52 54 74 74

Total Core + Non-Core(USD, thousands)

287,802 292,246 294,050 353,846 394,035 738,000 918,703

B. Regional breakdown. Regional Bureau for Africa (percentages)

Page 118: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 7 . S O U R C E S A N D T Y P E S O F F U N D I N G T O U N D P 9 9

ProgrammeExpenditure 2000-2006

Time Period

2000 2001 2002 2003 2004 2005 2006

Core

TRAC 1/2 45 39 28 25 19 17 19

TRAC 3 1 1 1 0 0 1 1

Other 5 2 2 2 0 0 0

Total Core 51 42 31 27 19 18 19

Non-Core

Government Cost Sharing 5 16 7 9 4 2 3

Donor Cost Sharing 10 5 17 21 34 37 34

Thematic Trust Funds - - 1 1 1 1 0

Trust Funds 17 15 31 30 34 35 34

GEF 7 15 10 9 5 5 7

MP/CAP21 9 8 4 3 2 2 2

Other Development PAF - - - - 0 0 0

Other - - - - - - -

Total Non-Core 48 59 70 73 80 82 80

Total Core + Non-Core(USD, thousands)

273,686 290,553 375,157 405,806 663,550 812,226 730,316

B. Regional breakdown. Regional Bureau for Asia and the Pacific (percentages)

Page 119: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 7 . S O U R C E S A N D T Y P E S O F F U N D I N G T O U N D P1 0 0

Programme Expenditure 2000-2006

Time Period

2000 2001 2002 2003 2004 2005 2006

Core

TRAC 1/2 13 10 8 8 14 11 8

TRAC 3 2 1 1 1 3 1 1

Other 1 1 1 1 0 0 0

Total Core 16 12 10 10 17 12 9

Non-Core

Government Cost Sharing 18 20 15 17 21 19 23

Donor Cost Sharing 3 4 4 4 16 24 33

Thematic Trust Funds - - 0 1 0 1 0

Trust Funds 57 58 66 63 40 38 31

GEF 5 5 4 5 5 5 4

MP/CAP21 2 1 1 1 0 0 0

Other Development PAF - - - - 0 0 0

Other - - - - - - -

Total Non-Core 85 88 90 91 82 87 91

Total Core + Non-Core(USD, thousands)

211,962 224,017 284,962 292,899 234,229 326,145 382,867

B. Regional breakdown. Regional Bureau of Arab States (percentages)

Page 120: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 7 . S O U R C E S A N D T Y P E S O F F U N D I N G T O U N D P 1 0 1

Programme Expenditure 2000-2006

Time Period

2000 2001 2002 2003 2004 2005 2006

Core

TRAC 1/2 15 15 15 15 16 14 13

TRAC 3 3 2 0 1 5 (2) 1

Other 2 1 1 1 0 0 0

Total Core 20 18 16 17 21 12 14

Non-Core

Government Cost Sharing 20 24 20 22 21 24 24

Donor Cost Sharing 35 37 35 29 27 34 24

Thematic Trust Funds - - 2 2 1 1 1

Trust Funds 16 11 15 19 21 22 26

GEF 8 10 11 11 8 8 10

MP/CAP21 0 1 1 0 0 0 0

Other Development PAF - - - - 1 1 1

Other - - - - - - -

Total Non-Core 79 83 84 83 79 90 86

Total Core + Non-Core(USD, thousands)

122,520 142,307 146,007 177,922 196,398 281,273 310,840

B. Regional breakdown. Regional Bureau for Europe and the CIS (percentages)

Page 121: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 7 . S O U R C E S A N D T Y P E S O F F U N D I N G T O U N D P1 0 2

Programme Expenditure 2000-2006

Time Period

2000 2001 2002 2003 2004 2005 2006

Core

TRAC 1/2 2 2 1 2 2 2 2

TRAC 3 0 0 0 0 0 0 0

Other 0 0 0 0 0 0 0

Total Core 2 2 1 2 2 2 2

Non-Core

Government Cost Sharing 88 84 81 84 84 83 84

Donor Cost Sharing 7 7 10 8 9 9 7

Thematic Trust Funds - - 0 0 0 0 0

Trust Funds 2 3 3 2 2 2 4

GEF 1 3 3 3 2 3 2

MP/CAP21 0 1 1 1 1 1 0

Other Development PAF - - - - 0 0 0

Other - - - - - - -

Total Non-Core 98 98 98 98 98 98 97

Total Core + Non-Core(USD, thousands)

886.427 999,410 937,953 1,069,437 1,128,173 1,227,976 1,417,215

B. Regional breakdown. Regional Bureau for Latin America and the Caribbean (percentages)

Page 122: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 7 . S O U R C E S A N D T Y P E S O F F U N D I N G T O U N D P 1 0 3

Programme Expenditure 2000-2006

Time Period

2000 2001 2002 2003 2004 2005 2006

Core

TRAC 1/2 - - - - 0 1 0

TRAC 3 5 5 3 4 2 4 2

Other 26 19 23 32 18 13 16

Total Core 31 24 26 36 20 18 18

Non-Core

Government Cost Sharing - - (2) - - - 0

Donor Cost Sharing 20 15 14 10 5 7 6

Thematic Trust Funds - - 2 5 2 2 3

Trust Funds 41 48 49 39 53 55 52

GEF 6 8 7 9 16 16 20

MP/CAP21 3 5 4 1 4 2 2

Other Development PAF - - - - - - -

Other - - - - - - -

Total Non-Core 70 76 74 64 80 82 83

Total Core + Non-Core(USD, thousands)

79,650 82,375 102,453 102,724 253,984 276,866 289,005

B. Regional breakdown. Bureau for Development Policy (percentages)

Page 123: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 7 . S O U R C E S A N D T Y P E S O F F U N D I N G T O U N D P1 0 4

Programme Expenditure 2000-2006

Time Period

2000 2001 2002 2003 2004 2005 2006

Core

TRAC 1/2 50 52 75 67 85 82 48

TRAC 3 0 0 0 1 2 1 0

Other 7 3 6 14 8 5 2

Total Core 57 55 81 82 93 88 50

Non-Core

Government Cost Sharing 0 0 (2) 0 0 0 0

Donor Cost Sharing 5 20 21 7 0 0 38

Thematic Trust Funds 0 0 1 8 3 1 0

Trust Funds 0 0 2 (2) 0 0 1

GEF 0 0 0 2 3 5 10

MP/CAP21 0 0 0 0 0 0 0

Other Development PAF 0 0 0 0 2 5 1

Other 39 25 (3) 2 0 2 0

Total Non-Core 44 45 19 17 8 13 50

Total Core + Non-Core(USD, thousands)

5,552 5,346 3,681 3,635 5,016 6,326 17,450

C. Breakdown for Country Case Studies. Zambia (percentages)

Page 124: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 7 . S O U R C E S A N D T Y P E S O F F U N D I N G T O U N D P 1 0 5

Programme Expenditure 2000-2006

Time Period

2000 2001 2002 2003 2004 2005 2006

Core

TRAC 1/2 32 37 45 65 51 27 33

TRAC 3 0 0 0 0 0 0 1

Other 4 1 1 3 5 3 2

Total Core 36 38 46 68 56 30 35

Non-Core

Government Cost Sharing 1 1 0 0 4 0 2

Donor Cost Sharing 50 58 39 9 5 19 21

Thematic Trust Funds 0 0 10 11 2 1 6

Trust Funds 4 1 2 5 23 46 33

GEF 7 3 3 2 4 1 0

MP/CAP21 3 0 0 0 0 1 1

Other Development PAF 0 0 0 0 4 1 0

Other 0 0 0 4 2 2 1

Total Non-Core 64 63 54 31 44 71 64

Total Core + Non-Core (USD, thousands)

1,857 2,415 1,804 1,576 2,897 6,708 6,977

C. Breakdown for Country Case Studies. Moldova (percentages)

Page 125: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 7 . S O U R C E S A N D T Y P E S O F F U N D I N G T O U N D P1 0 6

Programme Expenditure2000-2006

Time Period

2000 2001 2002 2003 2004 2005 2006

Core

TRAC 1/2 24 22 16 18 13 10 9

TRAC 3 - - - - - 1 1

Other 4 1 0 3 1 1 1

Total Core 28 23 16 21 14 12 11

Non-Core

Government Cost Sharing 6 43 2 2 1 0 0

Donor Cost Sharing 8 (11) 19 19 9 28 65

Thematic Trust Funds - - - 0 0 35 10

Trust Funds - - - - - - -

GEF 3 3 2 1 0 0 0

MP/CAP21 2 3 3 3 7 1 1

Other Development PAF - - - - - - -

Other 54 40 57 54 69 24 13

Total Non-Core 73 78 83 79 86 88 89

Total Core + Non-Core (USD, thousands)

11,458 17,613 26,076 32,620 58,921 72,331 96,805

C. Breakdown for Country Case Studies. Indonesia (percentages)

Page 126: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 7 . S O U R C E S A N D T Y P E S O F F U N D I N G T O U N D P 1 0 7

Programme Expenditure2000-2006

Time Period

2000 2001 2002 2003 2004 2005 2006

Core

TRAC 1/2 12 7 10 6 83 80 3

TRAC 3 0 0 0 0 0 0 0

Other (2) 1 0 1 15 17 1

Total Core 10 8 10 7 98 97 4

Non-Core

Government Cost Sharing 66 72 73 68 2 2 73

Donor Cost Sharing 9 4 4 6 0 0 10

Thematic Trust Funds 0 0 0 1 0 0 0

Trust Funds 0 0 1 5 0 0 7

GEF 12 13 11 11 0 0 5

MP/CAP21 3 3 0 2 0 0 0

Other Development PAF 0 0 0 0 0 0 0

Other 0 0 0 0 0 0 1

Total Non-Core 90 92 89 93 2 2 96

Total Core + Non-Core(USD, thousands)

18,877 23,072 15,139 19,573 1,419,808 1,728,887 49,134

C. Breakdown for Country Case Studies. Egypt (percentages)

Page 127: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 7 . S O U R C E S A N D T Y P E S O F F U N D I N G T O U N D P1 0 8

ProgrammeExpenditure 2000-2006

Time Period

2000 2001 2002 2003 2004 2005 2006

Core

TRAC 1/2 0 0 1 0 0 0 0

TRAC 3 0 0 0 0 0 0 0

Other 0 0 0 0 0 0 0

Total Core 0 0 1 0 0 0 0

Non-Core

Government Cost Sharing 99 97 97 98 95 95 99

Donor Cost Sharing 0 2 1 0 2 2 0

Thematic Trust Funds 0 0 0 0 0 0 0

Trust Funds 0 0 0 0 0 0 0

GEF 0 0 1 0 0 0 0

MP/CAP21 1 1 1 1 0 0 0

Other Development PAF 0 0 0 0 0 0 0

Other 0 0 0 0 3 2 0

Total Non-Core 100 100 99 100 100 100 100

Total Core + Non-Core(USD, thousands)

180,940 136,894 58,873 136,777 245,178 321,881 271,989

C. Breakdown for Country Case Studies. Argentina (percentages)

Page 128: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 8 . S U M M A R Y O F R E S U L T S F R O M T H E S T A F F S U R V E Y O N R B M 1 0 9

Culture and leadership

3 UNDP encourages risk taking and mistakes in the pursuit A 35 52 32of results D 65 48 68

4 In UNDP it is more important to achieve results, than to follow D 66 64 66process and deliver outputs

5 In my country office an adequate budget is made available for A 46 56 44operating the results-based management system D 54 44 56

Programme focus

6 The main value of the service lines is in allowing us to focus our A 53 58 52programme by saying no to partners in non-strategic areas

7 UNDP outcomes in my country are developed through a process A 76 87 74that brings ownership by all stakeholders (government, other UN organizations, development partners, civil society)

8 It is normal in our country office that policy and planning decisions A 66 77 64are informed by empirical evidence on past performance.

9 The organization in my office is structured to deliver the A 72 79 71CPAP outcomes

10 I can confidently explain to my colleagues and development A 93 96 93partners the difference between an output and an outcome

11 I can explain clearly how outputs contribute to programme outcomes A 94 100 93

12 The focus of management in my country is the achievement of A 56 69 54outcomes rather than implementation of individual projects

Monitoring and reporting

13 The ROAR is an effective outcome monitoring tool A 59 41 62D 41 59 38

14 The country office Balanced Scorecard is more important than D 62 73 60the ROAR in managing for results at country programme level

Percentage of respondents (A)greeing or (D)isagreeing n= 365 52 313A/D All RC/RR/ Others

DRR

Annex 8

SUMMARY OF RESULTS FROM THE STAFF SURVEY ON RESULTS-BASED MANAGEMENT

Page 129: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 8 . S U M M A R Y O F R E S U L T S F R O M T H E S T A F F S U R V E Y O N R B M1 1 0

15 Monitoring and reporting are well harmonized with other D 67 94 63development partners and make use of country reporting systems

Adjustment and learning

16 Stakeholders and managers collectively analyze performance A 61 65 60and decide on action

17 Development managers have the latitude, flexibility and authority A 57 73 54to arrange resources (financial and personnel) as required toachieve the desired outcomes

18 There is a clear link between allocation of the Biennial Support D 61 69 60Budget and four yearly Programme Allocation, and evidence of results in our country programme

19 Because most of our funds are raised through cost sharing or from A 56 55 56donors, we have little scope in allocating resources across our programme or within outcome areas according to results.

20 Whether positive or negative, performance information is used A 72 82 70to foster learning

21 There is effective follow-up and actions on management A 61 73 59response to evaluations

Evaluation and accountability

22 Roles and responsibilities at all levels in my country office are A 61 83 57clearly set out and known to staff

23 Under the RCA, the key factor in UNDP enhancing promotion A 49 37 51and advancement prospects is demonstrating a proven ability to D 51 63 49raise resources and in delivery

24 The RC/RR/CD is accountable for achievement of country A 79 74 80programme outcomes

25 The RC/RR/CD can only be held accountable for delivery of D 61 45 64UNDP outputs

26 In my office, country programme staff are under more pressure A 64 53 66to raise resources and ensure timely delivery than on enhancing the contribution by UNDP to achievement of the outcomes

Support systems

27 I can easily find guidelines and support from the RSCs and A 55 35 58headquarters to help design objectives and indicators for D 45 65 42projects and programmes

28 The training I have received has equipped me with the ability to A 60 67 59plan and manage for outcomes

29 In our country office adequate time and structured occasions are D 58 39 61made available to learn from results and evaluations.

30 UNDP’s rewards systems provide real incentives for strengthening D 76 81 75a results culture within the organization

Percentage of respondents (A)greeing or (D)isagreeing n= 365 52 313A/D All RC/RR/ Others

DRR

Page 130: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 9 . R E S U L T S - B A S E D M A N A G E M E N T B E N C H M A R K S A S S E S S M E N T 1 1 1

Benchmarks69 Assessment

Annex 9

RESULTS-BASED MANAGEMENTBENCHMARKS ASSESSMENT

Table 1. Rationale for judgement against benchmarks

69 Sources of benchmarks: JIU – Fontaine Ortiz E, Kuyama A, Münch W, Tang G, ‘Managing for Results in the UNSystem Part 1: Implementation of Results-based Management in the United Nations Organizations’, Joint InspectionUnit, United Nations Geneva, Switzerland, 2004. Paris – Paris Declaration on Aid Effectiveness, 2005. MfDR –OECD, ‘Emerging Good Practice in Managing for Development Results – Source Book (2006)’, Joint Venture onManaging for Development Results, OECD DAC.

70 Fontaine Ortiz E, Kuyama A, Münch W, Tang G, ‘Managing for Results in the UN System Part 1: Implementation ofResults-based Management in the United Nations Organizations’, Joint Inspection Unit, United Nations Geneva,Switzerland, 2004.

PLANNING – GOALS

Benchmark 3:Long-termobjectives havebeen clearlyformulated for the organization(Joint InspectionUnit [JIU])

JIU70 states that “A key step for RBM [results-based management] is to identifythe long-term goals and objectives to be pursued by the organization. Theyderive from the statutory instruments, the mission statement and the relatedmandates contained in the pertinent resolutions and decisions. The main actionsto be carried out are to:(1) Adopt a long-term planning instrument for the organization (corporate

strategic framework).(2) Identify the internationally agreed goals that closely relate to the organiza-

tion’s mission, and to which a contribution from the organization is expectedin view of its specific mandate and sphere of competence.

(3) Define clearly the long-term objectives for the organization that wouldcontribute to the attainment of the identified goals; the objectives should be specific, measurable, achievable, relevant and time-bound (SMART) andconstitute, therefore, the critical results to be accomplished or assessed by the organization over the period of time covered by its strategic framework.

(4) Ensure a participatory process in the development of the corporate strategic framework.”

Partially achieved.(1) MYFF 1 and 2 and the proposed Strategic Plan represent the long-term

planning instrument.(2) Internationally agreed goals that closely relate to the organization’s mission,

and to which a contribution from the organization is expected in view of itsspecific mandate and sphere of competence, should be identified. The MDGsand reporting against them is not explicitly incorporated into the MYFFplanning and reporting framework, although selection of the practice areasand explanation of the assumed linkage to the MDGs is found in the MYFFs.

(3) Long-term objectives are defined in the MYFF 1 and 2 and also in theStrategic Plan. In practice, these objectives do not define targets that driveplanning at operational level. Instead, they are expected to be used as a set of boundary rules defining areas in which UNDP should and should not work.To date, definition of SMART indicators has been the major weakness. Forexample, conclusion from MSI was that “A performance monitoring plan withperformance indicators, definitions, units of measure, and specified data

Page 131: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 9 . R E S U L T S - B A S E D M A N A G E M E N T B E N C H M A R K S A S S E S S M E N T1 1 2

Goals reflect themandate of UNDPand its comparativeadvantage (Paris 3)

Goals reflectdevelopmentobjectives of membercountries (Paris 3)

Goals aredeveloped througha process thatbrings ownershipby all stakeholders(Managing forDevelopmentResults [MfDR] 1)

Results leaddirectly to strategicgoals, adapted topartner countrycontext (Paris 7)(MfDR)

Results can beattributed plausiblyto programmes(MfDR)

collection methodologies has not been developed to measure MYFF goals,service lines or core results.”71

(4) The process for development of the corporate strategic framework is notparticipatory, but in theory it is an aggregation of objectives and results agreedby UNDP at country level that have been derived using a participatory process.

Achieved.MYFF 1 and 2 and the proposed Strategic Plan clearly reflect UNDP’s mandateand—despite changes between plans, which have been aimed at simplifying thepresentation—the plans have been consistent on what UNDP’s mandate is.

The MYFF 1 and 2 and the Strategic Plan are intended to be statements of what isseen at corporate level as UNDP’s comparative advantage. However, littleevidence presented to substantiate assertion on what that comparativeadvantage is.

Achieved.Dealt with when UNDAF and UNDP outcomes are agreed with partner governments.

Not achieved.Not relevant—see discussion of outcomes below.

Partially achieved.Not achieved at corporate level, see conclusion from MSI that “A performancemonitoring plan with performance indicators, definitions, units of measure, andspecified data collection methodologies has not been developed to measureMYFF goals, service lines or core results.” 72

At the country level, UNDP should be able to substantiate how results leaddirectly to strategic goals, adapted to partner country context. This would bethrough the UNDAF/CPD/CPAP planning processes. Evidence from the evalua-tion case studies is that results frameworks do not convincingly link resultsdirectly with achievement of goals specified. Findings of evaluation case studiesagree with those of others, including:

Benchmarks Assessment

Table 1 cont-d

71 MSI, ‘Background Paper on Results-Based Management in Development Organizations’, Management SystemsInternational, July 2006, p 6.

72 Ibid, p 6.

PLANNING – RESULTS AND RESOURCES

Page 132: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 9 . R E S U L T S - B A S E D M A N A G E M E N T B E N C H M A R K S A S S E S S M E N T 1 1 3

Benchmark 4:The organization’sprogrammes arewell aligned withits long-termobjectives (JIU)

A UNDP assessment72 concluded that:

“One of the challenges to improving the strength of practice identified is thequality or evaluability of the planning and programming frameworks uponwhich M&E rely. As noted in the review of the pilot UNDAF matrices, andhighlighted in the review of outcome evaluations, the lack of clearly definedand measurable objectives, supported by appropriate indicators and datastreams has restrained the accurate assessment of results for sector and inter-sectoral learning and corporate accountability.

As a further, more unique and internal constraint, it was raised that theplanning horizon in UNDP may be too short to be able to meaningfully assessprogress with outcomes. Whereas outcomes may easily take 5-10 years tomaterialize, most programmes are designed for a 2-4 year duration, intentionallybeing kept within the timeframe of the CPD/CCFs. The challenge for countryoffice is then to identify methodologies, standards and indicators thatmeaningfully capture performance and attribution within the timeframe of their operational horizon.”

Longhurst73 concluded the following about UNDAF results matrices:n Poor identification of risks and assumptions.n Lack of targets and timelines.n Outputs not linked to those accountable.n Indicators not properly identified, often vague and very general. (This is not a

matter of whether qualitative or qualitative—there should be a balance—butwhether the correct indicator has been chosen. A quantitative element doeshelp in later M&E.)

n Broadness of outcomes so that there was no suitable practical indicator toshow whether the outcome had been achieved or not. If national prioritiesare broad, then the outcomes will be as well.

n Commitment to disaggregating results by gender, which emphasizes thegeneral view that gender mainstreaming needs to be strengthened in the UNDAF.

JIU74 states that to achieve this, the following conditions are required:(1) A clear definition of the cascade of objectives at each level of the

organization’s programme structure.(2) Ensuring that objectives are logically consistent among levels reflecting

cause-effect linkages.(3) Regrouping and reformulating the programmes to provide better focus for

the work of the organization within the framework of the identified long-termobjectives thereby avoiding strategic disconnect in programme planning.

(4) Adoption of results-based management tools and approaches to the specificities of various operational entities.

Partially achieved.

Benchmarks Assessment

Table 1 cont-d

72 UNDP, ‘An Assessment of UNDP Practices in Monitoring and Evaluation at the Country Level’, UNDP EvaluationOffice, New York, NY, February 2005, para 7.4-7.5.

73 Longhurst R, ‘Review of the Role and Quality of the United Nations Development Assistance Frameworks (UNDAFs)’,Overseas Development Institute, London, England, May 2006, p 26.

74 Fontaine Ortiz E, Kuyama A, Münch W, Tang G, ‘Managing for Results in the UN System Part 1: Implementation ofResults-based Management in the United Nations Organizations’, Joint Inspection Unit, United Nations Geneva,Switzerland, 2004.

Page 133: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 9 . R E S U L T S - B A S E D M A N A G E M E N T B E N C H M A R K S A S S E S S M E N T1 1 4

Benchmark 5:The organization’sresources are wellaligned withits long-termobjectives (JIU)

(1) A clear definition of the cascade of objectives at each level of UNDP’sprogramme structure is rarely achieved.75

(2) Ensuring that objectives are logically consistent among levels reflectingcause-effect linkages is not consistently achieved.76

(3) Evidence from the evaluation case studies is that regrouping and reformulat-ing the programmes to provide better focus for the work of the organizationwithin the framework of the identified long-term objectives thereby avoidingstrategic disconnect in programme planning does not generally happen.Instead, changes in the strategic objectives and framework triggersremapping of projects and programmes, rather than a re-appraisal of theprogrammes and portfolio of projects.

(4) Adoption of results-based management tools and approaches to the specifici-ties of various operational entities has happened to varying extent. Seeevidence in evaluation case studies. Also following conclusion from UNDP:77

“The pre-2001 project-orientated framework for M&E remains stronglyembedded in many country practices, overlain at a higher level byelements of the new (post-2001) results-orientated approach. Thevariation in practice and experience relates primarily to the extent towhich country offices have sought to not integrate and localize elementsof old and new within their specific partnership and country contexts.”

JIU78 states that to achieve this, the following conditions are required:(1) Ensuring coherence and compatibility between budgeting and programming

decisions (e.g., any budget cuts should correspond to specific and identifiedprogramme cuts).

(2) Development of effective cost accounting systems that allows linkingexpenditures to expected results.

(3) Adoption of a programming instrument linking resources to results.(4) In case of a short programming cycle (2-3 years), merging budgeting with

programming and appropriate necessary resources.(5) In case of medium term programming (4 years or more), approving a targeted

overall level of resources and appropriating on annual or biennial basis.(6) Identifying under-performing, obsolete or marginal programmes and activities

over time and shifting resources not only to proven efficient and relevantones, but also to those programmes considered to be of the highest priority.

Partially achieved.(1) At present, at the corporate level ensuring coherence and compatibility

between budgeting and programming decisions (e.g., any budget cuts shouldcorrespond to specific and identified programme cuts) is difficult sinceprogramming and budgeting planning cycles are separate. A proposal tointegrate into one cycle and link programming and budgeting more explicitlywas agreed upon by the Executive Board in 2006 for implementation from 2008.

(2) Development of effective cost accounting systems that allows linkingexpenditures to expected results. This started with introduction of ATLAS in2004, but progress has been slow, since agreed that while potential to use this

Benchmarks Assessment

Table 1 cont-d

75 UNDP, ‘An Assessment of UNDP Practices in Monitoring and Evaluation at the Country Level’, UNDP EvaluationOffice, New York, NY, February 2005; and Longhurst R, ‘Review of the Role and Quality of the United NationsDevelopment Assistance Frameworks (UNDAFs)’, Overseas Development Institute, London, England, May 2006.

76 Ibid.77 UNDP, ‘An Assessment of UNDP Practices in Monitoring and Evaluation at the Country Level’, UNDP Evaluation

Office, New York, NY, February 2005, para 4.3.78 Fontaine Ortiz E, Kuyama A, Münch W, Tang G, ‘Managing for Results in the UN System Part 1: Implementation of

Results-based Management in the United Nations Organizations’, Joint Inspection Unit, United Nations Geneva,Switzerland, 2004, 47-55.

Page 134: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 9 . R E S U L T S - B A S E D M A N A G E M E N T B E N C H M A R K S A S S E S S M E N T 1 1 5

data in strategic manner exists, has not yet been adopted by many seniormanagers to date.

(3) Adoption of a programming instrument linking resources to results. Achievedwith introduction of ATLAS.

(4) In case of a short programming cycle (2-3 years), merging budgeting withprogramming and appropriate necessary resources. In theory, achievedthrough introduction of CPAP from 2006.

(5) In case of medium-term programming (4 years or more), approving a targetedoverall level of resources and appropriating on annual or biennial basis.Achieved at the country level through the CPDs.

(6) Identifying under-performing, obsolete or marginal programmes and activi-ties over time and shifting resources not only to proven efficient and relevantones, but also to those programmes considered to be of the highest priority.No transparent and systematic system to achieve this. Note that most TRACfunding is allocated to country programmes on an entitlement, and notpriority/ results basis. Varying levels of constraints on allocating non-TRACresourcing at corporate level, since priorities are somewhat set by donors.Also evidence that even when opportunity to allocate funds (i.e. withinthematic trust funds) and TRAC 1.1.2 on the basis of priorities and results, thishas not been done. At the country level, evaluation case studies suggest thatthere is no systematic tool for doing this.

In terms of staffing:MSI79 concludes that “UNDP’s headquarters’ structure is not organized to supportthe implementation of the service lines. For example, there are not active ordesignated service line managers for many of the service lines and programmingmodels, best practices and toolkits have not been developed and validated.”

The Management Review states:80

“According to a recent survey of COs [country offices],81 94% of RCs and DRRsindicated that the practices and service lines introduced in 2002 havebenefited their work by providing greater focus, improved knowledgemanagement and a focus on results, and through facilitating positioning andadvocacy at the national level. Having successfully introduced the practicearchitecture, the next challenge will be to strengthen and refine the practiceapproach and its implementation. This is critical, given that country offices,despite their satisfaction with the overall practice architecture, have notreceived consistent and high quality support across all service lines, and thatsupport has generally not included advice to UN Country Teams. Concernshave been voiced repeatedly over:n The lack of systematic and coherent delivery of policy advisory services;n Inadequate definition of roles and responsibilities;n A high cost and lack of flexibility in the financial model;n Too broad a scope in terms of themes addressed and products and

services offered; and n A disconnect between knowledge products and services and business

processes.”

The Management Review Team82 also states:“In recent years, COs [country offices] have been asked to be more flexible, todo more with less, while programmatic resources and core staffing levels in

Benchmarks Assessment

Table 1 cont-d

79 MSI, ‘Background Paper on Results-Based Management in Development Organizations’, Management SystemsInternational, July 2006, p 7.

80 UNDP, ‘UNDP Management Review Phase II – Report of the Management Review Team’, UNDP, March 2007, para 73.81 This survey was undertaken within the framework of Phase II of the Management Review.82 UNDP, ‘UNDP Management Review Phase II – Report of the Management Review Team’, UNDP, March 2007, para 63.

Page 135: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 9 . R E S U L T S - B A S E D M A N A G E M E N T B E N C H M A R K S A S S E S S M E N T1 1 6

Outcomes aredeveloped througha process thatbrings ownershipby all stakeholders(MfDR 1)

Developmentmanagers have thelatitude, flexibilityand authority toarrange resourcesas required toachieve the desiredoutcomes (MfDR 4)

Resources arecommitted in apredictable fashion

Benchmark 6:An effectiveperformancemonitoring system

some offices have fallen. In some cases, sudden increases in the volume ofresources have forced COs to quickly step up efficiency and performance.Consultations at the GMTM in January 2006 and ad-hoc surveys havehighlighted both the pressure felt in COs for increased accountability anddelivery of results, and the related constraints that make this goal hard toachieve. Operational capacity constraints also prevent COs from reachinggoals in delivery, and thus contribute to the build-up of unspent balances.The most commonly cited constraints fall mainly into two categories, i.e.,operations-related processes and systems (procurement, ATLAS, reporting, etc.)and staff capacity/knowledge (resident capacities, staff profiles, training andlearning, etc.), and are largely attributed to a lack of adequate human andfinancial resources. While Central Services Bureaux are seen as providingsome of the needed support (including through the Regional ServiceCenters), this is not perceived to be enough or in the areas needed to makeup for the shortfall in capacity.”

Partially achieved.Outcomes are supposed to be developed in a participatory fashion as part of theUNDAF/CPD/CPAP development process. 76% of UNDP staff responding to theevaluation survey agree that outcomes bring ownership. This finding issupported by the Egypt and Indonesia case-study evidence, but in Moldovaownership is poor, partially due to lack of capacity within the government and its level of substantive engagement in the process.

Partially achieved.57% of UNDP staff responding to the evaluation survey agree that managershave the latitude, flexibility and authority to arrange resources as required toachieve the desired outcomes. Evidence from the evaluation case studiessuggests that managers (RR/DRR level) have the necessary discretion and powerat country level, but don’t have enough control of the resource base to be able to fully use the discretion and power.

Partially achieved.TRAC 1.1 and 1.2 resources are allocated on a predictable basis to countryprogrammes. However, non TRAC funding is far greater in volume and is muchless predictable. In the case of donor or government cost-sharing, funding is onlypredictable once agreement is formally made. In terms of trust fund resources,normally allocated on an annual basis and must be used within the financial year.Inability to roll over use of TRAC and Trust Fund resources from one year to thenext creates pressure to disburse, even when recognized that expenditure will be ineffective.

The JIU83 states that to achieve this, the following conditions are required:(1) Adoption of clear provisions for the supervisors to systematically verify that

tasks assigned to meet the objectives and targets are being successfullycarried out.

Benchmarks Assessment

Table 1 cont-d

83 Fontaine Ortiz E, Kuyama A, Münch W, Tang G, ‘Managing for Results in the UN System Part 1: Implementation ofResults-based Management in the United Nations Organizations’, Joint Inspection Unit, United Nations Geneva,Switzerland, 2004.

MONITORING AND REPORTING

Page 136: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 9 . R E S U L T S - B A S E D M A N A G E M E N T B E N C H M A R K S A S S E S S M E N T 1 1 7

is in place (JIU)(Periodic assess-ments of perform-ance are madeagainst definedtargets)

(2) Identification of the type of data and information needed to be collected forperformance monitoring.

(3) Assignment of clear responsibilities among staff and managers for perform-ance monitoring.

(4) Linking future resource disbursements for programmes to the discharge oftheir performance monitoring requirements.

(5) Refining the quality of the defined results and indicators through the process.(6) Using both qualitative and quantitative indicators, as appropriate, and identi-

fying standard or key indicators to measure performance at the corporate level.(7) Establishment of baselines and targets against which progress could be

measured over a certain period of time.(8) Simplification of performance measurement, including through the initial

use of relatively few results statements and performance indicators.(9) Development of a clear information and communication strategy to guide,

inter alia, the selection of the performance monitoring information system tobe used, and ensure coherence in systems throughout the organization.

(10) Weighing carefully the return on investment expected from various optionsto select performance monitoring information systems.

(11) Ensuring that performance information systems are supported by a reliabletelecommunications infrastructure.

Partially achieved.(1) Adoption of clear provisions for the supervisors to systematically verify that

tasks assigned to meet the objectives and targets are being successfullycarried out is achieved. Operates through the RCA.

(2) Identification of the type of data and information needed to be collected forperformance monitoring is partially achieved. As references above show,definition of appropriate indicators is relatively poor.

(3) Assignment of clear responsibilities among staff and managers for perform-ance monitoring is partially achieved. Role of regional bureaux in oversightof country offices has not been well defined during the evaluation period, asconfirmed by the evaluation case studies. Within country programmes,evaluation case studies confirm the findings of UNDP:84

“The strength of implementation of plans and associated systems wasfound to reflect in large part the nature and quality of internal organiza-tional arrangements for M&E. To date, very few COs [country offices] werefound to have M&E officers, although the majority (just over 100 officesto date) have nominated M&E focal points. In the latter case, theseindividuals range in terms of function and location within the office fromDRR to JPO, thus reflecting a spectrum of ability and decision-makingauthority. Further, the location of the M&E officer/ focal point was alsofound to have an effect over the nature of the function, whether locatedmore on the programme or operations side.”

(4) Linking future resource disbursements for programmes to the discharge oftheir performance monitoring requirements is focused at the level of delivery.

(5) Refining the quality of the defined results and indicators through the processhappens in some country offices.

(6) Using both qualitative and quantitative indicators, as appropriate and identi-fying standard or key indicators to measure performance at the corporatelevel. Intermittent application of this approach.

(7) Establishment of baselines and targets against which progress could bemeasured over a certain period of time became mandatory at outcome levelfrom 2003.

(8) Simplification of performance measurement, including through the initialuse of relatively few results statements and performance indicators is notsuggested in UNDP guidance.

Benchmarks Assessment

Table 1 cont-d

84 UNDP, ‘An Assessment of UNDP Practices in Monitoring and Evaluation at the Country Level’, UNDP EvaluationOffice, New York, NY, February 2005, para 4.13.

Page 137: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 9 . R E S U L T S - B A S E D M A N A G E M E N T B E N C H M A R K S A S S E S S M E N T1 1 8

Qualitative andquantitativearrangements arein place makinguse of country levelmechanisms (Paris 10) (MfDR)

Monitoring andreporting areharmonized withother donors(MfDR)

Whether positive or negative,performanceinformation is used to supportconstructive andproactive manage-ment decisionmaking (MfDR 5)

(9) Development of a clear information and communication strategy to guide,inter alia, the selection of the performance monitoring information system tobe used, and ensure coherence in systems throughout the organization. Notfound within UNDP. Note conclusion from MSI85 study:

“UNDP has developed numerous monitoring systems—including theBalanced Scorecard, the Partner Survey, the Staff Survey, project evaluations,and country assessments—but these tools do not always communicateand complement one another, in particular in relation to the MYFFobjectives. It is not clear what program and management decisions areguided by MYFF performance data.”

(10) Weighing carefully the return on investment expected from various optionsto select performance monitoring information systems has not been done.

(11) Ensuring that performance information systems are supported by a reliabletelecommunications infrastructure. Implementation of robust telecommunica-tions systems was one focus of the Administrator’s Business Plans, 2000-2003.

Not achieved.67% of those responding to evaluation survey disagree with this statement.

Not achieved.67% of those responding to evaluation survey disagree with this statement.Evidence from evaluation case studies indicates some harmonization around(driven by) other donors systems (i.e. Egypt Female Genital Mutilation project).Finding is supported by finding of UNDP study:86

“The interaction between UNDP and the state and non-state actors in thefield of M&E at the country level was found to be limited.With few exceptions,UNDP’s M&E activities were found to be focused solely on its own programmes,despite often managing and supporting M&E related initiatives.”

Partially achieved.Regional bureaux will respond if poor performance scored in the ROAR of acountry programme, but in general, oversight focuses almost exclusively onresource mobilization and delivery (see evaluation case studies). Countryprogrammes see arrangements with regional bureau as for reporting rather than management.

There is little evidence from evaluation case studies of use of performanceinformation at outcome level, but there is evidence of use at project level.A UNDP review found:87

Benchmarks Assessment

Table 1 cont-d

ADJUSTMENT AND LEARNING

85 MSI, ‘Background Paper on Results-Based Management in Development Organizations’, Management SystemsInternational, July 2006, p 7.

86 UNDP, ‘An Assessment of UNDP Practices in Monitoring and Evaluation at the Country Level’, UNDP EvaluationOffice, New York, NY, February 2005, para 4.4.

87 Ibid., para 5.1-5.2

Page 138: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 9 . R E S U L T S - B A S E D M A N A G E M E N T B E N C H M A R K S A S S E S S M E N T 1 1 9

Stakeholders andmanagers collec-tively analyzeperformance anddecide on action

Developmentmanagers have thelatitude, flexibilityand authority toarrange resourcesas required toachieve the desiredoutcomes (MfDR 4)

Whether positive ornegative, perform-ance information is used to fosterlearning (MfDR 5)

Benchmark 9:A knowledge-managementstrategy isdeveloped tosupport results-based manage-ment (JIU)

“The issue of use relates primarily to the questions of what purposes M&Einformation is and can be utilized for, and to what effect(s). The findingsidentify five main uses: first, internal CO [country office] operational planningand management (to feed into staff meetings, retreats, performance assess-ment); second, corporate reporting (e.g. SRF/MYFF, ad hoc headquartersrequests for information); third, partner/counterpart liaison, consultationsand decision-making (e.g. steering committee meetings, TPR); fourth,resource mobilization (project briefs, proposals, ‘success stories’); and fifth, assubstantive contributions in participation in Government/donor consultationgroupings and UN inter agency working groups. Less evidence was found ondissemination of lessons learned for global and cross-national learning.

In most of the cases, information derived from the monitoring functionhas been found to be confined to a small cadre of operational staff withinthe arrangement of implementing agency, UNDP and government, servicingthe input and activity tracking functions. The notion of participation in thiscontext often hinged only upon the involvement of these ‘internal’ stakeholders,and thus relates, even unintentionally, largely to the control function ofprojects. However, where regular feedback mechanisms were found, wheresteering committees and Tripartite Reviews are systematically practiced andlocated within broader programmatic structures, the information generatedwas found to be of wider operational and strategic relevance.”

Partially achieved.In the evaluation survey, 61% agreed with this statement, although the focus wasat project level.

Partially achieved.In the evaluation survey, 57% agreed with this statement, although the focus wasat project level.

Partially achieved.In the evaluation survey, 72% agreed with this statement, but 58% say there isinadequate time and resources for learning.

Partially achieved.There is a Knowledge Management strategy seen through the practice networks,but this is not geared to results-based management. The evolving role of RSC isunder debate.

Benchmarks Assessment

Table 1 cont-d

Page 139: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 9 . R E S U L T S - B A S E D M A N A G E M E N T B E N C H M A R K S A S S E S S M E N T1 2 0

Benchmark 7:Evaluation findingsare used effectively(JIU)

Benchmark 2:Performance-oriented systems of accountabilityreplace traditional,compliance-basedsystems (JIU)

Benchmark 3:Accountability is applicable at alllevels, from the top down. Theexecutive headsand the heads ofmajor organiza-tional units aretherefore the first to be heldaccountable for theresults that theyare expected todeliver (JIU)

Partially achieved.Findings from evaluation case studies agree with those of the UNDP study:88

“In terms of UNDP and its primary partners own use, the findings from thecountry assessments and the RBEC workshop suggested the majority ofevaluations conducted have enabled senior management to determine andrespond to aspects of quality of the delivery, strengths and weaknesses,potential gaps and identified future needs. Country offices stated that theywere using the findings of the evaluations for strategic and programmaticadjustment and development. This include the drafting of new SRF andCountry Programme documents, and in one case for the complete reassess-ment of work arrangements, approach, staff needs and advocacy in the areaattended by the evaluation. The wide scope of outcome evaluation, beyondthe traditional sectoral boundaries, has in other cases identified the lack ofcross-fertilization of knowledge and ideas within UNDP and beyond Inanother, post-conflict, case, the evaluation was recognized as the first evercomprehensive written account of what UNDP set out to do, and what haseffectively been achieved in relation to other donor/ NGO contributions.

The format or presentation of knowledge generated by evaluation wasfound to impact the extent to which it was likely to be taken up by diverseaudiences. In the majority of cases, the findings and recommendations werepresented orally to UNDP staff and one or two key stakeholders, and throughthe written reports. Only in few cases was there evidence of an extensivedissemination process. With few exceptions, country offices and their partnersfelt that there was a broader audience that could benefit from the informa-tion generated. This audience was generally perceived by to be the public orthe end-users to whom the outcomes ultimately pertain.”

Partially achieved.RCA was modified in 2002 to reflect a more results orientation. Opinion wasevenly divided in the evaluation survey on the degree to which this had shiftedthe focus from compliance to results. However, 64% reported that mobilizingresources and delivery were more important than results.

Not achieved.Evidence from the evaluation case studies strongly indicates that accountabilityfocuses almost exclusively at the level of the Resident Coordinator/ ResidentRepresentative. Programme staff do perceive that they are accountable fordelivery of project outputs, but nobody is identified as accountable for ensuringthat projects and programmes are managed to maximize their contributions toachievement of the outcome which is the key aspect expected in the results-based management system.

Benchmarks Assessment

Table 1 cont-d

EVALUATION AND ACCOUNTABILITY

88 UNDP, ‘An Assessment of UNDP Practices in Monitoring and Evaluation at the Country Level’, UNDP EvaluationOffice, New York, NY, February 2005, para 5.4-5.5.

Page 140: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 9 . R E S U L T S - B A S E D M A N A G E M E N T B E N C H M A R K S A S S E S S M E N T 1 2 1

Assessment ofperformance andaccountability for results takesinto account bothcontextual factorsand risks andmakes adjustmentsaccordingly (MfDR 5)

Partially achieved.Technically this is covered in the ROAR, Balanced Scorecard and RCA reporting.In practice, this is not a concern for most staff since targets are formally agreedbetween four and six months into the annual programming period and thereforecan be set to reflect ex-post changes.

Benchmarks Assessment

Table 1 cont-d

EVALUATION AND ACCOUNTABILITY

Page 141: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in
Page 142: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 0 . F I N D I N G S F R O M T H E C O U N T R Y S T U D I E S 1 2 3

ARGENTINA

PERCEIVED STRENGTHS OF THE RESULTS-BASED MANAGEMENT SYSTEM

The Argentina Country Office has built astrategic direction for results-based managementwithin a broad set of organizational and processreforms that have been ongoing during the lasttwo to three years within the country office.Within a relatively short period of time, a transi-tion process has been developed, launched and isgenerating change.

The results-based management model that hasbeen developed presents ‘results’ and results-based management in a holistic manner. Itfocuses on aligning three levels of performance/results: at the development programme-level andproject-level; the institutional level; and at thecross-agency level involving UN coordination.This helps underscore the importance of not onlymeasuring results/impacts, but also understand-ing the process. That is, observing the ‘how’ inorder to be able to provide advice and lessonslearned to help others change behaviour.

Managing for results has brought a more system-atic and disciplined move to project developmentand monitoring and a series of new tools andprocesses has been introduced and shared withnational counterparts.

The country office and project-level systems haveintroduced a platform with which to pursueresults-based management capacity building withpartners, and coordination and harmonizationwith other UN organizations and donors.

The dedicated M&E Unit (now three full-timeofficers) has been identified as an importantsupporting element to the results-based manage-ment exercise in the Argentina Country Office.

Associated with this are the training sessionsoffered to UNDP staff and project officers,recognized as critical elements of results-basedmanagement capacity building. There is wide useand some improved understanding of thelanguage of results-based management, thoughstaff indicate a need to understand better how tooperationalize concepts such as indicator, targetsand outcomes.

PERCEIVED WEAKNESSES OF THE RESULTS-BASED MANAGEMENT SYSTEM

There are however, limitations to what the intro-duction of results-based management has producedto date in UNDP Argentina. The country officeis clearly in transition in its implementation ofthe results-based management model, and whilethe model identifies a number of tools andprocesses, some are still under development andothers are not being implemented to the extentthat results-based management normally implies.Of note is the absence of outcome monitoringand outcome evaluation.

The further introduction of results-basedmanagement in the country office appears to beat a crossroads, as there does not appear to be aconsensus on the direction of how to furtherenhance the implementation of results-basedmanagement within the country office. Thecurrent project-level focus at this time seems tobe overlooking the broader dimensions of theresults-based management framework.

Annex 10

FINDINGS FROM THE COUNTRY CASE STUDIES

Page 143: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 0 . F I N D I N G S F R O M T H E C O U N T R Y S T U D I E S1 2 4

Despite the umbrella of ‘outcomes’ offered by theCountry Programme and SRF, management stillfocuses primarily on the project, its activities andoutputs. The work done to link projects tooutcomes is really a mapping exercise. Theassignment of projects to outcome categories issomewhat subjective, though it has allowed forthe identification of outlier projects that havebeen targeted for removal upon completion of thecycle. While this is not necessarily problematic, itdoes suggest that much of this is administrativein nature to serve reporting needs rather thanresults-based management per se.

A weakness of the current system is that there are neither dedicated funds nor priority foroutcome evaluation.

An underlying problem that the country officefaces is that the structure of funding available tothe country office is at odds with managing onthe basis of performance. Only a small propor-tion of funds have any flexibility in their use bythe country office and the necessity to mobilizefunds to support the country office operation is apowerful incentive to prioritize funding overperformance. In effect, the old measure ofperformance and ‘success’ (‘mobilizing resources’)is generally at odds with an results-basedmanagement approach.

Last, two related aspects of the nature ofUNDP’s activities make monitoring results achallenge. First, UNDP is an extremely smallplayer financially, which makes it hard toattribute change to UNDP activities. Second, theupstream aspects of the country office’s moresubstantive work, such as policy engagement,represent a difficult area in which to identify andmeasure outcomes. That said, both of theseobjections can and should be dealt with.

EGYPT

INFLUENCE OF CORPORATE STRATEGIC GOALSON COUNTRY PROGRAMME

The two MYFFs led to structural reorganizationsof the portfolio in response to corporate priorities,but the core orientation of the programme has

not change markedly from the definition in thefirst CCF in 1997.

The first UNDAF had no material constrainingeffect on the programme. The second UNDAF,completed in 2007, still has broad and permissiveoutcome domains. The extent to which thestructure will influence UNDP will depend on the implementation of proposed UNDAFoutcome groups.

ALIGNMENT OF RESOURCES, ORGANIZATIONAND MANAGEMENT

A steady change can be seen in the articulation ofthe programme over the period under review.Outcomes have been reworded, baselinesituations are stated and outcome indicators arebecoming more precise. The objectives structureof the portfolio has become increasingly complex,with the projects under the new CPD beingfitted into UNDP country outcomes, core results,service lines, corporate goals, and UNDAFoutcomes. But the composition of the portfoliohas changed little. The main change is removal,through natural attrition, of outlier projects.

The office was organized around thematic teamsduring the 1990s. That was changed to a flatstructure, not related to outcomes, but retainingthematic teams up to 2007. Plans are currentlybeing developed to return to a team structuregeared toward outcomes.

Owing to the changing structure of objectives, itis difficult to track changes in the configurationof financial resources around objectives. The2007 Country Programme Evaluation concludedthat the programme strategy has been driven byresource mobilization and not by results.

MONITORING SYSTEMS

Steady progress has been made in using monitor-ing tools. Support from the results-basedmanagement officer has been instrumental in thisachievement. There are no routine mechanismsto review and improve the quality of objectivesand indicators. Headquarters-promoted tools

Page 144: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 0 . F I N D I N G S F R O M T H E C O U N T R Y S T U D I E S 1 2 5

either prioritize financial management (ATLAS)or are geared towards reporting upwards(ROAR). Yet the primary unit of monitoring atthe country level is the project.

Strong support to monitoring national outcomescan be seen through the NHDR, though this isnot always recognized as a contribution to results-based management. Development assistance toEgypt is mostly projectized rather than throughbudget support and, partly as a result, harmoniza-tion of M&E systems with the Government ofEgypt and with development partners hasdeveloped on a project-by-project basis.

ADJUSTMENT AND LEARNING

Interviews with programme staff produced clearexamples of systematic use of information tomanage at the project level. This was particularlyprominent in key projects with other develop-ment partners, such as addressing Female GenitalMutilation with the European Union and CIDA,and was reinforced by the project cycle manage-ment procedures used by those partners.Evidence about managing at an outcome level isless compelling. Some programme staff holddiscussions around the ROAR, but there is noclear link back to decisions about programmestrategy and composition. Links to regionalservice centres are strong in specific areas such asfor the environment portfolio and, in general, forthe development of new projects.

EVALUATION AND ACCOUNTABILITY

The office has continued project-level evaluationeven though the requirement was dropped in2002. Five evaluations have been undertaken atthe outcome level during 2006-2007, but these weredelayed as they were under budgeted and had tobe financed by cost sharing across projects.89

Furthermore, a full country programme evalua-tion for the 2002-2006 cycle was also conducted.There is no evidence yet about how managementand the regional bureau will use the findings.

There is a lack of clarity around aspects of theroles and responsibilities of staff and manage-ment in three key areas:

n Accountabilities for planning and implementinga functioning results-based management system

n Accountabilities for the use of information inthe country office and at higher levels

n Accountabilities for the performance ofprojects, programmes and the country office itself

The programme can be characterized as havingislands of excellence such as the Female GenitalMutilation project, where monitoring, adjust-ment and evaluation are rigorous and effective,but those standards are not applied across theportfolio as a whole.

INDONESIA

PERCEIVED STRENGTHS OF THE RESULTS-BASED MANAGEMENT SYSTEM

Both the Strategic Results Framework (SRF) andthe Integrated Resource Framework are consid-ered useful tools by the country office in that theydefine and narrow down the mandate of UNDP.For an organization that does not have a lot ofresources, having corporate policy on paper helpswhen negotiating with governments and othercounterparts who might request projects totallyunconnected with the UNDP mandate. Thearticulation of a clear mandate also helps carve aniche for the organization in an environmentcharacterized by competition for developmentspace and for resources. Before the introductionof the practice areas and service lines, UNDP hadlost its identity, especially as the word ‘develop-ment’ was subject to very broad interpretationand varied between country offices.

In CPAP 2006-2010, UNDP Indonesia setprogramme targets for, and allocated funds to,each outcome for the first time. According to

89 An Assessment of Development Results was also undertaken by UNDP Evaluation Office in 2004.

Page 145: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 0 . F I N D I N G S F R O M T H E C O U N T R Y S T U D I E S1 2 6

staff, this practice ensured rational allocation ofresources across outcome themes such as governance,as staff often challenged whether resources theywere committing to a particular outcome wouldbe sufficient to achieve the outcome.

UNDP sets annual targets in the CPAP that arereported in the ROAR. However, the quarterlyreviews of CPAP assist UNDP, the governmentand other partners in tracking progress towardsachievement of outputs and outcomes.

The introduction of new tools (especially ATLAS,the Balanced Scorecard and RCA) has improvedupward reporting and global transparency andbenchmarking.The country office can compare andcontrast its performance with other UNDP fieldoffices and this motivates improved performance.

Despite its limitations, the new RCA is appreci-ated by some staff as an attempt by the organiza-tion to link individual staff performance toannual targets of the organization. Its link tolearning goals and long-term staff developmentinitiatives is particularly useful in obtaining acorporate view of training needs and planning forthese activities.

The Global Staff Survey has contributed towardsachieving a 360-degree feedback system.Management in the Indonesia office takes theresults of the Global Staff Survey seriously.

PERCEIVED WEAKNESSES OF THE RESULTS-BASED MANAGEMENT SYSTEM

Whilst the theoretical underpinnings of theconcept ‘managing for results’ are sound, the roll-out plan for UNDP was not explicit and was notaccompanied by adequate headquarters guidanceand resources to support country offices. Basedon the experience of Indonesia, it would appearthat staff orientation, training and capacitybuilding for results-based management was oneof the weakest points of the UNDP results-based

management roll-out process. The whole processrelied too heavily on a small number of regionalface-to-face courses and intranet-based guidelinesand training programmes. The use of the intranetshould have been balanced with the use of face-to-face courses at the national level. Face-to-face courses at the regional level were too fewto reach a critical mass or impart sufficientknowledge on the concept of managing for results.Implementation of results-based managementwithin the country office is thus characterized byinformation asymmetry.

The training was confined to the regional levelon the premise that it would reach a sample ofstaff who would, in turn, train others at thecountry level. Unfortunately, those trained at theregional level were not supported with resourcesto offer substantive training to the rest of the staffin the country office. Furthermore, because of theshort duration of contracts, two of the threetrained staff have already left the organization. Itis important to note that most training is beingdeveloped and forced down on staff. Staffcomplained of redundant modules in coursessuch as Prince 2.

The fact that appreciation of results-basedmanagement concepts and tools varied acrossstaff within one country office, compromised theimpact of results-based management on internalorganizational efficiency and developmenteffectiveness.90

The Service Lines were expected to do wondersin terms of narrowing the focus of countryprogrammes. However, sometimes this has nothappened because of their poor wording. InIndonesia, the programme on deepeningdemocracy was placed under the service line onpolicy for lack of a matching service line. Servicelines for crisis recovery are also so broad that theycannot restrict country offices to focus on specifickey deliverables.

90 This feedback came from a short workshop with UNDP staff in Indonesia that explored the factors holding back andthose supportive of results culture.

Page 146: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 0 . F I N D I N G S F R O M T H E C O U N T R Y S T U D I E S 1 2 7

Some of the modules for ATLAS, such as ProjectManagement, continue to be a work in progressand underused. This, in part, explains whyUNDP Indonesia has gone beyond availablecorporate tools on project management todevelop one composite module with all features.

Tools for measurement of development effective-ness of UNDP support have been lacking fromthe UNDP results-based management model, orthey have just been difficult to put in place. Thelatter tends to be the excuse for concentrating onfinancial mobilization and delivery targets as thebackbone of results-based management inUNDP. The ROAR is more for upward reportingto headquarters and remains a self assessmenttool with little, if any, feedback from theheadquarters. Without feedback, the instrumentis not being used effectively by the regionalbureau in its oversight function. The BalancedScorecard has been effective in driving financialperformance and learning and growth achieve-ments upwards, but is a subjective assessmentinstrument. The idea of introducing outcomeevaluations was noble, but the absence ofenforcement of quality standards and corporaterules concerning management response compro-mises their potential impact in terms of changingthe corporate culture at UNDP.

Another issue missing is a strong accountabilityframework. Some accountability exists, but acomprehensive system tying everyone from topmanagement down to the lowest officer at thecountry level has been absent. There is nocustodian of power to hold UNDP accountable.Commitment to results, in terms of developmenteffectiveness, has been eroded by a preoccupationwith resource mobilization, which has kept theorganization afloat. This has tended to furtherreinforce the project approach, as most donorswant to fund projects that have visibility at thecommunity level. For as long as UNDP continuesto focus on non-core funds, which are oftenprojectized, development impact shall remainelusive to measure.

Incentives are an integral part of any performance-enhancement strategy. So far, UNDP has no

mechanism to reward competitively andpromptly above average performers. The staffRCA is a good instrument but weakened by thefact that UNDP does not as yet have amechanism to recognize in financial terms thosethat exceed average performance.

MOLDOVA

INFLUENCE OF CORPORATE STRATEGIC GOALSON THE COUNTRY PROGRAMME

There is no evidence that the MYFFs haveinfluenced the focus and composition of thecountry programme found in Moldova. Thesecond MYFF’s main effect may have been asymbolic reiteration from UNDP headquarters ofthe need to deliver results.

Results set at the outcome level were definedpurely to meet headquarters’ requirements, andreporting against them was seen as meeting aheadquarters’ reporting requirement. When theprogramme attempted to use the results andindicators as a management tool, it found thatthey were not useful. The fact that targets set inthe Balanced Scorecard and ROAR are often notagreed upon with headquarters until June—sixmonths into the implementation period—reinforces the perception that the targets are aformulistic reporting burden rather than a tool toframe and drive planning and performance at thecountry office level.

Whilst outcomes may be defined in consultationwith government and donors, there is noevidence that outcomes are owned by anybodyother than UNDP. While outcomes have beenused for structuring some discussion and analysis,this has been focused within the programmeteam. There is no evidence of using outcomeswith other stakeholders to discuss their differentcontributions to delivery of joint outcomes. Interms of the government, the lack of terminologyfor discussing results-based management orM&E in Romanian, and the almost completelack of formal M&E systems within the govern-ment was a challenge that the programme hasn’t

Page 147: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 0 . F I N D I N G S F R O M T H E C O U N T R Y S T U D I E S1 2 8

surmounted. In terms of donors, only in the field of Public Administration Reform has therebeen the possibility of structuring discussion ofresults around a common outcome, and thisopportunity has not yet been grasped. Elsewhere,cost-sharing donors have relied upon their ownM&E systems.

ALIGNMENT OF RESOURCES, ORGANIZATIONAND MANAGEMENT

There is no evidence that projects identified andother activities, such as advocacy, are organizedaround outcomes. Instead, revisions in outcomestrigger a remapping exercise in which projects areretrofitted under outcomes. Nor is there evidencethat definition of outcomes influences identifica-tion of new projects.

The programme has been very successful atmeeting its resource mobilization and deliverytargets during the past five years. This has mainlybeen achieved through projects that the manage-ment acknowledge are not focused in the areas of key interest to UNDP or where it has a policyrelated comparative advantage. Rather, thesedonor cost-sharing projects were won due toUNDP project administration capacity relative toalternative providers, its relative flexibility and, toa lesser extent, a perceived focus on results fromits projects. However, while there is evidence thatthe programme has tried to align resourcemobilization with the broad areas of engagementfound in the programme areas identified, theseefforts have had only limited success to date.

The Resident Representative considers increas-ing professional programme staffing and anumber of other investments in this area as themajor evidence of an increased results focus inthe country office. The number of professionalprogramme staff in the office expanded fromthree in 2002 to eight by the end of 2006. Theprogramme section was also reorganized in anattempt to lessen the hierarchical culture andexpand roles and responsibilities of staff toengage in substantive work. Programme teamswere reorganized around the major programmeteams, although not around specific outcomes,

which would have been difficult given the smallnumber of staff.

MONITORING SYSTEMS

With the removal of many of the mandatorymonitoring requirements in 2002, the officemoved to a very light project-monitoring systembased around development of annual work plansand quarterly reporting to Project SteeringCommittees against the annual work plans. Theoffice does not use annual project reviews and theuse of tri-partite reviews was also discontinued.Above the level of the project, a system of annualprogramme reviews was instituted.

The office has found it difficult to use reportsgenerated in ATLAS for monitoring with theProject Steering Committees, since presentationof information is not user friendly—not enoughnarrative information on performance is includedand reports are in English. Therefore it hasreverted to the use of short, informal narrativereports when discussing performance with theProject Steering Committees. However, there islittle evidence that monitoring focuses above thelevel of inputs and outputs.

There is no formal monitoring system focused atthe level of the outcome. The annual programmereview meetings have, to date, mostly focused onpresentation of a few key results, but the office isuncertain how these meetings might be betterused to allow substantive discussion of results.

ADJUSTMENT AND LEARNING

There were examples of adjustment of theprogramme or within individual projects. However,there is little evidence linking these adjustmentsto information on results derived from the formalresults-based management systems.

Stakeholders report that Annual ProgrammeReviews have normally lasted for two hours,which only allows 10 to 15 minutes of presentationon key results within each outcome area. Thisapproach does not allow time for substantivediscussion. The country office has experimented

Page 148: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 0 . F I N D I N G S F R O M T H E C O U N T R Y S T U D I E S 1 2 9

with increasing the level of substantive discussionduring the Programme Review process, such asits presentation of the results of the LocalGovernance Outcome evaluation at the 2005review meeting. However, how to re-structurethe country programme review process toenhance its value as a learning and actionapproach remains an unresolved issue.

EVALUATION AND ACCOUNTABILITY

Only one project evaluation has been carried outsince 2002, although three outcome evaluationshave been completed. While no longermandatory to carry out project evaluations,reviews by the Evaluation Office have shown that a lack of project evaluations will adverselyaffect the quality of outcome evaluations. Theexperience of UNDP Moldova in using outcomeevaluations to guide programme developmenthas been mixed, partly due to the quality of theevaluations.

Staff believe that they can be held accountablefor delivery of project outputs. In terms ofaccountability and responsibilities in the office,there are three main findings:

n There is consensus on accountability andresponsibilities at the level of the project.

n The roles and accountabilities of the programmeteam leaders are not well defined. Thissuggests that the wish to move from a flatstructure in which all programme staff reportto the Programme Coordinator to a systemwith teams organized around the two majorprogramme areas has not yet been completed.

n There is no consensus on roles and account-abilities within the country office for delivery of UNDP’s contribution to deliveryof its outcomes.

PRESENT STATUS

The office wishes to move to a greater focus ondelivery of results in its work, and over the pastfew years, management has worked to ensure thatthere are sufficient staff to implement the neededchange. However, the country office faces thefollowing major challenges:

n To survive the office must continue to meetchallenging resource mobilization anddelivery targets, which are strongly monitoredfrom headquarters. Balancing these needsagainst what a results focus would suggestshould be done is an on-going challenge.

n Systems designed at corporate headquartershave not supported development of a resultsorientation and have instead imposedadditional transaction costs, with littleperceptible benefit at country level.

n There is little evidence of support or advicefrom the wider UNDP on how the countryoffice might more effectively introduce a resultsfocus while managing its competing priorities.

Despite the umbrella of ‘outcomes’ offered by theUNDAF and the CPD, management still focusesprimarily on the project, its activities andoutputs. The work done to link projects tooutcomes is really a mapping exercise, and thereis no apparent demonstration of ‘results’ informa-tion being used to strategize on which projects toselect or reject based on their performance. Theassignment of projects to outcome categories issomewhat subjective, though it has allowed forthe identification of outlier projects that havebeen targeted for removal upon completion of thecycle. The outcome formulation is broad andpermissive, allowing wide scope for futureprojects to align against a particular outcome.While this is not necessarily problematic, it doessuggest that much of this is administrative innature to serve reporting needs rather thanresults-based management per se.

The plethora of modifications by headquarters tothe monitoring and reporting systems beingintroduced (i.e. changes to a particular system or interpretation of some element of implemen-tation), has resulted in frustration and confusionamong staff. One of the system-related weak-nesses expressed by staff is that some systems do not relate easily to one another. Mostfrequently mentioned was ATLAS, whichprovides financial and performance reporting on a project basis. The office found that the

Page 149: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 0 . F I N D I N G S F R O M T H E C O U N T R Y S T U D I E S1 3 0

information produced is totally unsuitable forengaging with partners and therefore hasreverted to producing separate narrative reportsfor engaging with other stakeholders.

There is some evidence that the country officedoes adjust the programme but little evidencethat this adjustment is based on assessment of results.

Last, two related aspects of the nature of UNDPactivities make monitoring results a challenge.First, UNDP is an extremely small playerfinancially, which makes it hard to attributechange to UNDP activities. Second, a significantelement in the programme’s work concerns policyengagement, and this is a difficult area in whichto identify and measure outcomes.

ZAMBIA

PERCEIVED STRENGTHS OF THE RESULTS-BASED MANAGEMENT SYSTEM

The setting of corporate goals (practice areas) has helped narrow down the focus of countryoffice programmes to a smaller set of thematicareas and services where the organization has aclear comparative advantage. In the case ofZambia, the five practice areas identified in thesecond MYFF were validated as priority areaswhere the government and bilateral and multilat-eral institutions would like UNDP to providefuture assistance.

UNDP sets annual targets in the CPAP that are reported in the ROAR, and this assists the organization in tracking progress towardsachievement of outputs and outcomes.Furthermore, it helps the country office plan itsresources with a results focus and adjust theprogramme as necessary.

The introduction of new tools (especiallyATLAS and the Dashboard) has improvedupward reporting and global transparency andbenchmarking. The country office can compareand contrast its performance with that of the rest

of UNDP field offices, and this motivatesimproved performance.

Despite its limitations, the new RCA is appreci-ated by some staff as an attempt by the organiza-tion to link individual staff performance toannual targets of the organization. Its link tolearning goals and long-term staff developmentinitiatives is particularly useful in obtaining acorporate view of training needs and planning forthese activities.

The Global Staff Survey has provided some 360-degree feedback to management. Management inthe Zambia office is taking the results of theGlobal Staff Survey seriously. The office hasintroduced annual management and staff retreatsfor joint planning and put in place mechanismsto improve information flow between manage-ment and staff. One of the mechanisms has beento share minutes of the weekly CountryManagement Team and sectional meetings withstaff by uploading them on its website.

PERCEIVED WEAKNESSES OF THE RESULTS-BASED MANAGEMENT SYSTEM

Whilst the theoretical underpinnings of theconcept ‘managing for results’ are sound, the roll-out plan for UNDP was not explicit, neither wasit accompanied by adequate headquartersguidance and resources to country offices. Basedon Zambia’s experience, it would appear that stafforientation, training and capacity building forresults-based management was one of theweakest points of the UNDP results-basedmanagement roll-out process. The whole processrelied too heavily on a small number of regionalface-to-face courses and intranet-based guidelinesand training programmes. The use of the intranetshould have been balanced with the use of face-to-face courses at the national level. Face-to-facecourses at the regional level were too few to reacha critical mass or impart sufficient knowledge ofthe concept. Implementation of results-basedmanagement within the country office is thuscharacterized by information asymmetry. This ismanifests in the variable quality of outcomes perthematic area for example in the current CPD.

Page 150: Evaluation of Results Based Managment in UNDPweb.undp.org/execbrd/pdf/RBM_Evaluation.pdf · 2007. 12. 26. · headquarters in New York, an electronic survey of programme staff in

A N N E X 1 0 . F I N D I N G S F R O M T H E C O U N T R Y S T U D I E S 1 3 1

One example of an outcome that needs improve-ment is found for the programme component onHIV and AIDS, which is: “mainstreaming andimplementation of multi-sectoral and communityresponses to HIV and AIDS at subnational,provincial and national levels strengthened.” Thisis more of an activity or process than a specific,measurable and time bound outcome.

Staff training was confined to the regional levelon the premise that it would reach a sample ofstaff who would, in turn, train others at thecountry level. Unfortunately, those trained at theregional level were not supported with resourcesto offer substantive training to the rest of the staffin the country office. Instead, they held no morethan ‘feedback meetings’. The concept of regional‘Training of Trainer’ courses was a noble idea,allowing headquarters to concentrate on anucleus of trainees who would, in turn, offertraining to a larger group back in their countryoffices. However, the approach fell into the usualtrap that most Training of Trainer programmesfall into: emphasis on the first layer of training(i.e., the training of the trainer) that overshadowsthe second layer (multiplier courses within thecountry offices). Yet it is the second layer that ismost important in terms of achieving the desiredoutreach and enabling uniform application ofresults-based management skills in countryoffices. The fact that appreciation of results-based management concepts and tools variedacross staff within one country office compro-mised the impact of results-based managementon internal organizational efficiency anddevelopment effectiveness.

The above was exacerbated by the fact thatdifferent aspects (modules) of the UNDP results-based management system and tools were rolledout at different stages. Also, most systems wereintroduced not as final products but as works inprogress, and improvements were made in thecourse of results-based management implementa-tion. This trapped the organization in a system-churning process, characterized by changingglobal policy (MYFF 1, MYFF 2 and now the

draft Strategic Plan), evolving results-basedmanagement guidelines, and introduction of newresults-based management tools. ATLAS, anenterprise resource planning tool that was pilotedin 2003, was rolled out in January 2004 as a workin progress. A new RCA was introduced withnew guidelines during the period of results-basedmanagement implementation. The BalancedScorecard was also introduced and adapted in thecourse of implementation. Prince 2, with its originin the U.K. Government, is also being rolled out.All these developments present challenges at thecountry level, where the country office has toadapt to not only internal changes but also externalchanges in the aid architecture driven especiallyby UN reforms and the Paris Declaration.

Some of the modules for ATLAS, such as ProjectManagement, continue to be work in progressand underused. Staff in the UNDP Zambia officeconfirmed that while almost all the financial,human resources and procurement modules inATLAS were now being fully used, only a smallproportion of the project management modulewas being used. At the time of the review,90 percent of staff time invested in ATLAS wasestimated to be spent on financial transactionsand only 10 percent was for substantiveprogramme issues. This is, more or less, the samepattern observed in Egypt.

The accountability framework for developmentresults is particularly weak—little investment hasbeen made or seems likely in the foreseeablefuture, although this problem is appreciated byheadquarters. The results of expanded audits or outcome evaluations are not linked to otherresults-based management tools, yet they are veryimportant especially for monitoring resultsachievement. UNDP has been lost in resourcemobilization in delivery, thus reducing results-based management tools to management ofinputs and only in quantitative terms. Results-based management has thus become a resourcemobilization and utilization management tool asopposed to ‘managing for development results’.