thinking about governance assessment a working framework session ii lusaka, january 20 2003...

Download Thinking about Governance Assessment A Working Framework Session II Lusaka, January 20 2003 Francesca Recanatini, WBI

If you can't read please download the document

Upload: ursula-neal

Post on 18-Jan-2018

219 views

Category:

Documents


0 download

DESCRIPTION

Objectives Which are the key elements for a governance assessment? Which empirical tools and approaches are already available? How can we select among them? How can such assessments be used for policy purpose?

TRANSCRIPT

Thinking about Governance Assessment A Working Framework Session II Lusaka, January Francesca Recanatini, WBI Outline of the Session Introducing a working framework Conceptual design Empirical tools Implementation process Sampling and Field work Analysis and use of the data A few country-specific illustrations Objectives Which are the key elements for a governance assessment? Which empirical tools and approaches are already available? How can we select among them? How can such assessments be used for policy purpose? Governance assessment: one or many approaches? The characteristics of a governance assessment are a function of the objective of the assessment Key starting points 1. What is the purpose of the assessment? Research and analysis Awareness raising Policy and Action planning Capacity building Monitoring Key starting points 2. What is the focus of the assessment? Governance as a whole Corruption Performance of a specific agency/sector Quality of a specific public service delivered Suppose we have determined. The final purpose of the assessment The focus of the assessment What next? An example Peru 2002 Issue: the government wanted to monitor progress in terms of Transparency of public administration activities Civil society participation and voice Quality of public services Peru 2002, cont. Purpose of assessment: monitoring Focus of the assessment: Transparency Citizens Participation and Voice Quality of public services What next? How to think about Governance Assessments? Four dimensions: Conceptual Empirical Process / Capacity Building Analytical and Policy Conceptual dimension Clear definition of the variable we focus on and its manifestations Translation of the definition into observable and measurable components Selection of methodological approach Understanding of the links between governance and Performance outcomes Development outcomes Empirical dimension Focus on institutions vs. individuals Experiential, objective data More than one type of respondents Adaptation of empirical tools to country reality Careful definition of sample and field work details Open end vs. close end questions Process / Capacity Building dimension To increase impact and sustainability: Consultative and participatory approach to discuss purpose, use and features of the assessment Engage local NGOs and academic institutions to adapt/revise tools Public dissemination of results Joint design of policy recommendations Analytical and Policy dimension Distill key links between manifestations of governance and: Quality of services Growth Specific characteristics of public sector Results should be used as one input for policy purpose Governance Assessment Analysis & use Empirical tools & sample Conceptual dimension Implementation process In sum, a working framework Peru 2002 Purpose of assessment: monitoring Final users: government and civil society Key feature: Comparability across time Ability to identify progresses Type of information needed: agency- specific Approach: objective, and based on citizen s feedback Peru 2002 Conceptual dimension Transparency in the management of resources Quality of basic health and education services Quality of complaint and feedback mechanisms Empirical Tool Score card/Questionnaire to households Focus on agency-specific information Objective, experiential data Close-end questions Peru 2002 Process/Capacity building: Partnership between WBI and with National Statistical Office on methodological issues Data and results publicly available Analytical dimension Monitoring of indices performance over time Link between indices of performance and measures of poverty Peru 2002 Decisions taken To develop the following yearly indicators: Index of transparency and civil society participation Index of quality of public services To focus on households/users only To promote a partnership between the National Statistical Agency and citizens Governance Assessment: A Comparative Approach Session III Lusaka, January V. Rao, F. Recanatini, M. Woolcock Suppose we have defined. The final purpose of the assessment The objective/focus of the assessment What next? We need to choose Methodology Empirical tools Process to implement the assessment Respondents and sample Type of analysis Follow-up activities Session III Session IV Session V Session VI Methodologies, Methods, Data Key: define the variable/ concept of interest Finding answers may require single or multiple methods and data forms Methodologies as the particular combination and sequence of methods used to answer the question(s) Methods can be qualitative and/or quantitative Data can also be qualitative and/or quantitative Selecting the methodology Qualitative methods Quantitative methods Mixed methods To each method corresponds a set of empirical tools that we can use Purely Qualitative Methods: - Focus Group Discussions, interviews, case studies Problems: - Non-representative - Lack of counterfactuals, causality is unclear - Small Samples Advantages -Open-Ended -Context, History Sources of data Qualitative ( texts ) Historical records, political reports, letters, legal documents Media (print, radio, and television) Open-ended responses to survey questions Observation (ethnography) Interviews key informants, focus groups Participatory approaches Purely Quantitative Problems: -Structured Questions - Top-Down -Reflect Biases of Researcher Advantages: -Large Samples -Representative Samples -Clear Methods for Inferring Causality Sources of data Quantitative ( numbers ) Household and other surveys (e.g., census, household surveys) Opinion polls (e.g., Gallup, marketing research) Data from official files (e.g., membership lists, government reports) Indexes created from multiple sources (e.g., Kaufmann s Index of governance ) Qualitative vs. quantitative? Assumptions that different standards apply Qualitative approaches seen as inductive, valid, subjective, process ( how ), generating ideas Quantitative approaches seen as deductive, reliable, objective, effects ( whether ), testing ideas Integrating qual and quan approaches Iterative and/or sequential mixing of approaches to Complement strengths, compensate weaknesses Address problems of missing/inadequate data Common goals of quan and qual methods Comparability across time and space Sensible aggregation and disaggregation Across different scales and units of analysis (How to find a common metric, like prices ?) Sensible integration of different data forms and sources Enhanced Validity Selected measures approximate reality Minimizing identification (and other) errors Enhanced Reliability Same procedures, different team should give same results Examples of Existing Empirical Tools Qualitative methods Budget use monitoring Video Observations Judicial Investigations Case Studies Quantitative methods Opinion Polls Sector-specific surveys Multi-sector surveys Existing Empirical Tools BEEPS IGR Public Official surveys PET QSDS Score Cards Investment Climate Surveys EC Audits PER CFAA CPAR GAC Case Studies HIPC Exp. Tracking ROSC Mixed Methods Take Best of Both Worlds - Advantages 1)Quantitative Questions Informed by Qualitative Investigation. 2)Hypotheses Generated by Qualitative tested for Generalizability by Quantitative. 3)Depth supplemented by breadth thick understanding with generalizability. 4)History, Context, Process and Identifying Causal Links 5)Participation Remember! Mixed Methods Problems High Cost Time Consuming Large Teams Coordination Problems Usually poorly done more research required to understand how methods compare Integrating Qualitative and Quantitative Approaches Sequential Qual/Quan Parallel tracks Independent efforts Increase data variance Most common in Large, lengthy projects Or, studies requiring quick turn-around (e.g. Indonesia crisis) Iterative Qual/Quan (exchange, dialogue) 1Context (qual) 2Design survey (quan) 3Pre-test survey (qual) 4Implement survey (quan) 5Data analysis (quan and qual) Missing/weak quan data Generating new data Building on existing data Thinking quan, acting qual Identifying valid instruments An operational example Guatemala Poverty Assessment Sequential : large, lengthy study Quan: expanded Household survey (LSMS) first social capital module large differences by region, gender, income, ethnicity pervasive elite capture Qual: 10 villages (5 different ethnic groups) perceptions of exclusion, access to services fear of reprisal, of children being stolen legacy of shocks (political and natural ) links to household (LSMS) data Selecting among instruments PER? Score cards? BEEPS? IGR? PET? QSDS? CFAA? CPAR? GAC? Case study? New tools? Pub. Officials? HIPC Ex.Tr.? EC Audits? ROSC? Which are the key dimensions for a tool comparison? Conceptual framework Approach Objective/quantitative Subjective/qualitative Mixed method Measuring precision Cost effectiveness Key dimensions, cont. Comparability of the data Across countries Over time Across empirical tools Final user of the data Different agencies Different stakeholders Type of respondent Citizens Government Officials Enterprises Civil Society Private Sector The State Linking the Tools to the Respondents PET QSDS PER CFAA CPAR Score cardsGAC IGR BEEPS INV. CL. Key dimensions, cont. Effectiveness for judicial actions Ability to identify general challenges Ability to identify priorities for reform Quality of information on: Specific institutional dimension Specific subject Linking the Tools to the Blueprint PER HIPC E.T. ROSC CPAR EC Audits CFAAIGR & GAC & Governance Cross- Country Ind. BEEPS & INVEST. CLIMATE SCORE CARDS QSDS Public Official Surveys PETs Examples of variables measured USERS/HOUSEHOLDS Quality of specific public services Cost and time to obtain a service Information available on basic rights Quality of public agencies Experience with inappropriate procedures and behavior Examples of questions used From Governance Diagnostic Surveys and Score Cards User/household survey I am going to read to you a list of problems, can you please rate how serious these problems are. We will use a scale from 1 to 7 in which 1 means that this is a really bad problem and 7 means that it is not so bad Examples, cont. From Governance Diagnostic Surveys and Score Cards User/household survey Do you know the process you need to follow in order to report a case of corruption? If you considered conducting procedures at some of the institutions listed and decided not to do it, could you please tell me which ones and why? Examples, cont. From Governance Diagnostic Surveys and Score Cards User/household survey Please evaluate the overall quality of the following public services . During the last year did you have any reason to make a complaint about any of the public services? In the last two years have you known of any case of corruption? If yes, have you reported it? Examples of variables measured ENTERPRISES Quality of specific services and procedures Cost and time to comply with permits and licenses Information available on basic rights Quality of public agencies Experience with inappropriate procedures and behavior Examples, cont. From Governance Diagnostic Surveys Enterprise survey Please evaluate the overall quality of the following public services . How much time during the week does your administrative staff spend dealing with bureaucracy in general? Please mention how many hours a week in average your administrative staff spent dealing with bureaucracy the last year. Examples, cont. Enterprise survey Examples, cont. From Governance Diagnostic Surveys Enterprise survey Given the cost that bureaucracy is to the firm, tell me during the past two years, your firm decided not to make any investment that had planned to do? What is the percentage of the total revenues that your firm paid for security last year? Examples of variables measured PUBLIC OFFICIALS Quality of the rules and procedures Transparency of budget and employment decisions Information available on procedures Quality of management Experience with inappropriate procedures and behavior within their office Examples, cont. From Governance Diagnostic Surveys Examples, cont. From Governance Diagnostic and Public Official Surveys Examples, cont. From Governance Diagnostic Surveys Examples, cont. From Governance Diagnostic and Public Official Surveys Reserved Slides The power of diagnostic data and key dimensions for analysis 1.Unbundle corruption by type administrative, capture of the state, bidding, theft of goods and public resources, purchase of licenses and regulations 2.Identify both weak institutions (in need of reform) and strong institutions (example of good governance) Key dimensions Cont. 3.Assess the cost of each type of corruption on different groups of stakeholders 4.Identify key determinants of good governance 5.Develop policy recommendations Corruption and Illicit Payments for: Formation of Policies, Laws and Regulations Implementation of Policies and Regulations Allocation of Resources and Investment Decisions Allocation of Labor (State Capture and Undue Elite Influence) (Nepotism/Patronage) (Administrative/Bureaucratic Corruption) (Procurement/Budget Diversion)