hrph > chqwuh ri e[fhoohqfh iru eydoxdwlrq … · 2/7/12 supporting effective evaluations: a...

6
2/7/12 Supporting Effective Evaluations: A Guide to Developing Performance Meas« 1/6 www.tbs-sct.gc.ca/cee/dpms-esmr/dpms-esmr06-eng.asp HRPH > CHQWUH RI E[FHOOHQFH IRU EYDOXDWLRQ Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies 6.0 Performance Measurement Strateg\ Framework 6.1 Overview of the Performance Measurement Strateg\ Framework TKH PM SWUDWHJ\ FUDPHZRUN LGHQWLILHV WKH LQGLFDWRUV UHTXLUHG WR PRQLWRU DQG JDXJH WKH SHUIRUPDQFH RI D SURJUDP. IWV SXUSRVH LV WR VXSSRUW SURJUDP PDQDJHUV LQ: FRQWLQXRXVO\ PRQLWRULQJ DQG DVVHVVLQJ WKH UHVXOWV RI SURJUDPV DV ZHOO DV WKH HIILFLHQF\ RI WKHLU PDQDJHPHQW; PDNLQJ LQIRUPHG GHFLVLRQV DQG WDNLQJ DSSURSULDWH, WLPHO\ DFWLRQ ZLWK UHVSHFW WR SURJUDPV; SURYLGLQJ HIIHFWLYH DQG UHOHYDQW GHSDUWPHQWDO UHSRUWLQJ RQ SURJUDPV; DQG HQVXULQJ WKDW WKH LQIRUPDWLRQ JDWKHUHG ZLOO HIIHFWLYHO\ VXSSRUW DQ HYDOXDWLRQ. PURJUDP PDQDJHUV VKRXOG FRQVXOW ZLWK KHDGV RI HYDOXDWLRQ RQ WKH VHOHFWLRQ RI LQGLFDWRUV WR KHOS HQVXUH WKDW WKH LQGLFDWRUV VHOHFWHG ZLOO HIIHFWLYHO\ VXSSRUW DQ HYDOXDWLRQ RI WKH SURJUDP. SHH SHFWLRQ 2.3 RI WKLV JXLGH IRU PRUH LQIRUPDWLRQ RQ WKH UROHV DQG UHVSRQVLELOLWLHV RI SURJUDP PDQDJHUV DQG KHDGV RI HYDOXDWLRQ. 6.2 Content of the Program Performance Measurement Strateg\ Framework TDEOH 3 VXPPDUL]HV WKH PDMRU FRPSRQHQWV RI WKH PM SWUDWHJ\ FUDPHZRUN. TKH IUDPHZRUN VKRXOG LQFOXGH WKH SURJUDP'V WLWOH DV VKRZQ LQ WKH GHSDUWPHQWDO PAA DV ZHOO DV WKH PAA HOHPHQWV WKDW DUH GLUHFWO\ OLQNHG WR WKH SURJUDP, L.H. WKH SURJUDP DFWLYLWLHV, VXEDFWLYLWLHV DQG/RU VXE-VXEDFWLYLWLHV. IW VKRXOG DOVR LQFOXGH WKH SURJUDP'V RXWSXWV, LPPHGLDWH DQG LQWHUPHGLDWH RXWFRPHV (DV GHILQHG LQ WKH ORJLF PRGHO), DV ZHOO DV RQH RU PRUH LQGLFDWRUV IRU HDFK RXWSXW DQG RXWFRPH. FRU HDFK LQGLFDWRU, SURYLGH: WKH GDWD VRXUFH(V) WKH IUHTXHQF\ RI GDWD FROOHFWLRQ EDVHOLQH GDWD WDUJHWV DQG WLPHOLQHV IRU ZKHQ WDUJHWV ZLOO EH DFKLHYHG WKH RUJDQL]DWLRQ, XQLW DQG SRVLWLRQ UHVSRQVLEOH IRU GDWD FROOHFWLRQ WKH GDWD PDQDJHPHQW V\VWHP XVHG Table 3: Sample Table for the Performance Measurement Strateg\ Framework: PAA Elements Linked to the Program Program outputs and outcomes Indicator Data source Frequenc\ Baseline Target Date to achieve target Organi]ation and position responsible for data collection Data management s\stem Output 1 IQGLFDWRU 1 Output 2 R RZV PD\ EH DGGHG EHORZ IRU DGGLWLRQDO RXWSXWV. IQGLFDWRU 2 IQGLFDWRU 3 Outcome 1 IQGLFDWRU 4 IQGLFDWRU 5

Upload: voquynh

Post on 10-May-2018

214 views

Category:

Documents


1 download

TRANSCRIPT

2/7/12 Supporting Effective Evaluations: A Guide to Developing Performance Meas…

1/6www.tbs-sct.gc.ca/cee/dpms-esmr/dpms-esmr06-eng.asp

Home > Centre of Excellence for Evaluation

Supporting Effective Evaluations: A Guide to DevelopingPerformance Measurement Strategies

6.0 Performance Measurement Strategy Framework

6.1 Overview of the Performance Measurement StrategyFramework

The PM Strategy Framework identifies the indicators required to monitor and gauge theperformance of a program. Its purpose is to support program managers in:

continuously monitoring and assessing the results of programs as well as the efficiency oftheir management;making informed decisions and taking appropriate, timely action with respect to programs;providing effective and relevant departmental reporting on programs; andensuring that the information gathered will effectively support an evaluation.

Program managers should consult with heads of evaluation on the selection of indicators to helpensure that the indicators selected will effectively support an evaluation of the program. SeeSection 2.3 of this guide for more information on the roles and responsibilities of program managersand heads of evaluation.

6.2 Content of the Program Performance Measurement StrategyFramework

Table 3 summarizes the major components of the PM Strategy Framework. The framework shouldinclude the program's title as shown in the departmental PAA as well as the PAA elements that aredirectly linked to the program, i.e. the program activities, subactivities and/or sub-subactivities. Itshould also include the program's outputs, immediate and intermediate outcomes (as defined in thelogic model), as well as one or more indicators for each output and outcome. For each indicator,provide:

the data source(s)the frequency of data collectionbaseline datatargets and timelines for when targets will be achievedthe organization, unit and position responsible for data collectionthe data management system used

Table 3: Sample Table for the Performance Measurement Strategy Framework: PAA Elements Linked to the Program

Program outputs andoutcomes

IndicatorData

sourceFrequency Baseline Target

Date toachievetarget

Organization andposition

responsible fordata collection

Datamanagement

system

Output 1 Indicator 1

Output 2Rows may be addedbelow for additionaloutputs.

Indicator 2Indicator 3

Outcome 1 Indicator 4

Indicator 5

2/7/12 Supporting Effective Evaluations: A Guide to Developing Performance Meas…

2/6www.tbs-sct.gc.ca/cee/dpms-esmr/dpms-esmr06-eng.asp

Outcome 2Rows may be addedbelow for additionaloutcomes.

Indicator 6

Indicator 7

Indicator 8

6.3 Performance Measurement Frameworks and thePerformance Measurement Strategy Framework

The Policy on Management, Resources and Results Structures (MRRS) requires the development ofa departmental Performance Measurement Framework (PMF), which sets out the expected resultsand the performance measures to be reported for programs identified in the PAA. The PMF isintended to communicate the overarching framework through which a department will collect andtrack performance information about the intended results of the department and its programs. Theindicators in the departmental PMF are limited in number and focus on supporting departmentalmonitoring and reporting.

The PM Strategy Framework is used to identify and plan how performance information will becollected to support ongoing monitoring of the program and its evaluation. It is intended to moreeffectively support both day-to-day program monitoring and delivery and the eventual evaluationof that program. Accordingly, the PM Strategy Framework may include expected results, outputsand supporting performance indicators beyond the limits established for the expected results andperformance indicators to be included in the MRRS PMF (see Figure 2 below). Unlike the MRRS PMF,the PM Strategy Framework has no imposed limit on the number of indicators that can be included;however, successful implementation of the PM Strategy is more likely if indicators are kept to areasonable number.

Figure 2: Comparison of Requirements: Performance Measurement Frameworks andPerformance Measurement Strategies

Figure 2 - Text Version

As illustrated in Figure 2, the indicators in the PM Strategy Framework focus on supporting ongoingprogram monitoring and evaluation activities and therefore align with and complement theindicators included in the departmental PMF. In instances where the program is shown as adistinct program in the PAA and indicators have been identified in the departmental PMF, the PMStrategy Framework should include, at a minimum, the indicators reported in the departmental PMF.When a PM Strategy is developed for a new program that is not represented in the departmental

2/7/12 Supporting Effective Evaluations: A Guide to Developing Performance Meas…

3/6www.tbs-sct.gc.ca/cee/dpms-esmr/dpms-esmr06-eng.asp

PAA, the outcomes, outputs and related indicators developed for the PM Strategy Framework

should be considered for inclusion in the departmental MRRS.[10]

6.4 Accountabilities and Reporting

The PM Strategy Framework should be accompanied by a short text that describes:

the reporting commitments and how they will be met, including who will analyze the data,who will prepare the reports, to whom they will be submitted, by when, what information willbe included, the purpose of the reports and how they will be used to improve performance;andif relevant, the potential challenges associated with data collection and reporting, as well asmitigating strategies for addressing these challenges (e.g. there may not be a system thatcan be used for data management).

6.5 Considerations when Developing the PerformanceMeasurement Strategy Framework

The chart below provides guidance on how to develop the PM Strategy Framework.[11]

Step Description Comments

1. Start with the MRRS PMF: Review the MRRSPMF that the department developed inaccordance with the Policy on MRRS. Include,as appropriate, the performance indicatorsfrom the MRRS PMF in the PM StrategyFramework.

The PM Strategy Framework should notbe developed in isolation. In accordancewith the Policy on MRRS, all programsrepresented in a department's PAA mustcontribute to its strategic outcome(s). Assuch, when developing the PM StrategyFramework, the performance indicatorsshould complement those alreadyestablished in the departmental PMF.

2. Identify performance indicators: Develop orselect at least one performance indicator foreach output and each outcome (immediate,intermediate and ultimate) that has beenidentified in the program logic model.Keep in mind that, in addition to day-to-dayprogram monitoring, the performanceindicators will also be used for evaluationpurposes. As such, it is recommended thatprogram managers also consider the core

issues[12] for evaluation (i.e. relevance andperformance) and consult with heads ofevaluation when developing the performanceindicators.

There are two types of indicators:quantitative and qualitative.

Quantitative performanceindicators are composed of anumber and a unit. The numberindicates the magnitude (howmuch) and the unit gives thenumber its meaning (what), e.g. thenumber of written complaintsreceived.Qualitative indicators areexpressed in expository form, e.g.assessment of research quality. Asmuch as possible, qualitativeindicators should be condensed intoa rating scale, e.g. research qualityis rated as "excellent," "average" or"below average." This will allow forcomparability over time.Choosing appropriate performanceindicators is essential for effectivelyevaluating and monitoring aprogram's progress. Appropriateindicators are characterized asfollows:

2/7/12 Supporting Effective Evaluations: A Guide to Developing Performance Meas…

4/6www.tbs-sct.gc.ca/cee/dpms-esmr/dpms-esmr06-eng.asp

Valid—The indicators measure whatthey intend to measure.Reliable—The data collectedshould be the same if collectedrepeatedly under the sameconditions at the same point intime.Affordable—Cost-effective datacollection (and analysis) methodscan be developed.Available—Data for the indicatorsare readily and consistentlyavailable to track changes in theindicator.Relevant—The indicator clearlylinks back to the programoutcomes.

Keep the number of performanceindicators to a manageable size. Asmall set of good performanceindicators are more likely to beimplemented than a long list ofindicators.

3. Identify data sources: Identify the datasources and the system that will be used fordata management.

There are a number of possible datasources:

Administrative data—Informationthat is already being collectedthrough program files or databasesor could be collected withadjustments to regular programmingprocesses or funding agreements;Primary performance data—Specialized data collectionexercises, such as focus groups,expert panels or surveys, in whichinformation needs to be collected;andSecondary data—Information thathas been collected for otherpurposes, such as nationalstatistics on health or economicstatus (e.g. Statistics Canadadata).

Use readily available information.Take advantage of any availablesources of information at yourdisposal.

4. Define frequency and responsibility fordata collection: Identify frequency for datacollection and the person(s) or officeresponsible for the activity.

Describe how often performanceinformation will be gathered. Dependingon the performance indicator, it maymake sense to collect data on anongoing, monthly, quarterly, annual orother basis.

2/7/12 Supporting Effective Evaluations: A Guide to Developing Performance Meas…

5/6www.tbs-sct.gc.ca/cee/dpms-esmr/dpms-esmr06-eng.asp

When planning the frequency andscheduling of data collection, animportant factor to consider ismanagement's need for timely informationfor decision making.

In assigning responsibility, it is importantto take into account not only whichparties have the easiest access to thedata or sources of data but also thecapacities and system that will need tobe put in place to facilitate datacollection.

5. Establish baselines, targets and timelines. Performance data can only be used formonitoring and evaluation if there issomething to which it can be compared. Abaseline serves as the starting point forcomparison. Performance targets are thenrequired to assess the relativecontribution of the program and ensurethat appropriate information is beingcollected.Baseline data for each indicator should beprovided by the program when the PMStrategy Framework is developed. Often,and particularly for indicators related tohigher-level outcomes, this informationwill have already been collected byprogram managers as part of their initialneeds assessment and to support thedevelopment of their business case.

Targets are either qualitative orquantitative, depending on the indicator.Some sources and reference points foridentifying targets include:

baseline data based on pastperformance or previous iterationsof the program;the achievements of other, similarprograms that are consideredleaders in the field (benchmarking);generally accepted industry orbusiness standards; andpublicly stated targets (e.g. set bythe government, the federalbudget).

If a target cannot be established or if theprogram is unable to establish baselinedata at the outset of the program,explicit timelines for when these will beestablished as well as who is responsiblefor establishing them should be stated.

2/7/12 Supporting Effective Evaluations: A Guide to Developing Performance Meas…

6/6www.tbs-sct.gc.ca/cee/dpms-esmr/dpms-esmr06-eng.asp

Date Modified: 2010-09-29

6. Consult and verify: As a final step, programmanagers should consult[13] with heads of

evaluation and other specialists in performancemeasurement to validate the performanceindicators and confirm that the resourcesrequired to collect data are budgeted for.Consultation with other relevant programstakeholders (e.g. information managementpersonnel) is also encouraged.

This step complies with the Directive onthe Evaluation Function, which holdsheads of evaluation responsible forreviewing and providing advice on thePM Strategies for all new and ongoingdirect program spending, including allongoing programs of grants andcontributions, to ensure that theyeffectively support an evaluation ofrelevance and performance (Section6.1.4, subsection a).It is highly recommended that programmanagers work with the head ofevaluation to ensure that theperformance indicators selected willprovide adequate data to support anevaluation of the program. The choice ofconsultation mechanism remains at thediscretion of the program manager andhead of evaluation.

Key questions to be considered during theconsultation and verification stageinclude:

Does the selection of indicatorsalign with and complement theindicators in the departmental PMF?Are the indicators valid measuresof the outputs and outcomes (i.e.do they measure what they intendto measure)?Are the indicators reliable (i.e.would the data recorded be thesame if collected repeatedly underthe same conditions at the samepoint in time)?Is it likely that the required data willbe provided in a timely manner?Is the cost reasonable and clearlybudgeted for?

If the answer to any of the abovequestions is "no," adjustments needto be made to the indicators or to theresources allocated forimplementation of the PM Strategy.

Implementation

The full benefits of the PM Strategy Framework can only be realized when it is implemented.[14]

As such, program managers should ensure that the necessary resources (financial and human)and infrastructure (e.g. data management systems) are in place for implementation. Programmanagers should begin working at the design stage of the program to create databases, reportingtemplates and other supporting tools required for effective implementation. Following the firstyear of program implementation, they should undertake a review of the PM Strategy to ensurethat the appropriate information is being collected and captured to meet both programmanagement and evaluation needs.

Keep in mind that, according to the Directive on the Evaluation Function, the head of evaluationis responsible for "submitting to the Departmental Evaluation Committee an annual report on thestate of performance measurement of programs in support of evaluation" (Section 6.1.4,subsection d). Given that the PM Strategy Framework represents a key source of evidence forthis annual report, its implementation is especially important.