it’s the learning, not the result which counts most in evaluation randall pearce think: insight...

20
It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference 5 July 2013

Upload: brent-bowens

Post on 01-Apr-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

It’s the learning, not the result which counts most in evaluation

Randall PearceTHINK: Insight & Advice

The 7th Australasian Better Boards Conference5 July 2013

Page 2: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Evaluation – What is it?

‘A systematic way of answering questions about projects, policies and programs’

1. Is it needed or worthwhile?2. Is it having an effect?3. At what cost?4. How could it be improved?5. Are there better alternatives?6. Are there any unintended consequences?

Page 3: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

NFP Evaluation – What it isn’t

Page 4: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Who evaluates?

75%

25%

Do you measure the impact of your work?

Yes

No

Source: New Philanthropy Capital 2012

Page 5: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Why do (or don’t) they evaluate?

Funders tend to ignore impact measurement

Data collection interferes with client relationships

Impact measurement detracts from priorities

You don't need evaluation to tell you what works

Funding is main reason for conducting evaluations

There is too much pressure to measure results

There is not enough funding for impact evaluation

Charities should report negative results too

Measuring impact makes NFPs more effective

0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00%

17.50%

21.20%

26.20%

29.50%

34.10%

37.50%

60.20%

72.30%

77.50%

Attitudes toward impact evaluation among charitiesTop two box scores (n=1,000)

Source: New Philanthropy Capital

Page 6: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

What do they gain?

Other

Increased publicity

Don't know

Improved strategy

Increased funding

Able to show how you make a difference

0.00% 5.00% 10.00% 15.00% 20.00% 25.00% 30.00%

0.90%0.90%

2.10%3.70%

4.40%6.50%

8.70%9.10%

9.80%10.70%

18.40%24.60%

Benefits gained through evaluation(n=755)

Source: New Philanthropy Capital

Page 7: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Who should evaluate?

Advantages Disadvantages

Internal • Better overall & informal knowledge

• Less threatening/known to staff• Less costly

• May be less objective• Evaluation could be a part-time

responsibility• May not be trained in evaluation

External • More objective• Able to dedicate time and

attention• Greater evaluation expertise

• Needs to learn about organisation and culture

• Unfamiliar to staff• More costly

Page 8: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

When to conduct evaluation?Form: Stage of Program Purpose:

Proactive Evaluation

Program Start

Program Completion

Before a program starts

To synthesis information to inform program design

Clarificative Evaluation

During program development

To clarify the program design and how it operates

Interactive Evaluation

During program delivery

To improve program delivery. Involves stakeholders in the evaluation.

Monitoring Evaluation

Once the program is in a settled stage

To monitor program progress for accountability and improvement

Impact Evaluation

During or after program implementation

To assess what has been achieved, to learn and be accountable

Source: K Roberts (adapted from Owen and Rogers, 2006)

Page 9: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Dispelling myths

Theory of change?

• Not needed because the evaluator will reconstruct the logic of the actual program, not the theoretical model:• Foundational activities• Activities• Outputs• Immediate outcomes• Intermediate outcomes• Long-term outcomes• Organisational goals

Page 10: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Dispelling myths

Mountain of data?

• Most data is just information…we are looking for insight into what it means

• Historical data is more valuable than a mountain of current data

• Your evaluator should identify the few ‘dashboard’ measures that you will need to evaluate

• Once an evaluation has been conducted you can use the dashboard forever

Page 11: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Dispelling myths

A wad of cash?

• Think of what is at stake versus the internal budget allocation – any activity with a value in excess of $200K should be evaluated

• Governments and foundations often allow for 10% to be spent on evaluation

• There are many ways that NFPs can reduce the cost of evaluations

Page 12: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Using the results of evaluation

• Share them…as widely as you can• Some evaluators will agree to write a

summary which protects the egos of those involved

• Action learning/research is a participative approach based on a four part cycle of: taking action, measuring the results of the action, interpreting the results and planning what change needs to take place for the next cycle of action

• The best projects conclude with a Summit workshop

Page 13: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Beyond program impact evaluation

Page 14: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Learning along the way

Documentation

• Documents success and failures• Summary of key documents in one place• Timeline/sequence of events• Isolates key measures for the future• Supports performance appraisal for staff and

board• Helps orient staff, volunteers and contractors

Page 15: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Learning along the way

Full cost accounting

• Full costs and expenses need to be calculated to arrive at the true financial picture

• Need to include:• Budget allocation• Cash donations• In-kind services• Pro-bono services

INCOMECash Contributions

Member contributions 348,000.00$ Donors 1,168,000.00$

Sub-total cash contributions 1,516,000.00$ Subtotal in-kind contributions 926,346.00$ TOTAL INCOME 2,442,346.00$

EXPENSEOperationsSub-total operations expense 881,346.00$

CommunicationsSub-total communications 803,500.00$

Field and coordinationSub-total field and coordination 470,970.00$

TOTAL EXPENSE 2,145,816.00$ GST adjustment (97,600.00)$

NET EXPENSE 2,048,216.00$

Page 16: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Learning along the way

Full value assessment

• Captures all non-financial outputs in addition to financial information

• For example, while social media produces a host of measures, there are no financial equivalents as there are in traditional media (i.e. TARPs)

• Need to identify data sources for year-on-year comparison in future

Page 17: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Learning along the way

Organisational behaviour and Governance

• Qualitative research reveals issues around organisational behaviour and governance which can impact outcomes

• Project governance can be examined independent of personalities to pinpoint areas for change/improvement

Page 18: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Learning along the way

Relationship building

• The evaluation process has been described as ‘cathartic’ by key players

• Helps diffuse tensions that build up during a campaign

• Provides stakeholders a voice/builds goodwill for the future

• Aids communication ‘across the political/media divide’

Page 19: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

Over to you…

Questions

Page 20: It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference

For more information, contact:Randall Pearce

+61 2 9358 [email protected]

NOTE: For a copy of this presentation, please provide your business card at the end of the session or email