Download - Evidence of impact and return on investment
Measuring Impact
Determining Return on Investment
Reana Rossouw – Next Generation Consultants
Our context
� Started in 2009 – global benchmarking – 40 models applied internationally
� Conclusion:
� Africa needs its own solution – applicable to both the investment and development sectors
� Practitioners need capacity, support and knowledge
� Practitioners don’t need complex methodologies requiring specialist skills, licenced software or specific hardware or expensive solutions
� The industry needs a transparent, comparable, and flexible solution that contextualises and take into consideration the complexities, relationships and fundamental development principles of our specific development context
Why the pressure to measure?
� The debate of impact and return on investment are playing out in three arenas:
� In private/corporate foundations and CSI divisions
� Aiming to be more strategic about their social/community investments and development programmes
� In nonprofit organisations in response to pressures from investors, corporates, foundations and government
� To be more accountable for the resources received and programme outcomes expected
� Among all other role-players such as government agencies, intermediary organisations, and international development agencies
� Seeking to improve development effectiveness and lessen dependency on development aid
Why now?
� Funders are increasingly asking for demonstrable results
– to understand the difference they make, directly and
indirectly in the economy, society and the environment
� This trend is accelerating, and the sector is increasingly
looking to pay for/by results – and to learn from what
they, and those they fund, do
� It is not just donors that care about impact. In a age of
competition, transparency, and recessionary economies
there is a growing competition for resources
Impact Assessment - Variety of purposes
� One can and should use impact data to make programme decisions across investment and development portfolios
� One can and should use impact data to make funding decisions within investment and development portfolios
� It is about building a unifying measurement standard, as well as a conceptual framework for understanding the biggest impact across a Rand value unit. So - it is not just about comparing one investment or one development portfolio to another or one organisation to another, but to also determine which programme/organisation/investment yields the highest return/impact for the most effective use of resources
� Therefore, we require a combination of the breadth of quantitative with the depth of qualitative evaluation/assessment methods
Constraints to impact assessments
� 1. Funders are not clear about the value and uses of impact assessment. The value proposition is not clear, while the financial costs are significant
� 2. Most organisations do not know how to do impact assessments, or use the findings. This is a technical and analytical skill that is new to the sector
� 3. Organisations do not have the systems, skills, resources, tools, processes or practices to conduct impact assessments
� 4. Identifying indicators to measure change/impact requires research, expertise, development knowledge and extensive experience in a development context
� In candid moments, organisations speak of the potential risks associated with the possibility that assessments will generate “bad findings, or negative impact” that is, findings that show that the organisation or intervention is not having an impact, or worse. The main concern is that negative/low impact assessment findings will lead to reduced funding
� Closely related, donors typically are, if anything, at a more primitive stage in their own understanding and use of impact assessment than their grantees. Research further indicate donors neither provide sufficient funding for nor understand analyze/interpret the outcome (data) of the impact assessments
Why should we be doing it?
� To decide
� whether or not to fund an intervention, to continue or expand
an intervention, to replicate and scale up interventions
� To learn
� how to adapt a successful intervention to suit another context
and improve future implementation and investment decisions
and outcomes of interventions
� To reassure funders, investors and other stakeholders
about progress and change
� To inform all all stakeholders about whether or not, and
in what ways, a programme is benefiting society
Key Findings
� Across all industry sectors and role players – measuring
impact is seen as useful and important
98%
Impact measurement makes funders more
effective
97%
Impact measurement makes development organisations more
effective
90% of funders agree the focus on measuring overall impact has
increased in the past five years
90% of funders plan to increase impact
measurement in the next three years
50% of funders conduct some form of
measurement including monitoring, evaluation
and site visits
10% of funders report on the outcome of
monitoring, evaluation and impact assessment
Key Findings
� Funders should support grantees to increase impact
measurement
90% rate evidence of impact as extremely important
90% provide no funding for impact assessment
50% are thinking about providing funding for impact measurement
10% provide funding for impact measurement
90% think funders should provide funding
for impact measurement
Key Findings
� Challenges for impact measurement – both funders and
grantees
Challenges
• Lack of resources/funding to measure impact
• Not knowing how to measure
• Not knowing what to measure
• Impact measurement not linked to overall funding strategy
• Impact measurement not linked to strategic objectives/intent/ outcomes of programmes
• Number of assessment tools and complex task of developing, quantifying and qualifying indicators
Solutions
• Discussions about how to measure impact
• Developing standard approaches, tools and methodologies to impact assessments
• Training, capacity building and guidance on how to develop and use measurement tools
• Discussions amongst stakeholders about collaborative approaches and sharing results of impact measurement
• Funding of impact assessments
What Impact Assessment is all about
� Impact
� Identify, interpret, improve, investigate, involve, inform
� To provide evidence
� To demonstrate performance
� To prove accountability
� To show programme/investment effectiveness
� To demonstrate shared value
� To empower and capacitate all stakeholders
� Ultimately - to alleviate, reduce and eradicate poverty and contribute to sustainable development
Calculating Impact – The Process
Information Sources (Strategies, applications,
contracts, evaluation reports, site inspections,
engagement)
Primary, secondary and tertiary assessment
Data score sheet –identifying and
calculating impact and return
Impact per stakeholder (QL & QN)
Return for investorDimension of impact and
returnAnalysis, interpretation,
triangulation of data
Impact per programme, per focus area per stakeholder group
Return per programme per focus area
Cost benefit analysis Shared Value X:YRecommendation
strategic, operational and programmatic
Evidence of Impact and Return
How much was spent – where – on what?
Case Study 1 - Multichoice
Total Impact – Per programme Total Impact – Per Focus Area
Case Study 1: Multichoice
Total Return per Brand Total Return by Investment
Case Study 2: Pioneer Foods
Spend per focus area Impacts per focus area
Case Study 2: Pioneer Foods
Impact across the triple bottom line
4
3 3 3 3 3 3 3
2 2
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
Economic Impact
7
5
4 4
0
1
2
3
4
5
6
7
8
F&TFA-
Limani
Aqua Noir Heart MOT
Social Impact
8 8 8
3
2 2
0
1
2
3
4
5
6
7
8
9
Environmental Impact
Case Study 2: Pioneer Foods
Impact over time
6
3
2 2 2 2 2
0
1
2
3
4
5
6
7
Short-Term Impact
4
3
2 2 2 2 2
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
Medium Term Impact
7
4
3 3 3 3 3
0
1
2
3
4
5
6
7
8
Long-Term Impact
Case Study 2: Pioneer Foods - Impact per focus area
97
91
Paardeberg WWF
Impact Scores for SED
Focus Area -
Environment
10097
91
79 78
60 60
50
38
Impact Scores for SED Focus
Area - Food Security
103
61
MOT PFECT Bursary
Scheme
Impact Scores for PFECT Focus
Area - Education
71
25
Aqua Noir Katmakoep
Impact Scores for Focus Area -
Enterprise Development
Case Study 3 – BHP Billiton Metalloys –
Programme performance
Top five Bottom five
94
76
6964 62
0
10
20
30
40
50
60
70
80
90
100
34
4446 46 47
0
5
10
15
20
25
30
35
40
45
50
60
54
52 52
48
50
52
54
56
58
60
62
Raiz.
Business
Incubation
Wings for
life
PC Literacy
for
Educators
Inkululeko
Project
Middle four
Case Study 3: BHP Billiton Metalloys - Impact across
focus areas per programme
Commuity Impact High
Developmental
Business Impact High
Strategic
HIV and Me 94:7
Community Impact Low
Charitable
Inkululeko 52:1
Cansa 44:2
Reach for a Dream 34:3
Business Impact Low
Commercial
Health
Average 56:3
Community Impact High
Developmental
Eureka School 76:2
Meyerton High School 69:1
Business Impact High
Strategic
Dr Malan School 64:3
PC Literacy for Educators 52:4
Community Impact Low
Charitable
Springfield Primary 46:1
Sibongile School 46:2
Business Impact Low
Commercial
Wings for Life 54:3
CASME 47:3
Education
Average 57:2
Commuity Impact High
Developmental
Business Impact High
Strategic
Kotulong Community Centre 62:4
Raizcorp Business Incubation 60:4
Community Impact Low
Charitable
MCD Training 52:3
Business Impact Low
Commercial
Poverty Alleviation
Average 58:4
Impact vs. Return: Programmes vs. Focus Areas
Commuity Impact Above Average
Eureka School 76:2
Meyerton High School 69:1
Dr Malan School 64:3
Business Impact Above Average
HIV & ME 94:7
Kotulong CC 62:4
Raizcorp Business Incubation 60:4
Community Impact Below Average
Wings for life 54:3
MCD Training Centre 52:3
Adv. of Science and Maths 47:3
Reach for a Dream 34:3
Sibonile School 46:2
CANSA 44:2
Inkululeko Project 52:1
Springfield Primary 46:1
Business Impact Below Average
PC Literacy for Educators 52:4
Total Impact 852
Total Return 43
Overall Averages 56:3
Commuity Impact Above Average
Business Impact Above Average
Education 454:19
Community Impact Below Average
Health 224: 13
Poverty Alleviation 174:11
Business Impact Below Average
Total Impact 852
Total Return 43
Overall Averages
284:14
Case Study 3: BHP Billiton Metalloys - Impact vs. Cost
Individual cost per impact and return on investment for the community investment and development programmes
Total Spend Programmes Total Impact Total Return
R 62 678 764,00 15 852 43
Cost per Impact R 4 933. 39
Cost per Return R 78 841. 21
Individual cost per impact per Focus Area
Focus Area Spend Programmes Total Impacts Cost per Impact
Health R 1 364 500.00 3 217 R 2 096.00
Education R 36 314 264.10 8 472 R 9 617.00
Poverty Alleviation R 25 000 000.00 3 158 R 52 742.60
Individual cost per return per Focus Area
Focus Area Spend Programmes Total Returns Cost per Return
Health R 1 364 500.00 3 15 R 30 322.22
Education R 36 314 264.10 8 27 R 168 121.59
Poverty Alleviation R 25 000 000.00 3 11 R 757 575.75
The table below indicates that the Poverty Alleviation portfolio/focus area delivers the least community impact,
whilst the health and educational portfolios delivered more cost effective impact for communities.
The table below highlights the fact that to achieve a single impact in the community R4 933 needs to be spend per programme.
Similarly, R78 800 needs to be spent to achieve a single return on investment for BHP Billiton Metalloys South Africa.
The table below clearly indicates that the health portfolio not only had the highest return on investment for BHP Billiton Metalloys, but was also the most
cost effective portfolio.
The Educational portfolio delivered the second highest return on investment and the Poverty Alleviation was not only the most expensive, had the least
community impact, but also the least return on investment.
The impact of Impact Assessments
What we say
� Overall, participants found use of the Impact Assessment Index model most helpful in quantifying and qualifying the results and impact of company social investment programs, project-planning for future contributions programmes, and communicating the importance and results of such programmes to both internal and external stakeholders
� The standardized, analytical structure of the model provided participants not only with concrete data on the impact of their community involvement and investment activities but also a common language to use when communicating with peers, non-profit partners, and other business functions in their companies
� Participants found this helped increase buy-in for community involvement programmes and improved the effectiveness of those programmes
What our clients say - practitioners
� Improved financial analysis and reporting as well as management capacity
� Improved operational efficiency and information management processes and systems, improved strategy setting and operational execution and programme insight
� Improved board governance and oversight, understanding, knowledge and capacity as well as improved risk management
� Improved M&E; communication; stakeholder relationships and partnerships
� Improved operational processes from application requirements; due diligence processes; reporting; grantmaking criteria; responsiveness to community needs, knowledge and expertise of staff
� Improved access to further (increased) investment
What our clients say – boards and SET committees
� It provides a detailed framework for future M&E practices
� It provides detail of how to increase impact and return
� It confirms and validates investment decisions
� It provides opportunities for cooperation and partnerships, and increased leveraging of resources
� It provides evidence of accountability and responsibility
� It provides insight and evidence of best vs. worst practice and case studies for success stories and informs marketing, branding and communication strategies
� It contributes to better risk management and maximization of positive impact and minimization of negative impact
� It provides assurance and verification of return on investment (business benefit)
� It provides evidence of impact and return of shared value
What our clients say – intermediaries and
beneficiaries
� Intermediaries
� We feel comfortable with the transparency of the process
� The process have added value to our own work – especially M&E and reporting practices
� The process have increased our effectiveness and performance; increased learning and knowledge; built internal capacity; and increased our credibility
� We have learnt the value of qualitative indicators, to consider impact more broadly and we are now more convinced of the actual value of our programme – both to funders and beneficiaries
� We believe we were assured and audited independently by someone who can verify our claims – it validated our own beliefs
� It ensured increased funding for both programmes, internal capacity and increased our own sustainability
� Beneficiaries
� We had an opportunity to talk without being judged – we could be honest
� We learnt to document our own work and the contribution we made
� We feel we are being trusted, being heard and someone asks our opinion
� We had an opportunity to share and learn
� We now know what works and don't
� Our needs and priorities are considered
What our competitors say – other consultants
� The outcome of the process informs sustainability and integrated reports
� The detailed stakeholder engagement process provide insight never documented or previously considered in evaluations
� The impact assessment process not only provides guidance for future strategies and programmes, but identify areas requiring attention, confirms whether the needs of beneficiaries are met, it monitors relationships, the lessons learnt provide detailed actions of issues that needs to be addressed and improved, and it informs future best practice
What we have learnt (1)
� One size of evaluation does not fit all
� Funders should take extra time in their planning to learn which evaluation techniques will work with indigenous populations and specific communities but more importantly in specific social/community contexts
� Trying to define and measure empirical changes is difficult but not impossible
� Understanding, defining, qualifying and quantifying long-term change is an incremental effort and on-going process
� Although evaluations are significant for organisational value and programme outcomes, requiring intermediaries to perform impact assessments is a challenge� A suggested solution is to consider monitoring and evaluation results as
one – but not the only – source of information and to couple it with knowledge, experience, strategy, and the context to obtain a complete development picture
What we have learnt (2)� Impact assessments describes two relationships:
� between evaluation and the funder’s development and investment approach/strategy
� and between the intermediary evaluation and the beneficiary results/outcomes
� Funders may have to change their funding approach to better accommodate intermediaries’ capacity to conduct evaluations, or provide funding to help them develop it
� Funders need to be clear about their overall investment strategies with regard to development goals and how individual investment portfolios fit within that model
� Intermediary and funder evaluations should be linked
� Both partners face the same issues of inadequate resources, coordination and expertise for conducting evaluations
� THE BIGGEST MISSING LINK IS PREDEFINED THEORY OF CHANGES and SCIENTIFIC DEVELOPMENT MODELS
What we have learnt (3)
� ANY resources CAN be measured
� Including books, wheelchairs, buildings, time – cash and non-cash, products, services
� The same project can deliver varied results for different funders
� The same type of interventions can now be compared
� Dreamfields vs Supersport Let’s Play
� How and what you spend the money on (inside the programme) has a direct influence on the impact and return
� Cost benefit analysis is critical and a completely absent practice
� The strategy and focus areas has to clearly define the return and impact required – upfront
� Research and engagement and the co-development of indicators to measure impact becomes a critical tools
� Sustainability has to be clearly defined – it means different things to different people in different development contexts
� Internal monitoring and external evaluation processes has to be established and adhered too
� IMPACT ASSESSMENT DOES NOT REPLACE EVALUATION AND MONITORING
What we have learnt (4)
� You have to consider the impact of the impact assessment on so many levels
� As a result of our work we can now categorically state that most programmes:
� Have only short term impact
� Those that have medium term impact are not necessarily sustainable
� The long term impact is mostly social only as opposed to economic impact which really contributes to poverty alleviation and eradication!
� It is possible to determine impact and return
� And its easy to spot best practice
� The real value lies in independent, verifiable assurance of social investment expenditure, programme results, outcomes and impact and return
What we have learnt (5)� Some programmes are too expensive and does yield the expected value, return or
impact
� We can now compare interventions in investment portfolios
� One programme costs R1200 per teacher and yields 18 impacts
� One programme costs R268 000 per teacher and yields 8 impacts
� Enterprise development requires specific skills, increased risk propensity and requires extensive resources – in essence - it costs about R1 million to create a single job
� Most programmes yield negative impact as well
� Negative impact should be seen as very valuable lessons
� One programme provides ICT labs at school without understanding the cost implication (i.e. increased energy consumption or increased costs for consumables such as ink, paper, licenses, security)
� The consequences of interventions should be considered as well
� Unintended and indirect impact as well as negative impact speaks to programme design aspects
� One programme provides bursaries, without taking into consideration the social disconnect that follows between the recipient and the community/family, or that additional support maybe required such as travel or books or extra tutorial classes
� Return on investment is hardly ever defined, considered, measured or reported on
� This contributes to the perpetuation of “PR” benefits only of CSI
� Impact on intermediaries are never considered
What I now know…
� We all have impact – but it is not necessarily positive, intended, or sustainable impact
� WHAT do we want to achieve?
� Sometimes it is our own (Practitioner’s) fault we don’t have higher impact as we decide what, who and how to fund/not to fund
� The most sustainable projects/programmes with the highest impact have social, socio economic and ECONOMIC impacts
� The most sustainable programmes provide impact across the triple bottom line – i.e. economic, social and environmental impact
� Sometimes there is negative impact – i.e. dependencies are created
� Mostly there is only short term impact – which makes our interventions UNSUSTAINABLE
� Very seldom there is a direct correlation or an assessment of between WHAT WAS PROMISED VS. WHAT WAS ACTUALLY DELIVERED
What I did not expect – the bad news� Poor design of programmes
� Lack of evidence – suitable development models for specific contexts
� Lack of data/information/evidence – baseline studies/indicators to measure outcomes
� Lack of engagement – determining real community needs
� Lack of skills – basic M&E is an immature practice
� Lack of governance and risk management
� Lack of defined theories of change
� Lack of qualitative indicators
� Lack of knowledge – the reach and extend of impact on various stakeholder groups
� Lack of transparency - strategic/programme objectives/intent vs. actual programme outcomes and impact
� Lack of programme integration and persistency of silo mentalities
� Lack of coordination and collaboration and persistent competition
� Shared value is a new concept – it needs to be explored, explained, detailed and documented
� In most cases – total spend and cost is under reported and evidence of change is over reported
Unexpected learning's
� ICT interventions and bursaries– not convinced it is working
� Providing infrastructure - schools, clinics, hospitals, libraries, roads, bridges, boreholes – needs government support and partnerships to ensure continuity
� Enterprise development is VERY expensive
� Education does not deliver value or evidence of success/impact
� Health portfolios should be connected and integrated – HIV/Aids and TB and Cancer
� Food and Water security and environmental portfolios deserves more focus and attention
� Missed opportunities – using CSI to build and capture business value/benefit (ROI)
� The importance of EPV programmes for employees and management
� Only now starting to realize and understand the true transformational value CSI contributes to competitiveness and differentiation for business
What I did not expect – the good news
� We can now prove the link/value/contribution of CSI to sustainability strategies (Pioneer Foods and BHP Billiton Aluminium) – how CSI can mitigate risk – for instance contribute to enhanced food security and nutrition, offset carbon emissions, provide future suppliers
� We can now prove the link and value between core business strategies and CSI – for instance the value CSI adds to:
� The brand (Multichoice/Supersport) – We Care
� Licence to operate BEE/Charter requirements - (Nedbank/BHP Billiton) – financial inclusion, financial literacy, financial services for the unbanked or (Mixit) – bridging the digital divide
� Customer loyalty (DSTV, Edcon and Nedbank) – increased subscriptions, sales, revenue, income and profitability
� Savings (Pioneer Foods) in retail, supplier and business negotiations
� Employee retention, recruitment and motivation (Nedbank/MNet EVP)
� Shareholder mandate (Transnet)
Going Forward and Doing it better
� Impact assessment can help funders, intermediaries, and beneficiaries to:� Plan how their work will make a difference, and determine how
much of a difference they are making
� Understand what does or does not work and why and detect unintended consequences
� Build a (scientific as opposed to anecdotal) evidence base to share with others, thus influencing and informing debate, and increasing the sector’s body of knowledge
� Challenge yourself and others by looking critically at your/their work in order to improve, to replicate good work, or to innovate and develop new processes, products and services with greater impact
� Inspire and motivate staff, trustees and other stakeholders including volunteers, beneficiaries, policy makers and other practitioners, funders and investors to build relationships with others, communicate shared value and increase the impact of development efforts
Impact Investment Index
� The III model is analytical , based on quality management principles
� It is a well-organized, concise way to present data that is both simple to use and explain
� The model is flexible, allowing for adaptation on a company-by-company and project basis
� It has a process for adapting to new challenges and includes a built-in process for continuous learning and improvement
� There are readily available tools, libraries and templates that help provide an off-the-shelf approach, flexible to individual company needs
� The model provides a common language to apply across the business and around the world, which simplifies communications
� Measurement language can be shared with non-profit partners
� The model accepts and aligns with GRI and SRI indexes, which provides global credibility and allows for consistency in reporting against other functional areas within the corporate enterprise.
� Next Generation’s benchmarking and inclusive culture encourages sharing so clients can learn from each other and improve performance
In closing – About impact assessments� Involve stakeholders
� Establish the scope and identify the key stakeholders impacted
� Understand what changes
� Map the outcomes – identify the indicators to measure impact
� Value things that matter
� Look for evidence, don’t forget baseline studies, engagement, M&E reports
� Only include what is material
� Focus on agreed outcomes first then incidental impact
� Do not over claim
� Calculate the impact based on evidence
� Be transparent
� Report the outcome – the good and BAD news
� Verify the result
� Share the learning
Community Impact
Business Return
If you can count and define
indicators you can measure impact and
return
Thank You!
Questions?
www.nextgeneration.co.za