fome symposium 2015 | workshop 8: current evaluation practices and perspectives on more effective...
Post on 08-Feb-2017
361 Views
Preview:
TRANSCRIPT
Towards more effective evaluation approaches for media development
Current projects
Evaluating Communication for development: supporting adaptive and accountable development
Mobilising Media for Sustainable Outcomes in the Pacific Region
RMIT University School of Media & Communication 2
PhD Research Question & Design
Research Question: How can the impacts of Australian media assistance on social change and governance be most effectively evaluated and understood?
Phase 1: Document analysis: Media Assistance Evaluations between 2002-2012 (total of 47) Phase 2: InterviewsInterviews with media assistance evaluators (consultants and in-house, total of 10)
Phase 3: Case study:Cambodian Communication Assistance Project (CCAP), funded by AusAID-funded project, managed by ABC International Development.
Findings: Key challenges
• Conceptual ambiguities
• Bureaucratic systems
• Complex epistemological and political undercurrents in the evaluation discipline
RMIT University School of Media & Communication 4
1. Conceptual ambiguities
RMIT University School of Media & Communication 5
2. Bureaucratic systems
the codification of approaches that are meant to accomplish positive outcomes into mechanical checklists and templates that not only fail to achieve their intent but actually lead to even worse outcomes. (Anderson, Brown and Jean 2012: 67)
Proceduralization:
2. Bureaucratic systems
Authored by Total Commissioned (/required) by
donor
Commissioned (/required) by
projectExternal consultant 27 19 8
Donor 5 5
Project 5 2 3
Consultant + Donor 2 2
Consultant + Project 1 1
Donor + Project 1 1
unknown 6 6
Total 47 35 12
2. Bureaucratic systems
2. Bureaucratic systems
• ‘Nuancing’ reports• Being ‘circumspect’• Pre-cooking evaluations
Dependence on Independence
2. Bureaucratic systems
•Simplicity and complexity•Proving vs improving•Positivism vs participation
3. Evaluation epistemologies
Adapted from Manyozo (2012)
Media DevelopmentFocus on industryAims to improve governance.
Media for DevelopmentFocus on contentAims to educate and inform.
Participatory Communication: Focus on dialogueAims to foster participation and self-determination.
Conceptual ambiguities
Prototype
Prototype
C4D Design & Evaluation in the Pacific
RMIT University School of Media & Communication 15
• Workshops, field visits, pilot– Capacity building – ‘Innovation’– Proportionate– Piloting the ‘LEAD4innovation
toolkit’or C4D IDEAS Toolkit– Incudes a facilitator’s guide
Rainbow Framework Toolkit Module Objectives and Activities
MANAGE an evaluation or evaluation system
Module 2: Stakeholder Mapping Who who has an interest or role in our project and evaluation? How can we communicate with them?
Module 3: Ethics
What are the potentially ethical issues in media and communication projects and evaluation? How can we prevent these from becoming problems?
REPORT & SUPPORT USE of findings
Module 4: Planning Data Use
What is our macro-timeline for sharing evaluation insights? How can we build in continuous learning? What are the most appropriate ways to share our insights?
DEFINE what is to be evaluated
Module 5: DefineWhat are we trying to do, how are we trying to do it?
FRAME the boundaries for an evaluation
Module 6: Ask QuestionsBased on our main goals, what are our key evaluation questions?
DESCRIBE activities, outcomes, impacts and context
Module 7: Planning Information and Data Collection
Which data collection methods will give us the best information to answer our questions? Which methods are realistic?
UNDERSTAND CAUSES of outcomes and impacts
Module 8: Making Sense of DataHow do we manage our data? How can we make sense of and analyse different kinds of data?
(SYNTHESISE data from one or more evaluations)
(Included in the Facilitator’s Guide)
(Meta-analysis of groups of projects)
Contents of the LEAD4innovation/ C4D IDEAS Toolkit against the Rainbow Framework
RMIT University School of Media & Communication 16
Conventional Baselines Living BaselinesOriginates from experimental designs Challenges conventional notion of
‘baselines’Assumes controlled environment Acknowledges complexity (feedback
loops, interrelationships)
Simple model of cause and effect Social change occurs through interaction between multiple factors and agents
Anticipates linear trajectories of change Anticipates multiple, emergent changes which are unknowable in advance
For measuring For generating insights and learning
Research is undertaken twice (before and after), or at most, three times (if there is a midline)
Research is undertaken periodically
Fixed, contained Adaptive, expandingPositions development knowledge in the domain of experts
Privileges local knowledge
“Living Baseline”
RMIT University School of Media & Communication 18
Noske-Turner, J. (2015). 10 Years of Evaluation Practice in Media Assistance: Who, When, Why and How?. In Nordicom Review (Vol. 36, No. Special Issue, pp. 41-56)
Noske-Turner, J. (2014). Evaluating the impacts of media assistance: problems, and principles. Global Media Journal German Edition, 4(2), 1-21.
Publications
top related