networked learning conference 2011
DESCRIPTION
xDelia presentation on evaluation framework devised to guide the interdisciplinary design and development work within the project.TRANSCRIPT
Using Participatory Evaluation to Support Collaboration in an Interdisciplinary Context
Gill Clough, Gráinne Conole & Eileen Scanlon
Agenda• Context for the Design and Evaluation Framework
– Developed as part of the evaluation of the xDelia European project
• Theoretical approach– Participatory, Iterative, Useful
• The D&E framework in action– How it supports project collaborations– Applied to the Games Design Workshop
• Supporting interdisciplinary collaboration
2
xDelia – 3-year EU Project• It’s goal:
– To use bio-sensors and serious games to identify & address the effects of emotional bias on financial decision making in 3 fields:• Traders• Investors• Individuals aged between 16 to 29
3
Challenges and Opportunities• Interdisciplinary consortium
– CIMNE, Spain (Project Management)– OU Business School, UK (Traders and Investors)– Bristol University, UK (Individual Financial Capability)– Blekinge Tekniska Högskola, Sweden (Serious Games)– Erasmus University, The Netherlands (Cognitive Psychology)– Forschungszentrum Informatik, Germany (Bio-sensors)– OU Institute of Educational Technology, UK (Evaluation)
• Challenge: – to collaborate together effectively to achieve a common goal
• Opportunity: – learn from each other and conduct top quality interdisciplinary research
4
xDelia D&E Framework• Framework for the design and
evaluation of effective project interventions
• Benefits– Provide a lens for reflection– Capture of the project experiences at key moments– Ensures there is a common understanding to the underpinning
approach to design and evaluation– Guides the development of the research questions and methods
• Support interdisciplinary collaboration in a distributed project
Theoretical Approach• Approach
– Participatory – Iterative– Useful
• Theoretical perspectives:– Practical participatory evaluation (Cousins and Whitmore, 1998) – Collaborative inquiry (Oliver et al., 2007, Poth and Shulha, 2008)– Participatory design (Namioka and Schuler, 1993, Poth and Shulha, 2008). – Also draws on
• Utilization-focused evaluation (Patton 20067)• Learning design (Conole, 2009)
Researcher controlled
Practitioner controlled
Consultation
Deep participation
Primary users
All groups
Evaluations• Workshop interventions:
– Type i) Prototype development workshops– Type ii) Substantive, subject-orientated workshops – Type iii) Design and evaluation workshops
• Study interventions: – Research activities that aim to provide data for the research
Design and Evaluation Framework
The framework in action• The D&E framework works at two levels
– Macro level: overall conceptual framework and approach– Micro level: specific support from design question through to analysis
• Ongoing meta-evaluation analysis – Articulates key moments in the project– Captures different partner stakeholder perspectives– Identifies the success factors for and challenges of this kind of
interdisciplinary research
11
Questions• Game Design Questions
– What games do we want to develop further?– What concepts do we want to develop further?– What are the key questions for games to improve financial capability?
• Evaluation Questions– What aspects of the workshop worked and what did not work?– What are partner’s initial perceptions of the project?
12
Interventions• Design Layer
– Briefing on financial capability– Mixed-partner groupings– Brainstorm games models– Select one game and prototype– Group evaluation
• Evaluation Layer– Audio and video workshop– Interview partners– Post workshop survey
13
Analysis• Design
– Identify key questions for games
– Better collective understanding of game design process & evaluation criteria for games
– Document an initial set of prototypes
• Evaluation– Abstract examples of good practice
• Interdisciplinary learning evaluated through post-event survey
– Feedback critical evaluation of the event
• Workshop template and guidelines page on wiki
– Identify and address issues of interdisciplinarity
• Analyse baseline interview data to identify themes
• Provide support and guidance for collaborative working founded on interdisciplinary learning
14
Interdisciplinarity Themes
15
Collaboration tools• Wiki and Email
• WP6 participation in collaborations – video-conferences– studies
• Cloud-based file repositories
Implications for practice• Evaluation team adopt a reflective participatory role
– Two-way participation in both workshops and study interventions– Identification of XDelia mediating technology artefacts and analysis of
their impact on collaboration and learning: Flashmeeting, Adobe Connect, Wiki, Google Docs, Google Wave, Email, Dropbox, Diigo
• Support local interventions via a online Evaluation toolbox– Toolbox consists of an evolving ‘pick and mix’ of evaluation methods– Available on the Cloudworks social media site– Each evaluation method is a ‘Cloud’ on Cloudworks with discussion
space – Anyone can improve the Clouds by adding more links and references
Into the Future• Evolving project glossary supporting shared
understandings
• Participatory evaluation of study interventions
• Refine the evaluation toolbox for micro-level application of D&E framework
• Apply the D&E framework to the study interventions
• In depth analysis of data on interdisciplinary collaboration
• Additional partner interviews
• Analysis of mediating technologies used in the project
20
Gill Clough [email protected]áinne Conole [email protected] Institute of Educational TechnologyThe Open UniversityWalton HallMilton KeynesMK7 6AAwww.open.ac.uk
References– Clough, G., Conole, G.C. and Scanlon, E. (2009). Behavioural Finance and Immersive Games: A Pan-
European Framework for Design and Evaluation. In Same places, different spaces. Proceedings Ascilite Auckland 2009.
– Conole, G. (2009) Capturing and Representing Practice. In Tait, A., Vidal, M., Bernath, U. & Szucs, A. (Eds.) Distance and E-learning in Transition: Learning Innovation, Technology and Social Challenges. London, John Wiley and Sons.
– Conole, G. and Culver, J. (2009). The design of Cloudworks: Applying social networking practice to foster the exchange of learning and teaching ideas and designs. In Computers and Education - Learning in Digital Worlds – Selected contributions from the CAL09 conference. http://oro.open.ac.uk/18384/ [viewed 19 Nov 2009].
– Cousins, J.B. and Whitmore, E. (1998). Framing participatory evaluation. New Directions for Evaluation, 80, 5-23.
– Oliver, M., Harvey, J., Conole, G. and Jones, A., (2007). Evaluation. In G. Conole and M. Oliver (Eds.), Contemporary Perspectives in E-Learning Research. (pp.203–216). London: Routledge
– Patton, M. Q. (2008) Utilization-Focused Evaluation, Sage Publications, London.
– Poth, C, A. and Shulha, L. (2008). Encouraging stakeholder engagement: A case study of evaluator behaviour. Studies in Educational Evaluation The Process of Evaluation: Focus on Stakeholders, 34, 218–233.
21