kth seminar, stockholm, sweden, 9 october 20091 approaches to quality in e-learning through...
Post on 21-Dec-2015
216 Views
Preview:
TRANSCRIPT
KTH Seminar, Stockholm, Sweden, 9 October 2009 1
Approaches to quality in e-learning through benchmarking programmes
Professor Paul Bacsich
Matic Media Ltd
KTH Seminar, Stockholm, Sweden, 9 October 2009 2
Topics
1. Introduction, disclaimers and acknowledgements
2. The four phases of the UK HE Benchmarking Programme
3. Relationship to Quality of e-Learning
4. Benchmarking in practice – and the Distance Learning Benchmarking Club
KTH Seminar, Stockholm, Sweden, 9 October 2009 4
Disclaimer: This talk is not on behalf of any institution, agency or ministry
– it is a personal expert view
Thanks to HE Academy, JISC, EU Lifelong Learning Programme,
Manchester Business School and University of Leicester for support
- apologies to others omitted
KTH Seminar, Stockholm, Sweden, 9 October 2009 5
2. The four phases of the UK HE Benchmarking Programme
an overview
KTH Seminar, Stockholm, Sweden, 9 October 2009 6
Benchmarking e-learning
At national level, started in UK and New Zealand– Soon spread to Australia– Not closely linked initially to quality agenda
At European level, developments include E-xcellence and UNIQUe– Some earlier work from OBHE, ESMU etc – but not in
“public criterion” mode– Later, developments in other projects– Increasingly, links made to quality agenda
KTH Seminar, Stockholm, Sweden, 9 October 2009 7
Benchmarking e-learning (UK)
Foreseen in HEFCE e-learning strategy 2005Higher Education Academy (HEA) oversaw itFour phases – 82 institutions – 5 methodologiesTwo consultant teams – BELA and OBHEJustified entry to HEA Pathfinder and
Enhancement National initiatives - and useful for JISC initiatives also (Curriculum Design etc)
Can be leveraged into update of learning and teaching strategy (e.g. Leicester U)
KTH Seminar, Stockholm, Sweden, 9 October 2009 8
Documentation – very goodHE Academy reports on benchmarkingEvaluator reports on each phaseConsultant team reports on each phaseConference papers (EADTU/ICDE each year –
and ALT-C etc)Definitive book chapter (to appear)HE Academy blog and wiki (web 2.0)Specific HEI blogs and some public reportshttp://elearning.heacademy.ac.uk/wiki/
index.php/Bibliography_of_benchmarking
KTH Seminar, Stockholm, Sweden, 9 October 2009 9
UK: benchmarking e-learning
“Possibly more important is for us [HEFCE] to help individual institutions understand their own positions on e-learning, to set their aspirations and goals for embedding e-learning – and then to benchmark themselves and their progress against institutions with similar goals, and across the sector”
KTH Seminar, Stockholm, Sweden, 9 October 2009 10
Methodologies in UK HEThere were five methodologies used in UK but only two
now have public criteria, are routinely updated and are available for single institutions (to use outside consortia):
Pick&Mix– Used under HEA auspices in 24 UK institutions– Including 4 diverse institutions in Wales– Now being used in a further UK HEI and one in Australia– About to be used by the 7-institution
Distance Learning Benchmarking Club (UK, Sweden, Australia, Canada, New Zealand)
eMM – as used in New Zealand and Australia
KTH Seminar, Stockholm, Sweden, 9 October 2009 11
Pick&Mix overview
Focussed on e-learning, not general pedagogyDraws on several sources and methodologies – UK and
internationally (including US) and from college sectorNot linked to any particular style of e-learning (e.g.
distance or on-campus or blended)Oriented to institutions with notable activity in e-learningSuitable for desk research as well as “in-depth” studiesSuitable for single- and multi-institution studies
KTH Seminar, Stockholm, Sweden, 9 October 2009 12
Pick&Mix history
Initial version developed in early 2005 in response to a request from Manchester Business School for an international competitor study
Since then, refined by literature search, discussion, feedback, presentations, workshops, concordance studies and four phases of use – fifth and sixth phases now
Forms the basis of the current wording of the Critical Success Factors scheme for the EU Re.ViCa project
KTH Seminar, Stockholm, Sweden, 9 October 2009 14
Criteria
Criteria are “statements of practice” which are scored into a number of performance levels from bad/nil to excellent
It is wisest if these statements are in the public domain – to allow analysis & refinement
The number of criteria is crucialPick&Mix currently has a core of 20 – based on
analysis from the literature (ABC, BS etc) and experience in many senior mgt scoring meetings
KTH Seminar, Stockholm, Sweden, 9 October 2009 15
Pick&Mix: 20 core criteria
Removed any not specific to e-learning– Including those in general quality schemes (QAA in UK)
Careful about any which are not provably success factorsLeft out of the core were some criteria where there was
not yet UK consensus Institutions will wish to add some to monitor their KPIs
and objectives. Recommended no more than 6.– Pick&Mix now has over 70 supplementary criteria to choose from– more can be constructed or taken from other schemes
These 20 have stood the test of four phases of benchmarking with only minor changes of wording– originally 18 - two were split to make 20
KTH Seminar, Stockholm, Sweden, 9 October 2009 16
Pick&Mix Scoring
Use a 6-point scale (1-6)– 5 (cf Likert, MIT90s levels) plus 1 more for
“excellence”Contextualised by “scoring commentary”There are always issues of judging
progress especially “best practice”The 6 levels are mapped to 4 colours in a
“traffic lights” system – red, amber, olive, green
KTH Seminar, Stockholm, Sweden, 9 October 2009 17
Pick&Mix System: summary
Has taken account of “best of breed” schemes
Output and student-oriented aspectsMethodology-agnostic but uses underlying
approaches where useful (e.g. Chickering & Gamson, Quality on the Line, MIT90s)
Requires no long training course to understand
KTH Seminar, Stockholm, Sweden, 9 October 2009 18
Institutional competences
University of Leicester used Pick&Mix in the very first phase of the HEA programme– And two phases of re-benchmarking
Other universities with strong competence (with approved HEA Consultants) are University of Derby and University of Chester
Several other universities have done excellent work and produced public papers and reports (e.g. Northumbria, Worcester)
KTH Seminar, Stockholm, Sweden, 9 October 2009 20
P01 “Adoption” (Rogers)
1. Innovators only2. Early adopters taking it up3. Early adopters adopted; early majority
taking it up4. Early majority adopted; late majority taking
it up5. All taken up except laggards, who are now
taking it up (or retiring or leaving)6. First wave embedded, second wave under
way (e.g. m-learning after e-learning)
KTH Seminar, Stockholm, Sweden, 9 October 2009 21
P10 “Training”
1. No systematic training for e-learning2. Some systematic training, e.g. in some projects
and departments3. Uni-wide training programme but little monitoring
of attendance or encouragement to go4. Uni-wide training programme, monitored and
incentivised5. All staff trained in VLE use, training appropriate to
job type – and retrained when needed6. Staff increasingly keep themselves up to date in a
“just in time, just for me” fashion except in situations of discontinuous change
KTH Seminar, Stockholm, Sweden, 9 October 2009 22
P05 “Accessibility”1. VLE and e-learning material are not accessible2. VLE and much e-learning material conform to minimum
standards of accessibility3. VLE and almost all e-learning material conform to minimum
standards of accessibility4. VLE and all e-learning material conform to at least minimum
standards of accessibility, much to higher standards5. VLE and e-learning material are accessible, and key
components validated by external agencies6. Strong evidence of conformance with letter & spirit of
accessibility in all countries where students study
KTH Seminar, Stockholm, Sweden, 9 October 2009 23
Other methodologies
Members of the BELA team have run three other methodologies:– MIT90s, eMM and ELTI for HE Academy
And analysed most others:– Most US and European methodologies were
analysed QoL, E-xcellence, BENVIC, OBHE
Insights from other methodologies are fed into Pick&Mix to improve it
KTH Seminar, Stockholm, Sweden, 9 October 2009 24
National indicators
Pick&Mix is mapped to the HEFCE Measures of Success (England)
Similar mappings were done for the Welsh Indicators of Success – draft and final
and for the Becta Balanced Scorecard (for colleges)
KTH Seminar, Stockholm, Sweden, 9 October 2009 25
Comparative work
A databank of scores from 10 HEIs is public in anonymous form
Because each criterion is stable in concept, longitudinal comparisons (across time) are also possible – Old criteria are withdrawn if no longer relevant
and new criteria introduced (e.g for Web 2.0 and work-based learning)
– Several HEIs have done re-benchmarking
KTH Seminar, Stockholm, Sweden, 9 October 2009 26
Benchmarking frameworksIt is implausible that there will be a global scheme
or even continent-wide schemes for benchmarkingBut common vocabulary and principles can be
enunciated – e.g. for public criterion systems:– Criteria should be public, understandable, concise and
relatively stable – and not politicised or fudged– Criteria choice should be justified from field experience
and the literature– Core and supplementary criteria should be
differentiated for each jurisdiction– Core criteria should be under 40 in number– The number of scoring levels should be 4, 5 or 6
KTH Seminar, Stockholm, Sweden, 9 October 2009 27
Concordances
Mappings between systems are hard and rarely useful (Bacsich and Marshall, passim)
Concordances of systems are easier and helpful – e.g. to reduce the burden of benchmarking with a new methodology– Such approaches will be used in the
Distance Learning Benchmarking Club– for E-xcellence+/ESMU and ACODE
KTH Seminar, Stockholm, Sweden, 9 October 2009 28
Experience on methodologies
Methodologies do not survive without regular updating by a design authority– this is difficult in a leaderless group context
Forking of methodologies needs dealt with by folding updates back to the core system– otherwise survival is affected
Complex methodologies do not survive well A public criterion system allows confidence,
transparency, and grounding in institutions
KTH Seminar, Stockholm, Sweden, 9 October 2009 29
3. Relationship to Quality of e-Learning
My thoughts
KTH Seminar, Stockholm, Sweden, 9 October 2009 30
Too many concepts
Benchmarking
Standards?Quality
Accreditation/approval
/kitemarking
Critical Success Factors
E-learning is only a small part of the quality process – how can agencies and assessors handle five variants of the concept across many separate methodologies?
KTH Seminar, Stockholm, Sweden, 9 October 2009 31
My view - the pyramid
Critical Success Factors -------------
Benchmarking ----
Quality --------------
Detailed pedagogic guidelines ----------
Criteria are placed at different layers
in the pyramid depending on their “level”
Leadership level
Senior managers
KTH Seminar, Stockholm, Sweden, 9 October 2009
4. Benchmarking in practice – and the Distance Learning
Benchmarking Club
KTH Seminar, Stockholm, Sweden, 9 October 2009 33
Carpets
3.5Quality Enhancement
2.7Decisions/Programmes
2.1Staff Recognition
2.8Quality Assurance
3.3Tech Support to Staff
2.9Organisation
3.4Evaluation (e-learning)
2.7Planning Annually
1.4Costing
1.6Academic Workload
3.1Training
2.0Learning Material
2.9Pedagogy
3.4Decisions/Projects
3.9e-Learning Strategy
2.0Accessibility
2.5Usability
2.8Tools
5.1VLE stage
3.6Adoption
AvIHGFEDCBACriterion name
3.5Quality Enhancement
2.7Decisions/Programmes
2.1Staff Recognition
2.8Quality Assurance
3.3Tech Support to Staff
2.9Organisation
3.4Evaluation (e-learning)
2.7Planning Annually
1.4Costing
1.6Academic Workload
3.1Training
2.0Learning Material
2.9Pedagogy
3.4Decisions/Projects
3.9e-Learning Strategy
2.0Accessibility
2.5Usability
2.8Tools
5.1VLE stage
3.6Adoption
AvIHGFEDCBACriterion name
KTH Seminar, Stockholm, Sweden, 9 October 2009 34
Supplementary criteria - examples
IT reliabilityMarket research, competitor research IPRResearch outputs from e-learningHelp DeskManagement of student expectationsStudent satisfactionWeb 2.0 pedagogy
KTH Seminar, Stockholm, Sweden, 9 October 2009 35
Local criteria
Institutions can track their own “local criteria”
But this is rarely done– It is actually very hard to craft good criterion
statements
KTH Seminar, Stockholm, Sweden, 9 October 2009 36
Slices (departments etc)
As well as benchmarking the whole institution, it is wise to look at a few “slices”:
Schools, Faculties,, Programmes…Useful to give a context to scoresDo not do too manySlices need not be organisational
– Distance learning…– Thematic or dimensional slices like HR, costs…
Most other systems also now use this approach
KTH Seminar, Stockholm, Sweden, 9 October 2009 37
Evidence and Process
Iterative Self-Review
for public criterion systems
KTH Seminar, Stockholm, Sweden, 9 October 2009 38
The Iterative Self-Review Process
For all the methodologies we deployed, we use an Iterative Self-Review Process
The methodologies do NOT require it – it was what our UK institutions desired, for all the public criterion systems – strong resistance to documentary review
It encourages a more senior level of participation from the institution: the result is theirs, not the assessors
It allows them to get comfortable with the criteria as they apply to their institution
And move directly to implementation of changeBut it selects against complex methodologiesAnd requires more effort from assessors
KTH Seminar, Stockholm, Sweden, 9 October 2009 39
Iterative Self-Review detailsIntroductory meetingInitial collection of evidenceSelection of supplementary criteriaMid-process meetingFurther collection of evidenceScoring rehearsal meetingFinal tweaks on and chasing of evidenceScoring meeting
Reflection meeting – to move to change
KTH Seminar, Stockholm, Sweden, 9 October 2009 40
How to handle evidence
Have a “file” for each criterionInstitutions normally group criteria
according to their own L&T strategy or in terms of “owning” departments– We also supply some standard groupings, e.g.
based on MIT90s, but few use these
KTH Seminar, Stockholm, Sweden, 9 October 2009 41
Peer review
Peer review exists in the Iterated Self Review model:– Specialist assessors (normally two nowadays)
have experience in the sector– Often, the benchmarking is done in a
benchmarking cohort and the leaders of each HEI in the cohort form a peer group
KTH Seminar, Stockholm, Sweden, 9 October 2009 42
Distance Learning Benchmarking Club
A work package in the JISC Curriculum Delivery project DUCKLING at the University of Leicester
A number (7) of institutions in UK and beyond will be benchmarked this year– And again next year (Sept-Oct 2010)– The aim is to baseline and then measure
incremental progress in e-learning
KTH Seminar, Stockholm, Sweden, 9 October 2009 43
Members
University of Leicester (UK)University of Liverpool (UK)University of Southern Queensland
(Australia)Massey University (NZ)Thompson Rivers University (Canada)Lund University (Sweden)KTH (Sweden)
KTH Seminar, Stockholm, Sweden, 9 October 2009 44
Process
Institutions will work in a virtual cohort using teleconferencing
Pick&Mix will be used – with an adjusted set of Core Criteria to take account of:– Updated analysis of earlier benchmarking
phases– Critical Success Factors for large dual-mode
institutions– The need for expeditious working
KTH Seminar, Stockholm, Sweden, 9 October 2009 45
References
A key paper on the international aspects is
“BENCHMARKING E-LEARNING IN UK UNIVERSITIES: LESSONS FROM AND FOR THE INTERNATIONAL CONTEXT”, in Proceedings of the ICDE conference M-2009 at http://www.ou.nl/Docs/Campagnes/ICDE2009/Papers/Final_Paper_338Bacsich.pdf.
A specific chapter on the UK HE benchmarking programme methodologies is:
“Benchmarking e-learning in UK universities – the methodologies”, in Mayes, J.T., Morrison, D., Bullen, P., Mellar, H., and Oliver, M.(Eds.) Transformation in Higher Education through Technology-Enhanced Learning, York: Higher Education Academy, 2009 (expected late 2009)
top related