evaluating the new technologies ann sefton faculties of medicine and dentistry university of sydney

Post on 18-Jan-2016

214 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Evaluating the New Technologies

Ann Sefton

Faculties of Medicine and Dentistry

University of Sydney

Outline

• what is our educational purpose?• levels of decision-making• evaluating local strategies• evaluating others’ “solutions”• designing evaluations• using IT for program evaluation• teaching students to evaluate• conclusions

Levels of decision-making have different implications• institution• faculty• department• unit of study• individual staff member

– whole program, unit of study, “lesson”?– undergrad, postgrad, continuing

education?

Evaluation of local strategies

• against goals and expectations• measuring learning processes• comparative• program improvement• consider quality, costs,

accessibility

Meeting goals:example values - degree program:

• student-centred, independent learning

• reflection and self-evaluation• cooperation in groups• evidence-based decision making• effective skills– clinical – IT

Meeting goals: example goals - degree program• basic and clinical science

– critical reasoning for medical practice

• patient and doctor – effective communication, clinical skills

• community and doctor – community concerns and population issues

• personal & professional development – ethics/law, humanities, reflective practice

Meeting goals: learning package examples

• understand new concept(s)• solve problems(s)• learn a skill• simulate an experiment• access new information incl.

images • formative self-assessment

Evaluation of local strategies• against goals and expectations• understanding learning

processes– evaluation vs research

Evaluation of local strategies• against goals and expectations• understanding learning processes

– evaluation vs research

• comparative studies– evidence-based education, cohort

studies– uncontrolled variables, ethical

dilemmas

Evaluation of local strategies

• against goals and expectations• understanding learning processes

– evaluation vs research

• comparative studies– evidence-based education, cohort

studies– uncontrolled variables, ethical

dilemmas

• quality improvement– structures in place for revision

Evaluation of local strategies

• against goals and expectations• understanding learning processes

– evaluation vs research

• comparative studies– evidence-based education, cohort

studies– uncontrolled variables, ethical

dilemmas

• quality improvement– structures in place for revision

• consider quality, costs, accessibility

Evaluation of imported “solutions”• to compare • to adopt “as is”• to adapt or modify• consider quality, accessibility,

costs

Goals and expectations - learning package:

• learn a new concept• apply knowledge already learned• offer access to information,

databases• solve problems• rehearse a skill • provide feedback on learning• access/simulate/replace an

experiment

Match: IT and educational aims• encourages active learning?• stimulates problem-solving?• triggers “what if” speculation?• stimulates student-student

discussion?• supports further exploration?• offers quizzes and/or feedback?

Is the content appropriate?

• material relevant? important?• specific skills essential? • level of knowledge/skill right?• enhances useful generic skills?• well-matched to assessment?• outcomes consistent with

program goals?

Design considerations

• well constructed and structured?• are screens clear/acceptable to users?• easily navigated?• instructions

clear/intuitive/consistent?• can the user exit easily?• too slow/fast?• does it include feedback?

Quality and relevance

• is the issue/task/learning important?• is the information accurate?• is the approach up-to-date?• are examples appropriate/engaging?• are illustrations clear and relevant?• if a simulation, how "real" is it?

Students’ needs

• what are their expectations?• what are their specific learning needs?• what are their generic learning needs?• is the IT interactive and time-

effective?• is the IT consistent with assessments?• does it offer helpful feedback?

Student and staff reports

• Student views:– exciting, interesting? useful? boring,

useless?– easy to use? impenetrable?– what aspects are valued for learning?

• Objective measures:– does it stimulate discussion?– do they seek it out frequently?– can improved learning be measured?

Is the program effective and cost-effective?

• is the students' learning significantly increased?

• do they enjoy the new learning?• do they rate it as high quality?• are they motivated to learn more?• are there cheaper but effective

alternatives?

Using computers for evaluation of programs and units

• feedback can be automated, interactive

• feedback can be frequent (but avoid overload)

• response times are shortened• BUT appropriate mechanisms

are needed for implementation

Teaching students skills of evaluation

• enhances collegiality• stimulates evidence-based practice • engaging students can provide insights

for them on program expectations• encouraging participation important

BUT they must see results!• “learning for life”: information literacy

Effective computer-based learning:• meets a clear educational need• is consistent with program goals • is cost-effective and timely• avoids errors, misconceptions • is set at an appropriate level• interests and motivates students• engages students actively in

learning• is well-designed, easy to use• encourages collaboration

Good evaluation

• is essential to meeting educational need

• must justify costs (time, money)• is difficult, expensive: many

variables• depends on explicit program goals• is time-consuming and may not

yield conclusive results

top related