national e-learning laboratory

Post on 06-May-2015

586 Views

Category:

Education

2 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Information evening

National e-Learning Laboratory

Introduction

• Agenda– Why we want to study usability at nell– Introduction to usability research– Demonstration of systems– Questions and answers

What is usability

Usability is the "effectiveness, efficiency and satisfaction with which a specified set of users can achieve a specified set of tasks in a particular environment."

ISO

Why study usability?• From a user’s perspective

– The interface is the system

• From a business perspective– Critical processes are located at the human

computer interface

• From a design perspective– Early insights save €€€ over late fixes

Some Usability Criteria

How quickly can a user learn to use a new system in order to perform tasks?

Learning

At level of organisation, structure, interface, navigation.

Some Usability Criteria

What effort is necessary for users to perform tasks?

Efficiency/Efficacy

Some Usability Criteria

How does a system react to errors provoked by the user and what

consequences do user errors have?

Reliability/Robustness

Some Usability Criteria

How quickly can a user remember the way to use a rarely used system?

Recall

Some Usability Criteria

How satisfactorily can the system be used?

Satisfaction/Completion

Usability Testing Timeline

Planning: framing the question

Planning: target group recruitment

Session recording: n users by x sessions

Analysis

Report

Framing the question

• Spend time on getting the right question• Pick your target group• Describe the process and behavioural indicators• Set a time frame

• A precise question will get a precise answer • Decide on how you want to report results and for

whom

What are your users doing?• How are they using your product?

• What is their experience?

• How could it be improved?

Web-server Logging:

navigation paths, performance, task

completion.

Server-side Logs

Each mouse-click is recorded: time stamp, location, application

Each key-press is recorded, too.

Example of Screen Recording

Screen behaviour: dynamic web-sites & applications, mouse clicks, keyboard use.

Screen behaviour: dynamic web-sites & applications, mouse clicks, keyboard use.

Screen Behaviour

Server-side Logs

Example of User Video from different Perspectives

The remote eyetracker is hidden under the screen.

It uses infrared light to track the eyes.

The eyetracker records the gaze position and the duration of each fixation.

User behaviour: gesture and posture, facial

expression, off-screen activities, gaze position.

User Behaviour

and Reaction

Screen Behaviour

Server-side Logs

User feedback: quantitative feedback,

e.g., usability questionnaire

User feedback: qualitative feedback, e.g., open-ended questions or

interview

User feedback: qualitative feedback, e.g., narrative over screen-recording

User Feedback

User Behaviour

and Reaction

Screen Behaviour

Server-side Logs

Screen Behaviour

User Behaviour

and Reaction

Server-side Logs

User Feedback

What is NELL?

• Equipment to observe and analyse learner behaviour– 4 workstations– 4 observation desks

• Analysis software

• Participant panel

Microphone

Dome Camera

Observation Equipment

• Desk camera

• Dome camera

• Screen recorder

• Keyboard log

• Desk and ceiling microphones

• Remote Eye-tracker

User workstation 3 & 4User workstation 1 & 2

Control desk

Test and Observation Rooms

Observation Desk

Screen Recorder Machines

Switchboard

Observation

• Dome camera control

• Switchboard

• Video and screen-recorder machines

• Data aggregation

• Analysis Software

Analysis Software

• Qualitative and quantitative analyis of sequential data (Observer XT)

• Emotional expression recognition (FaceReader)

• Eye-tracking analysis (BeGaze)

Analysis

• Qualitative & quantitative Research

Qua

litat

ive

Quantitative

Analysis

• Qualitative & quantitative Research

• Deep Analysis of small sample

• Formative Evaluation during development lifecycle

• Analysis supported by software

Eyetracking

We developed a Peer Finder for on-line

learning.

Design A is straightforward: A

list of peers.

In design B users can indicate their

preferences.

In design C users can search for suitable peers.

A B

CWhich works best?

A, B, or C?

Let’s explore how learners are using the

different designs.

The remote eyetracker is hidden under the screen.

It uses infrared light to track the eyes.

The eyetracker records the gaze position and the duration of each fixation.

Areas of Interest

We can define Areas of Interest to track what learners look at, e.g.,

which information of a user is considered.

For example, in this area, learners can

see whether a peer is available or not.

In fact they spent most of the time reading the

names rather than assessing their peers’

knowledge or availability.

Usability Questionnaire Results

Design CDesign BDesign A

SU

S s

co

re

100.00

80.00

60.00

40.00

20.00

0.00

Error Bars: 95% CI

While everybody said all three designs are easy to use ... the eyetracker data revealed that ...

0

2000

4000

6000

8000

10000

12000

14000

user

s

avail

able

know

ledg

e

head

er

crite

ria

intro

ducti

on

ave

rag

e d

we

ll tim

e (

ms)

design A design B design C

… many users ignored the instructions and the peers’ knowledge completely.

Further InformationAbi Reynolds

Stephan Weibelzahl

Leo Casey

www.ncirl.ie Centre for Research and Innovation in Learning and Teaching

nell@ncirl.ie

+353 1 4498600

top related