15 quality assurance (nov 2014)

35
Data quality assurance Richard Baker Professor of Clinical Gait Analysis Blog: wwRichard.net 1

Upload: richard-baker

Post on 10-Aug-2015

81 views

Category:

Education


0 download

TRANSCRIPT

Page 1: 15 quality assurance (nov 2014)

Data quality assurance

Richard Baker

Professor of Clinical Gait Analysis

Blog: wwRichard.net

1

Page 2: 15 quality assurance (nov 2014)

2

Gait analysis is based on measurement … … if we can’t make good measurements there is no point us being here.

Page 3: 15 quality assurance (nov 2014)

3

14 chapters on how to make measurements.

1 chapter on what to do with them.

Page 4: 15 quality assurance (nov 2014)

4

Measuring walking

• Both a science and an art

We need to • understand the science• practice the art

Need training in both and there is very little available (www.CMAster.eu)

Page 5: 15 quality assurance (nov 2014)

Quality assurance

• Staff training and education• Vigilance for errors in data

Before gait analysis

After gait analysis

During gait analysis

Page 6: 15 quality assurance (nov 2014)

6

Staff training

Before the analysis

Page 7: 15 quality assurance (nov 2014)

7

Normative datasets

For too long we have used normative datasets as an excuse for doing things differently.

Normative data should be compared between centres to show we are doing the same things

Page 8: 15 quality assurance (nov 2014)

8

Normative datasets

Differences in average traces suggest systematic differences in how markers are applied

Differences in standard deviations suggest one lab has more repeatable practices than the other.

Page 9: 15 quality assurance (nov 2014)

9

Repeatability studies

Measurement science can be quite simple.

All we need to know is the standard error of measurement (SEM - Standard deviation of repeat measurements made on the same subject).

Two measurements need to differ by 3xSEM for there to be evidence of difference.

Page 10: 15 quality assurance (nov 2014)

10

Other repeatability measure

• Never use a repeatability measure you don’t understand.

• Never use a repeatability measure that is not expressed in the original units of measurement.

• Never trust someone else’s definition of “acceptable repeatability (particularly a psychologist)

• “For many clinical measurements ICC should exceed 0.9 to ensure reasonable validity” (Portney and Watkins, 2009)

Page 11: 15 quality assurance (nov 2014)

11

Repeatability studies

McGinley, J. L., Baker, R., Wolfe, R., & Morris, M. E. (2009). The reliability of three-dimensional kinematic gait measurements: a systematic review. Gait and Posture, 29(3), 360-369.

SEM<2° “acceptable” don’t need to consider measurement variability explicitly in interpretation

2°<SEM<5° “reasonable” need to consider measurement variability in interpretation.

SEM>5° “concerning” measurement variability may mis-lead interpretation.

Page 12: 15 quality assurance (nov 2014)

Physical examination

McDowell et al. Gait & Posture, 2000Fosang et al. Dev Med Child Neurol, 2003

Page 13: 15 quality assurance (nov 2014)

13

Repeatability studies

Gait analysis measures can be more repeatable than physical exam measures …

… but may not be in your laboratory

Page 14: 15 quality assurance (nov 2014)

14

Repeatability studies

Require one or more analyst to make repeat measurements on same person.

If repeat testing of single analyst space measurements out.

If comparison of multiple analysts have them close together.

Page 15: 15 quality assurance (nov 2014)

15

Informal repeatability study

Measurements from three therapists (different colours) each measuring the same person on two different days

Page 16: 15 quality assurance (nov 2014)

16

Formal repeatability study

Page 17: 15 quality assurance (nov 2014)

17

Formal repeatability study

• Considerable undertaking• Extremely difficult on children with cerebral

palsy• Considerable uncertainty in SEM

estimates

Page 18: 15 quality assurance (nov 2014)

18

Quality assurance

• Protocols written by team making measurements– Process more important than result

• Regular review

• Repeatability studies

• Critical self-appraisal– by individuals– within teams– within community (peer review)

• Open and honest culture

Page 19: 15 quality assurance (nov 2014)

19

Vigilance for errors

During and after the analysis

Page 20: 15 quality assurance (nov 2014)

20

Vigilance for errors

• Check data before the patient leaves• Requires processed data to be available

before then (preferably before markers removed)

• Keep assessments short and focussed so that both patient and analyst are prepared to repeat tests if necessary.

Page 21: 15 quality assurance (nov 2014)

Is the data likely to be representative for the patient?

• General health• Pain• Fatigue• Behaviour

• No way of telling this from data

Page 22: 15 quality assurance (nov 2014)

Agreement with data from other sources –

Clinical exam

Barefoot

Pelvic Tilt60

0

Ant

Pst

deg

Hip Flexion70

-20

Flex

Ext

deg

Knee Flexion75

-15

Flx

Ext

deg

Dorsiflexion30

-30

Dor

Pla

deg

Pelvic Obliquity30

-30

Up

Dwn

deg

Hip Adduction30

-30

Add

Abd

deg

Knee Adduction30

-30

Var

Val

deg

Ankle Rotation30

-30

Int

Ext

deg

Pelvic Rotation30

-30

For

Bak

deg

Hip Rotation30

-30

Int

Ext

deg

Knee Rotation30

-30

Int

Ext

deg

Foot Progression30

-30

Int

Ext

deg

Bilateral hip flexion contracture

Page 23: 15 quality assurance (nov 2014)

Agreement with data from other sources –

Video.

Barefoot

Pelvic Tilt60

0

Ant

Pst

deg

Hip Flexion70

-20

Flex

Ext

deg

Knee Flexion75

-15

Flx

Ext

deg

Dorsiflexion30

-30

Dor

Pla

deg

Pelvic Obliquity30

-30

Up

Dwn

deg

Hip Adduction30

-30

Add

Abd

deg

Knee Adduction30

-30

Var

Val

deg

Ankle Rotation30

-30

Int

Ext

deg

Pelvic Rotation30

-30

For

Bak

deg

Hip Rotation30

-30

Int

Ext

deg

Knee Rotation30

-30

Int

Ext

deg

Foot Progression30

-30

Int

Ext

degGait data may help explain the video data but it should not contradict it

Page 24: 15 quality assurance (nov 2014)

Agreement with data from other sources –

Video.

Page 25: 15 quality assurance (nov 2014)

Smooth data

Be very suspicious of jerky data

If one kinetic graph is wrong you should be highly suspicious of all of them even if artefact is less obvious.

Page 26: 15 quality assurance (nov 2014)

Smooth data

Gait data is almost always smooth (it has been filtered to be so)

Page 27: 15 quality assurance (nov 2014)

Consistent data

• I can’t see all the detail• Should you be

interpreting detail you can’t see?

Page 28: 15 quality assurance (nov 2014)

Consistent data

• Be particularly careful if traces fall into groups.

• If this occurs in kinetics but not in kinematics then check force plates

Picture from J Stebbins with permission

Page 29: 15 quality assurance (nov 2014)

Swing phase ankle moments

Page 30: 15 quality assurance (nov 2014)

30

Learn consequences of marker placement error

Page 31: 15 quality assurance (nov 2014)

Hip rotation offsets

5° offsets of KAD

Page 32: 15 quality assurance (nov 2014)

32

Consequences of marker placement error

• Play!• Place markers erroneously on a colleague

and predict changes in gait graphs.

• If you can’t then you shouldn’t be placing markers on patients at all.

Page 33: 15 quality assurance (nov 2014)

Professional competencies

• Excellent data quality can only be provided by excellent gait analysts

• Requires combination of biomechanical and clinical competencies

• In many centres these are provided by different people

Page 34: 15 quality assurance (nov 2014)

Professional competencies• Gait analysis requires:

– Patient (and parent) management skills– Physical examination skills– Biomechanical measurement skills– Biomechanical analysis skills

• Recruit staff with some of these skills• Train them in the others• Longer term training• Assessed competencies

Page 35: 15 quality assurance (nov 2014)

Thanks for listening

Richard Baker

Professor of Clinical Gait Analysis

Blog: wwRichard.net

35