survey on activity recognition from acceleration data

23
Survey on Activity Recognition from Acceleration Data

Upload: theresa-patterson

Post on 13-Jan-2016

219 views

Category:

Documents


0 download

TRANSCRIPT

Survey on Activity Recognition from Acceleration Data

2

Outline

• Research goal of the project

• Contexts of mobile devices• Activity recognition from acceleration data[1,2]

– Overview– Related works– Data collection– Feature extraction– Experiment results– Analysis

• Summary• Future works

3

Research Goal

• Context-aware computing and interaction for mobile devices– Improving usability of mobile devices– Providing adequate services to users based on

context

– By recognizing context/situation of users and devices autonomously

– Applications• Reduce interruptions from mobile devices[5]• Health-care system[6]• …

4

Context

• Definition of context– Instantaneous, detectable, and relevant property of

the environment, the system or users

– Examples• location, time, light intensity, noise level, power,

user schedule, …

5

Contexts of Mobile Phones[7]

– Where the user is

– indoors– outdoors– in a meeting– at the desk– …

– Where the devices is in relation to the user

– in hand– in a pocket– on the table– …

– What the user is doing (with the device)

– walking– talking– sitting– running– …

Physical activity recognition

6

Activity Recognition from Acceleration Data[1,2]

• Recognizing physical activities– 20 activities including common household affairs– using data from 5 biaxial accelerometers

• Under semi-naturalistic conditions– No wires– Weighed less than 120g– No observation by researchers– Without any restriction on movement or

fear of damaging electronics– Minimizing subject awareness of data collection– Subjects annotate start, stop times by themselves

7

Activity Labels

Walking Walking carrying items

Sitting & relaxing Working on computers

Standing still Eating or drinking

Watching TV Reading

Running Bicycling

Stretching Strength-training

Scrubbing Vacuuming

Folding laundry Lying down & relaxing

Brushing teeth Climbing stairs

Riding elevator Riding escalator

8

Related Works

Ref.

Recognition

Rate

Activities Recognized

No. of

Subj.

Data Type

No. of Sensor

s

Sensor Placement

[8] 92.85%~ 95.91%

Ambulation 8 L 2 2 thigh

[9] 83%~ 90%

Ambulation, posture

6 L 6 3 left hip, 3 right hip

[10]

95.8% Ambulation, posture, typing, talking, bicycling

24 L 4 Chest, thigh, wrist,

forearm

[11]

89.30% Ambulation, posture

5 L 2 Chest, thigh

[12]

96.67% 3 Kung Fu arm movements

1 L 2 2 wrist

[4] 42%~ 96%

Ambulation, posture, bicycling

1 L 2 2 lower back

[13]

85%~ 90%

Ambulation, posture

10 L 2 2 knee

[10]

66.7% Ambulation, posture, typing, talking, bicycling

24 N 4 Chest, thigh, wrist,

forearm

[14]

86%~ 93%

Ambulation, posture, play

1 N 3 2 wrist, 1 thigh

Data typeL : laboratory settingN : naturalistic setting

9

Data Collection

• 5 biaxial accelerometers– Left thigh, right ankle, left arm, right wrist, right hip– Sampling frequency : 76.25Hz

10

Feature Extraction

• 512 sample windows (6.7 seconds) • Sliding windows with 50% overlap

• Features– DC feature

• Mean over the window– Energy feature

• Sum of squared DFT component magnitudes– Frequency-domain entropy

• Normalized information entropy of the DFT component magnitudes

• Support discrimination of activities with similar energy values– Correlation

• Between two axes and between all pairwise combinations of axes on different boards

11

Example of Frequency-domain Entropy

• Ex) Bicycling and running– Similar amounts of energy in the hip acceleration

– Bicycling• Uniform circular movement of the legs • DFT of hip acceleration in vertical direction show

single dominant frequency component at 1Hz • Low frequency-domain entropy

– Running• Complex hip acceleration and many major DFT

frequency components between 0.5Hz and 2Hz • Higher frequency-domain entropy

12

Experiment Results

• Naïve Bayes– Unable to adequately model such rules

• Due to the assumptions of conditional independence and Gaussian distribution

• Insufficient data

Classifier User-specific Training Leave-one-subject-out Training

Nearest Neighbor 69.21 82.70

Decision Tree 71.58 84.26

Naïve Bayes 34.94 52.35

13

Experiment Results

• Decision tree– Capture conjunctions in feature values well– Ex)

• Sitting– 1G downward acceleration, low energy at hip

and arm• Bicycling

– moderate energy and low entropy at hip, low energy at arm

• Window scrubbing and brushing teeth– window scrubbing shows more energy in hip

even though both activities shows high energy at arm

14

Experiment Results

Activity Accuracy(%)

Activity Accuracy(%)

Walking 89.71 Walking carrying items 82.10

Sitting & relaxing 94.78 Working on computers 97.49

Standing still 95.67 Eating or drinking 88.67

Watching TV 77.29 Reading 91.79

Running 87.68 Bicycling 96.29

Stretching 41.42 Strength-training 82.51

Scrubbing 81.09 Vacuuming 96.41

Folding laundry 95.14 Lying down & relaxing 94.96

Brushing teeth 85.27 Climbing stairs 85.61

Riding elevator 43.58 Riding escalator 70.56

using decision tree classifier andleave-one-subject-out validation

15

Experiment Results

• Riding elevator– Often misclassified as “riding escalator”– Both involve the subject standing still and similar

vertical acceleration

• Watching TV– Often misclassified as “sitting and relaxing” and

“reading”– All activities involve sitting

• Stretching– Often misclassified as “folding laundry”– Moderate moving at arms

16

Experiment Results

• Comparing leave-one-subject-out and user-specific training– Equal amounts of training data– Result

– User-specific training shows more accurate result

Classifier User-specific Training

Leave-one-subject-out

Training

Decision Tree

77.31% 72.99%

17

Discrimination Power of Each Accelerometer

• Thigh is the most powerful, hip is the second best location• One accelerometer attached to a cell phone may enable

recognition of certain activities

• Two accelerometers may enable effective recognition

Accelerometer(s) Left In Difference in Recognition Accuracy

Hip -34.12

Wrist -51.99

Arm -63.65

Ankle -37.08

Thigh -29.47

Thigh and Wrist -3.27

Hip and Wrist -4.78

18

Analysis

• Postures such as sitting, standing still, …– recognized by mean acceleration

• Ambulatory activities and bicycling– recognized by hip acceleration energy

• Bicycling and running– shows similar levels of hip acceleration mean and

energy– recognized by entropy and correlation between arm

and hip acceleration

19

Analysis

• Low recognition rates for stretching, scrubbing, riding an elevator or escalator– High level analysis is required– Ex) duration, time, day of activities

• Use of other sensor data may improve activity recognition

20

Summary

• Low recognition rates when using naturalistic data in prior works

• 84.26% recognition rates for 20 everyday activities– Using 5 biaxial accelerometers– DFT based features– By decision tree classifier

– Hip is the second best location• One accelerometer may recognize some

activitieswhich are not associated with upper part of body

21

Future Works

• Data collection– Making wireless board with accelerometers (and

other sensors)– Under naturalistic condition

• Activity recognition algorithm for mobile phone– Only one accelerometer– Consideration for location and posture of mobile

phone• In hand, in a pocket, in a bag, …• Different orientation of devices

22

References

[1] L. Bao, S. S. Intille, “Activity recognition from user-annotated acceleration data”, Proc. of Pervasive 2004.[2] L. Bao, “Physical actiity recognition from acceleration data under semi-naturalistic conditions”, M.Eng. Thesi

s, MIT, 2003.[3] R.W. DeVaul, S. Dunn, “Real-time motion classification for wearable computing applications”, Technical repo

rt, MIT Midea Lab., 2001.[4] K. V. Laerhoven, O. Cakmakci, “What shall we teach our pants?”, In the 4 th international symposium on Wea

rable Computers, 2000.[5] J. Ho, S. S. Intille, “Using context-aware computing to reduce the perceived burden of interruptions from mo

bile devices”, CHI2005.[6] S. S. Intille, “A new research challenge: persuasive technology to motivate healthy aging”, IEEE Transactions

on information technology in biomedicine, 2004.[7] E. Tuulari, “Methods and technologies for experimenting with ubiquitous computing”, Espoo2005, VTT Public

ations 560, 2005.[8] S.-W. Lee, K. Mase, “Activity and location recognition using wearable sensors”, IEEE Pervasive Computing, 2

002.[9] J. Mantyjarvi, et. al., “Recognizing human motion with multiple acceleration sensors”, In Proc. of the IEEE Int

ernational Conf. on Systems, Man, and Cybernetics, 2001.[10] F. Foerster, et. al., “Detection of posture and motion by accelerometry: a validation in ambulatory monitori

ng”, Computers in Human Behavior, 1999.[11] K. Aminian, et. al., “Physical activity monitoring based on accelerometry: validation and comparison with vi

deo observation”, Medical & Biological Engineering & Computing, 1999.[12] G.S. Chambers, et. al., “Hierarchical recognition of intentional human gestures for sports video annotatio

n”, In Proc. of the 16th International Conf. on Pattern Recognition, 2002.[13] C. Randell, H. Muller, “Context awareness by analysing accelerometer data”, The 4 th International Symposi

um on Wearable Computers, 2000.[14] M. Uiterwaal, et. al., “Ambulatory monitoring of physical activity in working situations, a validation study”, J

ournal of Medical Engineering & Technology, 1998.

23

Appendix

Confusion matrix