j.-y. yang, j.-s. wang and y.-p. chena, using acceleration measurements for activity recognition: an...
TRANSCRIPT
J.-Y. Yang, J.-S. Wang and Y.-P. Chena,
Using acceleration measurements for activity recognition: An effective learning algorithm for constructing neural classifiers Pattern Recognition Letters, vol. 29, no. 16, pp. 2213-2220, 2008.
Spring Semester, 2010
Dynamic Time Warping and Neural Network
Outline
Background Activity Recognition Strategy Experiments Summary
2
3
Background
Accelerometers can be used as a human motion detector and monitoring device– Biomedical engineering, medical nursing, interactive
entertainment, …
– Exercise intensity / distance, sleep cycle, and calorie consumption
Proposed Method Overview
One 3-D accelerometer on the dominant wrist
NNs– Pre-classifier static classifier or dynamic classifier
Eight domestic activities– Standing, sitting, walking, running, vacuuming, scrubbing,
brushing teeth, and working at a computer
Background
4
Neural Classifier
Neurons in the Brain– A neuron receives input from other neurons (generally
thousands) from its synapses
– Inputs are approximately summed
– When the input exceeds a threshold the neuron sends an electrical spike that travels from the body, down the axon, to the next neuron(s)
Background
5
Neurons in the Brain (cont.)
Amount of signal passing through a neuron depends on:– Intensity of signal from feeding neurons
– Their synaptic strengths
– Threshold of the receiving neuron
Hebb rule (plays key part in learning) – A synapse which repeatedly triggers the activation of a
postsynaptic neuron will grow in strength, others will gradually weaken
– Learn by adjusting magnitudes of synapses’ strengths
Background
6
Artificial Neurons
w1w2
w3
x1
x2
x3
y
∑w.x
g( )A
f(A)
+1
-1
0
Step Function
A
f(A)
+1
-1
0
Sigmoid Function
Background
7
Neural Classifier (Perceptron)
Structure
Learning– Weights are changed in proportion to the difference (error)
between target output and perceptron solution for each example
– Back-propagation algorithm• The gradient descent method, Slow convergence and local minima
– The resilient back-propagation (RPROP)• Ignore the magnitude of the gradient
Background
8
9
Activity Recognition Strategy
Pre-Classifier
Static/Dynamic Classifier
10
Pre-Classifier (1/2)
Two components of the acceleration data– Gravitational acceleration (GA)
– Body acceleration (BA): High-pass filtering to remove GA
Segmentation with overlapping windows– 512 samples per window
Activity Recognition Strategy
11
Pre-Classifier (2/2)
SMA (Signal Magnitude Area)– The sum of acceleration magnitude over three axes
AE (Average Energy)– Average of the energy over three axes
– Energy: The sum of the squared discrete FFT component magnitudes of the signal in a window
Activity Recognition Strategy
12
Feature Extraction
8 attributes × 3axis = 24 features– Mean, correlation between axes, energy, interquartile range
(IQR), mean absolute deviation, root mean square, standard deviation, variance
Activity Recognition Strategy
13
Feature Selection (1/2)
Common principalcomponent analysis (CPCA)
If features are highlycorrelated,the corresponding vectorsare similar clustering to group similar loadings
Activity Recognition Strategy
Feature Selection (2/2)
Apply the PCA Select the first p PCs (cumulative sum>90%) Estimate CPC Support vector clustering
14
Activity Recognition Strategy
Verification
Activity Recognition Strategy
15
16
Experiments: Environment (1/2)
MMA7260Q tri-axial accelerometer– Sensitivity: -4.0g ~ +4.0g, 100Hz
– Mount on the dominant wrist
Eight activities from seven subjects– Standing, sitting, walking,
running, vacuuming,scrubbing, brushing teeth,and working at a computer
– 2min per activity
Environment (2/2)
Window size = 512 (with 256 overlapping)– 22 windows in one min., 45 windows in two min.
Leave-one-subject-out cross-validation– Training: 1min per activity = 22 windows × 8 activities× 6
subjects
– Test: 2min per activity = 45 windows × 8 activities
Experiments
17
18
FSS Evaluation
Use six static selected features
Experiments
19
Recognition Result
NN– Hidden node
• Pre-classifier: 3
• Static-classifier: 5
• Dynamic-classifier: 7
– Epochs: 500
Computational load of FSS– Training without FSS = 7.457s, training with FSS = 8.46s
Experiments
20
Summary
Proposed method yielded 95% accuracy– Pre-classifier static / dynamic classifiers
Author’s other publication– Yen-Ping Chen, Jhun-Ying Yang, Shun-Nan Liou, Gwo-Yun Lee, Jeen-Shing Wang: Online
classifier construction algorithm for human activity detection using a tri-axial accelerometer.
– Applied Mathematics and Computation 205(2): 849-860 (2008)