pit pattern classification with support vector machines ...uhl/sevcm/do_na_2_pitpatnnsvm.pdf ·...
TRANSCRIPT
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Pit Pattern Classification with Support Vector Machines andNeural Networks
Christian Kastinger
February 1, 2007
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Objectives
Investigation of support vector machine and neural network for classification ofpit pattern (256 x 256 RGB images) with previous feature extraction(co-occurrence histogram) and feature selection (principle component analysis)for a 2-class and a 6-class problem.
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Classification Process
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Co-occurrence histogram
I Considers dependencies between adjacent pixels.
I Co-occurrence distance (between two samples) can be varied.
I Orientation (vertical,horizontal,...) is not fixed.
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Principle Components Analysis (PCA)
I PCA aims to provided a better representation with lower dimension.
I Compaction of information.I Process
I Create a new mean-adjusted data matrix X.I Calculate a m ×m covariance matrix Σ from the mean-adjusted data X.I Compute n significant eigenvectors W from the covariance matrix Σ.I Perform dimensionality reduction: Y = WT X.
I n is a tradeoff between ”compression” and quality.
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
PCA Example
I Data points : x =(1 3 3 5
); y =
(1 3.5 0.5 3
)I X =
(2.667 1.3331.333 2.1667
)=⇒ W =
(0.639− 0.77−0.77− 0.639
)I x ′ = WT ∗ x =
(1.409 4.546 2.63 5.767
)I y ′ = WT ∗ y =
(0.131 0.778 −1.63 −1.532 −0.885
)
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Basic Concept
I Inputs: x1, ..., xj ∈ [0, 1]
I Synapses: w1, ..., wj ∈ R
I Neuron: net =∑
wj ∗ xj
I Output: y = f (net − θ)
I Bias value: θ
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Multi Layer Network
I Layer weights are adjusted during learning based on some input/outputpatterns.
I Learning typically starts at the output layer and moves toward the inputlayer (back-propagation).
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Mathematical Model
I Activation of the hidden layer
netj =∑
i
wjixi
I Output of the hidden layer
yj = f (netj) = f
(∑i
wjixi
)I Activation of the output layer
netk =∑
j
wkj f (netj)
I Net output
yk = f (netk) = f
(∑j
wkj f
(∑i
wjixi
))= f
(∑j
wkjyj
)Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Basic Concept
I A support vector machine is a learning algorithm which attempts toseparate patterns by a hyperplane defined through:
I normal vector wI offset parameter b.
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
What is an Optimal Hyperplane?
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Separation with maximal Margin
I Support vectors are all points lying on the margin closest to the hyperplane.
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Kernel Trick
I Nonlinear and complex separation in the 2-dimensional input space.
I Easier and often linear separation in higher dimensional feature spaces.
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Kernel Examples
I Linear kernel
k(x, x′) = xTx′ = 〈x, x′〉
I Polynomial kernel of degree d
k(x, x′) = (γ + 〈x, x′〉+ coef 0)d
I Radial basis kernel (RBF)
k(x, x′) = exp(−γ‖x− x′‖2)
I MLP or Sigmoid kernel
k(x, x′) = tanh(γ〈x, x′〉+ coef 0)
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Classification Principle
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Mathematical Model
I Dual optimization problem
maxα∈Rm
W (α) =m∑
i=1
αi −1
2
m∑i,j=1
αiyiαjyjk(xi , xj)
subject to αi ≥ 0, for all i = 1, ..., m, andm∑
i,j=1
αiyi = 0
I Decision function
f (x) = sgn
(m∑
i=1
αiyi 〈φ(x), φ(xi )〉) + b
)
= sgn
(m∑
i=1
αiyik(x, xi ) + b
)
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Multi-Class Approaches
I Decomposition into several binary classification tasks.
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
SVM Optimization
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
PCA Dependency
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Co-occurrence Distance Dependency
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
PCA Dependency for 6 Classes
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Additional Investigations
I Color-histogram (3-dimensional).I Too high data dimension.I Low classification results due to high data compression.
I Vertical co-occurrence histogram.I 3-5% lower classification results compared with horizontal histogram.
I Combination of horizontal and vertical co-occurrence histogram.I Lower classification results as with horizontal histogram.
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Problems
I High data dimension.
I Data scaling.
I Time intensive parameter optimization.
I SVM accepts invalid input.
I Too less training samples
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Summary
I SVM provides 10% better results than the NN.
I SVM parameter have to be optimized carefully.
I Better classification with PCA due to higher compaction of information.
I Low impact co-occurrence distance on classification accuracy.
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Outlook
I Evaluation of SVM with a higher amount of pit patterns.
I Consideration of other feature extraction and selection strategies.
I Investigation of other neural network topologies.
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Bibliography
R.O. Duda, P.E. Hart and D.G. Stork. Pattern Classification, 2nd ed. NewYork: John Wiley & Sons, 2001.
C.C. Chang and C.J. Lin. LIBSVM: a library for support vector machines.URL: http://www.csie.ntu.edu.tw/ cjlin/libsvm, [Feb. 24, 2006].
N. Cristianini and J. Shawe-Taylor. An Introduction to Support VectorMachines and other kernel-based learning methods. Cambridge:Cambridge University Press,2000.
B. Schoelkopf and A.J. Smola. Learning with kernels. Cambridge,MA: MITPress, 2002.
J.E. Jackson A User’s Guide to Principal Components. John Wiley & SonsInc, 2003.
P. Chang and J. Krumm. Object recognition with color cooccurancehistograms. In Proceediungs of CVPR ’99, 1999.
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks
IntroductionFeature Extraction and Selection
Neural NetworkSupport Vector Machine
ResultsConclusion
Questions?
Thank you for your attention.
Christian Kastinger Pit Pattern Classification with Support Vector Machines and Neural Networks