classification.net: efficient and accurate classification in c# jamie shotton toshiba corporate...
TRANSCRIPT
![Page 1: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/1.jpg)
Classification.NET: Efficient and Accurate Classification in C#
Jamie Shotton
Toshiba Corporate Research & Development Center, Japan
http://jamie.shotton.org/work/code/Classification.NET.zip
![Page 2: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/2.jpg)
Introduction This tutorial gives
brief introduction to classification theory ideas for practical design and implementation of
classifiers examples of classifiers in Vision
Main technical reference [Bishop, 2006]
Programming references [Murach, 2005] [Liberty, 2005]
![Page 3: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/3.jpg)
Structure
1) Introduction to classification
2) Library Design
3) Implementing classifiers
4) Real vision research
![Page 4: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/4.jpg)
Infer discrete class label y from a set of measurements x
Mapping from a data point to a class label
This tutorial considers D-dimensional feature space binary labels
Supervised learning labeled training set
N training examples
Classification
Vending machine
Measurements x: material, diameter, weight
Class labels y : coin value
[Bishop, 2006]
![Page 5: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/5.jpg)
Probabilistic Interpretation Discriminative models
model conditional probability directly
Generative models are an alternative use Bayes’ theorem to infer conditional probability
Decision theory is used to choose a single y e.g. maximum a-posteriori (MAP)
[Bishop, 2006]
![Page 6: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/6.jpg)
Example Classifiers Nearest neighbour
[Shakhnarovich, 2006]
Linear discriminants decision stumps Fisher’s linear discriminant perceptrons
Decision trees
[Bishop, 2006]
![Page 7: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/7.jpg)
Example Classifiers Boosting
http://www.boosting.org/
Support Vector Machines (SVMs) a ‘kernel’ method
http://www.kernel-machines.org/
And many more!DEMO
TIME
[Bishop, 2006]
![Page 8: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/8.jpg)
Structure
1) Introduction to classification
2) Library Design
3) Implementing classifiers
4) Real vision research
![Page 9: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/9.jpg)
Classification.NET Framework for classification in C#
general purpose extensible
A few example classifiers
Download library, demo, and slides from http://jamie.shotton.org/work/code/Classification
.NET.zip many thanks to Matthew Johnson for the demo
![Page 10: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/10.jpg)
Why C#? Modern, object-oriented language
combines best of C++ and Java pointers, interpreted or compiled, garbage collected
.NET libraries rapid development and re-use of code
Freely available IDEs http://msdn.microsoft.com/vstudio/express/visualc
sharp/
http://www.mono-project.com/
[Scientific C#]
![Page 11: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/11.jpg)
Representing Data
accuratedouble
fast, memory efficientfloat
flexibleno performance hit
<T>
generics
IDataPoint<T>flexible
v low performance hit
T[]fast, easy, but inflexible
double[] or float[]
![Page 12: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/12.jpg)
Representing Data
fast, easyextensible to multi-class
int
![Page 13: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/13.jpg)
Representing Data Sets
dim
en
sio
n d…
…
matrix
row vector…
…
……
……
……
example i
So just use T[,] or T[][] arrays? Not flexible
e.g. on-the-fly computation
![Page 14: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/14.jpg)
Representing Custom class DataSet<T>
no interface changes needed for: on-the-fly computation sparse arrays sparse data points
void Increment(DataSet<double> dataSet){
for(int i = 0; i < dataSet.Count; i++) for(int d = 0; d < dataSet.Dimensionality; d++)
dataSet.Data[i][d] ++;}
![Page 15: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/15.jpg)
Representing Data – Summary
IDataPoint<T>
DataSet<T>
LabeledDataSet<T>
int
![Page 16: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/16.jpg)
Classifier<T> –Classifier Base Class
public abstract class Classifier<T>
{
// Train the classifier
public abstract void Train(LabeledTrainingSet<T> trainingSet);
// Return the classification for the data point
public abstract int Classify(IDataPoint<T> dataPoint);
…
}
![Page 17: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/17.jpg)
Structure
1) Introduction to classification
2) Library Design
3) Implementing classifiers
4) Real vision research
![Page 18: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/18.jpg)
Nearest-Neighbour Classification (NN) Find the nearest point in training set
distance metric (e.g. Euclidean)
Classify the point as
[Shakhnarovich, 2006]
![Page 19: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/19.jpg)
Nearest-Neighbour Classification (NN)
[Shakhnarovich, 2006]
x1
x2
‘decision boundary’
![Page 20: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/20.jpg)
Nearest-Neighbour Classification (NN)
[Shakhnarovich, 2006]
‘Voronoi’diagram
x1
x2
![Page 21: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/21.jpg)
Implementing NN Classification
So let’s implement NN in Classification.NET
Naïve implementation very memory hungry training is instantaneous testing is very slow
![Page 22: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/22.jpg)
Improvements to NN Classification Distance computation ‘trick’
Distances.Euclidean( IDataPoint<double> a,IDataPoint<double> b,double minDistance )
exact
kd-trees [Beis, 1997] class NearestNeighbourFast { … } exact or approximate
Parameter sensitive hashing [Shakhnarovich, 2006] approximate
![Page 23: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/23.jpg)
Divide space into two halves division is axis-aligned classify each half differently
Examples
2D
3D
Decision Stumps (DS)
![Page 24: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/24.jpg)
Classifier compares value xd
with threshold µ
Returns +1 or -1 based on sign s
Decision Stumps (DS)
x1
x2
x3
µ µ
![Page 25: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/25.jpg)
Training Decision Stumps (DS)
x1
x2
x1
x2
x1-value threshold x2-value threshold
or
![Page 26: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/26.jpg)
Training DS
But not always this easy! not usually linearly separable D dimensions
Search for best decision stump H dimensions d thresholds µ signs
Training set error
![Page 27: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/27.jpg)
Training DS Efficiently Project onto each dimension successively
x1
x2
projection onto x1 axis
projectiononto x2 axis
![Page 28: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/28.jpg)
Which Thresholds To Try? Fixed discrete set
perhaps wasteful does not adapt to data
Adaptive discrete set calculate mid-points between pairs of points
Efficient calculation of training set error ² algorithm
![Page 29: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/29.jpg)
Efficient computation of error ² Recall
Consider decision stump with sign
Trivially,
decision stump training set error
![Page 30: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/30.jpg)
Efficient computation of error ² Linear search over µ with update
4
µ
![Page 31: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/31.jpg)
Efficient computation of error ² Linear search over µ with update
4 5
µ
![Page 32: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/32.jpg)
Efficient computation of error ² Linear search over µ with update
4 5 6
µ
![Page 33: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/33.jpg)
Efficient computation of error ² Linear search over µ with update
4 5 6 5
µ
![Page 34: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/34.jpg)
Efficient computation of error ² Linear search over µ with update
4 5 6 5 4 3 2 3
µ
![Page 35: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/35.jpg)
Efficient computation of error ² Linear search over µ with update
µ
4 5 6 5 4 3 2 34 4 5 6 7 65 3
![Page 36: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/36.jpg)
DS Implementationpublic class DecisionStump : Classifier<double>
{private int _d; // The data dimensionprivate double _threshold; // The thresholdprivate int _sign; // The sign (+1 or -1)
// Train the classifier
public override void Train(LabeledTrainingSet<T> trainingSet) { … }
// Return the classification for the data point
public override int Classify(IDataPoint<T> dataPoint)
{
return dataPoint[_d] > _threshold ? _sign : -_sign;
}
…
DEMO
TIME
![Page 37: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/37.jpg)
DS Summary Complexity
reasonable training time very low memory instantaneous classification time
Classification accuracy individually, not very powerful but in combination, much more powerful…
![Page 38: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/38.jpg)
Boosting Many variants, e.g.
AdaBoost [Freund, 1999] LogitBoost & GentleBoost [Friedman, 1998] Cascade [Viola, 2001]
super-fast JointBoost [Torralba, 2007]
multi-class with shared features
Core ideas1. combine many simple classifiers2. weight the training data points
![Page 39: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/39.jpg)
Core Idea 1 – Classifier Combine many simple classifiers (‘weak’ or
‘base’ learners)
1. computes classification score of weak learner2. multiplies by learned confidence value3. sums over T rounds4. compares sum to zero
gives discrete classification value, +1 or -1
![Page 40: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/40.jpg)
Core Idea 2 – Training Weight the training data points
normalised distribution emphasise poorly classified examples
Learning is greedy iteration
At round (iteration) t choose optimal weak learner
under distribution
calculateto reflect updated accuracy
![Page 41: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/41.jpg)
Weak Learners Can use almost any type of classifier
must adapt to weights distribution must give some classification advantage
Simple change allows DS to learn with weights:
![Page 42: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/42.jpg)
AdaBoost Learning Algorithm
Initialise weights For
train weak learner using distribution
compute training set error
calculate confidence
update weights
[Freund, 1999]
![Page 43: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/43.jpg)
AdaBoost with DS Example
1 round2 rounds3 rounds4 rounds5 rounds50 rounds
![Page 44: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/44.jpg)
public class AdaBoost<WeakLearner>: Classifier<double>
where WeakLearner : Classifier<double>, IWeightedLearner
{
private List<WeakLearner> _h = new List<WeakLearner>(); // The learned weak learners
private List<double> _alpha = new List<double>(); // The learned alpha values
// Return the classification for the data point
public override int Classify(IDataPoint<T> dataPoint)
{
double classification = 0.0;
// Call the weak learner Classify() method and combine results
for (int t = 0; t < _h.Count; t++)
classification += _alpha[t] * _h[t].Classify(dataPoint);
// Return the thresholded classification
return classification > 0.0 ? +1 : -1;
}
…
AdaBoost Implementation DEMO
TIME
![Page 45: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/45.jpg)
AdaBoost Summary Complexity
complexity of weak learners x T
Weak Learners stumps, trees, even AdaBoost classifiers
e.g. AdaBoost<AdaBoost<DecisionStump>>
Classification accuracy very flexible decision boundary good generalization
![Page 46: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/46.jpg)
Support Vector Machines (SVMs) Maximize the margin
good generalization
Kernels allow complex decision boundaries linear, Gaussian, etc.
Classification.NET class SVM wrapper for [SVM.NET] library
[Bishop, 2006], [Burges, 1998]
smaller margin larger margin
DEMO
TIME
![Page 47: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/47.jpg)
Structure
1) Introduction to classification
2) Library Design
3) Implementing classifiers
4) Real vision research
![Page 48: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/48.jpg)
Contour Fragments for Object Detection We can recognise objects based fragments of
contour:
Can a computer?
[Shotton, 2007a]
![Page 49: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/49.jpg)
Contour Fragments for Object Detection Clustering learns fragments
Labeled training data object bounding boxes
Boosted classifier learns is the object centre here?
[Shotton, 2007a]
sparse image locations
ears
hin
d legs
rear
tors
o
head
head
belly
contour fragmentsexample i
dim
en
sion
d…
……
……
……
……
DEMO
TIME
![Page 50: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/50.jpg)
TextonBoost
Goal: semantically segment an image using texture (via ‘textons’) layout context
bicycle
road
building
![Page 51: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/51.jpg)
TextonBoost Patterns of textons
‘texture-layout filters’
Labeled training data hand-drawn segmentations
Multi-class boosted classifier [Torralba, 2007] what class is this pixel?
[Shotton, 2007b]
dense image pixels
patterns of textons
dim
en
sion
d…
……
……
……
……
example i
sheep
gra
ss
![Page 52: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/52.jpg)
TextonBoost
[Shotton, 2007b]
cow
grass grass
tree
body road
sky
airplane
sky
grass
building
object
classes
building grass tree cow sheep sky airplane water face car
bicycle flower sign bird book chair road cat dog body boat
![Page 53: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/53.jpg)
Summary Classifiers very powerful for Vision research
many different options available
Classification.NET gives you AdaBoost, SVMs, Decision Stumps, Nearest
Neighbour
Careful high-level library design allows rapid classifier implementation quick experimentation
![Page 54: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/54.jpg)
References [Beis, 1997]
J.S. Beis and D.G. Lowe.Shape Indexing Using Approximate Nearest-Neighbour Search in High-Dimensional Spaces.CVPR.
[Bishop, 2006]
C.M. Bishop.Pattern Recognition and Machine Learning.
[Burges, 1998]
C.J.C. Burges.A Tutorial on Support Vector Machines for Pattern Recognition. Data Mining and Knowledge Discovery.
[Freund, 1999]
Y. Freund and R.E. Schapire.A Short Introduction to Boosting.Journal of Japanese Society for Artificial Intelligence.
[Friedman, 1998]
J. Friedman, T. Hastie, and R. Tibshirani.Additive Logistic Regression: A Statistical View of Boosting.The Annals of Statistics.
[SVM.NET]
M. Johnson.http://mi.eng.cam.ac.uk/~mj293/software_svm.html
[Liberty, 2005]
J. Liberty.Programming C#.
[Murach, 2005]
J. Murach.Murach’s C# 2005.
[Scientific C#]
F. Gilani.http://msdn.microsoft.com/msdnmag/issues/04/03/ScientificC/default.aspx
[Shakhnarovich, 2006]
G. Shakhnarovich, T. Darrel, and P. Indyk [Eds.].Nearest-Neighbor Methods in Learning and Vision.
[Shotton, 2007a]
J. Shotton, A. Blake, and R. Cipolla.Multi-Scale Categorical Object Recognition Using Contour Fragments.PAMI to appear (available on request).
[Shotton, 2007b]
J. Shotton, J.Winn, C. Rother, and A. Criminisi.TextonBoost for Image Understanding: Multi-Class Object Recognition and Segmentation by Jointly Modeling Texture, Layout, and Context.IJCV to appear (available on request).
[Torralba, 2007]
A. Torralba, K.P. Murphy, and W.T. Freeman.Sharing Visual Features for Multiclass and Multiview Object Detection.PAMI.
[Viola, 2001]
P. Viola and M. Jones.Robust Real-time Object Detection.ICCV.
![Page 55: Classification.NET: Efficient and Accurate Classification in C# Jamie Shotton Toshiba Corporate Research & Development Center, Japan](https://reader038.vdocuments.us/reader038/viewer/2022110103/56649e8a5503460f94b8fd69/html5/thumbnails/55.jpg)
Thank [email protected]
code and slides available at
http://jamie.shotton.org/work/code/Classification.NET.zip