csse463: image recognition day 18 upcoming schedule: upcoming schedule: lightning talks shortly...

8
CSSE463: Image Recognition CSSE463: Image Recognition Day Day 18 18 Upcoming schedule: Upcoming schedule: Lightning talks shortly Lightning talks shortly Midterm exam Monday Midterm exam Monday Sunset detector due Wednesday Sunset detector due Wednesday

Upload: elfreda-fox

Post on 05-Jan-2016

213 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: CSSE463: Image Recognition Day 18 Upcoming schedule: Upcoming schedule: Lightning talks shortly Lightning talks shortly Midterm exam Monday Midterm exam

CSSE463: Image Recognition CSSE463: Image Recognition Day 18Day 18

Upcoming schedule: Upcoming schedule: Lightning talks shortlyLightning talks shortly Midterm exam MondayMidterm exam Monday Sunset detector due WednesdaySunset detector due Wednesday

Page 2: CSSE463: Image Recognition Day 18 Upcoming schedule: Upcoming schedule: Lightning talks shortly Lightning talks shortly Midterm exam Monday Midterm exam

Multilayer feedforward neural netsMultilayer feedforward neural nets

Many perceptronsMany perceptrons

Organized into layersOrganized into layers Input (sensory) layerInput (sensory) layer Hidden layer(s): 2 proven Hidden layer(s): 2 proven

sufficient to model any sufficient to model any arbitrary functionarbitrary function

Output (classification) Output (classification) layerlayer

Powerful!Powerful!

Calculates functions of Calculates functions of input, maps to output input, maps to output layerslayers

ExampleExample

x1

x2

x3

y1

Sensory (HSV)

Hidden(functions)

Classification(apple/orange/banana)

y2

y3

Q4

Page 3: CSSE463: Image Recognition Day 18 Upcoming schedule: Upcoming schedule: Lightning talks shortly Lightning talks shortly Midterm exam Monday Midterm exam

XOR exampleXOR example

2 inputs2 inputs 1 hidden layer of 5 1 hidden layer of 5

neuronsneurons 1 output1 output

Page 4: CSSE463: Image Recognition Day 18 Upcoming schedule: Upcoming schedule: Lightning talks shortly Lightning talks shortly Midterm exam Monday Midterm exam

Backpropagation algorithmBackpropagation algorithm

Initialize all weights randomlyInitialize all weights randomly For each labeled example:For each labeled example:

Calculate output using current Calculate output using current networknetwork

Update weights across Update weights across network, from output to input, network, from output to input, using Hebbian learningusing Hebbian learning

Iterate until convergenceIterate until convergence Epsilon decreases at every Epsilon decreases at every

iterationiteration

Matlab does this for you.Matlab does this for you. matlabNeuralNetDemo.mmatlabNeuralNetDemo.m

x1

x2

x3

y1

y2

y3

a. Calculate output (feedforward)

b. Update weights (feedback)R peat

Q5

Page 5: CSSE463: Image Recognition Day 18 Upcoming schedule: Upcoming schedule: Lightning talks shortly Lightning talks shortly Midterm exam Monday Midterm exam

ParametersParameters

Most networks are reasonably robust with respect Most networks are reasonably robust with respect to learning rate and how weights are initializedto learning rate and how weights are initialized

However, figuring out how to However, figuring out how to normalize your input normalize your input determine the architecture of your net determine the architecture of your net

is a black art. You might need to experiment. One is a black art. You might need to experiment. One hint:hint: Re-run network with different initial weights and different Re-run network with different initial weights and different

architectures, and test performance each time on a architectures, and test performance each time on a validation set. Pick best.validation set. Pick best.

Page 6: CSSE463: Image Recognition Day 18 Upcoming schedule: Upcoming schedule: Lightning talks shortly Lightning talks shortly Midterm exam Monday Midterm exam

ReferencesReferences

This is just the tip of the iceberg! See:This is just the tip of the iceberg! See: Sonka, pp. 404-407Sonka, pp. 404-407

Laurene Fausett. Laurene Fausett. Fundamentals of Neural NetworksFundamentals of Neural Networks. . Prentice Hall, 1994.Prentice Hall, 1994.

Approachable for beginner.Approachable for beginner.

C.M. Bishop. C.M. Bishop. Neural Networks for Pattern ClassificationNeural Networks for Pattern Classification . . Oxford University Press, 1995.Oxford University Press, 1995.

Technical reference focused on the art of constructing networks Technical reference focused on the art of constructing networks (learning rate, # of hidden layers, etc.)(learning rate, # of hidden layers, etc.)

Matlab neural net helpMatlab neural net help

Page 7: CSSE463: Image Recognition Day 18 Upcoming schedule: Upcoming schedule: Lightning talks shortly Lightning talks shortly Midterm exam Monday Midterm exam

SVMs vs. Neural NetsSVMs vs. Neural Nets SVM: Training can be expensiveSVM: Training can be expensive

Training can take a Training can take a long long time with large data sets. time with large data sets. Consider that you’ll want to experiment with Consider that you’ll want to experiment with parameters…parameters…

But the classification runtime and space are But the classification runtime and space are O(sd)O(sd), where , where ss is the number of support vectors, and d is the is the number of support vectors, and d is the dimensionality of the feature vectors. dimensionality of the feature vectors.

In the worst case, In the worst case, ss = size of whole training set (like = size of whole training set (like nearest neighbor)nearest neighbor)

But no worse than implementing a neural net with But no worse than implementing a neural net with ss perceptrons in the hidden layer.perceptrons in the hidden layer.

Empirically shown to have good generalizability even with Empirically shown to have good generalizability even with relatively-small training sets and no domain knowledge.relatively-small training sets and no domain knowledge.

Neural networks: can tune architecture. Neural networks: can tune architecture.

Q3

Page 8: CSSE463: Image Recognition Day 18 Upcoming schedule: Upcoming schedule: Lightning talks shortly Lightning talks shortly Midterm exam Monday Midterm exam

How does svmfwd compute y1?How does svmfwd compute y1?y1 is just the weighted sum of contributions of individual support vectors:

d = data dimension, e.g., 294, = kernel width.

numSupVecs, svcoeff (alpha) and bias are learned during training. Note: looking at which of your training examples are support vectors can be revealing! (Keep in mind for sunset detector and term project)

biasesvcoeffynumSupVecs

i

svxdi

i

1

)*/1(2

*1

Much easier computation than trainingMuch easier computation than training Could implement on a device without MATLAB (Could implement on a device without MATLAB (e.g.e.g., a , a

smartphone) easilysmartphone) easily