multi scale recognition with dag-cnns by s. yang & d. ramananbgong/cap6412/dag-cnn.pdf · 2....

Post on 29-May-2020

3 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Multi Scale Recognition with DAG-CNNs

by S. Yang & D. Ramanan

March 22, 2016

Niladri Basu Bal (@ gmail.com)

Motivation for this paper?

1. Contemporary approaches: Input for classifier is extracted feature from last layer (i.e. only high level feature).

2. Through experiments it was found that:

a) Course classification (e.g. person/dog) works better with mid levelfeature information

b) Fine grained classification (e.g. model of car) works better with high level feature information

Outline

1. Overview of paper

2. Convolutional Neural Network comparison

3. Comparison of feature maps in different levels

4. DAG-CNNs

5. Experiments

6. Conclusion

Outline

1. Overview of paper

2. Convolutional Neural Network comparison

3. Comparison of feature maps in different levels

4. DAG-CNNs

5. Experiments

6. Conclusion

Overview of paper

1. Analyses existing pre-trained CNN models like Caffe and Deep19 .

2. Tries to find the correct type of layer whose feature map proves useful.

3. Designs a multi scale architecture (DAG-CNN) that takes advantage of all the high, middle, and low-level features.

4. The DAG-CNNs lowers the error rate of MIT67, Scene15, SUN397 (benchmark datasets).

Outline

1. Overview of paper

2. Convolutional Neural Network comparison

3. Comparison of feature maps in different levels

4. DAG-CNNs

5. Experiments

6. Conclusion

Convolutional Neural Networkcomparison

Source: Multi Scale Recognition with DAG-CNNs by Yang & Ramanan

Convolutional Neural Networkcomparison1. Single Scale CNN

a) Using the information of only the last layer .

b) It is a very simple structure .

2. Multi Scale CNN

a) Uses information from all previous layers

b) Learning is difficult , over-fitting is likely to occur which can be overcome by using ‘pooling’ (sum, avg, max).

Source: www.slideshare.net ; Multi Scale Recognition with DAG-CNNs by Daiki Yamamoto

Outline

1. Overview of paper

2. Convolutional Neural Network comparison

3. Comparison of feature maps in different levels

4. DAG-CNNs

5. Experiments

6. Conclusion

Comparison of feature maps in different levels1. Next figure is a image retrieval result

2. Caffe model and MIT67 dataset is used for analysis

3. The output of the resulting layer is classified by the Support Vector Machine (SVM).

4. Procedure image retrieval: K=7 (top 7) images with lowest Euclidian distance from the query is retrieved.

Results of image retrieval

Queries

Layer 11

Layer 20

Layer 11

Layer 20

Source: Multi Scale Recognition with DAG-CNNs by Yang & Ramanan

Classification of result accuracies using information for each layer

1. Next figure is an accuracy bar graph

2. Caffe model and MIT67 dataset is used for analysis

3. Features of each layer is sent to the SVM classifier separately and their accuracy is presented in the graph.

4. Notice the jump of accuracy at each ReLU layer.

Classification of result accuracies using information for each layer

Source: Multi Scale Recognition with DAG-CNNs by Yang & Ramanan

Map of the accuracy rate of the respective layers and classes

1. Next figure is a heat map of accuracy corresponding to each layer.

2. The classification was carried out for all classes (MIT67).

3. Classes are grouped together to show which layer information works best for them.

Map of the accuracy rate of the respective layers and classes

The most useful

Source: Multi Scale Recognition with DAG-CNNs by Yang & Ramanan

Outline

1. Overview of paper

2. Convolutional Neural Network comparison

3. Comparison of feature maps in different levels

4. DAG-CNNs

5. Experiments

6. Conclusion

DAG-CNNs (Directed acyclic graph)

1x1xF features 1x1xF features1x1xF features

F = number of classes

Source: Multi Scale Recognition with DAG-CNNs by Yang & Ramanan

Learning of DAG-CNNs

• The following formula is used at the classifier

input Label indicatorError function

Weight of Convolution

K = Layer number , x = learning data y = learning labelSource: www.slideshare.net ; Multi Scale Recognition

with DAG-CNNs by Daiki Yamamoto

Outline

1. Overview of paper

2. Convolutional Neural Network comparison

3. Comparison of feature maps in different levels

4. DAG-CNNs

5. Experiments

6. Conclusion

Experiments

1. Evaluation by Accuracy of classification data set was carried out .a) SUN397

• 100K image and 397 categories of landscape image

b) MIT67

• 15K images and 67 categories of indoor image

c) Scene15

• 2985 landscape image of indoor and outdoor

Evaluation

Source: Multi Scale Recognition with DAG-CNNs by Yang & Ramanan

Classification Result of MIT67

labels

Source: Multi Scale Recognition with DAG-CNNs by Yang & Ramanan

Outline

1. Overview of paper

2. Convolutional Neural Network comparison

3. Comparison of feature maps in different levels

4. DAG-CNNs

5. Experiments

6. Conclusion

Conclusion

1. Higher level features are not always best for classification.

2. The proposed DAG-CNN network is applicable for any existing single scale model (not just Caffe and Deep19).

3. Improved accuracy recorded in all the tests when DAG-CNN is used.

Thank you for listening

top related