tutorial deep learning - cosmostatdlm.cosmostat.org/wp-content/uploads/2017/09/dl_part2... · 2017....
TRANSCRIPT
![Page 1: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/1.jpg)
Tutorial Deep Learning :Unsupervised Feature Learning
Joana Frontera-Pons
4th September 2017 - Workshop Dictionary Learning on Manifolds
![Page 2: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/2.jpg)
Introduction Representation Learning
TensorFlow Examples
OUTLINE
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons
![Page 3: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/3.jpg)
DEEP LEARNING§ Deep Learning : Unsupervised Learning methods that can learn invariant
features hierarchies,
§ Non-linear representations obtained with deep layer structures allow to bring out complex relationships and disentangle the variation factors of the inputs,
§ How? - AutoEncoders, ConvNets, Deep Belief Networks,… § What kind of representations the model can extract?
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons
![Page 4: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/4.jpg)
DEEP LEARNINGHidden layer 1 Hidden layer 2 Hidden layer 3
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons
![Page 5: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/5.jpg)
MOTIVATION
Introduction Long-term goals Medium-term goals Applications and Industrial collaboration Teaching project
MotivationMotivation
⇤ Linear representations are frequently used to model images:
x = D �
x is the original signal or image,
D is a transformation matrix (for e.g. Discrete cosine transform,
Wavelets transform, Principal Component Analysis, K-SVD dictionary),
� is the feature vector.
⇤ In this context, linear representations are the most widely spreadapproaches for denoising, compression, inverse problems solving, ...
� They fail at capturing some common variations in the data such astranslation, rotation, zoom
New adaptive non-linear representations
J.Frontera-Pons
Visit to the CMM - Research and Integration Project
3/ 3
Motivation
⇤ Linear representations are frequently used to model images:
x = D �
x is the original signal or image,
D is a transformation matrix (for e.g. Discrete cosine transform,
Wavelets transform, Principal Component Analysis, K-SVD dictionary),
� is the feature vector.
⇤ In this context, linear representations are the most widely spreadapproaches for denoising, compression, inverse problems solving, ...
� They fail at capturing some common variations in the data such astranslation, rotation, zoom
New adaptive non-linear representations
J.Frontera-Pons
Visit to the CMM - Research and Integration Project
3/ 3
Motivation
⇤ Linear representations are frequently used to model images:
x = D �
x is the original signal or image,
D is a transformation matrix (for e.g. Discrete cosine transform,
Wavelets transform, Principal Component Analysis, K-SVD dictionary),
� is the feature vector.
⇤ In this context, linear representations are the most widely spreadapproaches for denoising, compression, inverse problems solving, ...
� They fail at capturing some common variations in the data such astranslation, rotation, zoom
New adaptive non-linear representations
J.Frontera-Pons
Visit to the CMM - Research and Integration Project
3/ 3
J.Frontera-Pons
Visit to the CMM - Research and Integration Project
3/ 12
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons
![Page 6: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/6.jpg)
AUTOENCODERS
Autoencoders are artificial neural networks capable of learning efficient representations of the input data without supervision.
They are powerful feature extractors,
They work by learning a to copy their inputs to their outputs,
We have to constrain the network to prevent the model to learn the identity : limit the size of the representation, add noise,…
They find their purpose in : § dimensionality reductcion, § feature extraction § unsupervised pretraining, § or as generative models.
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons
![Page 7: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/7.jpg)
AUTOENCODERS
Input : x 2 Rd
Encoder Decoder
Hidden representation :
Reconstruction Error :
h 2 Rnhid
Reconstruction : ̂x 2 Rd
f✓(x) = �(WTf x+ bf )
L(x, x̂)- MeanSquared Error - Binary Cross-Entropy
g✓(h) = �(WTg h+ bg)
§ Basic Autoencoders : Parametric encoding function from inputs to their representations, and a decoding function that maps back to input space,
§ Train the model in order to reconstruct as accurately as possible the input.
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons
![Page 8: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/8.jpg)
WEIGHT FILTERS
PCA DAE
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons
![Page 9: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/9.jpg)
STACKED AUTOENCODERS§ Greedy layer-wise initialization:
Ø Pre-training all the layers using unsupervised feature learning, Ø Add one level at a time, Ø Build hierarchy of representations.
§ Stacking layers of autoencoders : Ø Pairs of encoder/decoder combined to form a global encoder and global
decoder, Ø Deep autoencoders jointly trained and optimized for an overall
reconstruction error.
Inputs
Outputs
Hidden 3
Hidden 1
Hidden 2
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons
![Page 10: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/10.jpg)
DEDALE TUB MEETING 19/04/2016 J. FRONTERA-‐PONS -‐ DEEP LEARNING 10
REGULARIZED AUTOENCODERS
§ Basic Autoencoders may learn the identity function to minimize the reconstruction error,
§ Impose some constraint on the code to force the representation to be insensitive to local variations Regularization,
§ Examples: Contractive AE, Sparse AE, DAE, etc.
Modify the training objective to retrieve a clean input from an artificially corrupted version of it, Make the transformation robust to small random perturbations in the input, Corruption noise: Gaussian, salt and pepper, masking, or adaptive corruption process.
DENOISING AUTOENCODERS [Vincent 2008]
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons
![Page 11: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/11.jpg)
DEDALE TUB MEETING 19/04/2016 J. FRONTERA-‐PONS -‐ DEEP LEARNING 11
DENOISING AUTOENCODERS
x
x̃
x̃
x̂
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons
![Page 12: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/12.jpg)
BASICS TENSORFLOW
§ TensorFlow is an open source software library for numerical computation developed by the Google Brain team,
§ The user defines in Python a graph of computations and then TF takes that graph and runs it efficiently using optimised C++.
+
+x
x y 2
Operation
Variable Constantx
f(x, y) = x
2y + y + 2
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons
![Page 13: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/13.jpg)
BASICS TENSORFLOW
TensorFlow programs are typically split into two parts: § Construction phase: Builds a computation graph representing the
machine learning model and the computations required to train it. § Execution phase: Runs a loop that evaluates a training step repeatedly.
TensorBoard example
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons
![Page 14: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/14.jpg)
EXPERIMENTAL RESULTS
§ Mixed National Institute of Standards and Technology database (MNIST) is a large database of handwritten digits,
§ Training set containing 60000 samples and test set 1000 samples,
BUILD FEATURES THAT CAPTURE THE VARIATIONS ALONG THE DIFFERENT DIGITS
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons
![Page 15: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/15.jpg)
Encode Decode
x hx̂
EXPERIMENTAL RESULTS
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons
![Page 16: Tutorial Deep Learning - CosmoStatdlm.cosmostat.org/wp-content/uploads/2017/09/DL_Part2... · 2017. 9. 4. · TensorFlow Examples OUTLINE Tutorial Unsupervised Feature Learning -](https://reader035.vdocuments.us/reader035/viewer/2022063019/5fe11660f80d3d391f1f0a6c/html5/thumbnails/16.jpg)
Thank you for your attention!
Tutorial Unsupervised Feature Learning - DLM, Nice, 2017 - J. Frontera-Pons