geometric deep learning on graphs and manifolds · geometric deep learning on graphs and manifolds...
TRANSCRIPT
Geometric Deep Learning on Graphsand Manifolds
Michael Bronstein1,2,3Joan Bruna6Arthur Szlam5,Xavier Bresson4YannLeCun5,6
https://qdata.github.io/deep2Read
Presenter: Arshdeep Sekhon
https://qdata.github.io/deep2Read
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 1 / 41
1 Motivation
2 Basics of Euclidean CNNs
3 BasicsBasics: Graph TheoryBasics: Riemannian manifoldsUsing Dirichlet Energy
4 Spectral Domain CNNs
5 GNNs: Spatial View
6 Spatial Domain Geometric Deep Learning
7 Applications
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 2 / 41
Motivation
What kind of geometric structure found in images/text/etc exploitedby CNNs
How to use this universal property on non euclidean domains
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 3 / 41
Examples of non euclidean domains
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 4 / 41
Some Distinctions?
Domain Structure/Data on a Domain
Fixed Graph vs Varying Graph
Known Graph vs Unknown Graph
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 5 / 41
Basics of Euclidean CNNs
Translational Invariance
Compositionality deformation stability: localization in space,1
constant features O(1) and O(n) computation time
1“each feature extraction in our network is followed by an additional layer whichperforms a local averaging and a sub-sampling, reducing the resolution of the featuremap. This layer introduces a certain level of invariance to distortions and translations.”Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 6 / 41
Euclidean CNNs
defined on euclidean domains or on discrete grids
Grids have the above mentioned properties
inducitve bias for images
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 7 / 41
Main Idea
Extending pooling and conv to non euclidean domains(graphs/manifolds)
assume stationarity and compositionality (find appropriate operatorsfor filtering and pooling)
How to make them fast?
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 8 / 41
Types of Non-Euclidean CNNs
Two types of non euclidean CNNs
Spectral Domain
Spatial Domain
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 9 / 41
Graph Theory
Weighted undirected graph G with vertices V = {1, ..., n},edges E ⊂ V × V
edge weights wij ≥ 0 for (i , j) ∈ E
Functions over the vertices L2(V ) = {f : V → R}Vectors in hilbert space: f = (f 1, . . . , fn), encoding value of functionat every node
Hilbert space with inner product < f , g >L2(V )= Σi∈V figi = f Tg
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 10 / 41
Graph Laplacian
Find geometry of a structure: measure smoothness of a function
The Laplacian measures what you could call the curvature or stress ofthe field.
Unnormalized Laplacian: ∆fi = Σi ,jwij(fi − fj)
difference between f and its local average: fiΣijwij − Σijwij fj
Represented as a positive semi-definite n × n,
∆ = D −W where
W = (wij) and D = diag(Σj 6=iwij)
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 11 / 41
Graph Laplacian
Find geometry of a structure: measure smoothness of a function
The Laplacian measures what you could call the curvature or stress ofthe field.
Unnormalized Laplacian: ∆fi = Σi ,jwij(fi − fj)
difference between f and its local average: fiΣijwij − Σijwij fj
Represented as a positive semi-definite n × n,
∆ = D −W where
W = (wij) and D = diag(Σj 6=iwij)
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 11 / 41
Smoothness of function
Dirichlet Energy: a measure of how much the function f changes overM ⊂ RN
||f ||2G =1
2Σijwij(fi − fj)
2 = f T∆f (1)
measures the smoothness of f (how fast it changes locally)
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 12 / 41
Riemannian manifolds
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 13 / 41
Manifold Laplacian
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 14 / 41
Orthogonal bases on graphs
find class of functions smooth
Find the smoothest orthogonal basis
minφ1Edir (ψ1) s.t.||φ1|| = 1 (2)
similarly find subsequent eigen vectors orthogonal to the previos onesin order of smoothness
Can be reoformulated as:
minφ∈Rn×ntrace(φT∆φ) s.t.φTφ = I (3)
laplacian eigen vectors are the solutions to this equation
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 15 / 41
Laplacian Eigen Vectors
∆ = φΛφT (4)
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 16 / 41
Laplacian Eigen Vectors for Graphs and Manifolds
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 17 / 41
Fourier Analysis on Euclidean Spaces
related to the solution of dirichlet
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 18 / 41
Fourier Analysis on graphs
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 19 / 41
Euclidean Conv Basics
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 20 / 41
Convolution theorem in graphs
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 21 / 41
Convolution theorem in graphs
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 22 / 41
Spectral Convolution
defined by analogy:
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 23 / 41
Issues with Spectral Graph CNN
Not shift-invariant! (G has no circulant structure)
Filter coefficients depend on basis φ1, ..., φn
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 24 / 41
Spectral CNN
Convolution expressed in the spectral domain g = φWφT f
W is n × n diagonal matrix of learnable spectral filter coefficients
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 25 / 41
Issues
Filters are basis-dependent: does not generalize across graphs
O(n) parameters per layer
O(n2) computation of forward and inverse Fourier transforms
No guarantee of spatial localization of filters: free to choose multiplier
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 26 / 41
Localization and Smoothness
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 27 / 41
Examples
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 28 / 41
Graph Pooling
Produce a sequence of coarsened graphs
Max or average pooling of collapsed vertices
Binary tree arrangement of node indices
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 29 / 41
Limitations
Poor generalization across non-isometric domains unless kernels arelocalized
Spectral kernels are isotropic due to rotation invariance of theLaplacian
Only undirected graphs, as symmetry of the Laplacian matrix isassumed
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 30 / 41
Spatial GNNs
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 31 / 41
Spatial and Spectral link
higher the power of r, richer the filter class
but tradeoff between test time and power of filters
Edge decoration
Vertex decoration
Interaction Nets
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 32 / 41
What does GNN look like on a euclidean grid
Graph is a regular lattice
gives isotropic filters
less expressive than a conventional ConvNet
no notion of up and downconv nets have implicit ordering implies edge knowledge
For example, local correlation among pixels /translation, easy toreorder shuffled patches of iimages
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 33 / 41
Geodesic Polar Coordiantes
Aim: Develop conv like operator for manifoldsBut not translation invariant : different patches very different atdifferent points of maifoldUse some local coordinates
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 34 / 41
Convolution on Manifolds
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 35 / 41
Convolution on Manifolds
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 36 / 41
Correspondence I: Local Feature Learning
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 37 / 41
Correspondence II: Labelling
Groundtruth correspondenceπ : X → Y from query shape X to somereference shape Y (discretized with n vertices)
Correspondence = label each query vertex x as reference vertex y
Net output at x after softmax layer= probabilitydistributiononY
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 38 / 41
Correspondence Results
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 39 / 41
Matrix Completion
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 40 / 41
Summary
Spectral vs Spatial Convolution on Non Euclidean Domains: Graphsand Manifolds
Spectral Better if Graph assumed to be similar across samples
Leveraging low dimension structure at tangent planes in manifolds forspectral convolution
Applications
Geometric Deep Learning on Graphs and Manifoldshttps://qdata.github.io/deep2Read 41 / 41