introduction to optimization on smooth manifoldsnboumal/papers/introoptimmanifol… · optimization...

Post on 05-Aug-2020

9 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

An introduction tooptimization on manifolds

min𝑥𝑥∈ℳ

𝑓𝑓(𝑥𝑥)

Nicolas Boumal, July 2020Mathematics @ EPFL

Optimization on smooth manifoldsmin𝑥𝑥𝑓𝑓 𝑥𝑥 subject to 𝑥𝑥 ∈ ℳ

Linear spacesUnconstrained; linear equality constraintsLow rank (matrices, tensors)Recommender systems, large-scale Lyapunov equations, …Orthonormality (Grassmann, Stiefel, rotations)Dictionary learning, SfM, SLAM, PCA, ICA, SBM, Electr. Struct. Comp.…Positivity (positive definiteness, positive orthant)Metric learning, Gaussian mixtures, diffusion tensor imaging, …Symmetry (quotient manifolds)Invariance under group actions

2

A Riemannian structure gives us gradients and Hessians

The essential tools of smooth optimization are defined generally on Riemannian manifolds.

Unified theory, broadly applicable algorithms.

First ideas from the ‘70s.First practical in the ‘90s.

3

5

6

7

8

9

10

manopt.org

With Bamdev Mishra,P.-A. Absil & R. Sepulchre

11

Lead by J. Townsend,N. Koep & S. Weichwald

Lead by Ronny Bergmann

manopt.org

With Bamdev Mishra,P.-A. Absil & R. Sepulchre

12

Lead by J. Townsend,N. Koep & S. Weichwald

Lead by Ronny Bergmann

nicolasboumal.net/book

What is a smooth manifold?

13

Yes

Yes

No

This overview: restricted to embedded submanifolds of linear spaces.

𝑥𝑥

𝑣𝑣

ℳ ⊆ ℰ is an embedded submanifold of codimension 𝑘𝑘 if:There exists a smooth function ℎ:ℰ → 𝐑𝐑𝑘𝑘 such that:

ℳ ≡ ℎ 𝑥𝑥 = 0 and Dℎ 𝑥𝑥 :ℰ → 𝐑𝐑𝑘𝑘 has rank 𝑘𝑘 ∀𝑥𝑥 ∈ ℳ

E.g., sphere: ℎ 𝑥𝑥 = 𝑥𝑥𝑇𝑇𝑥𝑥 − 1 and Dℎ 𝑥𝑥 𝑣𝑣 = 2𝑥𝑥𝑇𝑇𝑣𝑣

These properties allow us to linearize the set:ℎ 𝑥𝑥 + 𝑡𝑡𝑣𝑣 = ℎ 𝑥𝑥 + 𝑡𝑡Dℎ 𝑥𝑥 𝑣𝑣 + 𝑂𝑂 𝑡𝑡2

This is 𝑂𝑂 𝑡𝑡2 iff 𝑣𝑣 ∈ ker Dℎ 𝑥𝑥 .→Tangent spaces: T𝑥𝑥ℳ = ker Dℎ 𝑥𝑥 ,

(subspace of ℰ)

What is a smooth manifold? (2)

14

ℳ ⊆ ℰ is an embedded submanifold of codimension 𝑘𝑘 if:

For each 𝑥𝑥 ∈ ℳ there is a neighborhood 𝑈𝑈 of 𝑥𝑥 in ℰand a smooth function ℎ:𝑈𝑈 → 𝐑𝐑𝑘𝑘 such that:

a) Dℎ 𝑥𝑥 :ℰ → 𝐑𝐑𝑘𝑘 has rank 𝑘𝑘, and

b) ℳ∩𝑈𝑈 = ℎ−1 0 = 𝑦𝑦 ∈ 𝑈𝑈:ℎ 𝑦𝑦 = 0

(Necessary for fixed-rank matrices.)

What is a smooth manifold? (3)

15

Bootstrapping our set of tools

1. Smooth maps2. Differentials of smooth maps3. Vector fields and tangent bundles4. Retractions5. Riemannian metrics6. Riemannian gradients7. Riemannian connections8. Riemannian Hessians9. Riemannian covariant derivatives along curves

16

Smooth maps between manifolds

With ℳ,ℳ′ embedded in ℰ,ℰ′,

Define: a map 𝐹𝐹:ℳ →ℳ′ is smooth if:

There exists a neighborhood 𝑈𝑈 of ℳ in ℰ and a smooth map �𝐹𝐹:𝑈𝑈 → ℰ′ such that 𝐹𝐹 = �𝐹𝐹|ℳ .

We call �𝐹𝐹 a smooth extension of 𝐹𝐹.

17

Differential of a smooth map

18

For �𝐹𝐹:ℰ → ℰ′, we have D �𝐹𝐹 𝑥𝑥 𝑣𝑣 = lim𝑡𝑡→0

�𝐹𝐹 𝑥𝑥+𝑡𝑡𝑡𝑡 − �𝐹𝐹 𝑥𝑥𝑡𝑡

.

For 𝐹𝐹:ℳ →ℳ′ smooth, 𝐹𝐹 𝑥𝑥 + 𝑡𝑡𝑣𝑣 may not be defined.But picking a smooth extension �𝐹𝐹, for 𝑣𝑣 ∈ T𝑥𝑥ℳ we say:

Define: D𝐹𝐹 𝑥𝑥 𝑣𝑣 = lim𝑡𝑡→0

�𝐹𝐹 𝑥𝑥+𝑡𝑡𝑡𝑡 − �𝐹𝐹 𝑥𝑥𝑡𝑡

= D �𝐹𝐹 𝑥𝑥 𝑣𝑣 .

Claim: this does not depend on choice of �𝐹𝐹.Claim: we retain linearity, product rule & chain rule.

Differential of a smooth map (2)

19

We defined D𝐹𝐹 𝑥𝑥 = D �𝐹𝐹 𝑥𝑥 |T𝑥𝑥ℳ . Equivalently:

Claim: for each 𝑣𝑣 ∈ T𝑥𝑥ℳ, there exists a smooth curve 𝑐𝑐:𝐑𝐑 → ℳ such that 𝑐𝑐 0 = 𝑥𝑥 and 𝑐𝑐′ 0 = 𝑣𝑣.

Differential of a smooth map (2)

20

We defined D𝐹𝐹 𝑥𝑥 = D �𝐹𝐹 𝑥𝑥 |T𝑥𝑥ℳ . Equivalently:

Claim: for each 𝑣𝑣 ∈ T𝑥𝑥ℳ, there exists a smooth curve 𝑐𝑐:𝐑𝐑 → ℳ such that 𝑐𝑐 0 = 𝑥𝑥 and 𝑐𝑐′ 0 = 𝑣𝑣.

Define: D𝐹𝐹 𝑥𝑥 𝑣𝑣 = lim𝑡𝑡→0

𝐹𝐹 𝑐𝑐(𝑡𝑡) −𝐹𝐹 𝑥𝑥𝑡𝑡

= 𝐹𝐹 ∘ 𝑐𝑐 ′ 0 .

Vector fields and the tangent bundleA vector field 𝑉𝑉 ties to each 𝑥𝑥 a vector 𝑉𝑉 𝑥𝑥 ∈ T𝑥𝑥ℳ.

What does it mean for 𝑉𝑉 to be smooth?

Define: the tangent bundle of ℳ is the disjoint union of all tangent spaces:

Tℳ = 𝑥𝑥, 𝑣𝑣 : 𝑥𝑥 ∈ ℳ and 𝑣𝑣 ∈ T𝑥𝑥ℳ

Claim: Tℳ is a manifold (embedded in ℰ × ℰ).

𝑉𝑉 is smooth if it is smooth as a map from ℳ to Tℳ.23

Aside: new manifolds from old ones

Given a manifold ℳ, we can create a new manifold by considering its tangent bundle.

Here are other ways to recycle:

• Products of manifolds: ℳ × ℳ′, ℳ𝑛𝑛

• Open subsets of manifolds*• (Quotienting some equivalence relations.)

*For embedded submanifolds: subset topology.24

Retractions: moving around

Given 𝑥𝑥, 𝑣𝑣 ∈ Tℳ, we can move away from 𝑥𝑥 along 𝑣𝑣 using any 𝑐𝑐:𝐑𝐑 →ℳ with 𝑐𝑐 0 = 𝑥𝑥 and 𝑐𝑐′ 0 = 𝑣𝑣.

Retractions are a smooth choice of curves over Tℳ.

Define: A retraction is a smooth map 𝑅𝑅: Tℳ →ℳsuch that if 𝑐𝑐 𝑡𝑡 = 𝑅𝑅 𝑥𝑥, 𝑡𝑡𝑣𝑣 = 𝑅𝑅𝑥𝑥(𝑡𝑡𝑣𝑣)then 𝑐𝑐 0 = 𝑥𝑥 and 𝑐𝑐′ 0 = 𝑣𝑣.

25

Retractions: moving around (2)

Define: a retraction is a smooth map 𝑅𝑅: Tℳ →ℳsuch that if 𝑐𝑐 𝑡𝑡 = 𝑅𝑅 𝑥𝑥, 𝑡𝑡𝑣𝑣 = 𝑅𝑅𝑥𝑥(𝑡𝑡𝑣𝑣),then 𝑐𝑐 0 = 𝑥𝑥 and 𝑐𝑐′ 0 = 𝑣𝑣.

Equivalently: 𝑅𝑅𝑥𝑥 0 = 𝑥𝑥 and D𝑅𝑅𝑥𝑥 0 = Id.

Typical choice: 𝑅𝑅𝑥𝑥 𝑣𝑣 = projection of 𝑥𝑥 + 𝑣𝑣 to ℳ:•ℳ = 𝐑𝐑𝑛𝑛, 𝑅𝑅𝑥𝑥 𝑣𝑣 = 𝑥𝑥 + 𝑣𝑣•ℳ = S𝑛𝑛−1, 𝑅𝑅𝑥𝑥 𝑣𝑣 = 𝑥𝑥+𝑡𝑡

𝑥𝑥+𝑡𝑡•ℳ = 𝐑𝐑𝑟𝑟𝑚𝑚×𝑛𝑛, 𝑅𝑅𝑋𝑋 𝑉𝑉 = SVD𝑟𝑟 𝑋𝑋 + 𝑉𝑉

26

Towards gradients: reminders from 𝐑𝐑𝑛𝑛

The gradient of a smooth 𝑓𝑓:𝐑𝐑𝑛𝑛 → 𝐑𝐑 at 𝑥𝑥 is defined by:

D 𝑓𝑓 𝑥𝑥 𝑣𝑣 = grad 𝑓𝑓 𝑥𝑥 , 𝑣𝑣 for all 𝑣𝑣 ∈ 𝐑𝐑𝑛𝑛.

In particular, with the inner product 𝑢𝑢, 𝑣𝑣 = 𝑢𝑢⊤𝑣𝑣,

grad 𝑓𝑓 𝑥𝑥 𝑖𝑖 = grad 𝑓𝑓 𝑥𝑥 , 𝑒𝑒𝑖𝑖 = D 𝑓𝑓 𝑥𝑥 [𝑒𝑒𝑖𝑖] =𝜕𝜕 𝑓𝑓𝜕𝜕𝑥𝑥𝑖𝑖

𝑥𝑥 .

Note: grad 𝑓𝑓 is a smooth vector field on 𝐑𝐑𝑛𝑛.27

Riemannian metrics

T𝑥𝑥ℳ is a linear space (subspace of ℰ).

Pick an inner product ⋅,⋅ 𝑥𝑥 for each T𝑥𝑥ℳ.

Define: ⋅,⋅ 𝑥𝑥 defines a Riemannian metric on ℳ if for any two smooth vector fields 𝑉𝑉,𝑊𝑊 the function 𝑥𝑥 ↦ 𝑉𝑉(𝑥𝑥),𝑊𝑊(𝑥𝑥) 𝑥𝑥 is smooth.

A Riemannian manifold is a manifold with a Riemannian metric.

28

Riemannian submanifolds

Let ⋅,⋅ be the inner product on ℰ.

Since T𝑥𝑥ℳ is a linear subspace of ℰ, one choice is:

𝑢𝑢, 𝑣𝑣 𝑥𝑥 = 𝑢𝑢, 𝑣𝑣

Claim: this defines a Riemannian metric on ℳ.

With this metric, ℳ is a Riemannian submanifold of ℰ.

29

Riemannian gradient

Let 𝑓𝑓:ℳ → 𝐑𝐑 be smooth on a Riemannian manifold.

Define: the Riemannian gradient of 𝑓𝑓 at 𝑥𝑥 is the unique tangent vector at 𝑥𝑥 such that:

D𝑓𝑓 𝑥𝑥 𝑣𝑣 = grad𝑓𝑓 𝑥𝑥 , 𝑣𝑣 𝑥𝑥 for all 𝑣𝑣 ∈ T𝑥𝑥ℳ

Claim: grad𝑓𝑓 is a smooth vector field.Claim: if 𝑥𝑥 is a local optimum of 𝑓𝑓, grad𝑓𝑓 𝑥𝑥 = 0.

30

Gradients on Riemannian submanifoldsLet 𝑓𝑓 be a smooth extension of 𝑓𝑓. For all 𝑣𝑣 ∈ T𝑥𝑥ℳ:

grad𝑓𝑓 𝑥𝑥 ,𝑣𝑣 𝑥𝑥 = D𝑓𝑓 𝑥𝑥 𝑣𝑣= D 𝑓𝑓 𝑥𝑥 𝑣𝑣 = grad 𝑓𝑓 𝑥𝑥 ,𝑣𝑣

Assume ℳ is a Riemannian submanifold of ℰ.Since ⋅,⋅ = ⋅,⋅ 𝑥𝑥, by uniqueness we conclude:

grad𝑓𝑓 𝑥𝑥 = Proj𝑥𝑥 grad 𝑓𝑓 𝑥𝑥

Proj𝑥𝑥 is the orthogonal projector from ℰ to T𝑥𝑥ℳ.31

A first algorithm: gradient descent

For 𝑓𝑓:𝐑𝐑𝑛𝑛 → 𝐑𝐑, 𝑥𝑥𝑘𝑘+1 = 𝑥𝑥𝑘𝑘 − 𝛼𝛼𝑘𝑘grad 𝑓𝑓 𝑥𝑥𝑘𝑘

For 𝑓𝑓:ℳ → 𝐑𝐑, 𝑥𝑥𝑘𝑘+1 = 𝑅𝑅𝑥𝑥𝑘𝑘 −𝛼𝛼𝑘𝑘grad𝑓𝑓 𝑥𝑥𝑘𝑘

For the analysis, need to understand 𝑓𝑓 𝑥𝑥𝑘𝑘+1 :

The composition 𝑓𝑓 ∘ 𝑅𝑅𝑥𝑥: T𝑥𝑥ℳ → 𝐑𝐑 is on a linear space, hence we may Taylor expand it.

32

A first algorithm: gradient descent𝑥𝑥𝑘𝑘+1 = 𝑅𝑅𝑥𝑥𝑘𝑘 −𝛼𝛼𝑘𝑘grad𝑓𝑓 𝑥𝑥𝑘𝑘

The composition 𝑓𝑓 ∘ 𝑅𝑅𝑥𝑥: T𝑥𝑥ℳ → 𝐑𝐑 is on a linear space, hence we may Taylor expand it:

𝑓𝑓 𝑅𝑅𝑥𝑥 𝑣𝑣 = 𝑓𝑓 𝑅𝑅𝑥𝑥 0 + grad 𝑓𝑓 ∘ 𝑅𝑅𝑥𝑥 0 , 𝑣𝑣 𝑥𝑥 + 𝑂𝑂 𝑣𝑣 𝑥𝑥2

= 𝑓𝑓 𝑥𝑥 + grad𝑓𝑓 𝑥𝑥 , 𝑣𝑣 𝑥𝑥 + 𝑂𝑂 𝑣𝑣 𝑥𝑥2

Indeed: D 𝑓𝑓 ∘ 𝑅𝑅𝑥𝑥 0 𝑣𝑣 = D𝑓𝑓 𝑅𝑅𝑥𝑥 0 D𝑅𝑅𝑥𝑥 0 𝑣𝑣 = D𝑓𝑓 𝑥𝑥 𝑣𝑣 .

33

Gradient descent on ℳA1 𝑓𝑓 𝑥𝑥 ≥ 𝑓𝑓low for all 𝑥𝑥 ∈ ℳA2 𝑓𝑓 𝑅𝑅𝑥𝑥 𝑣𝑣 ≤ 𝑓𝑓 𝑥𝑥 + 𝑣𝑣, grad𝑓𝑓 𝑥𝑥 + 𝐿𝐿

2𝑣𝑣 2

Algorithm: 𝑥𝑥𝑘𝑘+1 = 𝑅𝑅𝑥𝑥𝑘𝑘 − 1𝐿𝐿

grad𝑓𝑓(𝑥𝑥𝑘𝑘)

Complexity: grad𝑓𝑓(𝑥𝑥𝑘𝑘) ≤ 𝜀𝜀 with some 𝑘𝑘 ≤ 2𝐿𝐿 𝑓𝑓 𝑥𝑥0 − 𝑓𝑓low1𝜀𝜀2

A2 ⇒ 𝑓𝑓 𝑥𝑥𝑘𝑘+1 ≤ 𝑓𝑓 𝑥𝑥𝑘𝑘 −1𝐿𝐿

grad𝑓𝑓 𝑥𝑥𝑘𝑘 2 +12𝐿𝐿

grad𝑓𝑓(𝑥𝑥𝑘𝑘) 2

⇒ 𝑓𝑓 𝑥𝑥𝑘𝑘 − 𝑓𝑓 𝑥𝑥𝑘𝑘+1 ≥12𝐿𝐿

grad𝑓𝑓 𝑥𝑥𝑘𝑘 2

𝐀𝐀𝐀𝐀 ⇒ 𝑓𝑓 𝑥𝑥0 − 𝑓𝑓low ≥ ∑𝑘𝑘=0𝐾𝐾−1 𝑓𝑓 𝑥𝑥𝑘𝑘 − 𝑓𝑓 𝑥𝑥𝑘𝑘+1 > 𝜀𝜀2

2𝐿𝐿𝐾𝐾

34

𝑥𝑥

𝑅𝑅𝑥𝑥 𝑣𝑣𝑣𝑣

(for contradiction)

Tips and tricks to get the gradient

Use definition as starting point: D𝑓𝑓 𝑥𝑥 𝑣𝑣 = ⋯

Cheap gradient principle

Numerical check of gradient: Taylor 𝑡𝑡 ↦ 𝑓𝑓 𝑅𝑅𝑥𝑥 𝑡𝑡𝑣𝑣Manopt: checkgradient(problem)

Automatic differentiation: Python, JuliaNot Matlab :/—this being said, for theory, often need to manipulate gradient “on paper” anyway.

35

On to second-order methods

Consider 𝑓𝑓:𝐑𝐑𝑛𝑛 → 𝐑𝐑 smooth. Taylor says:

𝑓𝑓 𝑥𝑥 + 𝑣𝑣 ≈ 𝑓𝑓 𝑥𝑥 + grad 𝑓𝑓 𝑥𝑥 ,𝑣𝑣 +12𝑣𝑣, Hess 𝑓𝑓 𝑥𝑥 𝑣𝑣

If Hess 𝑓𝑓 𝑥𝑥 ≻ 0, quadratic model minimized for 𝑣𝑣 s.t.:

Hess 𝑓𝑓 𝑥𝑥 𝑣𝑣 = −grad 𝑓𝑓 𝑥𝑥

From there, we can construct Newton’s method etc.36

Towards Hessians: reminders from 𝐑𝐑𝑛𝑛

Consider 𝑓𝑓:𝐑𝐑𝑛𝑛 → 𝐑𝐑 smooth.

The Hessian of 𝑓𝑓 at 𝑥𝑥 is a linear operator which tells us how the gradient vector field of 𝑓𝑓 varies:

Hess 𝑓𝑓 𝑥𝑥 𝑣𝑣 = Dgrad 𝑓𝑓 𝑥𝑥 𝑣𝑣

With 𝑢𝑢, 𝑣𝑣 = 𝑢𝑢⊤𝑣𝑣, yields: Hess 𝑓𝑓 𝑥𝑥 𝑖𝑖𝑖𝑖 = 𝜕𝜕2 𝑓𝑓𝜕𝜕𝑥𝑥𝑖𝑖𝜕𝜕𝑥𝑥𝑗𝑗

(𝑥𝑥).

Notice that Hess 𝑓𝑓 𝑥𝑥 𝑣𝑣 is a vector in 𝐑𝐑𝑛𝑛.

37

A difficulty on manifolds

A smooth vector field 𝑉𝑉 on ℳ is a smooth map:we already have a notion of how to differentiate it.

Example: with 𝑓𝑓 𝑥𝑥 = 12𝑥𝑥⊤𝐴𝐴𝑥𝑥 on the sphere,

𝑉𝑉 𝑥𝑥 = grad𝑓𝑓 𝑥𝑥 = Proj𝑥𝑥 𝐴𝐴𝑥𝑥 = 𝐴𝐴𝑥𝑥 − 𝑥𝑥⊤𝐴𝐴𝑥𝑥 𝑥𝑥

D𝑉𝑉 𝑥𝑥 𝑢𝑢 = ⋯ = Proj𝑥𝑥 𝐴𝐴𝑢𝑢 − 𝑥𝑥⊤𝐴𝐴𝑥𝑥 𝑢𝑢 − 𝑥𝑥⊤𝐴𝐴𝑢𝑢 𝑥𝑥

Issue: D𝑉𝑉 𝑥𝑥 𝑢𝑢 may not be a tangent vector at 𝑥𝑥!

38

Connections:A tool to differentiate vector fieldsLet 𝒳𝒳 ℳ be the set of smooth vector fields on ℳ.Given 𝑈𝑈 ∈ 𝒳𝒳 ℳ and smooth 𝑓𝑓, 𝑈𝑈𝑓𝑓 𝑥𝑥 = D𝑓𝑓 𝑥𝑥 [𝑈𝑈(𝑥𝑥)].

A map 𝛻𝛻:𝒳𝒳 ℳ × 𝒳𝒳 ℳ → 𝒳𝒳 ℳ is a connection if:1. 𝛻𝛻𝑓𝑓𝑓𝑓+𝑔𝑔𝑔𝑔 𝑉𝑉 = 𝑓𝑓𝛻𝛻𝑓𝑓𝑉𝑉 + 𝑔𝑔𝛻𝛻𝑔𝑔𝑉𝑉2. 𝛻𝛻𝑓𝑓 𝑎𝑎𝑉𝑉 + 𝑏𝑏𝑊𝑊 = 𝑎𝑎𝛻𝛻𝑓𝑓𝑉𝑉 + 𝑏𝑏𝛻𝛻𝑓𝑓𝑊𝑊3. 𝛻𝛻𝑓𝑓 𝑓𝑓𝑉𝑉 = 𝑈𝑈𝑓𝑓 𝑉𝑉 + 𝑓𝑓𝛻𝛻𝑓𝑓𝑉𝑉

Example: for ℳ = ℰ, 𝛻𝛻𝑓𝑓𝑉𝑉 𝑥𝑥 = D𝑉𝑉 𝑥𝑥 𝑈𝑈 𝑥𝑥Example: for ℳ ⊆ ℰ, 𝛻𝛻𝑓𝑓𝑉𝑉 𝑥𝑥 = Proj𝑥𝑥 D𝑉𝑉 𝑥𝑥 𝑈𝑈 𝑥𝑥

39

Riemannian connections:A unique and favorable choiceLet ℳ be a Riemannian manifold.Claim: there exists a unique connection 𝛻𝛻 on ℳ s.t.:

4. 𝛻𝛻𝑓𝑓𝑉𝑉 − 𝛻𝛻𝑉𝑉𝑈𝑈 𝑓𝑓 = 𝑈𝑈 𝑉𝑉𝑓𝑓 − 𝑉𝑉 𝑈𝑈𝑓𝑓5. 𝑈𝑈 𝑉𝑉,𝑊𝑊 = 𝛻𝛻𝑓𝑓𝑉𝑉,𝑊𝑊 + 𝑉𝑉,𝛻𝛻𝑓𝑓𝑊𝑊

It is called the Riemannian connection (Levi-Civita).

Claim: if ℳ is a Riemannian submanifold of ℰ, then

𝛻𝛻𝑓𝑓𝑉𝑉 𝑥𝑥 = Proj𝑥𝑥 D𝑉𝑉 𝑥𝑥 𝑈𝑈 𝑥𝑥

is the Riemannian connection on ℳ.40

Riemannian HessiansClaim: 𝛻𝛻𝑓𝑓𝑉𝑉 𝑥𝑥 depends on 𝑈𝑈 only through 𝑈𝑈(𝑥𝑥).This justifies the notation 𝛻𝛻𝑢𝑢𝑉𝑉; e.g.: 𝛻𝛻𝑢𝑢𝑉𝑉 = Proj𝑥𝑥 D𝑉𝑉 𝑥𝑥 𝑢𝑢

Define: the Riemannian Hessian of 𝑓𝑓:ℳ → 𝐑𝐑 at 𝑥𝑥is a linear operator from T𝑥𝑥ℳ to T𝑥𝑥ℳ defined by:

Hess𝑓𝑓 𝑥𝑥 𝑢𝑢 = 𝛻𝛻𝑢𝑢grad𝑓𝑓

where 𝛻𝛻 is the Riemannian connection.

Claim: Hess𝑓𝑓 𝑥𝑥 is self-adjoint.Claim: if 𝑥𝑥 is a local minimum, then Hess𝑓𝑓 𝑥𝑥 ≽ 0.

41

Hessians on Riemannian submanifolds

Hess𝑓𝑓 𝑥𝑥 𝑢𝑢 = 𝛻𝛻𝑢𝑢grad𝑓𝑓(𝑥𝑥)

On a Riemannian submanifold of a linear space,

𝛻𝛻𝑢𝑢𝑉𝑉 = Proj𝑥𝑥 D𝑉𝑉 𝑥𝑥 𝑢𝑢

Combining:

Hess𝑓𝑓 𝑥𝑥 𝑢𝑢 = Proj𝑥𝑥 Dgrad𝑓𝑓 𝑥𝑥 𝑢𝑢

42

Hessians on Riemannian submanifolds (2)

Hess𝑓𝑓 𝑥𝑥 𝑢𝑢 = Proj𝑥𝑥 Dgrad𝑓𝑓 𝑥𝑥 𝑢𝑢

Example: 𝑓𝑓 𝑥𝑥 = 12𝑥𝑥⊤𝐴𝐴𝑥𝑥 on the sphere in 𝐑𝐑𝑛𝑛.

𝑉𝑉 𝑥𝑥 = grad𝑓𝑓 𝑥𝑥 = Proj𝑥𝑥 𝐴𝐴𝑥𝑥 = 𝐴𝐴𝑥𝑥 − 𝑥𝑥⊤𝐴𝐴𝑥𝑥 𝑥𝑥

D𝑉𝑉 𝑥𝑥 𝑢𝑢 = Proj𝑥𝑥 𝐴𝐴𝑢𝑢 − 𝑥𝑥⊤𝐴𝐴𝑥𝑥 𝑢𝑢 − 𝑥𝑥⊤𝐴𝐴𝑢𝑢 𝑥𝑥

Hess𝑓𝑓 𝑥𝑥 𝑢𝑢 = Proj𝑥𝑥 𝐴𝐴𝑢𝑢 − 𝑥𝑥⊤𝐴𝐴𝑥𝑥 𝑢𝑢

Remarkably, grad𝑓𝑓 𝑥𝑥 = 0 and Hess𝑓𝑓 𝑥𝑥 ≽ 0 iff 𝑥𝑥 optimal.

43

Newton, Taylor and RiemannNow that we have a Hessian, we might guess:

𝑓𝑓 𝑅𝑅𝑥𝑥 𝑣𝑣 ≈ 𝑚𝑚𝑥𝑥 𝑣𝑣 = 𝑓𝑓 𝑥𝑥 + grad𝑓𝑓 𝑥𝑥 , 𝑣𝑣 𝑥𝑥 +12𝑣𝑣, Hess𝑓𝑓 𝑥𝑥 𝑣𝑣 𝑥𝑥

If Hess𝑓𝑓 𝑥𝑥 is invertible, 𝑚𝑚𝑥𝑥: T𝑥𝑥ℳ → 𝐑𝐑 has one critical point, solution of this linear system:

Hess𝑓𝑓 𝑥𝑥 𝑣𝑣 = −grad𝑓𝑓 𝑥𝑥 for 𝑣𝑣 ∈ T𝑥𝑥ℳ

Newton’s method on ℳ: 𝑥𝑥next = 𝑅𝑅𝑥𝑥 𝑣𝑣 .

Claim: if Hess𝑓𝑓 𝑥𝑥∗ ≻ 0, quadratic local convergence.44

We need one more tool…

The truncated expansion

𝑓𝑓 𝑅𝑅𝑥𝑥 𝑣𝑣 ≈ 𝑓𝑓 𝑥𝑥 + grad𝑓𝑓 𝑥𝑥 ,𝑣𝑣 𝑥𝑥 +12𝑣𝑣, Hess𝑓𝑓 𝑥𝑥 𝑣𝑣 𝑥𝑥

is not quite correct (well, not always…)

To see why, let’s expand 𝑓𝑓 along some smooth curve.

45

We need one more tool… (2)Let’s expand 𝑓𝑓 along some smooth curve.

With 𝑐𝑐:𝐑𝐑 →ℳ s.t. 𝑐𝑐 0 = 𝑥𝑥 and 𝑐𝑐′ 0 = 𝑣𝑣, the composition 𝑔𝑔 = 𝑓𝑓 ∘ 𝑐𝑐 maps 𝐑𝐑 → 𝐑𝐑, so Taylor holds:

𝑔𝑔 𝑡𝑡 ≈ 𝑔𝑔 0 + 𝑡𝑡𝑔𝑔′ 0 +𝑡𝑡2

2𝑔𝑔′′ 0

𝑔𝑔 0 = 𝑓𝑓 𝑥𝑥

𝑔𝑔′ 𝑡𝑡 = D𝑓𝑓 𝑐𝑐 𝑡𝑡 𝑐𝑐′ 𝑡𝑡 = grad𝑓𝑓 𝑐𝑐 𝑡𝑡 , 𝑐𝑐′ 𝑡𝑡 𝑐𝑐(𝑡𝑡)

𝑔𝑔′ 0 = grad𝑓𝑓 𝑥𝑥 , 𝑣𝑣 𝑥𝑥

𝑔𝑔′′ 0 = ? ? ?46

Differentiating vector fields along curves

A map 𝑍𝑍:𝐑𝐑 → Tℳ is a vector field along 𝑐𝑐:𝐑𝐑 →ℳ if 𝑍𝑍(𝑡𝑡)is a tangent vector at 𝑐𝑐(𝑡𝑡).Let 𝒳𝒳 𝑐𝑐 denote the set of smooth such fields.

Claim: there exists a unique operator Dd𝑡𝑡

:𝒳𝒳 𝑐𝑐 → 𝒳𝒳 𝑐𝑐 s.t.1. D

d𝑡𝑡𝑎𝑎𝑎𝑎 + 𝑏𝑏𝑍𝑍 = 𝑎𝑎 D

d𝑡𝑡𝑎𝑎 + 𝑏𝑏 D

d𝑡𝑡𝑍𝑍

2. Dd𝑡𝑡

𝑔𝑔𝑍𝑍 = 𝑔𝑔′𝑍𝑍 + 𝑔𝑔 Dd𝑡𝑡𝑍𝑍

3. Dd𝑡𝑡

𝑈𝑈 𝑐𝑐(𝑡𝑡) = 𝛻𝛻𝑐𝑐′(𝑡𝑡)𝑈𝑈

4. dd𝑡𝑡

𝑎𝑎 𝑡𝑡 ,𝑍𝑍 𝑡𝑡 𝑐𝑐(𝑡𝑡) = Dd𝑡𝑡𝑎𝑎 𝑡𝑡 ,𝑍𝑍

𝑐𝑐(𝑡𝑡)+ 𝑎𝑎 𝑡𝑡 , D

d𝑡𝑡𝑍𝑍 𝑡𝑡

𝑐𝑐(𝑡𝑡)

where 𝛻𝛻 is the Riemannian connection.47

Differentiating fields along curves (2)

Claim: there exists a unique operator Dd𝑡𝑡

:𝒳𝒳 𝑐𝑐 → 𝒳𝒳 𝑐𝑐 s.t.1. D

d𝑡𝑡𝑎𝑎𝑎𝑎 + 𝑏𝑏𝑍𝑍 = 𝑎𝑎 D

d𝑡𝑡𝑎𝑎 + 𝑏𝑏 D

d𝑡𝑡𝑍𝑍

2. Dd𝑡𝑡

𝑔𝑔𝑍𝑍 = 𝑔𝑔′𝑍𝑍 + 𝑔𝑔 Dd𝑡𝑡𝑍𝑍

3. Dd𝑡𝑡

𝑈𝑈 𝑐𝑐(𝑡𝑡) = 𝛻𝛻𝑐𝑐′(𝑡𝑡)𝑈𝑈

4. dd𝑡𝑡

𝑎𝑎 𝑡𝑡 ,𝑍𝑍 𝑡𝑡 𝑐𝑐(𝑡𝑡) = Dd𝑡𝑡𝑎𝑎 𝑡𝑡 ,𝑍𝑍

𝑐𝑐(𝑡𝑡)+ 𝑎𝑎 𝑡𝑡 , D

d𝑡𝑡𝑍𝑍 𝑡𝑡

𝑐𝑐(𝑡𝑡)

where 𝛻𝛻 is the Riemannian connection.

Claim: if ℳ is a Riemannian submanifold of ℰ,Dd𝑡𝑡𝑍𝑍 𝑡𝑡 = Proj𝑐𝑐 𝑡𝑡

dd𝑡𝑡𝑍𝑍 𝑡𝑡

48

With 𝑔𝑔 𝑡𝑡 = 𝑓𝑓 𝑐𝑐 𝑡𝑡 and 𝑔𝑔′ 𝑡𝑡 = grad𝑓𝑓 𝑐𝑐 𝑡𝑡 , 𝑐𝑐′ 𝑡𝑡 𝑐𝑐(𝑡𝑡):

𝑔𝑔′′ 𝑡𝑡 =Dd𝑡𝑡

grad𝑓𝑓 𝑐𝑐 𝑡𝑡 , 𝑐𝑐′ 𝑡𝑡𝑐𝑐(𝑡𝑡)

+ grad𝑓𝑓 𝑐𝑐 𝑡𝑡 ,Dd𝑡𝑡𝑐𝑐′ 𝑡𝑡

𝑐𝑐(𝑡𝑡)

= 𝛻𝛻𝑐𝑐′ 𝑡𝑡 grad𝑓𝑓, 𝑐𝑐′ 𝑡𝑡𝑐𝑐(𝑡𝑡)

+ grad𝑓𝑓 𝑐𝑐 𝑡𝑡 , 𝑐𝑐′′(𝑡𝑡) 𝑐𝑐(𝑡𝑡)

𝑔𝑔′′(0) = Hess𝑓𝑓 𝑥𝑥 𝑣𝑣 , 𝑣𝑣 𝑥𝑥 + grad𝑓𝑓 𝑥𝑥 , 𝑐𝑐′′(0) 𝑥𝑥

49

1. Dd𝑡𝑡

𝑎𝑎𝑎𝑎 + 𝑏𝑏𝑍𝑍 = 𝑎𝑎 Dd𝑡𝑡𝑎𝑎 + 𝑏𝑏 D

d𝑡𝑡𝑍𝑍

2. Dd𝑡𝑡

𝑔𝑔𝑍𝑍 = 𝑔𝑔′𝑍𝑍 + 𝑔𝑔 Dd𝑡𝑡𝑍𝑍

3. Dd𝑡𝑡

𝑈𝑈 𝑐𝑐(𝑡𝑡) = 𝛻𝛻𝑐𝑐′(𝑡𝑡)𝑈𝑈

4. dd𝑡𝑡

𝑎𝑎 𝑡𝑡 ,𝑍𝑍 𝑡𝑡 𝑐𝑐(𝑡𝑡) = Dd𝑡𝑡𝑎𝑎 𝑡𝑡 ,𝑍𝑍

𝑐𝑐(𝑡𝑡)+ 𝑎𝑎 𝑡𝑡 , D

d𝑡𝑡𝑍𝑍 𝑡𝑡

𝑐𝑐(𝑡𝑡)

With 𝑔𝑔 𝑡𝑡 = 𝑓𝑓 𝑐𝑐 𝑡𝑡 and 𝑔𝑔′ 𝑡𝑡 = grad𝑓𝑓 𝑐𝑐 𝑡𝑡 , 𝑐𝑐′ 𝑡𝑡 𝑐𝑐(𝑡𝑡):

𝑔𝑔′′ 𝑡𝑡 =Dd𝑡𝑡

grad𝑓𝑓 𝑐𝑐 𝑡𝑡 , 𝑐𝑐′ 𝑡𝑡𝑐𝑐(𝑡𝑡)

+ grad𝑓𝑓 𝑐𝑐 𝑡𝑡 ,Dd𝑡𝑡𝑐𝑐′ 𝑡𝑡

𝑐𝑐(𝑡𝑡)

= 𝛻𝛻𝑐𝑐′ 𝑡𝑡 grad𝑓𝑓, 𝑐𝑐′ 𝑡𝑡𝑐𝑐(𝑡𝑡)

+ grad𝑓𝑓 𝑐𝑐 𝑡𝑡 , 𝑐𝑐′′(𝑡𝑡) 𝑐𝑐(𝑡𝑡)

𝑔𝑔′′(0) = Hess𝑓𝑓 𝑥𝑥 𝑣𝑣 , 𝑣𝑣 𝑥𝑥 + grad𝑓𝑓 𝑥𝑥 , 𝑐𝑐′′(0) 𝑥𝑥

𝑓𝑓 𝑐𝑐 𝑡𝑡 = 𝑓𝑓 𝑥𝑥 + 𝑡𝑡 ⋅ grad𝑓𝑓 𝑥𝑥 , 𝑣𝑣 𝑥𝑥 +𝑡𝑡2

2⋅ Hess𝑓𝑓 𝑥𝑥 𝑣𝑣 , 𝑣𝑣 𝑥𝑥

+𝑡𝑡2

2⋅ grad𝑓𝑓 𝑥𝑥 , 𝑐𝑐′′ 0 𝑥𝑥 + 𝑂𝑂 𝑡𝑡3

The annoying term vanishes at critical points and for specialcurves (special retractions). Mostly fine for optimization.

50

Trust-region method:Newton’s with a safeguardWith the same tools, we can design a Riemanniantrust-region method: RTR (Absil, Baker & Gallivan ’07).

Approximately minimizes quadratic model of 𝑓𝑓 ∘ 𝑅𝑅𝑥𝑥𝑘𝑘restricted to a ball in T𝑥𝑥𝑘𝑘ℳ, with dynamic radius.

Complexity known. Excellent performance, also withapproximate Hessian (e.g., finite differences of grad𝑓𝑓).

In Manopt, call trustregions(problem).

51

In the lecture notes:• Proofs for all claims in these slides• References to the (growing) literature• Short descriptions of applications• Fully worked out manifolds

E.g.: Stiefel, fixed-rank matrices, general 𝑥𝑥 ∈ ℰ:ℎ 𝑥𝑥 = 0• Details about computation, pitfalls and tricks

E.g.: how to compute gradients, checkgradient, checkhessian

• Theory for general manifolds• Theory for quotient manifolds• Discussion of more advanced geometric tools

E.g.: distance, exp, log, transports, Lipschitz, finite differences• Basics of geodesic convexity

52

nicolasboumal.net/book

top related