polynomial regression on riemannian manifoldsjacob/pubs/frg2012.pdf · polynomial regression on...

Post on 03-Aug-2020

5 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Polynomial Regression onRiemannian Manifolds

Jacob Hinkle, Tom Fletcher, Sarang Joshi

May 11, 2012

arxiv:1201.2395

Nonparametric RegressionNumber of parameters tied to amount of data present

Example: kernel regression on images using diffeomorphisms(Davis2007)

Polynomial Regression on Riemannian Manifolds 2

Parametric RegressionSmall number of parameters can be estimated more efficiently

Fletcher 2011

Geodesic regression (Niethammer2011, Fletcher2011) hasrecently received attention.

Polynomial Regression on Riemannian Manifolds 3

Polynomial Regression

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

−0.01

−0.005

0

0.005

0.01

0.015

0.02

0.025

0.03

0.035

0.04

Independent Variable

Dep

ende

nt V

aria

ble

Polynomials provide a more flexible framework for parametricregression on Riemannian manifolds

Polynomial Regression on Riemannian Manifolds 4

Riemannian PolynomialsAt least three ways to define polynomial in Rd

Algebraic: γ(t) = c0 + 11!c1t+ 1

2!c2t2 + · · ·+ 1

k!cktk

Variational: γ = argminϕ∫ T0 |(ddt

) k+12 ϕ(t)|2dt s.t. BC/ICs

Differential:(ddt

)k+1γ(t) = 0 s.t. initial conditions

(ddt

)iγ(0) = ci

Covariant derivative: replace ddt of vectors with ∇γ̇

Polynomial Regression on Riemannian Manifolds 5

Riemannian PolynomialsAt least three ways to define polynomial in Rd

Algebraic: γ(t) = c0 + 11!c1t+ 1

2!c2t2 + · · ·+ 1

k!cktk

Variational: γ = argminϕ∫ T0 |(ddt

) k+12 ϕ(t)|2dt s.t. BC/ICs

Differential:(ddt

)k+1γ(t) = 0 s.t. initial conditions

(ddt

)iγ(0) = ci

Covariant derivative: replace ddt of vectors with ∇γ̇

Polynomial Regression on Riemannian Manifolds 5

Riemannian PolynomialsAt least three ways to define polynomial in Rd

Algebraic: γ(t) = c0 + 11!c1t+ 1

2!c2t2 + · · ·+ 1

k!cktk

Variational: γ = argminϕ∫ T0 |(ddt

) k+12 ϕ(t)|2dt s.t. BC/ICs

Differential:(ddt

)k+1γ(t) = 0 s.t. initial conditions

(ddt

)iγ(0) = ci

Covariant derivative: replace ddt of vectors with ∇γ̇

Polynomial Regression on Riemannian Manifolds 5

Riemannian PolynomialsAt least three ways to define polynomial in Rd

Algebraic: γ(t) = c0 + 11!c1t+ 1

2!c2t2 + · · ·+ 1

k!cktk

Variational: γ = argminϕ∫ T0 |(ddt

) k+12 ϕ(t)|2dt s.t. BC/ICs

Differential:(ddt

)k+1γ(t) = 0 s.t. initial conditions

(ddt

)iγ(0) = ci

Covariant derivative: replace ddt of vectors with ∇γ̇

Polynomial Regression on Riemannian Manifolds 5

Riemannian PolynomialsAt least three ways to define polynomial in Rd

Algebraic: γ(t) = c0 + 11!c1t+ 1

2!c2t2 + · · ·+ 1

k!cktk

Variational: γ = argminϕ∫ T0 |(ddt

) k+12 ϕ(t)|2dt s.t. BC/ICs

Differential:(ddt

)k+1γ(t) = 0 s.t. initial conditions

(ddt

)iγ(0) = ci

Covariant derivative: replace ddt of vectors with ∇γ̇

Geodesic (k = 1) has both formsγ = argminϕ

∫ T0 |ϕ̇(t)|2dt

∇γ̇ γ̇ = 0 s.t. initial conditions γ(0), γ̇(0)

Well-studied (Fletcher, Younes, Trouve, …)

Polynomial Regression on Riemannian Manifolds 5

Riemannian PolynomialsAt least three ways to define polynomial in Rd

Algebraic: γ(t) = c0 + 11!c1t+ 1

2!c2t2 + · · ·+ 1

k!cktk

Variational: γ = argminϕ∫ T0 |(ddt

) k+12 ϕ(t)|2dt s.t. BC/ICs

Differential:(ddt

)k+1γ(t) = 0 s.t. initial conditions

(ddt

)iγ(0) = ci

Covariant derivative: replace ddt of vectors with ∇γ̇

Cubic spline satisfies (Noakes1989, Leite, Machado,…)γ = argminϕ

∫ T0 |∇ϕ̇ϕ̇(t)|2dt

Euler-Lagrange equation: (∇γ̇)3γ̇ = R(γ̇,∇γ̇ γ̇)γ̇

Shape splines (Trouve-Vialard)

Polynomial Regression on Riemannian Manifolds 5

Riemannian PolynomialsAt least three ways to define polynomial in Rd

Algebraic: γ(t) = c0 + 11!c1t+ 1

2!c2t2 + · · ·+ 1

k!cktk

Variational: γ = argminϕ∫ T0 |(ddt

) k+12 ϕ(t)|2dt s.t. BC/ICs

Differential:(ddt

)k+1γ(t) = 0 s.t. initial conditions

(ddt

)iγ(0) = ci

Covariant derivative: replace ddt of vectors with ∇γ̇

k-order polynomial satisfies(∇γ̇)k γ̇ = 0subject to initial conditions γ(0), (∇γ̇)iγ̇(0), i = 0, . . . , k − 1

Introduced via rolling maps by Jupp&Kent1987Studied by Leite (2008), in rolling map setting

Polynomial Regression on Riemannian Manifolds 5

Riemannian PolynomialsAt least three ways to define polynomial in Rd

Algebraic: γ(t) = c0 + 11!c1t+ 1

2!c2t2 + · · ·+ 1

k!cktk

Variational: γ = argminϕ∫ T0 |(ddt

) k+12 ϕ(t)|2dt s.t. BC/ICs

Differential:(ddt

)k+1γ(t) = 0 s.t. initial conditions

(ddt

)iγ(0) = ci

Covariant derivative: replace ddt of vectors with ∇γ̇

k-order polynomial satisfies(∇γ̇)k γ̇ = 0subject to initial conditions γ(0), (∇γ̇)iγ̇(0), i = 0, . . . , k − 1

Introduced via rolling maps by Jupp&Kent1987Studied by Leite (2008), in rolling map setting

Polynomial Regression on Riemannian Manifolds 5

Riemannian PolynomialsAt least three ways to define polynomial in Rd

Algebraic: γ(t) = c0 + 11!c1t+ 1

2!c2t2 + · · ·+ 1

k!cktk

Variational: γ = argminϕ∫ T0 |(ddt

) k+12 ϕ(t)|2dt s.t. BC/ICs

Differential:(ddt

)k+1γ(t) = 0 s.t. initial conditions

(ddt

)iγ(0) = ci

Covariant derivative: replace ddt of vectors with ∇γ̇

k-order polynomial satisfies(∇γ̇)k γ̇ = 0subject to initial conditions γ(0), (∇γ̇)iγ̇(0), i = 0, . . . , k − 1

Introduced via rolling maps by Jupp&Kent1987Studied by Leite (2008), in rolling map setting

Polynomial Regression on Riemannian Manifolds 5

Rolling maps

Leite 2008

Unroll curve α on manifold to curve αdev on Rd without twistingor slipping. Then

(∇α̇)kα̇ = 0 ⇐⇒(d

dt

)kα̇dev = 0

Unknown whether this satisfies a variational principle

Polynomial Regression on Riemannian Manifolds 6

Rolling maps

Leite 2008

Unroll curve α on manifold to curve αdev on Rd without twistingor slipping. Then

(∇α̇)kα̇ = 0 ⇐⇒(d

dt

)kα̇dev = 0

Unknown whether this satisfies a variational principlePolynomial Regression on Riemannian Manifolds 6

Riemannian Polynomials

Generate via forward evolution of linearizedsystem of first-order covariant ODEsForward Polynomial Evolution

repeatw ← v1for i = 1, . . . , k − 1 dovi ← ParallelTransportγ(∆tw, vi + ∆tvi+1)

end forvk ← ParallelTransportγ(∆tw, vk)γ ← Expγ(∆tw)t← t+ ∆t

until t = T

Parametrized by ICs:γ(0) positionv1(0) velocityv2(0) accelerationv3(0) jerk

Polynomial Regression on Riemannian Manifolds 7

Polynomial Regression

(∇γ̇)kγ̇ = 0 becomes linearized system

γ̇ = v1

∇γ̇vi = vi+1 i = 1, . . . , k − 1

∇γ̇vk = 0.

Want to find initial conditions for this ODE that minimize

E(γ) =

N∑i=1

gi(γ(ti))

Polynomial Regression on Riemannian Manifolds 8

Lagrange multiplier (adjoint) vector fields λi along γ:

E∗(γ, {vi}, {λi}) =

N∑i=1

gi(γ(ti)) +

∫ T

0〈λ0, γ̇ − v1〉dt

+

k−1∑i=1

∫ T

0〈λi,∇γ̇vi − vi+1〉dt+

∫ T

0〈λk,∇γ̇vk〉dt

Euler-Lagrange for {λi} gives forward system.Vector field integration by parts:∫ T

0〈λi,∇γ̇vi〉dt = [〈λi, vi〉]T0 −

∫ T

0〈∇γ̇λi, vi〉dt

Polynomial Regression on Riemannian Manifolds 9

Lagrange multiplier (adjoint) vector fields λi along γ:

E∗(γ, {vi}, {λi}) =

N∑i=1

gi(γ(ti)) +

∫ T

0〈λ0, γ̇ − v1〉dt

+

k−1∑i=1

∫ T

0〈λi,∇γ̇vi − vi+1〉dt+

∫ T

0〈λk,∇γ̇vk〉dt

Euler-Lagrange for {λi} gives forward system.Vector field integration by parts:∫ T

0〈λi,∇γ̇vi〉dt = [〈λi, vi〉]T0 −

∫ T

0〈∇γ̇λi, vi〉dt

Polynomial Regression on Riemannian Manifolds 9

Rewrite using integration by parts

E∗(γ, {vi}, {λi}) =

N∑i=1

gi(γ(ti)) +

∫ T

0〈λ0, γ̇ − v1〉dt

+

k−1∑i=1

[〈λi, vi〉]T0 −k−1∑i=1

∫ T

0〈∇γ̇λi, vi〉dt

−k−1∑i=1

∫ T

0〈λi, vi+1〉dt

+ [〈λk, vk〉]T0 −∫ T

0〈∇γ̇λk, vk〉dt

So variation w.r.t. {vi} gives

δviE∗ = 0 = −∇γ̇λi − λi−1

δvi(T )E∗ = 0 = λi(T )

δvi(0)E∗ = −λi(0)

Polynomial Regression on Riemannian Manifolds 10

Rewrite using integration by parts

E∗(γ, {vi}, {λi}) =

N∑i=1

gi(γ(ti)) +

∫ T

0〈λ0, γ̇ − v1〉dt

+

k−1∑i=1

[〈λi, vi〉]T0 −k−1∑i=1

∫ T

0〈∇γ̇λi, vi〉dt

−k−1∑i=1

∫ T

0〈λi, vi+1〉dt

+ [〈λk, vk〉]T0 −∫ T

0〈∇γ̇λk, vk〉dt

So variation w.r.t. {vi} gives

δviE∗ = 0 = −∇γ̇λi − λi−1

δvi(T )E∗ = 0 = λi(T )

δvi(0)E∗ = −λi(0)

Polynomial Regression on Riemannian Manifolds 10

Variation with respect to the curve γ:Let {γs : s ∈ (−ε, ε)} a smooth family of curves, with:

γ0 = γ

W (t) :=d

dsγs(t)|s=0

Extend vi, λi away from curve via parallel transport:

∇W vi = 0

∇Wλi = 0

Then∫ T

0〈δγE∗(γ, {vi}, {λi}),W 〉dt =

d

dsE∗(γs, {vi}, {λi})|s=0

Polynomial Regression on Riemannian Manifolds 11

For any smooth family of curves γs(t), we have[d

dsγs(t),

d

dtγs(t)

]= [W, γ̇s] = 0

so

∇W γ̇ = ∇γ̇W.

We also need the Leibniz rule

d

ds〈X,Y 〉|s=0 = 〈∇WX,Y 〉+ 〈X,∇WY 〉,

And the Riemann curvature tensor

R(X,Y )Z = ∇X∇Y Z −∇Y∇XZ −∇[X,Y ]Z

∇W∇γ̇Z = ∇γ̇∇WZ +R(W, γ̇)Z

Polynomial Regression on Riemannian Manifolds 12

For first term, T1 =∫ T0 〈λ0, γ̇s〉dt

d

dsT1(γs)|s=0 =

d

ds

∫ T

0〈λ0, γ̇s〉dt|s=0

=

∫ T

0〈∇Wλ0, γ̇s〉+ 〈λ0,∇W γ̇s〉dt|s=0

=

∫ T

0〈0, γ̇s〉+ 〈λ0,∇γ̇W 〉dt|s=0

= [〈λ0,W 〉]T0 −∫ T

0〈∇γ̇λ0,W 〉dt

Variation of this term with respect to γ is

δγ(t)T1 = −∇γ̇λ0δγ(T )T1 = 0 = λ0(T )

δγ(0)T1 = −λ0(0)

Polynomial Regression on Riemannian Manifolds 13

Now do the same with another term Ti

d

dsTi(γs)|s=0 =

d

ds

∫ T

0〈λi,∇γ̇vi〉dt

=

∫ T

0〈∇Wλi,∇γ̇vi〉+ 〈λi,∇W∇γ̇vi〉dt

= 0 +

∫ T

0〈λi,∇γ̇∇W vi +R(W, γ̇)vi〉dt

= 0 +

∫ T

0〈R(λi, vi)γ̇,W 〉dt

where we used Bianchi identities to rearrange the curvatureterm. So

δγ(t)Ti = R(λi, vi)γ̇

Polynomial Regression on Riemannian Manifolds 14

Combine all terms to get adjoint equations

∇γ̇λ0 =

N∑i=1

δ(t− ti)(grad gi(γ(t))) +

k∑i=1

R(λi, vi)v1

∇γ̇λi = −λi−1

Initialization for λi at t = T is

λi(T ) = 0,

Parameter gradients are

δγ(0)E = −λ0(0)

δvi(0)E = −λi(0)

Typically, gi(γ) = d(γ, yi)2, so that

(grad gi(γ)) = −Logγ yi

Polynomial Regression on Riemannian Manifolds 15

Combine all terms to get adjoint equations

∇γ̇λ0 =

N∑i=1

δ(t− ti)(grad gi(γ(t))) +

k∑i=1

R(λi, vi)v1

∇γ̇λi = −λi−1

Initialization for λi at t = T is

λi(T ) = 0,

Parameter gradients are

δγ(0)E = −λ0(0)

δvi(0)E = −λi(0)

Typically, gi(γ) = d(γ, yi)2, so that

(grad gi(γ)) = −Logγ yi

Polynomial Regression on Riemannian Manifolds 15

Combine all terms to get adjoint equations

∇γ̇λ0 =

N∑i=1

δ(t− ti)(grad gi(γ(t))) +

k∑i=1

R(λi, vi)v1

∇γ̇λi = −λi−1

Initialization for λi at t = T is

λi(T ) = 0,

Parameter gradients are

δγ(0)E = −λ0(0)

δvi(0)E = −λi(0)

Typically, gi(γ) = d(γ, yi)2, so that

(grad gi(γ)) = −Logγ yi

Polynomial Regression on Riemannian Manifolds 15

Polynomial Regression

Algorithmrepeat

Integrate γ, {vi} forward to t = TInitialize λi(T ) = 0, i = 0, . . . , kIntegrate {λi} via adjoint equations back to t = 0Gradient descent step:γ(0)n+1 = Expγ(0)n(ελ0(0))

vi(0)n+1 = ParTransγ(0)n(ελ0(0), vi(0)n + ελi(0))until convergence

Polynomial Regression on Riemannian Manifolds 16

Special Case: Geodesic (k = 1)

Adjoint system is

∇γ̇λ0 =

N∑i=1

δ(t− ti)(grad gi(γ(t))) +R(λ1, v1)v1

∇γ̇λ1 = −λ0

Between data points this is

(∇γ̇)2λ1 = −R(λ1, γ̇)γ̇

This is the Jacobi equation, λ1 is a Jacobi field.

Polynomial Regression on Riemannian Manifolds 17

Kendall Shape Space

Space of N landmarks in d dimensions,RNd, modulo translation, scale, rotationPrevents skewed statistics due tosimilarity transformed datad = 2, complex projective space CPN−2

Polynomial Regression on Riemannian Manifolds 18

Kendall Shape Space Geometry(d = 2)

Center point-set and scale so that∑N

i=1 |xi|2 = 1 (resultingobject is called a preshape)Preshapes lie on sphere S2N−1, represented as vectors in(R2)N = CN

Riemannian submersion from preshape to shape space:vertical direction holds rotations of R2

Exponential and log map available in closed form (for d = 2)

Polynomial Regression on Riemannian Manifolds 19

Covariant derivative in shape space in terms of preshape(O’Neill1966):

∇∗X∗Y∗ = H∇XY

Vertical direction is JN , where N is outward unit normal at thepreshape, J is almost complex structure on CN .

So parallel transport in small steps in upstairs space then dohorizontal projection.

Polynomial Regression on Riemannian Manifolds 20

Curvature on preshape sphere S2N−1, R(X,Y )Z is:

R(X,Y )Z = 〈X,Z〉Y − 〈Y,Z〉X

For curvature, need first fundamental form A. For horizontal vf’sX,Y ,

AXY =1

2V[X,Y ]

Curvature downstairs is

〈R∗(X∗, Y∗)Z∗, H〉 = 〈R(X,Y )Z,H〉+ 2〈AXY,AZH〉 − 〈AY Z,AXH〉 − 〈AZX,AYH〉

Polynomial Regression on Riemannian Manifolds 21

First fundamental form is (O’Neill)

AXY = 〈X,JY 〉JN

Adjoint of AZ :

〈AXY,AZH〉 = 〈−J〈X, JY 〉Z,H〉

Curvature then is

R∗(X∗, Y∗)Z∗ = R(X,Y )Z − 2J〈X, JY 〉Z + J〈Y, JZ〉X + J〈Z, JX〉Y

Polynomial Regression on Riemannian Manifolds 22

Bookstein Rat Calivarium Growth

8 landmark points18 subjects8 ages

−0.4 −0.3 −0.2 −0.1 0 0.1 0.2 0.3 0.4 0.5 0.6

−0.2

−0.1

0

0.1

0.2

0.3

A

B

C

D

−0.32 −0.3 −0.28 −0.26 −0.24 −0.22 −0.2

−0.26

−0.24

−0.22

−0.2

−0.18

−0.16

−0.14

A

−0.15 −0.1 −0.05

0.18

0.2

0.22

0.24

0.26

0.28

0.3

0.32

B

0.22 0.24 0.26 0.28 0.3 0.32−0.22

−0.2

−0.18

−0.16

−0.14

−0.12

−0.1

C

0.44 0.46 0.48 0.5 0.52 0.54 0.56−0.2

−0.18

−0.16

−0.14

−0.12

−0.1

−0.08

−0.06

Dk R2

1 0.792 0.853 0.87

Polynomial Regression on Riemannian Manifolds 23

Corpus Collosum Aging (www.oasis-brains.org)

Fletcher 2011

N = 32 patientsAge range 18–9064 landmarks usingShapeWorks sci.utah.edu

k R2

1 0.122 0.133 0.21

Geo

desi

c

−0.15 −0.1 −0.05 0 0.05 0.1 0.15−0.08

−0.06

−0.04

−0.02

0

0.02

0.04

Qua

drat

ic−0.15 −0.1 −0.05 0 0.05 0.1 0.15

−0.08

−0.06

−0.04

−0.02

0

0.02

0.04

Cub

ic

−0.15 −0.1 −0.05 0 0.05 0.1 0.15

−0.08

−0.06

−0.04

−0.02

0

0.02

0.04

0.06

Polynomial Regression on Riemannian Manifolds 24

Corpus Collosum Aging

−0.15 −0.1 −0.05 0 0.05 0.1 0.15 0.2

−0.08

−0.06

−0.04

−0.02

0

0.02

0.04

0.06

0.08

γ̇(0)∇γ̇ γ̇(0)

(∇γ̇)2γ̇(0)

Initial conditions are collinear, implying time reparametrization

Polynomial Regression on Riemannian Manifolds 25

Landmark Space

Space L of N points in Rd. Geodesic equations:

d

dtxi =

N∑j=1

γ(|xi − xj |2)αj

d

dtαi = −2

N∑j=1

(xi − xj)γ′(|xi − xj |)2αTi αj

Usually use Gaussian kernel

γ(r) = e−r/(2σ2)

x ∈ L and α ∈ T ∗xL is a covector (momentum)

Polynomial Regression on Riemannian Manifolds 26

Landmark Space

Space L of N points in Rd. Geodesic equations:

d

dtxi =

N∑j=1

γ(|xi − xj |2)αj

d

dtαi = −2

N∑j=1

(xi − xj)γ′(|xi − xj |)2αTi αj

Usually use Gaussian kernel

γ(r) = e−r/(2σ2)

x ∈ L and α ∈ T ∗xL is a covector (momentum)

Polynomial Regression on Riemannian Manifolds 26

Have simple formula for cometric gij (the kernel)Parallel transport in terms of covectors, cometric:

d

dtβ` =

1

2gi`g

in,j g

jm (αmβn − αnβm)− 1

2gmn,` αmβn

Curvature more complicated (Mario’s Formula):

2Rursv = −gur,sv − grv,us + grs,uv + guv,rs + 2Γrvρ Γusσ gρσ − 2Γrsρ Γuvσ g

ρσ

+ grλ,ugλµgµv,s − grλ,ugλµgµs,v + guλ,rgλµg

µs,v − guλ,rgλµgµv,s

+ grλ,sgλµgµv,u + guλ,vgλµg

µs,r − grλ,vgλµgµs,u − guλ,sgλµgµv,r.

Polynomial Regression on Riemannian Manifolds 27

Landmark parallel transport in momenta (Younes2008):

d

dtβi = K−1

( N∑j=1

(xi − xj)T ((Kβ)i − (Kβ)j)γ′(|xi − xj |2)αj

−N∑j=1

(xi − xj)T ((Kα)i − (Kα)j)γ′(|xi − xj |2)βj

)

−N∑j=1

(xi − xj)γ′(|xi − xj |2)(αTj βi + αTi βj)

This is enough to integrate polynomials

Polynomial Regression on Riemannian Manifolds 28

For curvature, need Christoffel symbols and their derivatives:

(Γ(u, v))i = −N∑

j=1

(xi − xj)T (vi − vj)γ′(|xi − xj |2)(K−1u)j

−N∑

j=1

(xi − xj)T (ui − uj)γ′(|xi − xj |2)(K−1v)j

+N∑

j=1

γ(|xi − xj |2)

N∑k=1

(xj − xk)γ′(|xj − xk|2)((K−1u)Tk (K−1v)j + (K−1u)Tj (K−1v)k)

Take derivative with respect to x, and combine using

R`ijk = Γ`ki,j − Γ`ji,k + Γ`jmΓmki − Γ`kmΓmji

Polynomial Regression on Riemannian Manifolds 29

((DΓ(u, v))w)i =N∑j=1

(wi − wj)T (ui − uj)γ′(|xi − xj |2)(K−1v)j

+ 2N∑j=1

(xi − xj)T (ui − uj)(xi − xj)T (wi − wj)γ′′(|xi − xj |2)(K−1v)j

+N∑j=1

(xi − xj)T (ui − uj)γ′(|xi − xj |2)((d

dεK−1)v)j

+N∑j=1

(wi − wj)T (vi − vj)γ′(|xi − xj |2)(K−1u)j

+ 2N∑j=1

(xi − xj)T (vi − vj)(xi − xj)T (wi − wj)γ′′(|xi − xj |2)(K−1u)j

+N∑j=1

(xi − xj)T (vi − vj)γ′(|xi − xj |2)((d

dεK−1)u)j

− 2N∑j=1

(xi − xj)T (wi − wj)γ′(|xi − xj |2)N∑k=1

(xj − xk)γ′(|xj − xk|2)((K−1u)Tk (K−1v)j + (K−1u)Tj (K−1v)k)

−N∑j=1

γ(|xi − xj |2)N∑k=1

(wj − wk)γ′(|xj − xk|2)((K−1u)Tk (K−1v)j + (K−1u)Tj (K−1v)k)

− 2N∑j=1

γ(|xi − xj |2)N∑k=1

(xj − xk)(xj − xk)T (wj − wk)γ′′(|xj − xk|2)((K−1u)Tk (K−1v)j + (K−1u)Tj (K−1v)k)

−N∑j=1

γ(|xi − xj |2)N∑k=1

(xj − xk)γ′(|xj − xk|2)

× ((d

dεK−1u)Tk (K−1v)j + (K−1u)Tk (

d

dεK−1v)j + (

d

dεK−1u)Tj (K−1v)k + (K−1u)Tj (

d

dεK−1v)k)(

(d

dεK−1)v

)i

= −(K−1d

dεKK−1v)i

= −2(K−1N∑j=1

(xk − xj)T (wk − wj)γ′(|xk − xj |2)(K−1v)j

Polynomial Regression on Riemannian Manifolds 30

Landmark Regression ResultsSame Bookstein rat data. Procrustes alignment, no scaling.

−0.6 −0.4 −0.2 0 0.2 0.4 0.6 0.8

−0.2

−0.1

0

0.1

0.2

0.3

0.2 0.25 0.3 0.35 0.4 0.450.24

0.26

0.28

0.3

0.32

0.34

0.36

−0.2 −0.18 −0.16 −0.14 −0.12 −0.1 −0.08 −0.06 −0.04 −0.02 00.22

0.24

0.26

0.28

0.3

0.32

0.34

R2 = 0.92 geodesic, 0.94 quadraticPolynomial Regression on Riemannian Manifolds 31

Polynomial Regression on Riemannian Manifolds 32

Thank You!

top related