gelfand widths in compressive sensing - tum

33
Gelfand Widths in Compressive Sensing Optimal Number of Measurements Björn Bringmann Technische Universität München 15.01.2015 Björn Bringmann Gelfand Widths in Compressive Sensing 1 / 26

Upload: others

Post on 06-Feb-2022

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Gelfand Widths in Compressive Sensing - TUM

Gelfand Widths in Compressive Sensing

Optimal Number of Measurements

Björn Bringmann

Technische Universität München

15.01.2015

Björn Bringmann Gelfand Widths in Compressive Sensing 1 / 26

Page 2: Gelfand Widths in Compressive Sensing - TUM

Outline

1 Introduction to m-Widths

Kolmogorov and Gelfand Widths

Compressive m-widths

2 Gelfand Widths of l1-balls

Upper bound

Lower bound

3 Applications

Optimal Number of Measurements

Kashin’s Decomposition Theorem

Björn Bringmann Gelfand Widths in Compressive Sensing 2 / 26

Page 3: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths

Outline

1 Introduction to m-Widths

Kolmogorov and Gelfand Widths

Compressive m-widths

2 Gelfand Widths of l1-balls

Upper bound

Lower bound

3 Applications

Optimal Number of Measurements

Kashin’s Decomposition Theorem

Björn Bringmann Gelfand Widths in Compressive Sensing 3 / 26

Page 4: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Kolmogorov and Gelfand Widths

Motivation

LetX = {f ∈ C(R) : f is 2π periodic}

and define

Sn(f )(x) :=n

∑k=−n

f̂ (k)eikx .

How fast does ‖f −Sn(f )‖2 converge to 0?

Approximate for f ∈ C([a,b])∫ b

af (x)dx ≈

n

∑k=0

wk f (xk ) .

What is a reasonable error estimate?

Conclusion: Approximation by linear subspaces is a good idea!

Björn Bringmann Gelfand Widths in Compressive Sensing 4 / 26

Page 5: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Kolmogorov and Gelfand Widths

Motivation

LetX = {f ∈ C(R) : f is 2π periodic}

and define

Sn(f )(x) :=n

∑k=−n

f̂ (k)eikx .

How fast does ‖f −Sn(f )‖2 converge to 0?

Approximate for f ∈ C([a,b])∫ b

af (x)dx ≈

n

∑k=0

wk f (xk ) .

What is a reasonable error estimate?

Conclusion: Approximation by linear subspaces is a good idea!

Björn Bringmann Gelfand Widths in Compressive Sensing 4 / 26

Page 6: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Kolmogorov and Gelfand Widths

Motivation

LetX = {f ∈ C(R) : f is 2π periodic}

and define

Sn(f )(x) :=n

∑k=−n

f̂ (k)eikx .

How fast does ‖f −Sn(f )‖2 converge to 0?

Approximate for f ∈ C([a,b])∫ b

af (x)dx ≈

n

∑k=0

wk f (xk ) .

What is a reasonable error estimate?

Conclusion: Approximation by linear subspaces is a good idea!

Björn Bringmann Gelfand Widths in Compressive Sensing 4 / 26

Page 7: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Kolmogorov and Gelfand Widths

Motivation

LetX = {f ∈ C(R) : f is 2π periodic}

and define

Sn(f )(x) :=n

∑k=−n

f̂ (k)eikx .

How fast does ‖f −Sn(f )‖2 converge to 0?

Approximate for f ∈ C([a,b])∫ b

af (x)dx ≈

n

∑k=0

wk f (xk ) .

What is a reasonable error estimate?

Conclusion: Approximation by linear subspaces is a good idea!

Björn Bringmann Gelfand Widths in Compressive Sensing 4 / 26

Page 8: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Kolmogorov and Gelfand Widths

Kolmogorov m-WidthDefinitionLet X be a real (or complex) Banach space. For any subset K ⊂ X andany m-dimensional linear subspace Xm ⊂ X define

d(K ,Xm;X ) := supx∈K

infz∈Xm‖x−z‖X .

Now define the Kolmogorov m-width as

dm(K ;X ) := inf{d(K ,Xm;X ) : Xm ⊂ X m-dimensional linear subspace} .

x

y

KX1

Björn Bringmann Gelfand Widths in Compressive Sensing 5 / 26

Page 9: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Kolmogorov and Gelfand Widths

Kolmogorov m-WidthDefinitionLet X be a real (or complex) Banach space. For any subset K ⊂ X andany m-dimensional linear subspace Xm ⊂ X define

d(K ,Xm;X ) := supx∈K

infz∈Xm‖x−z‖X .

Now define the Kolmogorov m-width as

dm(K ;X ) := inf{d(K ,Xm;X ) : Xm ⊂ X m-dimensional linear subspace} .

x

y

KX1

Björn Bringmann Gelfand Widths in Compressive Sensing 5 / 26

Page 10: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Kolmogorov and Gelfand Widths

Properties of the Kolmogorov m-width

LemmaLet X be a Banach space and K ⊂ X,m ∈ N.

1 dm(K ;X ) = dm(K̄ ;X ), where K̄ is the closure of K .

2 For every scalar α there holds dm(αK ;X ) = |α|dm(K ;X ).

3 dm(co(K );X ) = dm(K ;X ), where co(K ) is the convex hull of K .

4 dm(K ;X )≥ dm+1(K ;X ).

Björn Bringmann Gelfand Widths in Compressive Sensing 6 / 26

Page 11: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Kolmogorov and Gelfand Widths

Gelfand m-Width

DefinitionLet X be a real or complex Banach space and K a subset of X. TheGelfand m-width is defined as

dm(K ;X ) = inf{ supx∈K∩Lm

‖x‖X : Lm⊂X linear subspace with codim(Lm)≤m}

We say that a closed linear subspace Lm ⊂ X has codim(Lm) = m ifdim(X/Lm) = m .

Lemma

A linear subspace Lm ⊂ X has codim(Lm)≤m if and only if there existf1, . . . , fm ∈ X ∗ such that

Lm = {x ∈ X : fi(x) = 0 ∀i = 1, . . . ,m} .

Björn Bringmann Gelfand Widths in Compressive Sensing 7 / 26

Page 12: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Kolmogorov and Gelfand Widths

Duality of Kolmogorov and Gelfand m-widths

Theorem

For 1≤ p,q ≤ ∞ let p∗,q∗ be such that 1p + 1

p∗ = 1 and 1q + 1

q∗ = 1. Then

dm(BNp ;`N

q ) = dm(BNq∗ ;`

Np∗) .

LemmaLet Y be a finite-dimensional subspace of a Banach space X. Givenx ∈ X\Y and y∗ ∈ Y, the following properties are equivalent:

1 y∗ is a best approximation to x from Y .2 For some λ ∈ X ∗ with ‖λ‖X ∗ ≤ 1 and λ |Y ≡ 0, there holds

‖x−y∗‖= λ (x) .

Björn Bringmann Gelfand Widths in Compressive Sensing 8 / 26

Page 13: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Kolmogorov and Gelfand Widths

Connection with linear operators

Note that

dm(K ;X ) = inf

{sup

x∈K∩Ker(A)‖x‖ : A : X →Km linear, continuous

}.

If Lm is a subspace with codim(Lm)≤m, then choose f1, . . . , fm ∈ X ∗

as in the previous Lemma and define

A : X →Km,x 7→ [f1(x), . . . , fm(x)]t .

If A : X →Km is given, define the corresponding linear subspaceLm := Ker(A).

Björn Bringmann Gelfand Widths in Compressive Sensing 9 / 26

Page 14: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Kolmogorov and Gelfand Widths

Connection with linear operators

Note that

dm(K ;X ) = inf

{sup

x∈K∩Ker(A)‖x‖ : A : X →Km linear, continuous

}.

If Lm is a subspace with codim(Lm)≤m, then choose f1, . . . , fm ∈ X ∗

as in the previous Lemma and define

A : X →Km,x 7→ [f1(x), . . . , fm(x)]t .

If A : X →Km is given, define the corresponding linear subspaceLm := Ker(A).

Björn Bringmann Gelfand Widths in Compressive Sensing 9 / 26

Page 15: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Kolmogorov and Gelfand Widths

Connection with linear operators

Note that

dm(K ;X ) = inf

{sup

x∈K∩Ker(A)‖x‖ : A : X →Km linear, continuous

}.

If Lm is a subspace with codim(Lm)≤m, then choose f1, . . . , fm ∈ X ∗

as in the previous Lemma and define

A : X →Km,x 7→ [f1(x), . . . , fm(x)]t .

If A : X →Km is given, define the corresponding linear subspaceLm := Ker(A).

Björn Bringmann Gelfand Widths in Compressive Sensing 9 / 26

Page 16: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Compressive m-widths

Compressive m-width

DefinitionThe compressive m-width of a subset K of a (real) Banach space X isdefined as

Em(K ;X ) := inf{

supx∈K‖x−∆(Ax)‖X : A ∈L (X ,Rm),∆: Rm→ X

}.

A is the measurement map and ∆: Rm→ X is the arbitrary reconstructionmap.

A

Björn Bringmann Gelfand Widths in Compressive Sensing 10 / 26

Page 17: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Compressive m-widths

Adaptive compressive m-width

DefinitionThe adaptive map F : X → Rm is defined by

F (x) :=

λ1(x)

λ2;λ1(x)(x)...

λm;λ1(x),...,λm−1(x)(x)

.

for λ1(·),λ2;λ1(x)(·), . . . ,λm;λ1(x),...,λm−1(x)(·) ∈ X ∗. The adaptivecompressive m-width of a subset K of a Banach space X is defined as

Emada(K ;X ) := inf

{supx∈K‖x−∆(F (x))‖ : F : X → Rm adaptive,∆: Rm→ X

}

Björn Bringmann Gelfand Widths in Compressive Sensing 11 / 26

Page 18: Gelfand Widths in Compressive Sensing - TUM

Introduction to m-Widths Compressive m-widths

Connection with the Gelfand m-width

TheoremIf K is a subset of a Banach space X, then

Emada(K ;X )≤ Em(K ;X ) .

If the subset K satisfies −K = K , then

dm(K ;X )≤ Emada(K ,X ) .

If the set K further satisfies K + K ⊂ a K for some positive constant a, then

Em(K ;X )≤ a dm(K ;X ) .

Therefore under these assumptions

1a

Em(K ;X )≤ dm(K ;X )≤ Emada(K ;X )≤ Em(K ;X ) .

Björn Bringmann Gelfand Widths in Compressive Sensing 12 / 26

Page 19: Gelfand Widths in Compressive Sensing - TUM

Gelfand Widths of l1-balls

Outline

1 Introduction to m-Widths

Kolmogorov and Gelfand Widths

Compressive m-widths

2 Gelfand Widths of l1-balls

Upper bound

Lower bound

3 Applications

Optimal Number of Measurements

Kashin’s Decomposition Theorem

Björn Bringmann Gelfand Widths in Compressive Sensing 13 / 26

Page 20: Gelfand Widths in Compressive Sensing - TUM

Gelfand Widths of l1-balls

Main result

TheoremFor 1 < p ≤ 2 and m < N, there exist constants c1,c2 > 0 depending onlyon p such that

c1 min{

1,ln(eN/m)

m

}1− 1p

≤ dm(BN1 , `

Np )≤ c2 min

{1,

ln(eN/m)

m

}1− 1p

.

Björn Bringmann Gelfand Widths in Compressive Sensing 14 / 26

Page 21: Gelfand Widths in Compressive Sensing - TUM

Gelfand Widths of l1-balls Upper bound

Upper bound

TheoremThere is a constant C > 0 such that, for 1 < p ≤ 2 and m < N

dm(BN1 , `

Np )≤ C min

{1,

ln(eN/m)

m

}1− 1p

.

Björn Bringmann Gelfand Widths in Compressive Sensing 15 / 26

Page 22: Gelfand Widths in Compressive Sensing - TUM

Gelfand Widths of l1-balls Lower bound

Lower bound

TheoremThere is a constant c > 0 such that, for 1 < p ≤ ∞ and m < N

dm(BN1 , `

Np )≥ c min

{1,

ln(eN/m)

m

}1− 1p

.

Björn Bringmann Gelfand Widths in Compressive Sensing 16 / 26

Page 23: Gelfand Widths in Compressive Sensing - TUM

Gelfand Widths of l1-balls Lower bound

Recovery of 2s-sparse vectors

Theorem

Given a matrix A ∈ Rm×N , if every 2s-sparse vector x ∈ RN is a minimizerof ‖z‖1 subject to Az = Ax, then

m ≥ c1 s ln(

Nc2 s

)where c1 = 1

ln(9) and c2 = 4.

Björn Bringmann Gelfand Widths in Compressive Sensing 17 / 26

Page 24: Gelfand Widths in Compressive Sensing - TUM

Gelfand Widths of l1-balls Lower bound

Preparation

LemmaGiven integers s < N, there exist

n ≥(

N4 s

)s/2

subsets S1, . . . ,Sn of [N] such that each Sj has cardinality s and

card(Si ∩Sj) <s2

whenever i 6= j .

Björn Bringmann Gelfand Widths in Compressive Sensing 18 / 26

Page 25: Gelfand Widths in Compressive Sensing - TUM

Gelfand Widths of l1-balls Lower bound

Preparation

DefinitionLet X be a Banach space. For asubset T ⊂ X define the packingnumber P(T ,‖ · ‖X , t) for t > 0 by themaximum number P of pointsxk ∈ T ,k ∈ [P], which aret-separated, i.e. ‖xk −xl‖> t for allk , l ∈ [P],k 6= l .

Lemma

Then for any norm ‖ · ‖ on Rm thereholds

P(B1(0),‖ · ‖, t)≤(

1 +2t

)m

.

••

••

Björn Bringmann Gelfand Widths in Compressive Sensing 19 / 26

Page 26: Gelfand Widths in Compressive Sensing - TUM

Gelfand Widths of l1-balls Lower bound

Preparation

DefinitionLet X be a Banach space. For asubset T ⊂ X define the packingnumber P(T ,‖ · ‖X , t) for t > 0 by themaximum number P of pointsxk ∈ T ,k ∈ [P], which aret-separated, i.e. ‖xk −xl‖> t for allk , l ∈ [P],k 6= l .

Lemma

Then for any norm ‖ · ‖ on Rm thereholds

P(B1(0),‖ · ‖, t)≤(

1 +2t

)m

.

••

••

Björn Bringmann Gelfand Widths in Compressive Sensing 19 / 26

Page 27: Gelfand Widths in Compressive Sensing - TUM

Gelfand Widths of l1-balls Lower bound

Kolmogorov widths Revisited

Corollary

For 2≤ p < ∞ and m < N, there exist constants c1,c2 > 0 depending onlyon p such that the Kolmogorov widths satisfy

c1 min

{1,

ln( eN

m

)m

}1−1/p

≤ dm(BNp , `

N∞)≤ c2 min

{1,

ln( eN

m

)m

}1−1/p

Björn Bringmann Gelfand Widths in Compressive Sensing 20 / 26

Page 28: Gelfand Widths in Compressive Sensing - TUM

Applications

Outline

1 Introduction to m-Widths

Kolmogorov and Gelfand Widths

Compressive m-widths

2 Gelfand Widths of l1-balls

Upper bound

Lower bound

3 Applications

Optimal Number of Measurements

Kashin’s Decomposition Theorem

Björn Bringmann Gelfand Widths in Compressive Sensing 21 / 26

Page 29: Gelfand Widths in Compressive Sensing - TUM

Applications Optimal Number of Measurements

Estimates for the Compressive m-widths

Corollary

For 1 < p ≤ 2 and m < N, the adaptive and nonadaptive compressivem-widths satisfy

Emada(BN

1 , `Np )� Em(BN

1 , `Np )�min

{1,

ln(eN/m)

m

}1− 1p

.

Björn Bringmann Gelfand Widths in Compressive Sensing 22 / 26

Page 30: Gelfand Widths in Compressive Sensing - TUM

Applications Optimal Number of Measurements

Optimal Number of Measurements

Theorem

Let 1 < p ≤ 2. Suppose that the matrix A ∈ Rm×N and the map∆ : Rm→ RN satisfy

‖x−∆(Ax)‖p ≤C

s1−1/pσs(x)1 ∀x ∈ RN .

Then for some constant c > 0 depending only on C there holds

m ≥ c s ln(

eNs

). (1)

In particular, if A ∈ Rm×N satisfies δ2s(A) < 0.6246, then necessarily (1)holds with c = c(δ2s).

Björn Bringmann Gelfand Widths in Compressive Sensing 23 / 26

Page 31: Gelfand Widths in Compressive Sensing - TUM

Applications Optimal Number of Measurements

Donoho-Tanner Phase Transition

Figure: Success of L1−Minimization from Random Partial Fourier Measurements.x-Axis: δ = m/N undersampling fractiony-Axis: ρ = s/m sparsity fraction

Björn Bringmann Gelfand Widths in Compressive Sensing 24 / 26

Page 32: Gelfand Widths in Compressive Sensing - TUM

Applications Kashin’s Decomposition Theorem

Kashin’s Decomposition Theorem

Theorem

There exist universal constants α,β > 0 such that, for any m ≥ 1 thespace R2m can be split into an orthogonal sum of two m-dimensionalsubspaces E and E⊥ such that

α√

m‖x‖2 ≤ ‖x‖1 ≤ β√

m‖x‖2

for all x ∈ E and for all x ∈ E⊥.

Björn Bringmann Gelfand Widths in Compressive Sensing 25 / 26

Page 33: Gelfand Widths in Compressive Sensing - TUM

References

References

Holger Rauhut and Simon Foucart

A Mathematical Introduction to Compressive Sensing

Birkhäuser, 2013

Allan Pinkus

n-Widths in Approximation Theory

Springer, 1985

David L. Donoho and Jared Tanner

Observed Universality of Phase Transitions in High DimensionalGeometry, with Implications for Modern Data Analysis and SignalProcessing

Philosophical Transactions of the Royal Society, 2009Björn Bringmann Gelfand Widths in Compressive Sensing 26 / 26