surface reconstruction with mls - school of...

48
Surface Reconstruction with MLS Tobias Martin CS7960, Spring 2006, Feb 23

Upload: others

Post on 29-Feb-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Surface Reconstruction with MLS

Tobias Martin

CS7960, Spring 2006, Feb 23

Page 2: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Literature

• An Adaptive MLS Surface forReconstruction with Guarantees,T. K. Dey and J. Sun

• A Sampling Theorem for MLS Surfaces,Peer-Timo Bremer, John C. Hart

Page 3: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

An Adaptive MLS Surface forReconstruction with Guarantess

Tamil K. Dey and Jian Sun

Page 4: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Implicit MLS Surfaces

• Zero-level-set of function I(x) defines S Normals are important, because points are

projected along the normals

Page 5: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Motivation

• Original smooth, closed surface S.

• Given Conditions:– Sampling Density– Normal Estimates– Noise

Design an implicit function whosezero set recovers S.

Page 6: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Motivation

• So far, only uniform sampling condition.• Restriction:

• The red arc requires 104 samples becauseof small feature.

Page 7: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Motivation

• Establish an adaptive sampling condition,similar to the one of Amenta and Bern.

Incorporate local feature size insampling condition.

• Red arc only requires 6 samples.

Page 8: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Sampling Condition

• Recent Surface Reconstruction algorithmsare based on noise free samples. Notion of ε-sample has to be modified.

• P is a noisy (ε,α)-sample if– Every z∈S to its closest point in P is < ε lfs(z).– The distance for p∈P to z=proj(p) is < ε2 lfs(z).– Every p∈P has a normal which has an angle with its

corresponding projected surface point < ε.– The number of sample points inside B(x, ε lfs(proj(x))

is less than a small number α.Note that it is difficult to check whether a sample

P is a noisy (ε,α)-sample.

Page 9: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Effect of nearby samples• Points within a small neighborhood are predictably

distributed within a s small slab:

Page 10: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Adaptive MLS

• at point x should be decided primarilyby near sample points. Choose such, that sample pointsoutside a neighborhood have less effect. Use Gaussian functions. Their width control influence of samples.

• Make width dependent on lfs Define width as fraction of lfs

Page 11: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Adaptive MLS

• Choice of weighting function:

θp(x) =

Many sample points at p which contributeto point x.

Page 12: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Adaptive MLS

• Choice of weighting function:

θ(x)p=

Sample point p has a constant weight( ). Influence of p does not decrease with distance.

Page 13: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Adaptive MLS

• Compromise: Take fraction ofas width of Gaussian.

Weighting function decreases as pgoes far away from x. Weighting is small for small features,i.e. small features do not require moresamples.

Page 14: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Contribution of distant samples

• Show that the effect of distant sample pointscan be bounded. Rely on nearby features.

• How to show that?

Page 15: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Contribution of distant samples

r=

Page 16: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Contribution of distant samples

• Once there is a bound on the number ofsamples in Bρ/2, we lemma 3 is used whichshows an upper bound on its influence,i.e.

Page 17: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Contribution of distant samples

• Using lemma 3 they prove the maintheorem:

Page 18: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Contribution of distant samples

• The space outside B(x, rlfs(proj(x)) can bedecomposed in an infinite number of shells, i.e.

The influence of all points outside is equalto the influence of all the points in the shellswhich was bounded by lemma 3.

Therefore, the contributions of pointsoutside B(x,rlfs(proj(x)) can be bounded.

Page 19: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Algorithm

Page 20: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Normal Estimation

• Delaunay based, i.e. calculate DT of inputpoints.

• Big Delaunay ball:radius greater than certain times thenearest neighbor.

• The vectors of incident points to center ofB(c,r) approximate normals.

Page 21: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Normal Estimation

Page 22: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Feature Detection• In noiseless case: Take poles in Voronoi cells, shortest distanceapproximates lfs(p).

• This does not work in the case of noise.• Every medial axis point is covered with a big

Delaunay Ball (observation by Dey andGoswami)

• Take biggest Delaunay Ball in both (normal)directions of sample point p. Centers act as poles (L).

• lfs(proj(x)) is d(p, L) where p is closest point to xin P.

Page 23: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Projection• Newton Iteration: move p to p’, i.e.:

• Iterate until d(p,p’) becomes smaller than acertain treshold.

• To calculate and only use thepoints in Ball with radius 5x the width of theGaussian weighting function.Points outside have little effect on the function.

• Newton projection has big convergent domain.

Page 24: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Reconstruction

• Finally, use a reconstruction algorithm,e.g. Cocone to calculate a mesh.

Page 25: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

AMLS vs PMLS

• Background on “Projection MLS”:– Surface is the set of stationary points of a function f.– An energy function e measures the quality of the fit of a

plane to the point set at r.– The local minima of e nearest to r is f(r).

Page 26: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

AMLS vs PMLS

• The zero set of the energy function definesthe surface.

• But there are other two layers of zero-levelsets where the energy function reaches itslocal maximum.

Page 27: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

AMLS vs PMLS

• When distance between the layers getssmall, computations on the PMLS surfacebecome difficult. Small “marching step” in ray tracing Small size of cubes in polygonizer

Page 28: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

AMLS vs PMLS

• Furthermore, projection procedure inPMLS is not trivial. non-linear optimization. …and finding a starting value is difficultif the two maximum layers are close.

• If there is noise, maxima layers couldinterfere. Holes and disconnectness.

Page 29: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

AMLS vs PMLS

Page 30: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

A Sampling Theorem for MLSSurfaces

Peer-Timo Bremer and John C. Hart

Page 31: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Motivation

• We saw a lot of work about PSS.

• But not much knowledge about resultingmathematical properties.

• It is assumed that surface construction iswell defined within a neighborhood.

Page 32: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Motivation• MLS filters samples, and projects them onto a a local

tangent plane.

• MLS is robust but difficult to analyze.

• Current algorithms may actually miss the surface. MLS is defined as the stationary points of a (dynamic)projection. This projection is “dynamic” and can be undefined. Robustness of MLS is determined by projection.

Need a sample condition which guarantees thatprojection is defined everywhere.

Page 33: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Motivation

• What is necessary to have a faithfulprojection? Correct normals.

• We need a condition which shows thatgiven a surface and a sampling, normalsare well defined.

Page 34: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Sampling Condition - Outline

• Given a sample, show

Normal vector needed by MLS does not vanish

Given conditions of the surface and sampling, the normal

is well defined

Page 35: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

PCA - Review

• We have given a set of points in R3 which lie ordon’t lie on a surface.

• We know the 3x3 covariance matrix at point q is

• The cov(q) can be factored into UTLU

• …where

Page 36: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

PCA - Review

• vmax, vmid, vmin define local coordinateframe:

Page 37: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

MLS Surface

• e.g. Gaussian

• Weighted average

• 3x3 covariance matrix

Page 38: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

MLS surface

• From we know:The eigenvector of the unique smallesteigenvalue defines the normal .

• Using that, the zero set of

defines the MLS surface. has to be well defined!

Page 39: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

New Sampling Condition• Previous Sampling condition:

“S is well sampled by P if normal directions aredefined inside a neighborhood of .” (Adamsonand Alexa)

• This is not good because S≠ .S could be “well sampled” but undefined normal

direction can result

• New Sampling condition: “S is well sampled by P if normal directions aredefined inside a neighborhood of S.

Page 40: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

• You can prove a sampling to be well defined byensuring that has a unique smallesteigenvalue over a neighborhood of S.

• It is proved in terms of the weighted variance ofthe samples P.

• Directional Variance:

Variance in a specific direction

Uniqueness of smallest Eigenvalue

Page 41: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Uniqueness of smallest Eigenvalue

• We combine that with and get

• …and decompose it into eigenvalues andeigenvectors:

Page 42: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Uniqueness of smallest Eigenvalue

• Using that it can be shown that if λmin isnot unique varn(q) isn’t uniqueeither.(Lemma 2)

• This leads to:

Page 43: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Sampling Theorem

• To prove the sampling condition, show that forevery point q, there is a normal direction whoseweighted variance is less than that of anyperpendicular direction.Derive upper bound of weighted variance in normal

direction n.

Derive lower bound in an arbitrary tangent direction x.

Determine sampling conditions:max(varn(q)) < min(varx(q))

Page 44: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Sampling Theorem

• To show , partitiondatapoints into ones that increase varnmore versus ones that increase varx more.

• Construct two planes through q

…which seperates the points such that

Page 45: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Sampling Theorem

• Points below those planes increase varn(q) morethan varx(q).

• The region below the planes define an hourglassshape.

Page 46: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Sampling Theorem• Furthermore, q is at most w=τlfs(proj(q) above

surface.

Page 47: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Sampling Theorem• Project hourglass onto lower medial ball (B-).• Find limit of max possible effect of these “risky” samples.• Use the risky area (red) to overestimate the number of

samples and their contribution to varn.

Page 48: Surface Reconstruction with MLS - School of Computingcsilva/courses/cpsc7960/pdf/surface-reconstruction-with-mls.pdfSampling Condition •Recent Surface Reconstruction algorithms are

Sampling Theorem

• Define upper bound on samples in risky areaand use it to compute upper bound of varn.

• Define lower bound on samples outside riskyarea to compute lower bound of varx.

• This leads to the Main Theorem: