the good sides of bayes

34
The good sides of Bayes Jeannot Trampert Utrecht University

Upload: verity

Post on 16-Jan-2016

34 views

Category:

Documents


0 download

DESCRIPTION

The good sides of Bayes. Jeannot Trampert Utrecht University. Bayes gives us an answer! Example of inner core anisotropy. Normal mode splitting functions are linearly related to seismic anisotropy in the inner core. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: The good sides of Bayes

The good sides of Bayes

Jeannot Trampert

Utrecht University

Page 2: The good sides of Bayes

Bayes gives us an answer!Example of inner core anisotropy

Page 3: The good sides of Bayes

Normal mode splitting functions are linearly related to seismic anisotropy in the inner core

The kernels Kα, Kβ and Kγ are of different size, hence regularization affects the different models differently

Page 4: The good sides of Bayes

Regularized inversion

Page 5: The good sides of Bayes

Full model space search (NA, Sambridge 1999)

Page 6: The good sides of Bayes

Resolves 20 year disagreement between body wave and normal mode data (Beghein and Trampert, 2003)

Page 7: The good sides of Bayes

Bayes or not to Bayes?

We need proper uncertainty analysis to interpret seismic tomography

probability density functions for all model parameters

Page 8: The good sides of Bayes

Do models agree?No knowledge of uncertainty

implies subjective comparisons.

Page 9: The good sides of Bayes
Page 10: The good sides of Bayes

Partial knowledge of uncertainty allows hypothesis testing

Page 11: The good sides of Bayes

Deschamps and Tackley, 2009

Page 12: The good sides of Bayes

Mean density model separated into its chemical and temperature contributions (full pdf obtained with NA)

Trampert et al, 2004)

Page 13: The good sides of Bayes

Deschamps and Tackley, 2009

Page 14: The good sides of Bayes

Full knowledge of uncertainty allows to evaluate the probability

of overlap or consistency between models

Page 15: The good sides of Bayes

What is uncertainty?

Consider a linear problem where d are data, m the model, G partial derivatives and e the data uncertainty

where m0 is a starting model and L the linear inverse operator

The estimated solution is

Page 16: The good sides of Bayes

What is uncertainty?

where (I-R) is the null-space operator

This can be rewritten as

Resulting in a formal statistical uncertainty expressed with covariance operators as

Page 17: The good sides of Bayes

What is uncertainty?

Page 18: The good sides of Bayes

How can we estimate uncertainty?

① Ignore it: should not be an option but is the common approach

② Try and estimate m: Regularized extremal bound analysis (Meju,

2009)Null-space shuttle (Deal and Nolet, 1996)

③ Probabilistic tomographyNeighbourghood algorithm (Sambridge, 1999)Metropolis (Mosegaard and Tarantola, 1995) Neural Networks (Meier et al., 2007)

Page 19: The good sides of Bayes

The most general solution of an inverse problem (Bayes)

evidence

x

),(

),(),(),(

likelihoodpriorposterior

md

mdmdkmd

Tarantola, 2005

Page 20: The good sides of Bayes

A full model space search should estimate )()()( mLmkm m

• Exhaustive search• Brute force Monte Carlo (Shapiro and Ritzwoller, 2002)• Simulated Annealing (global optimisation with convergence

proof)• Genetic algorithms (global optimisation with no covergence

proof)• Neighbourhood algorithm (Sambridge, 1999)• Sample(m) and apply Metropolis rule on L(m). This will

result in importance sampling of (m) (Mosegaard and Tarantola, 1995)

• Neural networks (Meier et al., 2007)

Page 21: The good sides of Bayes

The neighbourhood algorithm (NA): Sambridge 1999

Stage 1:Guided sampling of the model space.Samples concentrate in areas (neighbourhoods) of better fit.

Page 22: The good sides of Bayes

The neighbourhood algorithm (NA):

Stage 2: importance samplingResampling so that sampling density reflects posterior

2D marginal 1D marginal

Page 23: The good sides of Bayes

Advantages of NA

• Interpolation in model space with Voronoi cells

• Relative ranking in both stages (less dependent on data uncertainty)

• Marginals calculated by Monte Carlo integration convergence check

• Marginals are a compact representation of the seismic data and prior rather than a model

Page 24: The good sides of Bayes

Example: A global mantle model

•Using body wave arrival times, surface wave dispersion measurements and normal mode splitting functions

•Same mathematical formulation

Page 25: The good sides of Bayes

Mosca et al., 2011

Page 26: The good sides of Bayes

Mosca et al., 2011

Page 27: The good sides of Bayes

What does it all mean?

Mineral physics willtell us!

Thermo-chemicalparameterization:

• Temperature• Fraction of Pv (pPv)• Fraction of total Fe

Page 28: The good sides of Bayes

Example: Importance sampling using the Metropolis rule (Mosegaard and Tarantola, 1995)

Page 29: The good sides of Bayes
Page 30: The good sides of Bayes

Disadvantages of NA

and Metropolis

Works only on small linear and non-linear problems(less than ~50 parameters)

Page 31: The good sides of Bayes

The neural network (NN) approach: Bishop 1995, MacKay 2003

•A neural network can be seen as a non-linear filter between any input and output•The NN is an approximation to a non-linear function g where d=g(m)•Works on forward or inverse function•A training set (contains the physics) is used to calculate the coefficients of the NN by non-linear optimisation

Page 32: The good sides of Bayes

Properties of NN

1. Dimensionality is not a problem because NN approximates a function and not a data prediction!

2. Flexible: invert for any combination of parameters

3. 1D or 2D marginal only

Page 33: The good sides of Bayes

Mantle transition zone discontinuities

Page 34: The good sides of Bayes

Probabilistic tomography using Bayes’ theorem is possible but

challenges remain

• Control the prior and data uncertainty

• Full pdfs in high dimensions

• Interpret and visualize the information contained in the marginals