bayesian lss inference - mpe.mpg.de
TRANSCRIPT
![Page 1: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/1.jpg)
J. Jasche, Bayesian LSS Inference
Jens Jasche
Garching, 11 September 2012
Large Scale Bayesian Inferencein
Cosmology
![Page 2: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/2.jpg)
J. Jasche, Bayesian LSS Inference
Introduction Cosmography
• 3D density and velocity fields• Power-spectra, bi-spectra• Dark Energy, Dark Matter, Gravity• Cosmological parameters
Large Scale Bayesian inference High dimensional ( ~ 10^7 parameters ) State-of-the-art technology On the verge of numerical feasibility
![Page 3: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/3.jpg)
J. Jasche, Bayesian LSS Inference
Introduction Cosmography
• 3D density and velocity fields• Power-spectra, bi-spectra• Dark Energy, Dark Matter, Gravity• Cosmological parameters
Large Scale Bayesian inference• High dimensional ( ~ 10^7 parameters )• State-of-the-art technology• On the verge of numerical feasibility
![Page 4: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/4.jpg)
J. Jasche, Bayesian LSS Inference
Introduction
Why do we need Bayesian inference?
![Page 5: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/5.jpg)
J. Jasche, Bayesian LSS Inference
Introduction
Why do we need Bayesian inference?• Inference of signals = ill-posed problem
![Page 6: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/6.jpg)
J. Jasche, Bayesian LSS Inference
Introduction
Why do we need Bayesian inference?• Inference of signals = ill-posed problem
• Noise Incomplete observations Systematics
![Page 7: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/7.jpg)
J. Jasche, Bayesian LSS Inference
Introduction
Why do we need Bayesian inference?• Inference of signals = ill-posed problem
• Noise• Incomplete observations Systematics
![Page 8: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/8.jpg)
J. Jasche, Bayesian LSS Inference
Introduction
Why do we need Bayesian inference?• Inference of signals = ill-posed problem
• Noise• Incomplete observations• Systematics
![Page 9: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/9.jpg)
J. Jasche, Bayesian LSS Inference
Introduction
Why do we need Bayesian inference?• Inference of signals = ill-posed problem
• Noise• Incomplete observations• Systematics
No unique recovery possible!!!
![Page 10: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/10.jpg)
J. Jasche, Bayesian LSS Inference
“What are the possible signals compatible with observations?”
Introduction
![Page 11: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/11.jpg)
J. Jasche, Bayesian LSS Inference
Object of interest: Signal posterior distribution
“What are the possible signals compatible with observations?”
We can do science! Model comparison Parameter studies Report statistical summaries Non-linear, Non-Gaussian error propagation
Introduction
![Page 12: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/12.jpg)
J. Jasche, Bayesian LSS Inference
Object of interest: Signal posterior distribution
“What are the possible signals compatible with observations?”
We can do science!• Model comparison• Parameter studies• Report statistical summaries• Non-linear, Non-Gaussian error propagation
Introduction
![Page 13: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/13.jpg)
J. Jasche, Bayesian LSS Inference
Markov Chain Monte Carlo Problems:
• High dimensional (~10^7 parameter) A large number of correlated parameters
No reduction of problem size possible Complex posterior distributions
Numerical approximation Dim > 4 MCMC
Metropolis-Hastings
![Page 14: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/14.jpg)
J. Jasche, Bayesian LSS Inference
Markov Chain Monte Carlo Problems:
• High dimensional (~10^7 parameter)• A large number of correlatedcorrelated parameters
• No reduction of problem size possible Complex posterior distributions
Numerical approximation Dim > 4 MCMC
Metropolis-Hastings
![Page 15: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/15.jpg)
J. Jasche, Bayesian LSS Inference
Markov Chain Monte Carlo Problems:
• High dimensional (~10^7 parameter)• A large number of correlatedcorrelated parameters
• No reduction of problem size possible• Complex posterior distributions
Numerical approximation Dim > 4 MCMC
Metropolis-Hastings
![Page 16: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/16.jpg)
J. Jasche, Bayesian LSS Inference
Markov Chain Monte Carlo Problems:
• High dimensional (~10^7 parameter)• A large number of correlatedcorrelated parameters
• No reduction of problem size possible• Complex posterior distributions
Numerical approximation• Dim > 4 MCMC
Metropolis-Hastings
![Page 17: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/17.jpg)
J. Jasche, Bayesian LSS Inference
Markov Chain Monte Carlo Problems:
• High dimensional (~10^7 parameter)• A large number of correlatedcorrelated parameters
• No reduction of problem size possible• Complex posterior distributions
Numerical approximation• Dim > 4 MCMC
• Metropolis-Hastings
![Page 18: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/18.jpg)
J. Jasche, Bayesian LSS Inference
Parameter space exploration via Hamiltonian sampling interpret log-posterior as potential
introduce Gaussian auxiliary “momentum” variable
rltant joint posterior distribution of and
separable in and marginalization over yields again
Hamiltonian sampling
![Page 19: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/19.jpg)
J. Jasche, Bayesian LSS Inference
Parameter space exploration via Hamiltonian sampling• interpret log-posterior as potential
introduce Gaussian auxiliary “momentum” variable
resultant joint posterior distribution of and
separable in andmarginalization over yields again
Hamiltonian sampling
![Page 20: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/20.jpg)
J. Jasche, Bayesian LSS Inference
Parameter space exploration via Hamiltonian sampling• interpret log-posterior as potential
• introduce Gaussian auxiliary “momentum” variable
resultant joint posterior distribution of and
separable in and marginalization over yields again
Hamiltonian sampling
![Page 21: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/21.jpg)
J. Jasche, Bayesian LSS Inference
Parameter space exploration via Hamiltonian sampling• interpret log-posterior as potential
• introduce Gaussian auxiliary “momentum” variable
• resultant joint posterior distribution of and
separable in and marginalization over yields again
Hamiltonian sampling
![Page 22: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/22.jpg)
J. Jasche, Bayesian LSS Inference
Parameter space exploration via Hamiltonian sampling• interpret log-posterior as potential
• introduce Gaussian auxiliary “momentum” variable
• resultant joint posterior distribution of and
separable in and marginalization over yields again
Hamiltonian sampling
![Page 23: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/23.jpg)
J. Jasche, Bayesian LSS Inference
IDEA: Use Hamiltonian dynamics to explore solve Hamiltonian system to obtain new sample
Hamiltonian dynamics conserve the Hamiltonian Metropolis acceptance probability is unity
All samples are accepted
Hamiltonian sampling
![Page 24: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/24.jpg)
J. Jasche, Bayesian LSS Inference
IDEA: Use Hamiltonian dynamics to explore• solve Hamiltonian system to obtain new sample
Hamiltonian dynamics conserve the Hamiltonian Metropolis acceptance probability is unity
All samples are accepted
Hamiltonian sampling
![Page 25: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/25.jpg)
J. Jasche, Bayesian LSS Inference
IDEA: Use Hamiltonian dynamics to explore• solve Hamiltonian system to obtain new sample
Hamiltonian dynamics conserve the HamiltonianMetropolis acceptance probability is unity
All samples are accepted
Hamiltonian sampling
![Page 26: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/26.jpg)
J. Jasche, Bayesian LSS Inference
IDEA: Use Hamiltonian dynamics to explore• solve Hamiltonian system to obtain new sample
Hamiltonian dynamics conserve the Hamiltonian Metropolis acceptance probability is unity
All samples are accepted
Hamiltonian sampling
![Page 27: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/27.jpg)
J. Jasche, Bayesian LSS Inference
IDEA: Use Hamiltonian dynamics to explore• solve Hamiltonian system to obtain new sample
Hamiltonian dynamics conserve the Hamiltonian Metropolis acceptance probability is unity
All samples are accepted
Hamiltonian sampling
![Page 28: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/28.jpg)
J. Jasche, Bayesian LSS Inference
Hamiltonian sampling Example: Wiener posterior = multivariate normal distribution
![Page 29: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/29.jpg)
J. Jasche, Bayesian LSS Inference
Hamiltonian sampling Example: Wiener posterior = multivariate normal distribution
Prior
![Page 30: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/30.jpg)
J. Jasche, Bayesian LSS Inference
Hamiltonian sampling Example: Wiener posterior = multivariate normal distribution
Likelihood
![Page 31: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/31.jpg)
J. Jasche, Bayesian LSS Inference
Hamiltonian sampling Example: Wiener posterior = multivariate normal distribution
![Page 32: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/32.jpg)
J. Jasche, Bayesian LSS Inference
Hamiltonian sampling Example: Wiener posterior = multivariate normal distribution
EOM:
coupled harmonic oscillator
![Page 33: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/33.jpg)
J. Jasche, Bayesian LSS Inference
Hamiltonian sampling How to set the Mass matrix?
Large number of tunable parameter Determines efficiency of sampler
Mass matrix aims at decoupling the system In practice: use diagonal approximation The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
![Page 34: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/34.jpg)
J. Jasche, Bayesian LSS Inference
Hamiltonian sampling How to set the Mass matrix?
• Large number of tunable parameter Determines efficiency of sampler
Mass matrix aims at decoupling the system In practice: use diagonal approximation The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
![Page 35: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/35.jpg)
J. Jasche, Bayesian LSS Inference
Hamiltonian sampling How to set the Mass matrix?
• Large number of tunable parameter• Determines efficiency of sampler
Mass matrix aims at decoupling the system In practice: use diagonal approximation The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
![Page 36: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/36.jpg)
J. Jasche, Bayesian LSS Inference
Hamiltonian sampling How to set the Mass matrix?
• Large number of tunable parameter• Determines efficiency of sampler
Mass matrix aims at decoupling the system In practice: use diagonal approximation The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
![Page 37: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/37.jpg)
J. Jasche, Bayesian LSS Inference
Hamiltonian sampling How to set the Mass matrix?
• Large number of tunable parameter• Determines efficiency of sampler
• Mass matrix aims at decoupling the system In practice: use diagonal approximation The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
![Page 38: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/38.jpg)
J. Jasche, Bayesian LSS Inference
Hamiltonian sampling How to set the Mass matrix?
• Large number of tunable parameter• Determines efficiency of sampler
• Mass matrix aims at decoupling the system• In practice: use diagonal approximation The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
![Page 39: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/39.jpg)
J. Jasche, Bayesian LSS Inference
Hamiltonian sampling How to set the Mass matrix?
• Large number of tunable parameter• Determines efficiency of sampler
• Mass matrix aims at decoupling the system• In practice: use diagonal approximation• The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
![Page 40: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/40.jpg)
J. Jasche, Bayesian LSS Inference
Hamiltonian sampling How to set the Mass matrix?
• Large number of tunable parameter• Determines efficiency of sampler
• Mass matrix aims at decoupling the system• In practice: use diagonal approximation• The quality of approximation determines sampler efficiency• Non-Gaussian case: Taylor expand to find Mass matrix
![Page 41: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/41.jpg)
J. Jasche, Bayesian LSS Inference
HMC in action Inference of non-linear density fields in cosmology
• Non-linear density field• Log-normal prior
Galaxy distribution Poisson likelihood Signal dependent noise
Problem: Non-Gaussian sampling in high dimensions
See e.g. Coles & Jones (1991), Kayo et al. (2001)
Credit: M. Blanton and the
Sloan Digital Sky Survey
HADES (HAmiltonian Density Estimation and Sampling)
Jasche, Kitaura (2010)
![Page 42: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/42.jpg)
J. Jasche, Bayesian LSS Inference
HMC in action Inference of non-linear density fields in cosmology
• Non-linear density field• Log-normal prior
• Galaxy distribution• Poisson likelihood• Signal dependent noise
Problem: Non-Gaussian sampling in high dimensions
See e.g. Coles & Jones (1991), Kayo et al. (2001)
Credit: M. Blanton and the
Sloan Digital Sky Survey
HADES (HAmiltonian Density Estimation and Sampling)
Jasche, Kitaura (2010)
![Page 43: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/43.jpg)
J. Jasche, Bayesian LSS Inference
HMC in action Inference of non-linear density fields in cosmology
• Non-linear density field• Log-normal prior
• Galaxy distribution• Poisson likelihood• Signal dependent noise
Problem: Non-Gaussian sampling in high dimensions
See e.g. Coles & Jones (1991), Kayo et al. (2001)
Credit: M. Blanton and the
Sloan Digital Sky Survey
HADES (HAmiltonian Density Estimation and Sampling)
Jasche, Kitaura (2010)
![Page 44: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/44.jpg)
J. Jasche, Bayesian LSS Inference
LSS inference with the SDSS
Application of HADES to SDSS DR7• cubic, equidistant box with sidelength 750 Mpc• ~ 3 Mpc grid resolution• ~ 10^7 volume elements / parameters
Jasche, Kitaura, Li, Enßlin (2010)
![Page 45: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/45.jpg)
J. Jasche, Bayesian LSS Inference
Application of HADES to SDSS DR7• cubic, equidistant box with sidelength 750 Mpc• ~ 3 Mpc grid resolution• ~ 10^7 volume elements / parameters
Goal: provide a representation of the SDSS density posterior• to provide 3D cosmographic descriptions• to quantify uncertainties of the density distribution
LSS inference with the SDSS
Jasche, Kitaura, Li, Enßlin (2010)
![Page 46: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/46.jpg)
J. Jasche, Bayesian LSS Inference
LSS inference with the SDSS
![Page 47: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/47.jpg)
J. Jasche, Bayesian LSS Inference
![Page 48: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/48.jpg)
J. Jasche, Bayesian LSS Inference
Multiple Block Sampling What if the HMC is not an option?
Problem: Design of “good” proposal distributions High rejection rates
Multiple block sampling ( see e.g. Hastings (1997) )
Break down into subproblems
simplifies design of conditional proposal distributions Average acceptance rate is higher Requires serial processing
Serial processing only!
![Page 49: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/49.jpg)
J. Jasche, Bayesian LSS Inference
Multiple Block Sampling What if the HMC is not an option?
• Problem: Design of “good” proposal distributions High rejection rates
Multiple block sampling ( see e.g. Hastings (1997) )
Break down into subproblems
simplifies design of conditional proposal distributions Average acceptance rate is higher Requires serial processing
Serial processing only!
![Page 50: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/50.jpg)
J. Jasche, Bayesian LSS Inference
Multiple Block Sampling What if the HMC is not an option?
• Problem: Design of “good” proposal distributions• High rejection rates
Multiple block sampling ( see e.g. Hastings (1997) )
Break down into subproblems
simplifies design of conditional proposal distributions Average acceptance rate is higher Requires serial processing
Serial processing only!
![Page 51: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/51.jpg)
J. Jasche, Bayesian LSS Inference
Multiple Block Sampling What if the HMC is not an option?
• Problem: Design of “good” proposal distributions• High rejection rates
Multiple block sampling ( see e.g. Hastings (1997) )
Break down into subproblems
simplifies design of conditional proposal distributions Average acceptance rate is higher Requires serial processing
Serial processing only!
![Page 52: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/52.jpg)
J. Jasche, Bayesian LSS Inference
Multiple Block Sampling What if the HMC is not an option?
• Problem: Design of “good” proposal distributions• High rejection rates
Multiple block sampling ( see e.g. Hastings (1997) )
• Break down into subproblems
simplifies design of conditional proposal distributions Average acceptance rate is higher Requires serial processing
![Page 53: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/53.jpg)
J. Jasche, Bayesian LSS Inference
Multiple Block Sampling What if the HMC is not an option?
• Problem: Design of “good” proposal distributions• High rejection rates
Multiple block sampling ( see e.g. Hastings (1997) )
• Break down into subproblems
• simplifies design of conditional proposal distributions Average acceptance rate is higher Requires serial processing
Serial processing only!
![Page 54: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/54.jpg)
J. Jasche, Bayesian LSS Inference
Multiple Block Sampling What if the HMC is not an option?
• Problem: Design of “good” proposal distributions• High rejection rates
Multiple block sampling ( see e.g. Hastings (1997) )
• Break down into subproblems
• simplifies design of conditional proposal distributions• Average acceptance rate is higherRequires serial processing
Serial processing only!
![Page 55: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/55.jpg)
J. Jasche, Bayesian LSS Inference
Multiple Block Sampling What if the HMC is not an option?
• Problem: Design of “good” proposal distributions• High rejection rates
Multiple block sampling ( see e.g. Hastings (1997) )
• Break down into subproblems
• simplifies design of conditional proposal distributions• Average acceptance rate is higher• Requires serial processing
Serial processing only!
![Page 56: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/56.jpg)
J. Jasche, Bayesian LSS Inference
Multiple Block Sampling Can we “boost” block sampling?
Sometimes it is easier to explore full joint the PDF
Block sampler:
Permits efficient sampling for numerical expensive posteriors
![Page 57: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/57.jpg)
J. Jasche, Bayesian LSS Inference
Multiple Block Sampling Can we “boost” block sampling?
• Sometimes it is easier to explore full joint the PDF
Block sampler:
Permits efficient sampling for numerical expensive posteriors
![Page 58: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/58.jpg)
J. Jasche, Bayesian LSS Inference
Multiple Block Sampling Can we “boost” block sampling?
• Sometimes it is easier to explore full joint the PDF
• Block sampler:
Permits efficient sampling for numerical expensive posteriors
process in parallel!
![Page 59: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/59.jpg)
J. Jasche, Bayesian LSS Inference
Multiple Block Sampling Can we “boost” block sampling?
• Sometimes it is easier to explore full joint the PDF
• Block sampler:
Permits efficient sampling for numerical expensive posteriors
process in parallel!
![Page 60: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/60.jpg)
J. Jasche, Bayesian LSS Inference
Multiple Block Sampling Can we “boost” block sampling?
• Sometimes it is easier to explore full joint the PDF
• Block sampler:
• Permits efficient sampling for numerical expensive posteriors
process in parallel!
![Page 61: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/61.jpg)
J. Jasche, Bayesian LSS Inference
Photometric redshift sampling Photometric surveys
• millions of galaxies ( ~10^7 - 10^8) low redshift accuracy ( ~ 100 Mpc along LOS) Infer accurate redshifts: Rather sample from joint distribution:
Block sampler:
Process in parallel!
HMC sampler!
![Page 62: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/62.jpg)
J. Jasche, Bayesian LSS Inference
Photometric redshift sampling Photometric surveys
• millions of galaxies ( ~10^7 - 10^8)• low redshift accuracy ( ~ 100 Mpc along LOS) Infer accurate redshifts: Rather sample from joint distribution:
Block sampler:
![Page 63: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/63.jpg)
J. Jasche, Bayesian LSS Inference
Photometric redshift sampling Photometric surveys
• millions of galaxies ( ~10^7 - 10^8)• low redshift accuracy ( ~ 100 Mpc along LOS)• Infer accurate redshifts: Rather sample from joint distribution:
Block sampler:
![Page 64: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/64.jpg)
J. Jasche, Bayesian LSS Inference
Photometric redshift sampling Photometric surveys
• millions of galaxies ( ~10^7 - 10^8)• low redshift accuracy ( ~ 100 Mpc along LOS)• Infer accurate redshifts:• Rather sample from joint distribution:
Block sampler:
![Page 65: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/65.jpg)
J. Jasche, Bayesian LSS Inference
Photometric redshift sampling Photometric surveys
• millions of galaxies ( ~10^7 - 10^8)• low redshift accuracy ( ~ 100 Mpc along LOS)• Infer accurate redshifts:• Rather sample from joint distribution:
• Block sampler:
![Page 66: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/66.jpg)
J. Jasche, Bayesian LSS Inference
Photometric redshift sampling Photometric surveys
• millions of galaxies ( ~10^7 - 10^8)• low redshift accuracy ( ~ 100 Mpc along LOS)• Infer accurate redshifts:• Rather sample from joint distribution:
• Block sampler:
Process in parallel!
![Page 67: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/67.jpg)
J. Jasche, Bayesian LSS Inference
Photometric redshift sampling Photometric surveys
• millions of galaxies ( ~10^7 - 10^8)• low redshift accuracy ( ~ 100 Mpc along LOS)• Infer accurate redshifts:• Rather sample from joint distribution:
• Block sampler:
Process in parallel!
HMC sampler!
![Page 68: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/68.jpg)
J. Jasche, Bayesian LSS Inference
Photometric redshift sampling Application to artificial photometric data
• ~ Noise, Systematics, Position uncertainty (~100 Mpc)• ~ 10^7 density amplitudes /parameters• ~ 2x10^7 radial galaxy positions / parameters~ 3x10^7 parameters in total
![Page 69: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/69.jpg)
J. Jasche, Bayesian LSS Inference
Photometric redshift sampling Application to artificial photometric data
• ~ Noise, Systematics, Position uncertainty (~100 Mpc)• ~ 10^7 density amplitudes /parameters• ~ 2x10^7 radial galaxy positions / parameters• ~ 3x10^7 parameters in total
![Page 70: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/70.jpg)
J. Jasche, Bayesian LSS Inference
Photometric redshift sampling
Jasche, Wandelt (2012)
![Page 71: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/71.jpg)
J. Jasche, Bayesian LSS Inference
Deviation from the truth
Jasche, Wandelt (2012)
Before After
![Page 72: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/72.jpg)
J. Jasche, Bayesian LSS Inference
Deviation from the truth
Jasche, Wandelt (2012)
Raw data Density estimate
![Page 73: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/73.jpg)
J. Jasche, Bayesian LSS Inference
4D physical inference Physical motivation
Complex final state Simple initial state
![Page 74: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/74.jpg)
J. Jasche, Bayesian LSS Inference
4D physical inference Physical motivation
• Complex final state Simple initial state
Final state
![Page 75: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/75.jpg)
J. Jasche, Bayesian LSS Inference
4D physical inference Physical motivation
• Complex final state• Simple initial state
Initial state Final state
![Page 76: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/76.jpg)
J. Jasche, Bayesian LSS Inference
4D physical inference Physical motivation
• Complex final state• Simple initial state
Gravity
Initial state Final state
![Page 77: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/77.jpg)
J. Jasche, Bayesian LSS Inference
4D physical inference The ideal scenario:
• We need a very very very large computer!
Not practical! Even with approximations!!!!
![Page 78: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/78.jpg)
J. Jasche, Bayesian LSS Inference
4D physical inference The ideal scenario:
• We need a very very very large computer!
Not practical! Even with approximations!!!!
![Page 79: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/79.jpg)
J. Jasche, Bayesian LSS Inference
4D physical inference The ideal scenario:
• We need a very very very large computer!
Not practical! Even with approximations!!!!
![Page 80: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/80.jpg)
J. Jasche, Bayesian LSS Inference
4D physical inference The ideal scenario:
• We need a very very very large computer!
Not practical! Even with approximations!!!!
![Page 81: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/81.jpg)
J. Jasche, Bayesian LSS Inference
4D physical inference The ideal scenario:
• We need a very very very large computer!
Not practical! Even with approximations!!!!
![Page 82: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/82.jpg)
J. Jasche, Bayesian LSS Inference
4D physical inference BORG (Bayesian Origin Reconstruction from Galaxies)
• HMC• Second order Lagrangian perturbation theory
BORG
Jasche, Wandelt (2012)
![Page 83: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/83.jpg)
J. Jasche, Bayesian LSS Inference
4D physical inference
![Page 84: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/84.jpg)
J. Jasche, Bayesian LSS Inference
4D physical inference
![Page 85: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/85.jpg)
J. Jasche, Bayesian LSS Inference
Summary & Conclusion Large scale Bayesian inference
• Inference in high dimensions from incomplete observations Noise, systematic effects, survey geometry, selection effects, biases
• Need to quantify uncertainties explore posterior distribution Markov Chain Monte Carlo methods Hamiltonian sampling (exploit symmetries, decouple system) Multiple block sampling (break down into subproblems)
3 high dimensional examples (>10^7 parameter) Nonlinear density inference Photometric redshift and density inference 4D physical inference
![Page 86: Bayesian LSS Inference - mpe.mpg.de](https://reader030.vdocuments.us/reader030/viewer/2022041120/62504bd608dabe1bda28625f/html5/thumbnails/86.jpg)
J. Jasche, Bayesian LSS Inference
The End …
Thank you