modeling of physical systems underspeci ed by data

Post on 11-May-2022

0 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Modeling of physical systemsunderspecified by data

Daniel M. Tartakovsky

Department of Mechanical & Aerospace Engineering

University of California, San Diego

Outline

1. Physical systems & Stochastic PDEs

2. Data-Driven Domain Decompositions (D4) for SPDEs

3. Implementation of D4

(a) Data analysis & image segmentation

(b) Closure approximations for SPDEs

(c) PDEs on random domains

4. Effective parameters for heterogeneous composites

5. Conclusions

1.1 Background

• Wisdom begins with the acknowledgment of uncertainty—of the

limits of what we know. David Hume (1748), An Inquiry Concerning

Human Understanding

• Most physical systems are fundamentally stochastic (Wiener, 1938;

Frish, 1968; Papanicolaou, 1973; Van Kampen, 1976):

• Model & Parameter uncertainty

– Heterogeneity– Lack of sufficient data– Measurement noise∗ Experimental errors∗ Interpretive errors

00.2

0.40.6

0.81

0

0.2

0.4

0.6

0.8

1

• Randomness as a measure of ignorance

1.2 Probabilistic Framework

• Parameter,

p(x) : D ∈ Rd→ R

• Random field,

p(x, ω) : D × Ω→ R

• Ergodicity,

〈p〉 ≈ 1||D||

∫D pdx

00.2

0.40.6

0.81

0

0.2

0.4

0.6

0.8

1

parameter p(x1, x2)

• Governing equations become stochasticL(x;u) = f(x), x ∈ DB(x;u) = g(x), x ∈ ∂D

=⇒

L(x, ω;u) = f(x, ω), x ∈ DB(x, ω;u) = g(x, ω), x ∈ ∂D

1.3 Stochastic PDEs & UQ

• Consider a physical processL(x;u) = f(x), x ∈ DB(x;u) = g(x), x ∈ ∂D

– System parameters pi = p(xi),i = 1, . . . , N

– System states uj = u(xj),j = 1, . . . ,M

• Modeling steps

1. Use data to construct a probabilistic description of p(x, ω)2. Solve SPDEs to obtain a probabilistic description of u(x, ω)

3. Assimilate uj = u(xj) to refine prior distributions

1.4a Statistical Methods for UQ

• (Brute-force) Monte Carlo methods

– Convergence rate (CR): 1/√N

– CR is independent of the number of random variables

• Monte Carlo based methods

– Quasi MC (QMC)– Markov chain MC (MCMC)– Importance sampling (Fishman, 1996)

• Variance reduction techniques

– Problematic when the number of RVs is large

• Response Surface Methods (RSM)

– Interpolation in the state space reduces the number of realizations– Problematic when the number of RVs is large

1.4b Stochastic Methods for UQ

• “Indirect” methods

– Fokker-Planck equations

– Moments equations

• “Direct” methods

– Interval analysis: Maximum output bounds

– Operator-based methods

∗ Weighted integral method (Takada, 1990; Geodatis, 1991)

∗ Neumann expansion (Shinozuka, 1988)

• Polynomial chaos expansions

– Grounded in rigorous mathematical theory of Wiener (1938)

– Arbitrary inputs: Generalized polynomial chaos

1.5 Modeling Dichotomy

• Real systems are characterized by

– Non-stationary (statistically inhomogeneous)– Multi-modal– Large variances– Complex correlation structures

• Standard SPDE techniques require

– Stationarity (statistically homogeneity)– Small variances– Simple correlation structures (Gaussian, exponential)– Uni-modality

• Our goal is

– to incorporate realistic statistical parameterizations– to enhance predictive power– to improve computational efficiency

1.5 Reasons for Modeling Dichotomy

• Perturbation expansions

– Small variance σ2p

• Spectral methods / polynomial chaos expansions

– large correlation length lp– uni-modality

• Mapping closures

– “Nice” parameter distributions, e.g., Gaussian

• Stochastic upscaling (homogenezation)

– Regularity requirements, e.g., spatial periodicity– Small σ2

p

• Monte Carlo simulations

– N increases with variance σ2p

– Resolution depends on σ2p and lp

Outline

1. Physical systems & Stochastic PDEs

2. Data-Driven Domain Decompositions (D4) for SPDEs

3. Implementation of D4

4. Effective parameters for heterogeneous composites

5. Conclusions

2. Data-Driven Domain Decomposition

γR. D. D.

ΩΩ

f(Π) f(Π,Γ)

Multi-modal distributions Uni-modal distributions

High variances Low variancesComplex correlation structures Simple correlation structures

σ2p = Q1σ

2p1

+Q2σ2p2

+Q1Q2 [〈p1〉 − 〈p2〉]2

2.1 Strategy for Domain Decompositions

• Step 1: Decomposition of the parameter space (image processing

techniques; probability maps)

• Step 2: Conditional statistics (noise propagation; closures)∫LΠu f(Π|γ) dΠ → 〈u|γ〉

• Step 3: Averaging over random geometries

〈u〉 =

∫〈u|Γ〉 f(Γ) dΓ

2.2 Implementation of Domain Decompositions

1. Probabilistic decomposition of the parameter space

• Geostatistical reconstruction of internal geometries• (nonstationary) Bayesian spatial statistics• Statistical Learning Theory (Support Vector Machines)• Risk-based parameterization

2. Conditional statistics from SPDEs

• Perturbation expansions• Polynomial chaos expansions• Collocation methods• Nonlinear Gaussian mappings

3. PDEs on random domains

• Monte Carlo simulations• Stochastic mappings• Perturbation expansions

Outline

1. Physical systems & Stochastic PDEs

2. Data-Driven Domain Decompositions (D4) for SPDEs

3. Implementation of D4

3.1 Decomposition of the parameter space• Spatial statistics (geostatistics)• MCMC with Metropolis sampling• Support Vector Machines

3.2 Conditional Statistics

3.3 Averaging over random geometries

4. Effective parameters for heterogeneous composites

5. Conclusions

Probabilistic Reconstruction of Facies

Mean boundary (a) and the boundary with 95% probability (b)

Outline

1. Physical systems & Stochastic PDEs

2. Data-Driven Domain Decompositions (D4) for SPDEs

3. Implementation of D4

3.1 Decomposition of the parameter space

3.2 Conditional Statistics

• Perturbation solutions

• Polynomial chaos expansions

• Stochastic collocation on sparse grids

3.3 Averaging over random geometries

4. Effective parameters for heterogeneous composites

5. Conclusions

Conditional Statistics

L(x, ω;u) = f, x ∈ DB(x, ω;u) = g, x ∈ ∂D

, p = p(x, ω), ω ∈ Ω

γR. D. D.

ΩΩ

L(x, ω;u) = f, x ∈ DB(x, ω;u) = g, x ∈ ∂Dcontinuity conditions, x ∈ γ

, p =

p1(x, ω), x ∈ D1, ω ∈ Ω1

p2(x, ω), x ∈ D2, ω ∈ Ω2

3.2a Perturbation Solutions

• ∇ · k(x)∇u = Qδ(x− x0)

Yi = ln ki – Gaussian, Y 1 = 3Y 2

σ2Y1

= σ2Y2

= 1, σ2Y ≈ 30

exponential correlation functions

b is log-normal

• Perturbation in σ2Yi

• Monte Carlo simulations for b

– 〈u(x)〉: Eu ≈ 2%

– σ2u(x): Eσ ≈ 5%

1H H 2

b

K

K

2

1

tran

sver

se c

oord

inat

e, x

2

longitudinal coordinate, x1

0 2 4 6 8 10 120

2

4

6

8

10

12

3.2a Perturbation Solutions (cntd)

3.2a Perturbation Solutions (cntd)

Assign weights Wk to each boundary Γk, so that

〈u〉 =

∫〈u|Γ〉 f(Γ) dΓ ≈

∑k

Wk〈u|Γk〉

Mean (a) and variance (b) of u

3.2b Polynomial Chaos Expansions

• Stochastic PDEL(x, ω;u) = f(x, ω), x ∈ DB(x, ω;u) = g(x, ω), x ∈ ∂D

• Generalized polynomialchaos expansions

p(x, t, ω) =

∞∑i=1

ai(x, t)Ψi(ω)

– An approximation

p(x, t, ω) ≈N∑i=1

ai(x, t)Ψi(ω)

Correspondence between the type of the Wiener-Askey

polynomial chaos and their underlying random variables.

Distribution Polynomials

Gaussian HermiteGamma Laguerre

Beta JacobiUniform LegendrePoisson Charlier

Binomial KrawtchoukNegative binomial MeixnerHypergeometric Hahn

G. Em. Karniadakis. etc.

3.2b Polynomial Chaos Expansions (cntd)

• Parameter representation

p(x, t, ω) =

N∑i=1

ai(x, t)Ψi(ω)

– Choose an orthogonal polynomial basis– Reduce N

• Advantages:

– Nonperturbative

– Large fluctuations

• Drawbacks:

– Finite correlations

– Unimodal distributions

−15 −10 −5 0 5 10 15 20 25 30 350

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

x

F X(x

)

exact CDF5th−order Hermite expansion

a highly non-Gaussian unimodal field

0 0.5 1 1.5 2 2.5 30

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

xF X

(x)

exact CDF3th−order Hermite expansion5th−order Hermite expansion10th−order Hermite expansion

a bimodal field

3.2b Polynomial Chaos Expansions (cntd)

Problem: ∇ · k∇u = 0

Parameter measurements ki = k(xi):

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10.5

1

1.5

2

2.5

x

Mea

sure

men

t of K

(x)

0.5 1 1.5 2 2.5 30

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

2

K

p(K

)

Solution statistics: ensemble mean (a) and standard deviation (b)

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.2

0.4

0.6

x

mea

n he

ad

RDDPCMCS

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.01

0.02

0.03

0.04

0.05

0.06

0.07

0.08

x

head

STD

RDDPCMCS

3.2c Stochastic Collocation Methods

• Choice of sampling points

– Tensor products of 1-D collocation point sets– Sparse grids (nested or non-nested)

• Choice of weight functions

ϑ(ξ) = δ(ξ − ξk)

• Advantage:

– Nonintrusive

• Disadvantage

– Can be less accurate than PCE

x3

h

x1

x2

meanσh

x1

x2

x3

variance

Outline

1. Physical systems & Stochastic PDEs

2. Data-Driven Domain Decompositions (D4) for SPDEs

3. Implementation of D4

3.1 Decomposition of the parameter space

3.2 Conditional Statistics

3.3 Averaging over random geometries

4. Effective parameters for heterogeneous composites

5. Conclusions

Computational Approach

• A deterministic equation in a random domainL(x;u) = f(x), x ∈ D(ω)

B(x;u) = g(x), x ∈ ∂D(ω)

• Step 1: Stochastic mapping

ξ = ξ(x;ω), x = x(ξ;ω) x ∈ D(ω), ξ ∈ E

• Step 2: Solving a stochastic equation in a deterministic domainL(ξ, ω;u) = f(ξ, ω), ξ ∈ EB(ξ, ω;u) = g(ξ, ω), ξ ∈ ∂E

Random Mapping

• Surface parameterization

– Karhunen-Loeve

representations

– Fourier-type expansions

– Etc.

• Numerical mappings, e.g.,

∇2ξxi = 0 xi|∂E = xi|∂D(ω)

• Analytical mappings

x1

x2

ξ1

ξ2

D(ω)E

Q1 Q2

Q3

Q4

P1 P2

P3P4

ξ=ξ(x)

x=x(ξ)

Random mapping, D(ω)→ E

Computational examples

• Steady-state diffusion

∇2u(x;ω) = 0, x ∈ D(ω)

– Random bottom of a channel

– Random exclusion

– Numerical mapping

• Transport by Stocks flow in a pipe

∂u

∂t+ v · ∇u = a∇2u, x ∈ D(ω)

– Lubrication approximation

– Analytical mapping

u = 1

u = 0 u = 0

u = 0

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

x1

x 2

u=1

u=0

u=0

u=0

u=0

00.5

11.5

22.5

33.5

44.5

5

−0.20

0.2

−0.20

0.2

Steady Diffusion

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5

−0.2

0

0.2

0.4

0.6

0.8

1

1.2

x1

x 2

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5

−0.2

0

0.2

0.4

0.6

0.8

1

1.2

ξ1

ξ 2

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5−0.5

−0.4

−0.3

−0.2

−0.1

0

0.1

0.2

0.3

0.4

0.5

x1

x 2

• Random surface: exponential correlation function, λ = 1

• 10-term Karhunen-Loeve representation

• 10-dimensional (N = 10) random space

Steady Diffusion: Results

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5

−0.2

0

0.2

0.4

0.6

0.8

1

1.2

ξ1

ξ 2

0.166670.33333

0.50.66667

0.83333

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5

−0.2

0

0.2

0.4

0.6

0.8

1

1.2

ξ1

ξ 2

0.00300430.0060087

0.0090130.012017

0.015022

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 50

0.002

0.004

0.006

0.008

0.01

0.012

0.014

0.016

0.018

0.02

x

STD

SG: 1st−orderSG: 2nd−orderSG: 3rd−order

Mean & standard deviation PCE vs. MCS

4. Effective Parameters

• Step 1: Conditional effective conductivity

– Keffi(x|γ) = 〈Ki〉I + ki(x) ki = [I−Bi]−1Ai

– Closure by perturbation

• Step 2: Averaging over random geometries γ

0.0 0.2 0.4 0.6 0.8 1.0coordinate, x

0

100

200

300

400

500

600

700

appa

rent

con

duct

ivity

, Kap

p

σβ = 0.1

σβ = 0.25

σβ = 1.0

homogeneous model

Conclusions

• Stochastic PDEs provide a natural framework for uncertainty

quantification

• In real applications, SPDEs require a complex statistical

parameterization

• D4 allows for incorporation of expert knowledge and diverse data

• New concept of effective properties

• Ability to make predictions and quantify uncertainty in realistic

setting

top related