travis v. anderson july 26, 2011 graduate committee: christopher a. mattson

59
Efficient, Accurate, and Non- Gaussian Statistical Error Propagation Through Nonlinear System Models Travis V. Anderson July 26, 2011 Graduate Committee: Christopher A. Mattson David T. Fullwood Kenneth W. Chase

Upload: farhani

Post on 24-Feb-2016

25 views

Category:

Documents


0 download

DESCRIPTION

Efficient, Accurate, and Non-Gaussian Statistical Error Propagation Through Nonlinear System Models. Travis V. Anderson July 26, 2011 Graduate Committee: Christopher A. Mattson David T. Fullwood Kenneth W. Chase. Presentation Outline. Section 1: Introduction & Motivation - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Efficient, Accurate, and Non-Gaussian Statistical Error Propagation Through

Nonlinear System Models

Travis V. Anderson July 26, 2011

Graduate Committee: Christopher A. Mattson David T. Fullwood Kenneth W. Chase

Page 2: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Presentation Outline

2

Section 1: Introduction & MotivationSection 2: Uncertainty Analysis MethodsSection 3: Propagation of VarianceSection 4: Propagation of Skewness & KurtosisSection 5: Conclusion & Future Work

Page 3: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson3

Section 1:Introduction & Motivation

Page 4: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Engineering Disasters

Tacoma Narrows Bridge

Hindenburg

Space Shuttle Challenger

Chernobyl

Page 5: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

5

F-35 Joint-Strike Fighter

Page 6: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Research Motivation

• Allow the system designer to quantify system model accuracy more quickly and accurately

• Allow the system designer to verify design decisions at the time they are made

• Prevent unnecessary design iterations and system failures by creating better system designs

6

Page 7: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson7

Section 2:Uncertainty Analysis Methods

Page 8: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Uncertainty Analysis Methods

8

• Error Propagation via Taylor Series Expansion• Brute Force Non-Deterministic Analysis

(Monte Carlo, Latin Hypercube, etc.)• Deterministic Model Composition• Error Budgets• Univariate Dimension Reduction• Interval Analysis• Bayesian Inference• Response Surface Methodologies• Anti-Optimizations

Page 9: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Brute Force Non-Deterministic Analysis

9

• Fully-described, non-Gaussian output distribution can be obtained

• Simulation must be executed again each time any input changes

• Computationally expensive

Page 10: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson10

Deterministic Model Composition

• A compositional system model is created• Each component’s error is included in an error-

augmented system model• Component error values are varied as the model is

executed repeatedly to determine max/min error bounds

Page 11: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson11

Error Budgets

• Error in one component is perturbed at a time• Each perturbation’s effect on model output is

observed• Either errors must be independent or a separate

model of error interactions is required

Page 12: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson12

Univariate Dimension Reduction

• Data is transformed from a high-dimensional space to a lower-dimensional space

• In some situations, analysis in reduced space may be more accurate than in the original space

Page 13: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson13

Interval Analysis

• Measurement and rounding errors are bounded• Arithmetic can be performed using intervals instead of a

single nominal value• Many software languages, libraries, compilers, data types,

and extensions support interval arithmetic • XSC, Profil/BIAS, Boost, Gaol, Frink, MATLAB (Intlab)

• IEEE Interval Standard (P1788)

Page 14: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson14

Bayesian Inference

• Combines common-sense knowledge with observational evidence

• Meaningful relationships are declared, all others are ignored

• Attempts to eliminate needless model complexity

Page 15: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson15

Response Surface Methodologies

• Typically uses experimental data and design of experiments techniques

• An n-dimensional response surface shows the output relationship between n-input variables

Page 16: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson16

Anti-Optimizations

• Two-tiered optimization problem• Uncertainty is anti-optimized on a lower level to

find the worst-case scenario• The overall design is then optimized on a higher-

level to find the best design

Page 17: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson17

Section 3:Propagation of Variance

Page 18: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson18

Central Moments

• 0th Central Moment is 1• 1st Central Moment is 0• 2nd Central Moment is variance• 3rd Central Moment is used to calculate skewness• 4th Central Moment is used to calculate kurtosis

Page 19: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson19

First Order Taylor Series

Page 20: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

First-Order Formula Derivation

Square and take the Expectation of both sides:

20

Assumption:• Inputs are independent

CovarianceTerm

Page 21: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

First-Order Error Propagation

21

• Formula for error propagation most-often cited in literature

• Frequently used “blindly” without an appreciation of its underlying assumptions and limitations

Page 22: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Assumptions and Limitations

22

1. The approximation is generally more accurate for linear models This Section

2. Only variance is propagated and higher-order statistics are neglected Section 4

3. All inputs are assumed be Gaussian Section 44. System outputs and output derivatives can be obtained5. Taking the Taylor series expansion about a single point

causes the approximation to be of local validity only6. The input means and standard deviations must be known7. All inputs are assumed to be independent

Page 23: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

First-Order Accuracy

23

Function: y = 1000sin(x)Input Variance: 0.2

100% ErrorUnacceptable!

Page 24: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Second-Order Error Propagation

24

Just as before:1. Subtract the expectation of a second-order Taylor

series from a second-order Taylor series2. Square both sides, and take the expectation

Odd moments are zero

Assumption:• Inputs are Gaussian

Page 25: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Second-Order Error Propagation

• Second-order formula for error propagation most-often cited in literature

• Like the first-order approximation, the second-order approximation is also frequently used “blindly” without an appreciation of its underlying assumptions and limitations

Page 26: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Second-Order Accuracy

26

Function: y = 1000sin(x)Input Variance: 0.2

Page 27: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Higher-Order Accuracy

27

Function: y = 1000sin(x)Input Variance: 0.2

Page 28: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Computational Cost

28

Page 29: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Predicting Truncation Error

29

• How can we achieve higher-order accuracy with lower-order cost?

Page 30: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Predicting Truncation Error

30

• Can Truncation Error Be Predicted?

Page 31: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Adding A Correction Factor

31

Trigonometric (2nd Order): y = sin(x) or y = cos(x)

Page 32: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Trigonometric Correction Factor

32

Page 33: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Correction Factors

Exponential (1st Order): y = exp(x)

Natural Log (1st Order): y = ln(x)

33

Page 34: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Correction Factors

where:

Exponential (1st Order): y = bx

34

Page 35: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

So What Does All This Mean?

• We can achieve higher-order accuracy with lower-order computational cost

35

Computational CostAverage Error

Page 36: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Kinematic Motion of Flapping Wing

36

Page 37: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Accuracy of Variance Propagation

37

Order2nd:3rd:4th:CF:

RMS Rel. Err.40.97%11.18%1.32%1.96%

Page 38: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Computational Cost

38

Execution time was reduced from ~70 minutes to ~4 minutes A computational cost reduction by 1750%

Fourth-order accuracy was obtained with only second-order computational cost

Page 39: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson39

Section 4:Propagation of Skewness &

Kurtosis

Page 40: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Non-Gaussian Error Propagation

40

Predicted Gaussian Output Actual System Output

Predicted Non-Gaussian Output Actual System Output

Page 41: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Skewness

41

• Measure of a distribution’s asymmetry• A symmetric distribution has zero

skewness

Page 42: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Propagation of Skewness

42

• Based on a second-order Taylor series

Page 43: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Kurtosis & Excess Kurtosis

43

• Measure of a distribution’s “peakedness” or thickness of its tails

Kurtosis

Excess Kurtosis

Page 44: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Propagation of Kurtosis

44

• Based on a second-order Taylor series

Page 45: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Flat Rolling Metalworking Process

45

Coefficient of Friction

Roller RadiusMaximum change in material thickness achieved in a single pass

Page 46: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Input Distribution

46

Page 47: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Gaussian Error Propagation

47

• Probability Overlap: 53%

Predicted Gaussian Output Actual System Output

Page 48: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Non-Gaussian Error Propagation

48

• Probability Overlap: 93%

Predicted Non-Gaussian Output Actual System Output

Page 49: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Benefits of Higher-Order Statistics

49

Gaussian Non-Gaussian

Accuracy:Max ΔH:

(99.5% success rate)

93%7.9 cm

53%3.0 cm

That’s a 263% reductionin the number of passes!

Page 50: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson50

Section 5:Conclusion & Future Work

Page 51: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Conclusion

• Fourth-order accuracy in variance propagation can be achieved with only first- or second-order computational cost

• Designers do not need to assume Gaussian output. A fully-described output distribution can be obtained without significant additional cost

51

Page 52: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Future Work

• Develop predictable correction factors for other types of nonlinear functions and models

(differential equations, state-space models, etc.)• Apply correction factors to open-form models• Can correction factors be obtained for skewness and

kurtosis propagation?

52

Page 53: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Questions?

53

Page 54: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson54

Page 55: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Variance Example: Whirlybird

55

Page 56: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Variance Example: Whirlybird

56

Compositional Model

System Model (Pitch)

Page 57: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Higher-Order Stats Example: Thrust

57

Thrust Output

Page 58: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Higher-Order Stats Example: Thrust

58

Input Distribution

Gaussian Output Non-Gaussian Output Actual Output

Overlap: 65% Overlap: 79%

Page 59: Travis  V. Anderson July 26, 2011 Graduate Committee:    Christopher A. Mattson

Travis V. Anderson

Non-Gaussian Proof

59

Propagation of Skewness

Even Gaussian Inputs Produce Skewed Outputs If 2nd Derivatives Are Non-Zero

(Nonlinear Systems)