asq talk v4

40
Robots in Uncertain & Noisy World Ghulam Mustafa 5/13/2015

Upload: ghulam-mustafa-ph-d

Post on 15-Apr-2017

35 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: ASQ Talk v4

Robots in Uncertain & Noisy World

Ghulam Mustafa 5/13/2015

Page 2: ASQ Talk v4

Deterministic Mechanics

Rigid Body Kinematics

Newton’s Law Control

Robots in Uncertain & Noisy World

Stochastic Dynamics

Estimation

Filtering Prediction Inference

dx/dt = f(x,u,t) + n(t)

Forward – Backward Algorithm Featherstone | Kalman

Page 3: ASQ Talk v4

Tonight's Recurring Themes

TranslationRotation [Algorithms for computing Newtonian\Eulerian kinematics]

ForwardBackward [Algorithms for computing kinematics\dynamics and state prediction\estimation]

F=maAx=b [Algorithms for predicting dynamics and estimating states from measurements]

Page 4: ASQ Talk v4

Part 1 Deterministic Dynamics & Control

(Articulated Rigid Bodies)

Page 5: ASQ Talk v4

Some Definitions Spatial Motion of Rigid Bodies Denavit – Hartenberg Representation Sheth - Uicker Representation Computations and Algorithms Kinematics of Some Wheeled Robots Time Derivatives in ICC The Wheel Jacobian Computations and Algorithms Velocity – Torque Duality Newton—Euler Recursive Algorithm Robot Control

Content of this Talk : Part 1 – Articulated Rigid Bodies

Page 6: ASQ Talk v4

“In theory, there is no difference between theory and practice. In practice there is.”

Page 7: ASQ Talk v4

Part 2 Stochastic Dynamics (Filtering, Estimation and Prediction)

Page 8: ASQ Talk v4

In the Beginning - Geometry of Ax=b Minimizing Error - (Vanilla LS) Weighted - (Vanilla w/ Cream LS) Recursive - (Vanilla w /Cream & Nuts on top LS) … and Beyond (The Kalman Filter) Rules of Probability (Bayes’ POV) Graphical Models - Deconstructing Bayes Noisy Measurements and Estimation Repeated Noisy Measurements - Recursive Bayes Noisy Measurements and Estimation Hidden Markov Model (HMM) Forward—Backward Algorithm

Content of this Talk – Part 2 : Filtering, Estimation and Prediction

Page 9: ASQ Talk v4

It's tough to make predictions, especially about the future.

Yogi Berra

Page 10: ASQ Talk v4

Gauss-Markov-Kalman [and sometimes ]

Carl Friederich Gauss

1777-1855

Andrey Markov 1856-1922

Rudolf Kalman 1930

John Flaig

Flaig

Thomas Bayes 1701-1761

Page 11: ASQ Talk v4

Motivation Deterministic models represent adequate description of dynamics and control – why complicate things with stochasticity? Incomplete Deterministic Models: Models are based on assumptions and hence are approximations – ignoring higher modes, does not make them go away.

Extraneous Disturbance: Systems are driven not just by deterministic control inputs but also by uncontrollable environmental factors – wind gusts, treacherous terrain.

Incomplete /Noisy Measurement: Not everything is amenable to measurement (easier to measure position than velocity). Measurement errors are unavoidable.

Estimate the state xk from noisy measurements zk

Page 12: ASQ Talk v4

How to … Model Development: Develop models that account for uncertainties that are practical to implement.

Optimal Estimation: How to estimate model behavior based on incomplete and noisy sensor data – fuse data from multiple sources, recursively in real-time.

Optimal Control: Given uncertain system description, incomplete, noisy and corrupted data, how to optimally control a system for a desirable performance.

Estimate the state xk from noisy measurements zk

Performance Evaluation: How to evaluate performance capabilities of estimation and control both before and after they are built.

Page 13: ASQ Talk v4

Problem Formulation

System Dynamics :

Robot EOM : )(),()( qGqqVqqM

))(,),(),(),(()( tttutxtxftx

Observation : ))(,),(),(),(()( tttutxtxhtz

Linear Model :

)()()(

)()()()(

tvtHxtz

twtButAxtx

Discrete Time :

kkk

kkkk

vHxz

wBuAxx

111

k

k

z

x State @ tk

Observation @ tk

Estimate the state xk from noisy measurements zk

Page 14: ASQ Talk v4

Intro to Least Squares – Geometry of Ax=b

2

1

2

22

12

1

21

11

2

1

2

1

2221

1211

b

bx

a

ax

a

a

b

b

x

x

aa

aabAx

Linear Combination of Columns of A

What if b is NOT in S(A)?

Solution exists if b lies in S(A) [space spanned by columns of A]

Columns Space of A bAx

Columns Space of A

PbxA ˆ

b

xAb ˆ

Project b on S(A) and call it the best estimate of x.

xAbe ˆMinimize

Page 15: ASQ Talk v4

bAAAxbAAxA TTTT 1)(

Recall for a non-square A

Minimizing the Error – Vanilla LS

bAAAx

xAAbAexd

d

xAbxAbe

xAbe

TT

TT

T

1

2

2

)(ˆ

0ˆ22ˆ

)ˆ).(ˆ(

ˆ

Projection P (on S(A))

Minimize

Consider

bAx

The best solution is the one that minimizes the norm square of the error

(Assume b to be measurements and x the state)

Page 16: ASQ Talk v4

WbWAWAWAx

WbWAxWAWA

ewewewWe

TTTT

TT

1

2

33

2

22

2

11

2

)(ˆ

)(ˆ)(

...

bAAAxbAAxA TTTT 1)(

Recall when equally reliable W=I

dxxxpeE )(][Mean Error

dxxpxeE )(][ 222Variance

dxeepeeeeEcv jijiji ),(][C0-Variance

1V 1V

][

)(ˆ

1

111

T

T

P

T

eeEP

bVAAVAx

Weighted Residual – Vanilla w/ Cream LS If the measurements are not equally reliable

WbWAx (W is diagonal of weights wi)

Page 17: ASQ Talk v4

Recursive LS – Vanilla w/ Cream & Nuts on-the-top LS Now imagine the data coming in a stream

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

0b

00 bxA

11 bxA

1b

If more data arrives, can the best estimate for the combined data be computed from x0 and b1 without restarting the calculation from b0?

Lets compute the average of n numbers n

xxxnA n...)( 21

One additional data arrives 1

...)1( 121

n

xxxxnA nn

121 ...)1()1( nn xxxxnAn

1211 ...)( nnn xxxxxnnA

)(1

1)()1( 1 nAx

nnAnA n

Re-arrange and simplify

)()()1( 1 nAxKnAnA n

Running Average

)()1(, nAnAn

A(n

+1)

Digression – Running Average

Page 18: ASQ Talk v4

Recursive LS – Vanilla w/ Cream & Nuts on-the-top LS … 2b

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

0b

00 bxA

11 bxA

1b

22 bxA

Appended Data

1

0

1

0

1

1

0

1

0

b

b

A

Ax

A

A

A

ATT

)( 11001

1

0

1

0

11 bAbAPb

b

A

APx TT

T

Re-arrange and simplify )( 0111101 xAbAPxx T

Original Data 00000 )( bAxAA

TT

)( 011101 xAbKxx

Recursive LS

Page 19: ASQ Talk v4

Recursive LS – What’s Goin’ On?

Projecting b on S(A) is the best estimate of x.

Columns Space of A

PbxA ˆ

b

xAb ˆ

exAb ˆError

Expanding th Columns Space of A

bAppending data expands S(A) and makes b closer to S(A) which reduces the error. In the limit, b collapses on S(A) we have the exact Ax=b.

0ˆ xAbError

Page 20: ASQ Talk v4

Recursive LS – and Beyond (Take One)

iii

iii

xAb

xFx

1 System Dynamics

Measurement

Est

imat

e C

urre

nt

Sta

te

Predict Future State

1b

1P

111 xAb

2

1

0

2

1

0

2

1

1

0

0

0

0

00

0

00

0

00

b

b

b

x

x

x

A

IF

A

IF

A

bAx )(1 iiiiii xAbKxx

Kalman

1x0F

100 xxF

0x

0b

0P

000 xAb

2b3b

2x 3x4x

2P 3P

1F 2F 3F

222 xAb

211 xxF 322 xxF

433 xxF

Page 21: ASQ Talk v4

Recursive LS – and Beyond (Take Two)

)ˆ(ˆˆ kkkkk xHzKxx

Kalman

kkk

kkk

vHxz

wAxx

11 System Dynamics with noise

Measurement with error

System & Measurement Noise

kx

kz

kx̂

1kx kxˆ

RRNvp

QQNwp

:),0(~)(

:),0(~)( system noise covariance meas. noise covariance

:][:ˆ

][:ˆ

k

kT

kkkkkk

T

kkkkkk

xd

dPeeEPxxe

eeEPxxe

Page 22: ASQ Talk v4

Rules of Probability – The Bayesian POV

)()|()()|(),(

),()(;),()(

ypyxpxpxypyxp

yxpypyxpxpxy

Sum Rule

Product Rule

Joint Probability

Conditional Probability

)(

)()|()|(

xp

ypyxpxyp

Bayes’ Rule

(Posterior)

(Likelihood) (Prior)

(Marginal)

(Prior) – belief before making and observation or collecting data.

(Posterior) – belief after making and observation or collecting data. It forms the Prior for the next iteration.

(Likelihood) – it is a function of y, not a probability distribution of y. (Marginal) – Data from the observation.

p(x,y)

p(y|x)

p(x|y)

x

y

Page 23: ASQ Talk v4

Rules of Probability – The Bayesian POV

(Posterior) )(

)()|()|(

xp

pxpxp

Bayes’ Rule (Likelihood) (Prior)

(Marginal)

2,)|( Nxp Likelihood

Unknown Known

2

00 ,)( Np Prior (Known)

),().,()()|()|(2

00

2 NNpxpxp

Posterior

Gaussian Prior

Gaussian Prior -> Gaussian Posterior [Conjugate Pair]

Page 24: ASQ Talk v4

Conditional Probability - De-Constructing Bayes’

),|().|(

),|().,,|(

).().().(

54746

2153214

321

xxxpxxp

xxxpxxxxp

xpxpxp

)().|().,|(),,( cpcbpcbapcbap

c

b

a

)|().|().(),,( bcpabpapcbap

c b a

)().().(),,( cpbpapcbap

c b a

Page 25: ASQ Talk v4

Factor Graphs – De-Constructing Bayes’

1x 3x2x

1f 3f2f 4f

)().,().,().,(),,( 34323212211321 xfxxfxxfxxfxxxp

N

i

iN fxxp1

1 ),...,(

1x

3x

2x

1f 2f

3f

321321

2133

22

11

..),,(

),|(

)(

)(

fffxxxp

xxxpf

xpf

xpf

1x

3x

2x

Page 26: ASQ Talk v4

Making Noisy Measurements on a Stationary Robot

Image Recognition/ Data Display / Storage

Robot

Target

Camera Problem Statement We want to estimate the mean position () of the robot from the noisy image measurements (x).

)(p

Measurement Model Assume () to be normally distributed random variable with probability p().

x (x) is the noisy measurement.

)|( xpNoise on the sensor centered around (x) is the noisy measurement.

Choose () .

Page 27: ASQ Talk v4

100um

Ext

PRE Theta

-4

-3

-2

-1

0

1

2

3

Pri

n 2

1

23

4

5

6

xy

-10 -8 -6 -4 -2 0 2 4

Prin 1

Extension

Pre-aligner

Theta

Theta

25

35

45

55

65

75

Mean o

f r

33 66 99 132 165 198 231 264 297 330 363 396 429 462

Sample40 50 60

Wafer Placement

“Potential” of the System

Patterns in data indicate presence of assignable cause(s).

(Gaussian = Purely Random)

Pattern vs Noise – One More Digression …

Page 28: ASQ Talk v4

Making Repeated Noisy Measurements

x

)(p

)|( xp

x

)(p

x

)(p

. . . . . .

Page 29: ASQ Talk v4

Successive Noisy Measurements –The Recursive Bayes’

)|( xp )|( xp )(p

Prior for next p(x)

)(xpSample

x

Page 30: ASQ Talk v4

Making Noisy Measurements on a Moving Robot (The Double Whammy) WMR whose distance from an obstacle is measured by an ultrasonic sensor. Estimate the position of WMR at any time t.

Measurement: @ t=0, initial position has a Gaussian distribution based on sensor accuracy. Initial position is estimated.

Prediction: @ t=1, position is predicted from the estimate @ t=0.

Measurement: @ t=1, position is estimated from the measurement @ t=1.

Correction: @ t=1, position is corrected from the measurement @ t=1. Also prediction is made for t=2.

Page 31: ASQ Talk v4

Making Noisy Measurements on a Moving Robot System Noise – Low Measurement Noise - Low

System Noise – Low Measurement Noise - High

System Noise – High Measurement Noise - Low

System Noise – High Measurement Noise - High

Page 32: ASQ Talk v4

Making Noisy Measurements on a Moving Robot (Take Three)

Kalman

iiii

iiii

exAb

xFx

1

x

)(p

)|( xp

1t 2t 3t 4t

)|( xp M

easu

rem

ent

Noi

se

Prediction Error

Page 33: ASQ Talk v4

The HMM – Single Page Review

Hidden Markov Model

Hidden States

Observations

Transition T(i,j)

Xxnz kk };...1{

Emission i

1t

1z 2z 3zkz 1kz

1x 2x 3x kx 1kx

2t 3t kt 1 kt

)( 1zp

k

kkkk zxpzzpzxpzpxzp )|()|()|()(),( 1111

Xxizxxpx

njiizjzpjiT

kkii

kk

),|()(

}..1{),();|(),( 1

Page 34: ASQ Talk v4

The HMM – The Forward-Backward Algo

1t 2t 3t kt 1 kt1 kt nt

)(),|(),|( 11 zpzzpzxp kkkk Given : Emission Transition Initial

Goal: Compute )|( :1nk xzp

Forward : nkxzp kk ...1),,( :1

Backward: nkzxp knk ...1),|( :1

)|().,(),()|( :1:1:1:1 knkkknknk zxpxzpxzpxzp

1kz

1kx

nz

nx

1z 2z 3z

kz

1x 2x 3x

1kz

1kxkx

kz

1xnx

Page 35: ASQ Talk v4

The HMM – Example – 2 State Model }10...1{};2,1{ Xxz kk (Uniformly Distributed)

7

2

9

1

6

2

7

1

6

1

Page 36: ASQ Talk v4

Forward-Backward – Two Ways

3j mj2j1j

0jRobot

Dynamic &

Control (Featherstone) kkk maf .

Velocities, Accelerations

Forces, Torques

Prediction, Estimation

Smoothing

State Prediction

& Estimation

(Kalman)

(Space)

z

x

)(xp

)|( xzp

)|( zxp

(Time)

Page 37: ASQ Talk v4

The Definition - Reliability of a Robot

It is the probability (R) that the robot will successfully complete the assigned task (T) under the specified conditions (C).

Specified Conditions (C): Martian Terrain / Contact with Human Body / Assembly Line Assigned Task (T): Move from A to B / Perform Surgery / Spot Weld. Probability (R). On Mars, move from A to B 50 times without failure. On a human perform surgery with failure. On an assembly line do 1 million spot welds before failure. The basic problem is to quantify R during design.

Page 38: ASQ Talk v4

Re-Visiting Kalman

And what we don’t know yet believe to be true is religion

Page 39: ASQ Talk v4

References

1. Craig, J. J., Introduction to Robotics,: Mechanics and Control, Prince-Hall, 2003.

2. Muir, F. P. and Neuman, C. P. , (1987), Kinematics Modeling of Wheeled Mobile Robots, J. Robotic Systems, 4(2).

3. Bishop, C. , Pattern Recognition and Machine Learning, Spring , 2006.

4. Ghahramani, Z. , (2001), An Introduction to Hidden Markov Models and Bayesian Networks, J. Pattern Recognitions and Artificial Intelligence.

5. Kalman, R. E. and Bucy, R. S. Z. , (1961), New Results on Linear Filtering and Prediction Theory, J. Basic Engineering.

Page 40: ASQ Talk v4

Pain

ting T

itle

: M

any

a M

oon A

go, G

. M

usta

fa ©

2011

Fin