11. markov chains courtesy of j. akinpelu, anis koubâa, y. wexler, & d. geiger

16
11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

Upload: martin-farmer

Post on 14-Jan-2016

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

11. Markov Chains

Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

Page 2: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

2

Random ProcessesA stochastic process is a collection of random variables

– The index t is often interpreted as time.– is called the state of the process at time t.

• Discrete-valued or continuous-valued

– The set I is called the index set of the process.• If I is countable, the stochastic process is said to be a discrete-time

process.• If I is an interval of the real line, the stochastic process is said to

be a continuous-time process.

– The state space E is the set of all possible values that the random variables can assume.

It),t(X

)(tX

)(tX

Page 3: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

3

Discrete Time Random Process

If I is countable,is often denoted byn = 0,1,2,3,…

nX)(tX

0 1 2 3 4

time

Events occur at specific points in time

Page 4: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

Discrete time Random Process

Day 1

Day

Day 2 Day 3 Day 4 Day 5 Day 6 Day 7

THU FRI SAT SUN MON TUE WED

X(dayi): Status of the weather observed each DAY

State Space = {SUNNY, RAINY}

" "1X Sday " "3X Rday

" "2X Sday

" "5X Rday

" "4X Sday

" "7X Sday

" "6X Sday

" " or " " : RANDOM VARIABLE that varies with the DAYiX S Rday

IS A STOCHASTIC PROCESSiX day

Page 5: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

5

Markov processes

A stochastic process is called a Markov process if

for all states and all

If Xn’s are integer-valued,

Xn is called a Markov Chain

,2,1,0, nXn

jiiii n ,,,,, 110 0n

}|(

},,,,|{

1

0011111

iXjXP

iXiXiXiXjXP

nn

nnnn

Page 6: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

What is “Markov Property”?

Day 1

Day

Day 2 Day 3 Day 4 Day 5 Day 6 Day 7

THU FRI SAT SUN MON TUE WED

" "1X Sday " "3X Rday

" "2X Sday

" "5X Rday

" "4X Sday

NOW FUTURE EVENTSPAST EVENTS

Markov Property: The probability that it will be (FUTURE) SUNNY in DAY 6given that it is RAINY in DAY 5 (NOW) is independent from PAST EVENTS

?Probability of “R” in DAY6 given all previous states Probability of “S” in DAY6 given all previous states

6 5 4 1

6 5

Pr " " | " ", " ",..., " "

Pr " " | " "

DAY DAY DAY DAY

DAY DAY

X S X R X S X S

X S X R

Page 7: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

7

Markov Chains

We restrict ourselves to Markov chains such that the conditional probabilities

are independent of n, and for which

(which is equivalent to saying that state space E is finite or countable). Such a Markov chain is called homogeneous.

}|{ 1 iXjXPp nnij

},,,{Ewhere,Ej,i 210

Page 8: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

8

Markov Chains

Since– probabilities are non-negative, and– the process must make a transition into some

state at each time in I, then

We can arrange the probabilities into a square matrix called the transition matrix.

.1;,00

EiforpEjiforpj

ijij

ijP}{ ijP

Page 9: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

9

Weather:

• raining today 40% rain tomorrow

60% no rain tomorrow

• not raining today 20% rain tomorrow

80% no rain tomorrow

Markov Chain: A Simple Example

rain no rain

0.60.4 0.8

0.2

State transition diagram:

Page 10: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

10

Weather:

• raining today 40% rain tomorrow

60% no rain tomorrow

• not raining today 20% rain tomorrow

80% no rain tomorrow

Rain (state 0), No rain (state 1)

8.02.0

6.04.0P

• for a given current state: - Transitions to any states- Each row sums up to 1

The transition (prob.) matrix P:

Page 11: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

Examples in textbook

Example 11.6

Example 11.7

Figures 11.2 and 11.3

11

Page 12: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

Transition probability

Note that each entry in P is a one-step transition probability, say, from the current state to the right next state

Then, how about multiple steps?

Let’s start with 2 steps first

12

Page 13: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

13

2-Step Transition Prob. of 2 state system: states 0 and 1

Let pij(2) be probability of going from i to j in 2 steps

Suppose i = 0, j = 0, then

p00(2) = p01p10 + p00p00

Similarly p01(2) = p01p11 + p00p01

p10(2) = p10p00 + p11p10

p11(2) = p10p01 + p11p11

P(X2 = 0|X0 = 0)

+ P(X1 = 0|X0 = 0) P(X2 = 0| X1 = 0)

In matrix form, P(2) = P(1)P(1) = P2

= P(X1 = 1|X0 = 0) P(X2 = 0| X1 = 1)

Page 14: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

14

In general, 2 step transition is expressed as

Now note that

20

0112

0112

2

}|{}|{

}|{},|{

}|{

ij

kkjik

knnnn

knnnnn

nn

P

pp

iXkXPkXjXP

iXkXPiXkXjXP

iXjXP

Page 15: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

15

Two-step transition prob.

State space E={0, 1, 2}

Hence,

21

41

41

21

21

41

41

21

0P

167

163

83

83

41

83

83

163

167

2 PPP

163

}2|1{ 1,22

2 PXXP nn

Page 16: 11. Markov Chains Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger

16

Chapman-Kolmogorov Equations

In general, for all

This leads to the Chapman-Kolmogorov equations:

,,and0, Ejimn

.P)m(p}iX|jX{P mijijnmn

., all,0, all for

)()()(

Ejimn

npmpnmpEk

kjikij