markov chains (2) - sharifce.sharif.edu/courses/93-94/1/ce634-1/resources/root... · 2020. 9....
TRANSCRIPT
Markov Chains (2)
2
Outlines
Discrete Time Markov Chain (DTMC) …
Continuous Time Markov Chain (CTMC)
3
Discrete Time Markov Chain (DTMC)…
denotes the pmf of the random variable
We will only be concerned with homogenous Markov
chains. For such chains, we use the following notation to
denote n-step transition probabilities.
The one-step transition probabilities are simply
written as , thus:
( ) ( )j np n P X j
( )jp n
( ) ( | )jk m n mp n P X k X j
1(1) ( | )jk jk n np p P X k X j
(1)jkp
jkp
4
Discrete Time Markov Chain (DTMC)…
The pmf of the random variable , often called the initial
probability vector, is specified as
The one-step transition probabilities are compactly specified
in the form of a transition probability matrix
The entries of the matrix P satisfy the following two properties
0 1p(0) [ (0), (0),...]p p
0X
00 01 02
10 11 12
.
.P [ ]
. . . .
. . . .
ij
p p p
p p pp
, ;i j I and .i I0 1,ijp 1,ij
j I
p
5
Discrete Time Markov Chain (DTMC)…
An equivalent description of the one-step transition
probabilities can be given by a directed graph called the
state transition diagram (state diagram for short) of the
Markov chain.
A node labeled i of the state diagram represents state i
of the Markov chain and a branch labeled pij from node i
to j implies that the conditional probability is
1[ | ]n n ijP X j X i p
6
Discrete Time Markov Chain (DTMC)…
Example: Two states
Suppose a person can be in one of two states "healthy"
or "sick". Let X(n), n = 0, 1, 2, … refer the state at time n
where
Define
7
Discrete Time Markov Chain (DTMC)…
Its corresponding DTMC can be shown by
state diagram
transition probability matrix
8
Discrete Time Markov Chain (DTMC)…
We are interested in obtaining an expression for
evaluating the n-step transition probability from the one-
step probabilities.
If we let P(n) be the matrix whose (i, j) entry is , that
is, let P(n) be the matrix of n-step transition probabilities,
then we can write
Thus the matrix of n-step transition probabilities is
obtained by multiplying the matrix of one-step transition
probabilities by itself n-1 times.
( )ijp n
( ) . ( 1) nP n P P n P
9
Discrete Time Markov Chain (DTMC)…
We can obtain the pmf of the random variable from
the n-step transition probabilities and the initial
probability vector as follows
This implies that step dependent probability vector of a
homogeneous Markov chain are completely determined
from the one-step transition probability matrix P and the
initial probability vector p(0).
p( ) p(0) ( ) p(0) nn P n P
nX
10
Discrete Time Markov Chain (DTMC)…
Example: Stock Exchange
11
Discrete Time Markov Chain (DTMC)…
Example: Stock Exchange…
12
Discrete Time Markov Chain (DTMC)…
Example: Stock Exchange…
13
Discrete Time Markov Chain (DTMC)…
A state i is said to be transient iff there is a positive
probability that the process will not return to this state.
A state i is said to be recurrent iff starting from i, the
process eventually returns to state i with probability one.
For a recurrent state i, define the period of
state i denoted by , as the greatest common divisor
(gcd) of the set of positive integers n such that .
( ) 0, 1iip n n
iid
( ) 0iip n
14
Discrete Time Markov Chain (DTMC)…
A state i has period k if any return to state i must occur in
multiples of k time steps. Formally, the period of a state
is defined as
A recurrent state i is said to be aperiodic if its period ,
and periodic if .
A state i is said to be an absorbing state iff .
1id
1id
0gcd{ : Pr( | ) 0}nk n X i X i
1iip
15
Discrete Time Markov Chain (DTMC)…
Two states i and j communicate if directed paths from i to
j and vice-versa exist.
A Markov chain is said to be irreducible if every recurrent
state can be reached from every other state in a finite
number of steps. In other words, for all , there is an
integer such that .
,i j I
1n ( ) 0ijp n
16
Continuous Time Markov Chain (CTMC)
As in DTMCs, we confine our attention to discrete-state
processes. This implies that, although the parameter t
has a continuous range of values, the set of values is
discrete.
Recall the definition of discrete-state continuous tie
stochastic process stated in the class which satisfies
The behavior of the process is characterized by (1) initial
state probability given by the pmf of
and (2) the transition probabilities
( , ) ( ( ) | ( ) )ijp v t P X t j X v i
( )X t
1 1 0 0[ ( ) | ( ) , ( ) ,... ( ) ]n n n nP X t x X t x X t x X t x
[ ( ) | ( ) ]n nP X t x X t x
0 0( ), ( ( ) ), 0,1,2,...X t P X t k k
17
Continuous Time Markov Chain (CTMC)…
Let denote the pmf of X(t) (or the state probabilities at
time t by
It is clear that
for any , since at any given time the process must be
in some state.
( ) ( ( ) ),j t P X t j 0,1,2,...; 0j t
( ) 1j
j I
t
0t
18
Continuous Time Markov Chain (CTMC)…
Using the theorem of total probability, for given , we
can express the pmf of X(t) in term of the transition
probabilities and the pmf of X(v):
If we let v=0, then
( ) ( ( ) )j t P X t j
t v
( , )ijp v t
( ( ) | ( ) ) ( ( ) )i I
P X t j X v i P X v j
( , ) ( )ij i
i I
p v t v
( ) (0, ) (0)j ij i
i I
t p t
19
Continuous Time Markov Chain (CTMC)…
If we let the , then in the matrix form
we have
Where Q is the infinitesmial generator matrix containing
the transition rates from any state i to any other state j,
where of a given CTMC.
The elements on the main diagonal of Q are defined by
( )( )
d tt Q
dt
0 1( ) [ ( ), ( ),...]t t t
i jijq
iiq
,
ii ij
j j i
q q
20
Continuous Time Markov Chain (CTMC)…
If for a given CTMC, the steady state probabilities are
independent of time, we immediately get
For determining the unconditional state probabilities
resolves to much simpler system of linear equations
In matrix for, we get accordingly
( )lim 0t
d t
dt
0 ,ij i
i S
q j S
0 Q
21
Continuous Time Markov Chain (CTMC)…
Example:
Discussion on steady state solution of the following CTMC in class…
1 2 3
1 2 3 1