chapter 4. markov chains - nchu.edu.t ch4 part i1.pdf · only on multiples of some positive integer...
TRANSCRIPT
![Page 1: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/1.jpg)
Chapter 4. Markov Chains
1
![Page 2: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/2.jpg)
2
Example. (Weather Forecasting 2)
![Page 3: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/3.jpg)
Other Examples
3
Queuing Systems ─ number of customers waiting in line at a bank at the beginning of each
hour ─ number of machines waiting for repair at the end of each day ─ number of patients waiting for a lung transplant at beginning of each
week
Inventory Management ─ number of units in stock at beginning of each week ─ number of backordered units at end of each day
Airline Overbooking ─ number of coach seats reserved at beginning of each day
Finance ─ price of a given security when market closes each day
![Page 4: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/4.jpg)
Other Examples (con’t)
4
Epidemiology ─ number of foot-and-mouth infected cows at the beginning of each day
Population Growth ─ size of US population at end of each year
Genetics ─ genetic makeup of your descendants in each subsequent generation
Workforce Planning ─ number of employees in a firm with each level of experience at the end
of each year
![Page 5: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/5.jpg)
5
Independent of n Independent of past states
Introduction
![Page 6: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/6.jpg)
Introduction
6
![Page 7: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/7.jpg)
Introduction
7
![Page 8: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/8.jpg)
Transforming a Process into a Markov Chain
8
No!
![Page 9: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/9.jpg)
Transforming a Process into a Markov Chain (con’t)
9
The transition matrix for this Markov chain is: Today & Tomorrow
Yesterday & Today
… rain sun rain sun sun rain sun sun sun rain …
day n
state 2 state 3
day n+1
![Page 10: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/10.jpg)
Example 1. The M/G/1 Queue
10
![Page 11: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/11.jpg)
Example 1. The M/G/1 Queue
11
X(t) arrival
departure
![Page 12: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/12.jpg)
Example 1. The M/G/1 Queue
12
﹏ ﹏ ﹏
![Page 13: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/13.jpg)
Example 1. The M/G/1 Queue
13
![Page 14: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/14.jpg)
Example 1. The M/G/1 Queue
14
![Page 15: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/15.jpg)
Example 2. The G/M/1 Queue
15
﹏ ﹏ ﹏
![Page 16: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/16.jpg)
Example 2. The G/M/1 Queue
16
![Page 17: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/17.jpg)
Example 3. The General Random Walk
17
![Page 18: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/18.jpg)
Example 4. The Simple Random Walk
18
![Page 19: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/19.jpg)
Example 4. The Simple Random Walk (con’t)
19
![Page 20: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/20.jpg)
Example 4. The Simple Random Walk (con’t)
20
![Page 21: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/21.jpg)
Example 4. The Simple Random Walk (con’t)
21
![Page 22: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/22.jpg)
Other Examples
22
Queuing Systems ─ number of customers waiting in line at a bank at the beginning of each
hour ─ number of machines waiting for repair at the end of each day ─ number of patients waiting for a lung transplant at beginning of each
week
Inventory Management ─ number of units in stock at beginning of each week ─ number of backordered units at end of each day
Airline Overbooking ─ number of coach seats reserved at beginning of each day
Finance ─ price of a given security when market closes each day
![Page 23: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/23.jpg)
Other Examples (con’t)
23
Epidemiology ─ number of foot-and-mouth infected cows at the beginning of each day
Population Growth ─ size of US population at end of each year
Genetics ─ genetic makeup of your descendants in each subsequent generation
Workforce Planning ─ number of employees in a firm with each level of experience at the end
of each year
![Page 24: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/24.jpg)
Chapman-Kolmogorov Equations
24
![Page 25: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/25.jpg)
Chapman-Kolmogorov Equations (con’t)
25
![Page 26: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/26.jpg)
Chapman-Kolmogorov Equations (con’t)
26
![Page 27: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/27.jpg)
27
Example. (Weather Forecasting 1)
P { rain-rain-rain + rain-no rain-rain } = 0.7 x 0.7 + 0.3 x 0.4 = 0.61
![Page 28: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/28.jpg)
28
Example. (Weather Forecasting 2)
![Page 29: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/29.jpg)
Example. (Weather Forecasting 2) (con’t)
29
Mon. Tue. Wed. Thur.
Yesterday & Today (state 0)
Today & Tomorrow P
Yesterday & Today
Today & Tomorrow P
rain rain ? rain
The desired probability is given by
![Page 30: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/30.jpg)
30
Classification of States
(reflexive)
(symmetric)
(transient)
![Page 31: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/31.jpg)
31
Classification of States (con’t)
31
0 1 2 3 Transition Diagram => Three classes {0, 1}, {2}, {3}
![Page 32: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/32.jpg)
Classification of States (con’t)
32
0 1 2
![Page 33: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/33.jpg)
• Transient – A state is transient if it is possible to leave the state and never return.
• Periodic – A state is periodic if it is not transient, and if that state is returned to
only on multiples of some positive integer greater than 1. This integer is known as the period of the state.
• Ergodic – A state is ergodic if it is neither transient nor periodic.
• An absorbing state is a state that returns to itself immediately with probability 1
33
Classification of States (con’t)
Tip1: Always find the communicating classes before classifying the states. Tip2: Identify states in the order of the definition: transient states -> periodic states second -> ergodic states.
Tip3: A state that has a loop cannot be periodic
Tip4: If a communicating class is periodic, each state has the same period
![Page 34: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/34.jpg)
Classification of States (con’t)
34
g.c.d: Greatest Common Denominator
![Page 35: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/35.jpg)
35
•No. of communicating classes: 5 { a, b, c } :
{ d} : { e} : { f, g, h, i, j } :
1. f, g, h, i, j form a communicating class. 2. Once you enter, you never leave
3. look at state h
transient transient absorbing, ergodic
periodic w/ d=3
-> not transient.
Once we leave it we will always return in 3 transitions
-> period=3 -> f, g, h, i, j all have period=3 -> f, g, h, i, j are periodic
{ k, l, m} : ergodic
![Page 36: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/36.jpg)
36
A Graph of Period Two. A Non-periodic Graph.
![Page 37: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/37.jpg)
37
Examples of Periodicity
Example: Gambler’s Ruin
Example: Weather Prediction
0 1 2 3
![Page 38: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/38.jpg)
Classification of States (con’t)
38
(Periodicity is a class property)
![Page 39: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/39.jpg)
Classification of States (con’t)
39
![Page 40: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/40.jpg)
Classification of States (con’t)
40
Example:
![Page 41: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/41.jpg)
41
The probability of eventually returning to state 0 is
State 0 is recurrent nonnull or recurrent null ?
the state must be recurrent null.
Thus, state 0 is recurrent.
![Page 42: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/42.jpg)
Classification of States (con’t)
42
![Page 43: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/43.jpg)
Proof (con’t)
43
![Page 44: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/44.jpg)
Classification of States (con’t)
44
=> A finite-state Markov must have at least one recurrent state.
![Page 45: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/45.jpg)
45
Examples of Recurrent and Transient States Example:
Example:
1. The chain is finite
2. All states communicate
Communicating classes: {0,1} {2,3} {4}
1 0 4
1/2 1/2
1/2 1/2
3 2
1/2 1/2
1/2
1/4 1/4
1/2
=> States 0 and 1 are recurrent! Similarly, States 2 and 3 are recurrent!
=> State 4 is transient!
=> At least one recurrent state. => All states are recurrent!
![Page 46: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/46.jpg)
Classification of States (con’t)
46
(Recurrent is a class property)
![Page 47: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/47.jpg)
Example. The Simple Random Walk
47
1 2 0 -1 -2 … …
p p p p
1-p 1-p 1-p 1-p
![Page 48: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/48.jpg)
Example. The Simple Random Walk (con’t)
48
![Page 49: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/49.jpg)
Example. The Simple Random Walk (con’t)
49
![Page 50: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/50.jpg)
Example. The Simple Random Walk (con’t)
50
![Page 51: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/51.jpg)
Classification of States (con’t)
51
![Page 52: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/52.jpg)
Classification of States (con’t)
52
![Page 53: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/53.jpg)
Limit Theorems
53
![Page 54: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/54.jpg)
Limit Theorems
54
Strong Law of Renewal Process
Elementary Renewal Theorem
Blackwell Theorem
![Page 55: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/55.jpg)
Limit Theorems
55
stationary probability, limiting probabilities, equilibrium probabilities, or balance probabilities
![Page 56: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/56.jpg)
Limit Theorems
56
![Page 57: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/57.jpg)
Limit Theorems
57
![Page 58: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/58.jpg)
Limit Theorems
58
Note that for a finite Markov chain, there must be at least one recurrent state
only one class
limiting probabilities equilibrium probabilities
=> There are infinite transient states!
Only one stationary distribution
![Page 59: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/59.jpg)
Proof.
59
(1)
![Page 60: Chapter 4. Markov Chains - nchu.edu.t ch4 part I1.pdf · only on multiples of some positive integer greater than 1. This integer is known as the . period . of the state. • Ergodic](https://reader030.vdocuments.us/reader030/viewer/2022040704/5e0156020d25092dab15afb6/html5/thumbnails/60.jpg)
Proof. (con’t)
60
Contradiction!!
balance equations