introduction to markov chains - computer...
TRANSCRIPT
![Page 1: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/1.jpg)
Introduction to Markov chains
Examples of Markov chains:
- Random walk on a line
![Page 2: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/2.jpg)
Introduction to Markov chains
Examples of Markov chains:
- Random walk on a graph
![Page 3: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/3.jpg)
Introduction to Markov chains
Examples of Markov chains:
- Random walk on a hypercube
![Page 4: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/4.jpg)
Introduction to Markov chains
Examples of Markov chains:
- Markov chain on graph colorings
![Page 5: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/5.jpg)
Introduction to Markov chains
Examples of Markov chains:
- Markov chain on graph colorings
![Page 6: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/6.jpg)
Introduction to Markov chains
Examples of Markov chains:
- Markov chain on graph colorings
![Page 7: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/7.jpg)
Introduction to Markov chains
Examples of Markov chains:
- Markov chain on matchings of a graph
![Page 8: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/8.jpg)
Definition of Markov chains
- State space
- Transition matrix P, of size ||x||
- P(x,y) is the probability of getting from state x to state y
- y∈ P(x,y) =
This course: state space is finite
![Page 9: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/9.jpg)
Definition of Markov chains
- State space
- Transition matrix P, of size ||x||
- P(x,y) is the probability of getting from state x to state y
- y∈ P(x,y) =
This course: state space is finite
![Page 10: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/10.jpg)
Definition of Markov chains
Example: frog on lily pads
![Page 11: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/11.jpg)
Definition of Markov chains
Stationary distribution π (on states in ):
Example: frog with unfair coins
![Page 12: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/12.jpg)
Definition of Markov chains
Other examples:
![Page 13: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/13.jpg)
Definition of Markov chains
- aperiodic
- irreducible
- ergodic: a MC that is both aperiodic and irreducible
Thm (3.6 in Jerrum): An ergodic MC has a unique stationary distribution. Moreover, it converges to it as time goes to ∞.
![Page 14: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/14.jpg)
Gambler’s Ruin
A gambler has k dollars initially and bets on (fair) coin flips.If she guesses the coin flip: gets 1 dollar.If not, loses 1 dollar.Stops either when 0 or n dollars.
Is it a Markov chain ?
How many bets until she stops playing ?What are the probabilities of 0 vs n dollars ?
![Page 15: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/15.jpg)
Gambler’s Ruin
Stationary distribution ?
![Page 16: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/16.jpg)
Gambler’s Ruin
Starting with k dollars, what is the probability of reaching n ?
What is the probability of reaching 0 ?
![Page 17: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/17.jpg)
Gambler’s Ruin
How many bets until 0 or n ?
![Page 18: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/18.jpg)
Coupon Collecting
- n coupons- each time get a coupon at random, chosen from 1,…,n (i.e., likely will get some coupons multiple times)- how long to get all n coupons ?
Is this a Markov chain ?
![Page 19: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/19.jpg)
Coupon Collecting
How long to get all n coupons ?
![Page 20: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/20.jpg)
Coupon Collecting
How long to get all n coupons ?
![Page 21: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/21.jpg)
Coupon Collecting
Proposition:Let τ be the random variable for the time needed to collect all n coupons. Then,
Pr(τ > d n log n + cn e) ≤ e-c.
![Page 22: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/22.jpg)
Random Walk on a Hypercube
How many expected steps until we get to a random state ?(I.e., what is the mixing time ?)
![Page 23: Introduction to Markov chains - Computer Scienceib/Classes/CS801_Spring11-12/Slides/Intro.pdfMicrosoft PowerPoint - Intro Author: ib Created Date: 3/22/2012 5:14:23 PM](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fa85ecf662da467196d2fd0/html5/thumbnails/23.jpg)
Random Walk on a Hypercube
How many expected steps until we get to a random state ?(I.e., what is the mixing time ?)