group exercise for 0≤t 1

16
Group exercise For 0≤t 1 <…<t n real and 0≤r 1 ≤…≤r n integers, define X(t) by X(0)=0 and (a)Find P(X(t)=0) (b) Determine P(X(t)=k) P(X (t 1 ) =r 1 ,...,X(t n ) =r n )= λ r n e −λt n t 1 r 1 (t 2 −t 1 ) r 2 −r 1 L (t n −t n −1 ) r n −r n −1 r 1 !(r 2 −r 1 )!L (r n −r n −1 )! P(X (t 1 ) =r 1 )= λ r 1 e −λt 1 t 1 r 1 r 1 ! : Po(λt 1 )

Post on 22-Dec-2015

224 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Group exercise For 0≤t 1

Group exercise

For 0≤t1<…<tn real and 0≤r1≤…≤rn

integers, define X(t) by X(0)=0 and

(a) Find P(X(t)=0)

(b) Determine P(X(t)=k)

P(X(t1) =r1, ...,X(tn) =rn) =

λrne−λtnt1r1 (t2 −t1)

r2 −r1L (tn −tn−1)rn−rn−1

r1!(r2 −r1)!L (rn −rn−1)!

P(X(t1) =r1) =

λr1e−λt1t1r1

r1!: Po(λt1)

Page 2: Group exercise For 0≤t 1

(c) Show that X(t1) and X(t2)-X(t1) are independent

(d) Show that Kolmogorov’s consistency condition is satisfied (try this at home, xc)

P(X(t1) =r1, ...,X(tn) =rn) =

λrne−λtnt1r1 (t2 −t1)

r2 −r1L (tn −tn−1)rn−rn−1

r1!(r2 −r1)!L (rn −rn−1)!

P(X(t1) =r1,X(t2 ) =r2 ) =P(X(t1) =r1,X(t2 ) −X(t1) =r2 −r1)

λr2e−λt2 t1r1 (t2 −t1)

r2 −r1

r1!(r2 −r1)!

=P(X(t1) =r1)[λ(t2 −t1)]

r2 −r1e−λ(t2 −t1 )

(r2 −r1)!

Page 3: Group exercise For 0≤t 1

Markov chains

Chapters 5 and 6

Page 4: Group exercise For 0≤t 1
Page 5: Group exercise For 0≤t 1

Precipitation data

January data from Snoqualmie Falls, Washington, 1948-1983

325 dry and 791 wet days

Rt=1(rain day t) ~ Bern(p) independently

E(#WW days) = N x 30 x p2

36 x 30 x (791/1116)2 = 543Today wet Today dry

Yesterday wet 643 (543) 128 (223) 771

Yesterday dry 123 (223) 186 (91) 309

766 314 1080

Page 6: Group exercise For 0≤t 1

A conditional model

P(Rt = 1 | Rt-1 = 1) = p11

P(Rt = 1 | Rt-1 = 0) = p01

Special case of

ik=0 or 1

Transition matrix

pd = P(rain following dry day)

P(R t =1R t−1 =i1, ...,R0 =it )

P =p00 p10

p01 p11

⎝⎜⎞

⎠⎟=

1−pd pd

1−pw pw

⎝⎜⎞

⎠⎟

Page 7: Group exercise For 0≤t 1

More generally

Consider a stochastic process (Xn, n≥0) taking values in a discrete state space S. It is a Markov chain if

The quantities are called the transition probabilities. The chain is homogeneous if the transition probabilities do not depend on n.

We will usually assume this.

P(Xn =k Xn−1 =k1 , ...,X1 =kn−1,X0 =kn) =

P(Xn =k Xn−1 =k1) ≡pk1,k(n)

pij (n)

Page 8: Group exercise For 0≤t 1

The transition matrix

The matrix P=(pij) is called the transition matrix.

Theorem: P has nonnegative entries and all row sums are one.

Such matrices are called stochastic.

Proof: pij =j∈S∑ P(Xn =j Xn−1 =i)

j∈S∑

=P(Xn ∈S Xn−1 = i) = 1

Page 9: Group exercise For 0≤t 1

n-step transitions

are called n-step transition probabilities. The matrix of them is denoted P(n).

Theorem (Chapman-Kolmogorov):

P(n+m) = P(n)P(m)

Proof:

using the Markov property.

pij(n) =P(Xm+n =j Xm =i)

P(Xm+n =j X0 =i) = P(Xm+n =j,Xm =k X0 =i)k∈S∑

= P(Xm+n = j Xm = k,X0 = i)P(Xm = k | X0 = j)k∈S∑

= P(Xm+n = j Xm = k)P(Xm = k | X0 = i)k∈S∑

Page 10: Group exercise For 0≤t 1

Consequences

1. P(n) = Pn

2. Let k(n) = P(Xn = k) and (n) = (k

(n)). Then (m+n)= (n)Pm

3. In particular, (n)= (0)Pn

Page 11: Group exercise For 0≤t 1

Back to precipitation

Let 1(0)=p1. Then

1(n) = P(Xn = 1)

= P(Xn = 1,Xn−1 = 0) + P(Xn = 1,Xn−1 = 1)

=0(n−1)p01 + μ1

(n−1)p11

=1(n−1) (p11 − p01) + p01

1(1) = p1(p11 − p01) + p01

1(2) = μ1

(1) (p11 − p01) + p01

= p1(p11 − p01)2 + p01(1+ (p11 − p01))

1(n) = p1(p11 − p01)

n + p01 (p11 − p01)j

j=0

n−1

Page 12: Group exercise For 0≤t 1

p00 = p11 = 1

P(Xn=1) = p1

p01 ≠ p11

If it rains on Jan 1, what is the chance that it rains on Jan 6?

1(n) = p1(p11 − p01)

n + p01 (p11 − p01)j

j=0

n−1

1(n) =

p01

1− (p11 − p01)+ (p1 −

p01

1− (p11 − p01)(p11 − p01)

n

P =.602 .398.166 .834⎛⎝⎜

⎞⎠⎟

P5 =.305 .695.290 .710⎛⎝⎜

⎞⎠⎟

Page 13: Group exercise For 0≤t 1

Generating functions

a=(a0,a1,a2,...) sequence of real numbers

is its generating function

Special case:

probability generating function (pgf)

Ga (s) = aisi

i=0

G(s) = pisi

i=0

∑ =EsX

dk

dsk G(s) s=0 =

dk

dsk G(s)s↑1=

k!pk

E(X(X −1)L (X−k + 1))

Page 14: Group exercise For 0≤t 1

Examples

Bernoulli:

GX(s) = s0 (1-p)+sp = 1+p(s-1)

Geometric:

GX(s) =

Poisson:

GX(s) =

skp(1−p)kk=0

∑ =p

1− (1− p)s

sk λk

k!e−λ

k=0

∑ =e−λesλ = eλ(s−1)

Page 15: Group exercise For 0≤t 1

Convolutions

The convolution of sequences a and b is c=a*b where ci=a0bi+a1bi-1+...+aib0

Theorem: Gc(s) = Ga(s)Gb(s)

Proof: cis

i

i=0

∑ = akbi−kk=0

i

∑⎛⎝⎜⎞⎠⎟i=0

∑ = aksk bi−ks

i−k

i=k

∑k=0

Page 16: Group exercise For 0≤t 1

Sum of iid random variables

Xi nonnegative integervaluedMarkov chain?

E(Sn) = = nE(X)

P(Sn=1) =Bernoulli case:

Random walk case:

Xi=1 w pr p, -1 w pr 1-p

GSn(s) =GX (s)n

n ′GX (1−) GX (1)[ ]n−1

n ′GX (0) GX (0)[ ]n−1

=np1p0n−1

GSn(s) =(1+p(s-1))n