lecture 12 and 13

6
12 Independen ce , th e weak la w of larg e num- bers and the central limit theorem The following de…nition is central to probability. De…nition 12.1  Let  (; F ; P)  be a pr obabili ty sp ac e and  fG i  :  i  = 1; 2;:::g  a collection of sub   -…elds of  F . We say  fG i  :  i  = 1; 2;:::g  are  independent  if for every sequence of sets  (G i ) 1 i=1  such that  G i  2 G i  for every  i;  and every …nite collection of distinct indices  i 1 ;:::;i n  2 N  the following equalit y holds: P (G i1  \ ::: \ G in ) = n Q k=1 P (G ik ) : We then say a collection of random variables  (X i ) 1 i=1  on  (; F ; P)  are in- dependent if the  -…elds  ( (X i )) 1 i=1  are independent. Similarly , a collection of events (A i ) 1 i=1  in F  are independent if the -…elds ( (A i )) 1 i=1  = (f;; A i ; A c i ; g) 1 i=1 are independent. Exercise 12.2  If  (X i ) n i=1  are independent random variables on  (; F ; P)  and f i  : ! R with  i = 1;:::;n are bounded measurable functions prove that E  n Q k=1 f i  (X i )  = n Q k=1 E [f i  (X i )] 12.1 The w eak la w of larg e numbe rs (WLLN) a nd cen tral limit theorem (CLT) We recall some basic facts. First , if  X  is a random variable on  (; F ; P)  with characteristic function   (t) = E e itX , then if  E X k  < 1 for  k = 1;::::;n we have the Taylor expansion (t) = n X k=0 (it) k E X k k!  + R n  (t) ; where  R n  (t)  is  o (t n )  as  t ! 0 , i.e.  t n R n  (t) ! 0  as  t ! 0 :  Second, we recall the classical result that for any sequence of complex numbers  z n  !  z 2  C as n ! 1 we have lim n!1 1 +  z n n n = e z : Lemma 12.3  Suppose  ( X n ) 1 n=1  is a sequence of random variables on  (; F ; P) such that  X n D ! X  as  n ! 1. If  X  is constant a.s. then  X n P ! X  as  n !1: Proof.  Suppose  X  = c  a.s. By considering  X n c  we may assume that  c = 0: Then, for   > 0  we let  f   2  C b  (R)  be the bounded continuous function de…ned by f   (x) =  jxj  1 [;]  (x) + 1 [;] c (x) : 1

Upload: mufy7

Post on 17-Feb-2018

226 views

Category:

Documents


0 download

TRANSCRIPT

7/23/2019 Lecture 12 and 13

http://slidepdf.com/reader/full/lecture-12-and-13 1/6

12 Independence, the weak law of large num-

bers and the central limit theoremThe following de…nition is central to probability.

De…nition 12.1   Let   (; F ;P)   be a probability space and  fGi :  i  = 1; 2;:::g   a collection of sub  -…elds of  F . We say  fGi  :  i  = 1; 2;:::g are  independent  if for every sequence of sets   (Gi)

1i=1   such that   Gi 2 Gi   for every   i;  and every …nite 

collection of distinct indices   i1;:::;in 2 N   the following equality holds:

P (Gi1 \ ::: \ Gin) =nQ

k=1

P (Gik) :

We then say a collection of random variables   (X i)1i=1   on   (; F ;P)   are in-

dependent if the  -…elds ( (X i))1i=1  are independent. Similarly, a collection of 

events (Ai)1i=1 in F  are independent if the -…elds ( (Ai))1i=1 = (f;; Ai; Aci ; g)1i=1are independent.

Exercise 12.2   If   (X i)ni=1   are independent random variables on   (; F ;P)   and 

f i  : ! R  with   i = 1;:::;n  are bounded measurable functions prove that 

E

  nQk=1

f i (X i)

 =

nQk=1

E [f i (X i)]

12.1 The weak law of large numbers (WLLN) and centrallimit theorem (CLT)

We recall some basic facts. First, if   X   is a random variable on  (; F ;P)  with

characteristic function  (t) = E

eitX

, then if  E

X k

  < 1 for  k = 1;::::;n wehave the Taylor expansion

(t) =nX

k=0

(it)k E

X k

k!  + Rn (t) ;

where  Rn (t)   is  o (tn)  as   t ! 0, i.e.   tnRn (t) ! 0  as   t ! 0:  Second, we recallthe classical result that for any sequence of complex numbers   zn !  z 2  C  asn ! 1 we have

limn!1

1 +

 zn

n

n= ez:

Lemma 12.3   Suppose  (X n)1n=1   is a sequence of random variables on  (; F ;P)

such that  X 

n

D

!X 

  as  n

! 1. If  X 

 is constant a.s. then  X 

n

P

!X 

  as  n

! 1:

Proof.   Suppose  X  = c  a.s. By considering  X n c  we may assume that  c = 0:

Then, for   > 0  we let  f  2 C b (R)  be the bounded continuous function de…nedby

f  (x) = jxj

  1[;] (x) + 1[;]c (x) :

1

7/23/2019 Lecture 12 and 13

http://slidepdf.com/reader/full/lecture-12-and-13 2/6

Since  X nD! X  = 0  as  n ! 1 we have

limn!1E [f  (X n)] = 0:

On the other hand,  f  (X n) 1[;]c (X n) and hence

0 P (jX n X j ) = P (jX nj ) = E

1[;]c (X n) E [f  (X n)] ! 0

as  n ! 1:

De…nition 12.4  We say a random variable   X  has …nite mean if    =  E [X ] <

1  and, in this case  X  has …nite variance if  2 = E

h(X   )2

i < 1:

Theorem 12.5 (WLLN)   Suppose  (X n)1n=1   is a sequence of independent and identically distributed (i.i.d) random variables on  (; F ;P)  with …nite mean  ,then 

1

n

nXi=1

X iP

!  as  n ! 1:

Proof.   Let   S n   :=Pn

i=1 X i:   We compute the characteristic function   Snn

(t)

using the i.i.d assumption

Snn

(t) =   E

heit

Snn

i

=   E

"  nQj=1

eitXj=n

#

=nQ

j=1E

heitXj=n

i  (independence)

=nQ

j=1X1 t

n  (identically distributed)

=   X1

 t

n

n

=

1 +

 it

n  + o

1

n

n

!   eit as  n ! 1:

(t) =  eit is the characteristic function of the constant random variable X  =  .

Using Levy’s continuity theorem we deduce that   1nS n

D!  as  n ! 1 which, by

the previous lemma, gives that   1nS n

P!  as  n ! 1:

Theorem 12.6 (CLT)   Suppose  (X n)1n=1 is a sequence of independent and iden-

tically distributed (i.i.d) random variables on   (; F ;P)  with …nite mean    and  …nite variance  2:  Let  S n :=

Pni=1 X i;  then 

S n n

n

D! Z ;

where  Z   N  (0; 1)  has the distribution of a standard normal random variable.

2

7/23/2019 Lecture 12 and 13

http://slidepdf.com/reader/full/lecture-12-and-13 3/6

Proof.  By replacing  X i  by 1 (X i )  we may assume    = 0 and    = 1:  Wecompute the characteristic function   Sn

p n

(t)  to give

Snp n

(t) =   E

he

itp n(X1+X2+:::+Xn)

i=

nQj=1

E

he

itp nXj

i

=   X1

  tp 

n

n

=

1   t2

2n + o

  1

n3=2

n

!   et2=2 =  (t) :

  is the characteristic function of the standard normal distribution. Since     iscontinuous we may use by Levy’s continuity theorem to deduce that   S np 

n

D!  Z 

as  n ! 1:

3

7/23/2019 Lecture 12 and 13

http://slidepdf.com/reader/full/lecture-12-and-13 4/6

13 Further consequences of independence

We recall that for a sequence of events  (An)1n=1  we de…ne

lim supAn   =   \1n=1 [1k=n Ak

=   fAn  in…nitely ofteng :

Theorem 13.1 (Borel-Cantelli lemmas)   Let  (An)1n=1 be a sequence of events 

on a probability space  (; F ;P) :

1. If P1

n=1 P (An) < 1   then  P (lim sup An) = 0:

2. If P1

n=1 P (An) = 1 and  (An)1n=1  are independent then  P (lim sup An) =1:

Proof.  For 1 let  Bn

 =[1k=n

Ak

; then Bn 

Bn+1

 for all n  and by the continuityproperties of the probability measure we have

P (Bn) ! P (\1n=1Bn) = P (lim sup An)

as  n ! 1:  On the other hand,

P (Bn) = P ([1k=nAk) 1Xk=n

P (Ak) ! 0

as  n ! 1 becauseP1

n=1 P (An) < 1:  It follows that  P (lim sup An) = 0:

For part 2 it su¢ces to show  P ((lim sup An)c) = P ([1n=1 \1k=n Ack) = 0:  Let

E n := \1k=nAck  and notice by continuity that

P (E n) = limN !1

P\N 

k=nAck

= lim

N !1

N Qk=n

P (Ack)   (independence)

= limN !1

N Qk=n

(1 P (Ak))

  limN !1

N Qk=n

exp(P (Ak))   (using  1 x ex for x 2 (0; 1) )

= limN !1

exp

N Xk=n

P (Ak)

!

= 0

usingP1

n=1 P (An) = 1:   It follows that   P ([1n=1E n) =   P ((lim sup An)c) = 0and the result is proved.

We illustrate the use of the Borel-Centelli lemmas with two examples.

4

7/23/2019 Lecture 12 and 13

http://slidepdf.com/reader/full/lecture-12-and-13 5/6

Example 13.2   Suppose   X nP!   X   as   n ! 1  then there exists a subsequence 

(X nk)1

k=1

 such that   X nka. s 

! X   as   k

 ! 1:  To see this for each   k  = 1; 2; 3::::we 

choose  nk 2 N  such that for all  n nk

P

jX n X j   1

k

  1

k2:

This is possibly since  X nP! X:We then have 

1Xk=1

P

jX nk  X j   1

k

1Xk=1

1

k2  < 1:

By the Borel-Cantelli lemmas this implies PjX nk  X j   1

k   in…nitely often 

 =

0, which just says that  X nka. s ! X   as  k ! 1:

Example 13.3  As another example, we record that the strong law of large num-bers (SLLN) states that if  (X n)1n=1  are i.i.d and if  E [jX 1j] < 1  then 

1

n

nXi=1

X ia. s !  = E [X 1]   as  n ! 1:

(Note: above we have proved the weak law, which shows convergence in prob-ability). We cannot relax the assumption   E [jX 1j]   < 1:   To see this suppose S n :=

Pni=1 X i  and assume that   S nn   converges a.s. as  n ! 1;  then 

X n

n  =

  S n

n   n 1

n

S n1

n

1

a. s ! 0:

This means that if  An := fjX nj ng then  P (An   in…nitely often ) = 0:  However,since  (X n)1n=1  are independent it follows that the events  (An)1n=1  are indepen-dent and from the second part of the Borel-Cantelli lemmas we must have 

1Xn=1

P (An) < 1

(otherwise   P (An   in…nitely often ) = 1, which we know to be false). Finally we 

5

7/23/2019 Lecture 12 and 13

http://slidepdf.com/reader/full/lecture-12-and-13 6/6

use the fact that  (X n)1n=1  are identically distributed to give that 

E [jX 1j]   1Xk=0

(k + 1)P (k  jX 1j < k + 1)

  1 +1Xk=1

kP (k  jX 1j < k + 1)

= 1 +1Xk=1

kXj=1

P (k  jX j j < k + 1)

= 1 +1Xj=1

1Xk=j

P (k  jX j j < k + 1)

= 1 +1

Xj=1

P (jX 

jj  j)

= 1 +1Xj=1

P (Aj) < 1:

6