theoretical tutorial session 1 - drexel universitysong/gene golub summer school/song... · >c ˙...
TRANSCRIPT
Theoretical Tutorial Session 1
Xiaoming Song
Department of MathematicsDrexel University
July 25, 2016
1 / 48
Basic properties for Brownian motion
1. Selfsimilarity:For any a > 0, the process a−
12 Bat , t ≥ 0 is also a Brownian
motion.
Proof: Let Yt = a−12 Bat . It is obvious that the process
Yt = a−12 Bat , t ≥ 0 is an Gaussian process with zero mean
and continuous trajectories. It is sufficient to show that for any0 ≤ s ≤ t < +∞ the covariance function E(YtYs) = s which isproved as follows:
E(YtYs) = E(a−12 Bata−
12 Bas) = a−1E(BatBas) = a−1(as) = s.
2 / 48
2. For any h > 0, the process Bt+h − Bh, t ≥ 0 is a Brownianmotion.
Proof: Let Yt = Bt+h − Bh. We know that the processYt = Bt+h − Bh, t ≥ 0 is an Gaussian process with zero meanand continuous trajectories. Next, we calculate its covariancefunction: for any 0 ≤ s ≤ t < +∞,
E(YtYs) = E((Bt+h − Bh)(Bs+h − Bh))
= E(Bt+hBs+h)− E(BhBs+h)− E(Bt+hBh) + E(BhBh)
= (s + h)− h − h + h= s.
Therefore, the process Bt+h −Bh, t ≥ 0 is a Brownian motion.
3 / 48
3. The process −Bt , t ≥ 0 is a Brownian motion.
Proof: This process is a Gaussian process with zero mean andcontinuous trajectories, and for any 0 ≤ s ≤ t < +∞,
E((−Bt )(−Bs)) = E(BtBs) = s.
Thus, The process −Bt , t ≥ 0 is a Brownian motion.
4 / 48
Chebyshev’s inequality
If ξ is a random variable with E(|ξ|p) <∞, then for any λ > 0
P(|ξ| > λ) ≤ E(|ξ|p)
λp .
5 / 48
Borel-Cantelli lemma
LemmaLet A1,A2,A3, . . . be a sequence of events in a probabilityspace. If
∑∞n=1 P(An) <∞, then the probability that infinitely
many of them occur is 0, that is,
P(lim supn→∞
An) = 0.
Note that
lim supn→∞
An =∞⋂
n=1
∞⋃k=n
Ak
6 / 48
Strong law of large numbers
TheoremIf ξ1, ξ2, ξ3, . . . is a sequence of iid random variables andµ = E(ξ1) <∞, then∑n
i=1 ξi
n=ξ1 + ξ2 + · · ·+ ξn
n→ µ
almost surely as n→∞.
7 / 48
4. Almost surely limt→∞
Btt = 0 and the process
Xt =
tB1/t , t > 0;
0, t = 0
is a Brownian motion.
Proof: We first show that limn→∞
Bnn = 0 which is true by using the
strong law of large numbers since
Bn
n=
∑ni=1(Bi − Bi−1)
n
and B1 −B0,B2 −B1, . . . ,Bn −Bn−1, . . . are iid with law N(0,1).
8 / 48
Then for any t > 0 and t ∈ [n,n + 1) for some n, we have∣∣∣∣Bt
t− Bn
n
∣∣∣∣ =
∣∣∣∣Bt
t− Bn
t+
Bn
t− Bn
n
∣∣∣∣≤
sups∈[n,n+1] |Bs − Bn|t
+ |Bn|(
1n− 1
t
)≤
sups∈[n,n+1] |Bs − Bn|n
+|Bn|
n. (1)
Note that limn→∞
Bnn = 0 implies
limn→∞
|Bn|n
= 0. (2)
9 / 48
Since the sequence of random variables
sups∈[n−1,n]
|Bs − Bn|,n ≥ 1
are iid with the same law as the integrable random variablesups∈[0,1] |Bs|, using the strong law of large numbers again wecan show that∑n
i=1 sups∈[i−1,i] |Bs − Bi |n
→ E( sups∈[0,1]
|Bs|) <∞
almost surely, as n→∞.
10 / 48
Then, the above results yields
sups∈[n,n+1] |Bs − Bn|n
=
∑ni=1 sups∈[i−1,i] |Bs − Bi |
n−∑n−1
i=1 sups∈[i−1,i] |Bs − Bi |n
=
∑ni=1 sups∈[i−1,i] |Bs − Bi |
n−∑n−1
i=1 sups∈[i−1,i] |Bs − Bi |n − 1
(n − 1
n
)→E( sup
s∈[0,1]
|Bs|)− E( sups∈[0,1]
|Bs|)(1) = 0, (3)
as n→∞.
The fact of |Bt |t ≤
∣∣∣Btt −
Bnn
∣∣∣+ |Bn|n together with (1), (2) and (3)
imply
limt→∞
Bt
t= 0.
11 / 48
For the process X , using limt→∞
Btt = 0 almost surely we can
prove that the trajectories are continuous almost surely.
The process X is a Gaussian process with zero mean. For any0 ≤ s ≤ t <∞, we have
E(XtXs) = E((tB1/t )(sB1/s)) = (t)(s)E(B1/tB1/s) = (t)(s)(1/t) = s.
Therefore, X is also a Brownian motion.
12 / 48
Fatou’s lemma
LemmaLet ξ1, ξ2, ξ3, . . . be a sequence of non-negative randomvariables on a probability space (Ω,F ,P), then
E(lim infn→∞
ξn) ≤ lim infn→∞
E(ξn).
Let ξ1, ξ2, ξ3, . . . be a sequence of non-negative randomvariables on a probability space (Ω,F ,P). If there exists anon-negative integrable random variable η such that ξn ≤ ηfor all n, then
lim supn→∞
E(ξn) ≤ E(lim supn→∞
ξn).
13 / 48
Hewitt-Savage 0-1 law
TheoremLet ξ1, ξ2, ξ3, . . . be a sequence of independent andidentically-distributed (iid) random variables taking values in aset E . Then, any event whose occurrence or non-occurrence isdetermined by the values of these random variables and whoseoccurrence or non-occurrence is unchanged by finitepermutations of the indices, has probability either 0 or 1.
14 / 48
5. lim supt→∞Bt√
t=∞ and lim inft→∞ Bt√
t= −∞.
Proof: It suffices to show that lim supn→∞Bn√
n =∞ and
lim infn→∞ Bn√n = −∞. For any positive constant C, we have
P(
lim supn→∞
Bn√n> C
)= P
(lim sup
n→∞
Bn√
n> C
)≥ lim sup
n→∞P(
Bn√n> C
)(Fatou’s lemma)
= lim supn→∞
P (B1 > C) (Scaling property)
>0.
15 / 48
Let ξn = Bn − Bn−1, n ≥ 1. Then ξ1, ξ2, ξ3, . . . is a sequence ofiid random variables. Note that the event
lim supn→∞
Bn√
n> C
= lim sup
n→∞
∑nk=1 ξk√
n> C
is unchanged by finite permutations of the indices. Hence, byHewitt-Savage 0− 1 law together with the last inequality onprevious page, we have
P(
lim supn→∞
Bn√
n> C
)= 1.
Since C is arbitrary, we can show that lim supn→∞Bn√
n =∞.
Since −Bt , t ≥ 0 is also a Brownian motion, we can show that
lim infn→∞
Bn√n
= lim infn→∞
−Bn√n
= − lim supn→∞
Bn√n
= −∞.
16 / 48
Remark: From the Exercise 5, we obtain the following result
P(lim supt→∞
Bt = +∞) = 1, (4)
andP(lim inf
t→∞Bt = −∞) = 1. (5)
Furthermore,
P(lim supt→∞
Bt = +∞, lim inft→∞
Bt = −∞) = 1. (6)
17 / 48
6. Show that for each fixed t ≥ 0
P(ω ∈ Ω : lim sup
h→0+
Bt+h − Bt
h=∞
)= 1.
Proof: Given this Brownian motion B, we construct anotherBrownian motion X defined in Exercise 4:
Xt =
tB1/t , t > 0;
0, t = 0.
Then for any t ≥ 0, we obtain
lim supn→∞
Bt+ 1n− Bt
1n
= lim supn→∞
B 1n− B0
1n
(Exercise 2)
≥ lim supn→∞
√nB 1
n
= lim supn→∞
Xn√n
=∞ (Previous exercise).
18 / 48
7. Almost surely the paths of B are not differentiable at anypoint t ≥ 0, more precisely, the set
ω ∈ Ω : for each t ∈ [0,∞), either lim sup
h→0+
Bt+h − Bt
h=∞
or lim infh→0+
Bt+h − Bt
h= −∞
contains an event A ∈ F with P(A) = 1.
Proof: Denote
D+Bt = lim suph→0+
Bt+h − Bt
h
andD+Bt = lim inf
h→0+
Bt+h − Bt
h.
19 / 48
It suffices to show the result on the interval [0,1]. For any fixedintegers j ≥ 1 and k ≥ 1, define
Ajk =⋃
t∈[0,1]
⋂h∈[0,1/k ]
ω ∈ Ω : |Bt+h(ω)− Bt (ω)| ≤ jh.
Note that Ajk is not an event, that is Ajk /∈ F . Note also that
ω ∈ Ω : −∞ < D+Bt (ω) ≤ D+Bt (ω) <∞, for some t ∈ [0,1]
=∞⋃
j=1
∞⋃k=1
Ajk .
Then the proof of the result will be complete if we can find anevent C ∈ F with P(C) = 0 such that Ajk ⊆ C for all j and k .
20 / 48
For any fixed j and k , we consider a sample path ω ∈ Ajk , thatis, there exists t ∈ [0,1] with
|Bt+h(ω)− Bt (ω)| ≤ jh
for all h ∈ [0,1/k ].
Let n be an integer with n ≥ 4k . Then there exists some i ,1 ≤ i ≤ n such that i−1
n ≤ t < in . Then it is also easy to verify
that, for l = 1,2,3,
i + ln− t ≤ l + 1
n≤ 1
k.
Thus,
|B i+1n
(ω)− B in(ω)| ≤ |B i+1
n(ω)− Bt (ω)|+ |B i
n(ω)− Bt (ω)|
≤ 2jn
+jn
=3jn,
21 / 48
|B i+2n
(ω)− B i+1n
(ω)| ≤ |B i+2n
(ω)− Bt (ω)|+ |B i+1n
(ω)− Bt (ω)|
≤ 3jn
+2jn
=5jn,
and
|B i+3n
(ω)− B i+2n
(ω)| ≤ |B i+3n
(ω)− Bt (ω)|+ |B i+2n
(ω)− Bt (ω)|
≤ 4jn
+3jn
=7jn.
Now denote
C(n)i =
3⋂l=1
ω ∈ Ω :
∣∣∣B i+ln
(ω)− B i+l−1n
(ω)∣∣∣ ≤ (2l + 1)j
n
.
Then, we can see that C(n)i is an event in F and Ajk ⊆
⋃ni=1 C(n)
ifor all n ≥ 4k .
22 / 48
Note that the random variables√
n(
B i+ln− B i+l−1
n
), l = 1, 2,3,
are iid with law N(0,1). Note also that if a random variable ξhas law N(0,1) then
P(|ξ| ≤ x) ≤ x , for any x ≥ 0.
Therefore, we have
P(C(n)i ) ≤
(3j√
n
)(5j√
n
)(7j√
n
)=
105j3
n32
, i = 1,2, · · · ,n.
Then
P(n⋃
i=1
C(n)i ) ≤ 105j3√
n.
23 / 48
Take
C =∞⋂
n=4k
n⋃i=1
C(n)i ∈ F .
Then Ajk ⊆ C and it also holds that
P(C) = P(∞⋂
n=4k
n⋃i=1
C(n)i ) ≤ inf
n≥4kP(
n⋃i=1
C(n)i ) = 0,
which completes the proof.
24 / 48
Properties of the conditional expectation
8. Linearity:
E(aX + bY |G) = aE(X |G) + bE(Y |G).
Proof: aE(X |G) + bE(Y |G) is G-measurable. For any A ∈ G,∫A
E(aX + bY |G)dP =
∫A
(aX + bY )dP
= a∫
AXdP + b
∫A
YdP
=a∫
AE(X |G)dP + b
∫A
E(Y |G)dP
=
∫A
(aE(X |G) + bE(Y |G))dP,
which implies
E(aX + bY |G) = aE(X |G) + bE(Y |G).
25 / 48
9. E(E(X |G)) = E(X ).
Proof: In the definition of conditional expectation, let A = Ω,then
E(E(X |G)) =
∫Ω
E(X |G)dP =
∫Ω
XdP = E(X ).
26 / 48
10. If X and G are independent, then E(X |G) = E(X ).
Proof: It is obvious that E(X ) is G-measurable. For any A ∈ G,∫A
E(X |G)dP =
∫A
XdP = E(1AX )
= E(1A)E(X ) (X and 1A are independent)= P(A)E(X )
=
∫A
E(X )dP.
Thus, it implies that E(X |G) = E(X ).
27 / 48
11. If X is G-measurable, then E(X |G) = X .
Proof: The proof follows from the fact that X is G-measurableand the definition of conditional expectation: for any A ∈ G∫
AE(X |G)dP =
∫A
XdP.
28 / 48
Dominated convergence theorem
TheoremLet ξ1, ξ2, ξ3, . . . be a sequence of random variables. If thesequence ξn converges almost surely to a random variable ξand there exists a positive and integrable random variable ηsuch that |ξn| ≤ η for all n, then
limn→∞
E(|ξn − ξ|) = 0,
which also implies
limn→∞
E(ξn) = E(ξ).
29 / 48
12. If Y is bounded and G-measurable, then
E(XY |G) = YE(X |G).
Proof: If Y = 1B is an indicator function where B ∈ G, then forany A ∈ G∫
AE(YX |G)dP =
∫A
1BXdP =
∫A∩B
XdP
=
∫A∩B
E(X |G)dP
= intA1BE(X |G)dP,
which implies E(XY |G) = YE(X |G).
By the linearity of conditional expectation, we know that theresult is true if Y is a simple function (a linear combination ofindicator functions).
30 / 48
If Y is bounded G-measurable function, then there exists asequence of simple functions Yn such that |Yn| ≤ |Y | andlim
n→∞Yn = Y almost surely.
Then by dominated convergence theorem, we can show that forany A ∈ G,∫
AE(YX |G)dP =
∫A
YXdP = limn→∞
∫A
YnXdP
= limn→∞
∫A
YnE(X |G)dP
=
∫A
YE(X |G)dP,
which proves E(XY |G) = YE(X |G).
31 / 48
13. Given two σ-fields B ⊂ G, then
E(E(X |B)|G) = E(E(X |G)|B) = E(X |B).
Proof: Since B ⊂ G, we know that E(X |B) is G-measurable.Then by Exercise 4, we obtain
E(E(X |B)|G) = E(X |B).
For any A ∈ B ⊂ G, we have∫A
E(E(X |G)|B)dP =
∫A
E(X |G)dP =
∫A
XdP =
∫A
E(X |B)dP,
which impliesE(E(X |G)|B) = E(X |B).
32 / 48
14. Let X and Z be such that(i) Z is G-measurable.(ii) X is independent of G.
Suppose that E(h(X ,Z )|) <∞. Then,
E(h(X ,Z )|G) = E(h(X , z))|z=Z .
Proof: Denote g(z) = E(h(X , z)) and µX (E) = P(X ∈ E) forE ∈ B(R). For any A ∈ G, we denote
Y = 1A;
µY ,Z (F ) = P((Y ,Z ) ∈ F ), F ∈ B(R2).
33 / 48
Then,∫A
E(h(X ,Z )|G)dP =
∫A
h(X ,Z )dP
= E(Yh(X ,Z ))
=
∫R3
yh(x , z)µX (dx)µY ,Z (dy ,dz)
=
∫R2
yg(z)µY ,Z (dy ,dz)
= E(Yg(Z ))
= E(1Ag(Z ))
=
∫A
g(Z )dP
=
∫A
E(h(X , z))|z=Z dP.
34 / 48
Properties of stopping times
15. S ∨ T and S ∧ T are stopping times.
Proof: For all t ≥ 0, we know that S ≤ t ∈ Ft andT ≤ t ∈ Ft . Hence,
S ∨ T ≤ t = S ≤ t ∩ T ≤ t ∈ Ft ,
andS ∧ T ≤ t = S ≤ t ∪ T ≤ t ∈ Ft .
35 / 48
16. Given a stopping time T ,
FT = A : A ∩ T ≤ t ∈ Ft , for all t ≥ 0
is a σ-field.
Proof: It is obvious that Ω is in FT and FT is closed underintersection of countably infinite many subsets. We only needto show that FT is closed under complement. If A ∈ FT thenA ∩ T ≤ t ∈ Ft , and hence
Ac∩T ≤ t = (A∪T > t)c = ((A∩T ≤ t)∪T > t)c ∈ Ft .
36 / 48
17. S ≤ T ⇒ FS ⊂ FT .
Proof: If A ∈ FS, then for any t ≥ 0 we have A ∩ S ≤ t ∈ Ft .
Since S ≤ T , we also have T ≤ t ⊂ S ≤ t. Thus
A ∩ T ≤ t = A ∩ S ≤ t ∩ T ≤ t ∈ Ft ,
which implies A ∈ FT .
37 / 48
18. Let Xt , t ≥ 0 be a continuous and adapted process. Thehitting time of a set A ⊂ R is defined by
TA = inft ≥ 0 : Xt ∈ A.
Then, if A is open or closed, TA is a stopping time.
Proof: If A is an open set, then for any t > 0, we have
TA < t =⋃
r∈Q+,r<t
Xr ∈ A ∈ Ft .
By Exercise 1 in Professor Nualart’s lecture notes, we canshow that TA is a stopping time.
38 / 48
If A is a closed set, then for any t ≥ 0 we have
TA ≥ t =⋂
r∈Q+,r<t
Xr /∈ A ∈ Ft .
Hence TA < t ∈ Ft . By Exercise 1 in Professor Nualart’slecture notes, we can show that TA is a stopping time.
39 / 48
19. Let Xt be an adapted stochastic process withright-continuous paths and T <∞ is a stopping time. Then therandom variable
XT (ω) = XT (ω)(ω)
is FT -measurable.
Proof: For any n ∈ N, define
Tn =∞∑
i=0
i + 12n 1 i
2n<T≤ i+12n
.
For any t ≥ 0,
Tn ≤ t =⋃
i, i+12n ≤t
T ≤ i + 12n ∈ Ft .
Then Tn is a stopping time for each n, and moreover, Tn ↓ T .
40 / 48
Then it suffices to show that for any open set A ⊂ R
XTn ∈ A ∩ Tn ≤ t ∈ Ft , t ≥ 0.
In fact,
XTn ∈ A ∩ Tn ≤ t
= ∪ k2n≤t (X k
2n∈ A ∩ k − 1
2n < T ≤ k2n ) ∈ Ft .
Since Tn ↓ T and the process X is right-continuous, we can letn go to∞ and obtain
XT ∈ A ∩ T ≤ t = limn→∞XTn ∈ A ∩ Tn ≤ t ∈ Ft .
41 / 48
Application to Brownian hitting times
Let Bt be a Brownian motion. Fix a ∈ R and consider the hittingtime
τa = inft ≥ 0 : Bt = a.
PropositionIf a < 0 < b, then
P(τa < τb) =b
b − a.
Proof: From (4) and (5), we can get τa <∞ and τb <∞ almostsurely.
Exercise 18 implies that τa and τb are two stopping times, andhence Exercise 15 implies that τa ∧ τb is also a stopping time.
42 / 48
For any t ≥ 0, τa ∧ τb ∧ t is a bounded stopping time. Then bythe optional stopping theorem, we have
E(Bτa∧τb∧t ) = E(B0) = 0.
Sincea ≤ Bτa∧τb∧t ≤ b, ∀t ≥ 0
andlim
t→∞Bτa∧τb∧t = Bτa∧τb
almost surely, by the dominated convergence theorem, we have
E(Bτa∧τb ) = E( limt→∞
Bτa∧τb∧t ) = Bτa∧τb∧tE(Bτa∧τb∧t ) = 0.
The random variable Bτa∧τb takes only two values a and b andits distribution is given by
Bτa∧τb =
a, with probability P(τa < τb),
b, with probability P(τa > τb).
43 / 48
Hence,
E(Bτa∧τb ) = aP(τa < τb) + bP(τa > τb)
= aP(τa < τb) + b(1− P(τa < τb))
= 0.
Solving this equation for P(τa < τb), we obtain
P(τa < τb) =b
b − a.
44 / 48
Proposition
Let T = inft ≥ 0 : Bt /∈ (a,b), where a < 0 < b. Then
E(T ) = −ab.
Proof: In fact T = τa ∧ τb <∞ almost surely.
Using that B2t − t is a martingale, we get, by the optional
stopping theorem,
E(B2T∧t − (T ∧ t)) = 0,
that is,E(B2
T∧t ) = E(T ∧ t). (7)
45 / 48
Since T ∧ t ↑ T as t ↑ ∞, by the monotone convergencetheorem we get
E(T ) = E( limt→∞
(T ∧ t)) = limt→∞
E(T ∧ t). (8)
Since B2T∧t is bounded for all t ≥ 0 and B2
T∧t → BT almostsurely as t →∞, using the dominated convergence theoremand (7) and (8) we have
E(B2T ) = E( lim
t→∞B2
T∧t ) = limt→∞
E(B2T∧t ) = E(T ).
From the previous Proposition, we get
E(B2T ) = a2
(b
b − a
)+ b2
(1− b
b − a
)= −ab.
Therefore,E(T ) = −ab.
46 / 48
Additional exercises
1. Let Z be a Gaussian random variable with law N(0, σ2).From the expression
E(eλZ ) = e12λ
2σ2,
deduce the following formulas for the moments of Z :
E(Z 2k ) =(2k)!
2kK !σ2k , k = 1,2, . . . ,
E(Z 2k−1) = 0, k = 1,2, . . . .
47 / 48
2. Let Bt , t ∈ [0,1] be a standard Brownian motion. Showthat
ξ =
∫ 1
0Btdt
is a well-defined random variable. Calculate its first and secondmoments.
48 / 48