Lecture 1: Brownian motion, martingales andMarkov processes
David Nualart
Department of MathematicsKansas University
Gene Golub SIAM Summer School 2016Drexel University
David Nualart (Kansas University) July 2016 1 / 54
Outline
1 Stochastic proceses. Brownian motion. Markov processes.2 Stopping times. Martingales.3 Stochastic integrals.4 Ito’s formula and applications.5 Stochastic differential equations.6 Introduction to Malliavin calculus.
David Nualart (Kansas University) July 2016 2 / 54
Multivariate normal distribution
A random vector X = (X1, . . . ,Xn
) has the multivariate normal
distribution N(µ,⌃), if its characteristic function is
E
⇣
e
ihu,Xi⌘
= exp✓
ihu, µi � 12
u
T⌃u
◆
, u 2 Rn,
where µ 2 Rn and ⌃ is an n ⇥ n symmetric and nonnegative definitematrix.
µ = (E(X1), . . . ,E(Xn
))
⌃ij
= Cov(Xi
,Xj
)
If X has the N(µ,⌃) distribution, then Y = AX + b, where A is an m ⇥ n
matrix and b 2 Rm, has the N(Aµ+ b,A⌃A
T ) distribution.
David Nualart (Kansas University) July 2016 3 / 54
If ⌃ is nonsingular, then X has a density given by
f (x) = (2⇡)�n
2 (det⌃)�12 exp
✓
�12(x � µ)T⌃�1(x � µ)
◆
.
David Nualart (Kansas University) July 2016 4 / 54
Stochastic processes
A stochastic process X = {X
t
, t � 0} is a family of random variables
X
t
: ⌦ ! R
defined on a probability space (⌦,F ,P).
The probabilities on Rn, n � 1,
P
t1,...,tn = P � (Xt1 , . . . ,Xt
n
)�1
where 0 t1 < · · · < t
n
, are called the finite-dimensional marginal
distributions of the process X .
For every ! 2 ⌦, the mapping
t ! X
t
(!)
is called a trajectory of the process X .
David Nualart (Kansas University) July 2016 5 / 54
Stochastic processes
A stochastic process X = {X
t
, t � 0} is a family of random variables
X
t
: ⌦ ! R
defined on a probability space (⌦,F ,P).
The probabilities on Rn, n � 1,
P
t1,...,tn = P � (Xt1 , . . . ,Xt
n
)�1
where 0 t1 < · · · < t
n
, are called the finite-dimensional marginal
distributions of the process X .
For every ! 2 ⌦, the mapping
t ! X
t
(!)
is called a trajectory of the process X .
David Nualart (Kansas University) July 2016 5 / 54
Stochastic processes
A stochastic process X = {X
t
, t � 0} is a family of random variables
X
t
: ⌦ ! R
defined on a probability space (⌦,F ,P).
The probabilities on Rn, n � 1,
P
t1,...,tn = P � (Xt1 , . . . ,Xt
n
)�1
where 0 t1 < · · · < t
n
, are called the finite-dimensional marginal
distributions of the process X .
For every ! 2 ⌦, the mapping
t ! X
t
(!)
is called a trajectory of the process X .
David Nualart (Kansas University) July 2016 5 / 54
Theorem (Kolmogorov’s extension theorem)Consider a family of probability measures
{P
t1,...,tn , 0 t1 < · · · < t
n
, n � 1}
such that :
(i) P
t1,...,tn is a probability on Rn
.
(ii) (Consistence condition) : If {t
k1 < · · · < t
k
m
} ⇢ {t1 < · · · < t
n
}, then
P
t
k1 ,...,tkm
is the marginal of P
t1,...,tn , corresponding to the indexes
k1, . . . , km
.
Then, there exists a stochastic process {X
t
, t � 0} defined in some
probability space (⌦,F ,P), which has the family {P
t1,...,tn} as
finite-dimensional marginal distributions.
Take ⌦ as the set of all functions ! : [0,1) ! R, F the �-algebragenerated by cylindrical sets, extend the probability from cylindrical setsto F , and set X
t
(!) = !(t).
David Nualart (Kansas University) July 2016 6 / 54
Gaussian processes
X = {X
t
, t � 0} is called Gaussian if all its finite-dimensional marginaldistributions are multivariate normal.
The law of a Gaussian process is determined by the mean functionE(X
t
) and the covariance function
Cov(Xt
,Xs
) = E((Xt
� E(Xt
))(Xs
� E(Xs
))).
Suppose µ : R+ ! R, and � : R+ ⇥ R+ ! R is symmetric andnonnegative definite :
n
X
i,j=1
�(ti
, tj
)ai
a
j
� 0, 8 t
i
� 0, a
i
2 R.
Then there exists a Gaussian process with mean µ and covariancefunction �.
David Nualart (Kansas University) July 2016 7 / 54
Equivalent processes
Two processes, X , Y are equivalent (or X is a version of Y ) if for allt � 0,
P{X
t
= Y
t
} = 1.
Two equivalent processes may have quite different trajectories. Forexample, the processes X
t
= 0 for all t � 0 and
Y
t
=
⇢
0 if ⇠ 6= t
1 if ⇠ = t
where ⇠ � 0 is a continuous random variable, are equivalent, becauseP(⇠ = t) = 0, but their trajectories are different.Two processes X and Y are said to be indistinguishable if
X
t
(!) = Y
t
(!)
for all t � 0 and for all ! 2 ⌦⇤, with P(⌦⇤) = 1.
Exercise : Two equivalent processes with right-continuous trajectories areindistinguishable.
David Nualart (Kansas University) July 2016 8 / 54
Equivalent processes
Two processes, X , Y are equivalent (or X is a version of Y ) if for allt � 0,
P{X
t
= Y
t
} = 1.
Two equivalent processes may have quite different trajectories. Forexample, the processes X
t
= 0 for all t � 0 and
Y
t
=
⇢
0 if ⇠ 6= t
1 if ⇠ = t
where ⇠ � 0 is a continuous random variable, are equivalent, becauseP(⇠ = t) = 0, but their trajectories are different.
Two processes X and Y are said to be indistinguishable if
X
t
(!) = Y
t
(!)
for all t � 0 and for all ! 2 ⌦⇤, with P(⌦⇤) = 1.
Exercise : Two equivalent processes with right-continuous trajectories areindistinguishable.
David Nualart (Kansas University) July 2016 8 / 54
Equivalent processes
Two processes, X , Y are equivalent (or X is a version of Y ) if for allt � 0,
P{X
t
= Y
t
} = 1.
Two equivalent processes may have quite different trajectories. Forexample, the processes X
t
= 0 for all t � 0 and
Y
t
=
⇢
0 if ⇠ 6= t
1 if ⇠ = t
where ⇠ � 0 is a continuous random variable, are equivalent, becauseP(⇠ = t) = 0, but their trajectories are different.Two processes X and Y are said to be indistinguishable if
X
t
(!) = Y
t
(!)
for all t � 0 and for all ! 2 ⌦⇤, with P(⌦⇤) = 1.
Exercise : Two equivalent processes with right-continuous trajectories areindistinguishable.
David Nualart (Kansas University) July 2016 8 / 54
Regularity of trajectories
Theorem (Kolmogorov’s continuity theorem)Suppose that X = {X
t
, t 2 [0,T ]} satisfies
E(|Xt
� X
s
|�) K |t � s|1+↵,
for all s, t 2 [0,T ], and for some constants �,↵ > 0. Then, there exists a
version
e
X of X such that, if � < ↵/�,
|eXt
� e
X
s
| G� |t � s|�
for all s, t 2 [0,T ], where G� is a random variable.
The trajectories of eX are Holder continuous of order � for any � < ↵/�.
David Nualart (Kansas University) July 2016 9 / 54
Sketch of the proof :
(i) Suppose T = 1. Take � < ↵/� and set Dn
= { k
2n
, 0 k 2n} andD = [
n�1Dn
. From Chebychev’s inequality,
P( max1k2n
|Xk
2n
� X
k�12n
| � 2��n) 2n
X
k=1
P(|Xk
2n
� X
k�12n
| � 2��n)
2n
X
k=1
2��n
E [|Xk
2n
� X
k�12n
|�)
K 2�n(↵���).
Because this series of probabilities is convergent, from the Borel-Cantellilemma, there is a set ⌦⇤ 2 F with P(⌦⇤) = 1 such that for all ! 2 ⌦⇤,there exists N(!) with
|Xk
2n
(!)� X
k�12n
(!)| < 2��n, 8n � N(!), 81 k 2n.
David Nualart (Kansas University) July 2016 10 / 54
(ii) Suppose that s, t 2 D are such that
|s � t | 2�n, n � N.
Then , there exists two increasing sequences s
k
2 Dk
and t
k
2 Dk
,k � n, converging to s and t respectively, and such that|s
k+1 � s
k
| 2�(k+1), |tk+1 � t
k
| 2�(k+1) and |sn
� t
n
| 2�n. Then,from the decomposition
X
s
� X
t
=1X
i=n
(Xs
i+1 � X
s
i
) + (Xs
n
� X
t
n
) +1X
i=n
(Xt
i
� X
t
i+1)
we obtain|X
t
� X
s
| 21 � 2��
2��n.
This implies that the paths t ! X
t
(!) are �-Holder on D for all ! 2 ⌦⇤,which allows us to conclude the proof. ⇤
David Nualart (Kansas University) July 2016 11 / 54
Brownian motion
A stochastic process B = {B
t
, t � 0} is called a Brownian motion if :i) B0 = 0 almost surely.ii) Independent increments : For all 0 t1 < · · · < t
n
the incrementsB
t
n
� B
t
n�1 , . . . ,Bt2 � B
t1 , are independent random variables.iii) If 0 s < t , the increment B
t
� B
s
has the normal distributionN(0, t � s).
iv) With probability one, t ! B
t
(!) is continuous.
David Nualart (Kansas University) July 2016 12 / 54
PropositionProperties i), ii), iii) are equivalent to :
(?) B is a Gaussian process with mean zero and covariance
�(s, t) = min(s, t).
Proof :
a) Suppose i), i) and iii). The distribution of (Bt1 , . . . ,Bt
n
), for0 < t1 < · · · < t
n
, is normal, because this vector is a lineartransformation of
�
B
t1 ,Bt2 � B
t1 , . . . ,Bt
n
� B
t
n�1
�
which has independentand normal components.The mean is zero, and for s < t , the covariance is
E(Bs
B
t
) = E(Bs
(Bt
� B
s
+ B
s
)) = E(Bs
(Bt
� B
s
)) + E(B2s
) = s.
b) The converse is also easy to show. ⇤
David Nualart (Kansas University) July 2016 13 / 54
PropositionProperties i), ii), iii) are equivalent to :
(?) B is a Gaussian process with mean zero and covariance
�(s, t) = min(s, t).
Proof :
a) Suppose i), i) and iii). The distribution of (Bt1 , . . . ,Bt
n
), for0 < t1 < · · · < t
n
, is normal, because this vector is a lineartransformation of
�
B
t1 ,Bt2 � B
t1 , . . . ,Bt
n
� B
t
n�1
�
which has independentand normal components.The mean is zero, and for s < t , the covariance is
E(Bs
B
t
) = E(Bs
(Bt
� B
s
+ B
s
)) = E(Bs
(Bt
� B
s
)) + E(B2s
) = s.
b) The converse is also easy to show. ⇤
David Nualart (Kansas University) July 2016 13 / 54
First construction of the Brownian motion
1. The function �(s, t) = min(s, t) is symmetric and nonnegative definitebecause it can be written as
min(s, t) =Z 1
01[0,s](r)1[0,t](r)dr ,
son
X
i,j=1
a
i
a
j
min(ti
, tj
) =n
X
i,j=1
a
i
a
j
Z 1
01[0,t
i
](r)1[0,tj
](r)dr
=
Z 1
0
"
n
X
i=1
a
i
1[0,ti
](r)
#2
dr � 0.
Therefore, by Kolmogorov’s extension theorem there exists a Gaussianprocess B with zero mean and covariance function min(s, t).
David Nualart (Kansas University) July 2016 14 / 54
2. The process B satisfies
E
h
(Bt
� B
s
)2k
i
=(2k)!
2k
k !(t � s)k , s t
for any k � 1, because the distribution of B
t
� B
s
is N(0, t � s).
3. Therefore, by the Kolmogorov’s continuity theorem, there exist a versione
B of B, such that eB has Holder continuous trajectories of order � for any� < k�1
2k
on any interval [0,T ]. This implies that the paths are �-Holderon [0,T ] for any � < 1
2 and for any T > 0.
David Nualart (Kansas University) July 2016 15 / 54
Second construction of Brownian motionFix T > 0.
(i) {e
n
, n � 0} is an orthonormal basis of L
2([0,T ]).(ii) {Z
n
, n � 0} are independent N(0, 1) random variables.Then, as N ! 1,
sup0tT
�
�
�
�
�
N
X
n=0
Z
n
Z
t
0e
n
(s)ds � B
t
�
�
�
�
�
a.s.,L2
�! 0.
Notice that
E
"
N
X
n=0
Z
n
Z
t
0e
n
(r)dr
!
N
X
n=0
Z
n
Z
s
0e
n
(r)dr
!#
=N
X
n=0
Z
t
0e
n
(r)dr
!
✓
Z
s
0e
n
(r)dr
◆
=N
X
n=0
⌦
1[0,t], en
↵
L
2([0,T ])
⌦
1[0,s], en
↵
L
2([0,T ])
N!1!⌦
1[0,t], 1[0,s]↵
L
2([0,T ])= s ^ t .
David Nualart (Kansas University) July 2016 16 / 54
In particular, if T = 2⇡, e0(t) =1p2⇡
and e
n
(t) = 1p⇡
cos(nt/2), for n � 1,we obtain the Paley-Wiener representation of Brownian motion :
B
t
= Z0tp2⇡
+2p⇡
1X
n=1
Z
n
sin(nt/2)n
, t 2 [0, 2⇡].
In order to use this formula to get a simulation of Brownian motion, wehave to choose some number M of trigonometric functions and anumber N of discretization points.
David Nualart (Kansas University) July 2016 17 / 54
Third construction of Brownian motion
Let {⇠k
, 1 k n} be independent and identically distributed randomvariables with zero mean and variance one.
Define S
n
(0) = 0,
S
n
(kT
n
) =p
T
⇠1 + · · ·+ ⇠kp
n
, k = 1, . . . , n
and extend S
n
(t) to t 2 [0,T ] by linear interpolation.
Donsker Invariance Principle : The law of the random walk S
n
onC([0,T ]) converges to the Wiener measure, which is the law of theBrownian motion. That is, that for any continuous and bounded function' : C([0,T ]) ! R,
E('(Sn
))n!1�! E('(B)),
David Nualart (Kansas University) July 2016 18 / 54
Simulations of Brownian motion
David Nualart (Kansas University) July 2016 19 / 54
Basic properties
1. Selfsimilarity :
For any a > 0, the process {a
� 12B
at
, t � 0} is also a Brownian motion.
David Nualart (Kansas University) July 2016 20 / 54
2. For any h > 0, the process {B
t+h
� B
h
, t � 0} is a Brownian motion.
3. The process {�B
t
, t � 0} is a Brownian motion.
4. Almost surely limt!1
B
t
t
= 0 and the process
X
t
=
(
tB1/t
, t > 00, t = 0
is a Brownian motion.
5. P(sups,t2[0,1]
|Bt
�B
s
|p|t�s|
= +1) = 1.
6. P(supt�0 B
t
= +1, inft�0 B
t
= �1) = 1.
7. Almost surely the paths of B are not differentiable at any point t � 0.
David Nualart (Kansas University) July 2016 21 / 54
Quadratic variation
Fix a time interval [0, t ] and consider a partition
⇡ = {0 = t0 < t1 < · · · < t
n
= t}.
Define �t
k
= t
k
� t
k�1, �B
k
= B
t
k
� B
t
k�1 and |⇡| = max1kn
�t
k
.
PropositionThe following convergence holds in L
2:
lim|⇡|!0
n
X
k=1
(�B
k
)2 = t .
We can say that (�B
t
)2 ⇠ �t
David Nualart (Kansas University) July 2016 22 / 54
Proof : Set ⇠k
= (�B
k
)2 ��t
k
. The random variables ⇠k
are independent andcentered. Thus,
E
2
4
n
X
k=1
(�B
k
)2 � t
!23
5 = E
2
4
n
X
k=1
⇠k
!23
5 =n
X
k=1
E
⇥
⇠2k
⇤
=n
X
k=1
h
3 (�t
k
)2 � 2 (�t
k
)2 + (�t
k
)2i
= 2n
X
k=1
(�t
k
)2 2t |⇡| |⇡|!0�! 0.
⇤Exercise : Using the Borel-Cantelli lemma, show that if {⇡n} is a sequence ofpartitions of [0, t ] such that
P
n
|⇡n| < 1, thenP
n
k=1 (�B
k
)2 convergesalmost surely to t .
David Nualart (Kansas University) July 2016 23 / 54
Infinite total variation
Define
V
t
= sup⇡
n
X
k=1
|�B
k
|
Then,P(V
t
= 1) = 1.
In fact, using the continuity of the trajectories of the Brownian motion, wehave, on the set V < 1,
n
X
k=1
(�B
k
)2 supk
|�B
k
|
n
X
k=1
|�B
k
|!
V supk
|�B
k
| |⇡|!0�! 0.
Then, V < 1 contradicts the fact thatP
n
k=1 (�B
k
)2 converges in L
2 to t
as |⇡| ! 0.
David Nualart (Kansas University) July 2016 24 / 54
Fine properties of the trajectories
L
´
evy’s modulus of continuity :
lim sup�#0
sups,t2[0,1],|t�s|<�
|Bt
� B
s
|p
2|t � s| log |t � s|= 1, a.s.
In contrast, the behavior at at single point is given by the law of iterated
logarithm :
lim supt#s
|Bt
� B
s
|p
2|t � s| log log |t � s|= 1, a.s.
for any s � 0.
David Nualart (Kansas University) July 2016 25 / 54
Conditional expectation
Let X be an integrable random variable on a probability space (⌦,F ,P) andG ⇢ F a �-algebra.
DefinitionThe conditional expectation E(X |G) is a random variable Y safisfying :
(i) Y is G-measurable.
(ii) For all A 2 G,Z
A
XdP =
Z
A
YdP.
if X � 0, E(X |G) is the density of the measure µ(A) =R
A
XdP, restrictedto G, with respect to P.
By the Radon-Nikodym theorem, E(X |G) exists and it is unique almostsurely.
David Nualart (Kansas University) July 2016 26 / 54
Properties of the conditional expectation
1. Linearity :
E(aX + bY |G) = aE(X |G) + bE(Y |G).
2. E (E(X |G)) = E(X ).
3. If X and G are independent, then E(X |G) = E(X ).
4. If X is G-measurable, then E(X |G) = X .
5. If Y is bounded and G-measurable, then
E(YX |G) = YE(X |G).
6. Given two �-fields B ⇢ G, then
E (E(X |B)|G) = E (E(X |G)|B) = E(X |B).
David Nualart (Kansas University) July 2016 27 / 54
7. Let X and Z be such that :
(i) Z is G-measurable.(ii) X is independent of G.
Suppose that E (|h(X ,Z )|) < 1. Then,
E (h(X ,Z )|G) = E (h(X , z)) |z=Z
.
David Nualart (Kansas University) July 2016 28 / 54
Markov processes
A filtration {Ft
⇢ F , t � 0} is an increasing family of �-fields.
A process {X
t
, t � 0} is Ft
-adapted if X
t
is Ft
-measurable for all t � 0.
DefinitionAn adapted process X
t
is a Markov process with respect to Ft
if for any s � 0,t > 0 and any f 2 C
b
(R),
E [f (Xs+t
)|Fs
] = E [f (Xs+t
)|Xs
], a.s.
This implies that X
t
is also an FX
t
-Markov process, whereFX
t
= �{X
u
, 0 u t}.
The finite-dimensional marginal distributions of a Markov process arecharacterized by the transition probabilities
p(s, x , s + t ,B) = P(Xs+t
2 B|Xs
= x).
David Nualart (Kansas University) July 2016 29 / 54
Markov property of Brownian motion
TheoremThe Brownian motion B
t
is an FB
t
-Markov process such that, for any
f 2 C
b
(R), s � 0 and t > 0,
E [f (Bs+t
)|FB
s
] = (Pt
f )(Bs
),
where (Pt
f )(x) =R
R f (y) 1p2⇡t
e
� |x�y|22t
dy.
{P
t
, t � 0} is the the semigroup of operators associated with theBrownian motion :
P
t
� P
s
= P
t+s
P0 = Id
David Nualart (Kansas University) July 2016 30 / 54
Proof :
We haveE [f (B
s+t
)|FB
s
] = E [f (Bs+t
� B
s
+ B
s
)|FB
s
].
Since B
s+t
� B
s
is independent of FB
s
, we obtain
E [f (Bs+t
)|FB
s
] = E [f (Bs+t
� B
s
+ x)]|x=B
s
=
Z
Rf (y + B
s
)1p2⇡t
e
� |y|22t
dy
=
Z
Rf (y)
1p2⇡t
e
� |Bs
�y|22t
dy = (Pt
f )(Bs
).
⇤
David Nualart (Kansas University) July 2016 31 / 54
Multidimensional Brownian motion
B
t
= (B1t
, . . . ,Bd
t
) is called a d-dimensional Brownian motion if itscomponents are independent Brownian motions.
It is a Markov process with semigroup
(Pt
f )(x) =
Z
Rd
f (y)(2⇡t)�d
2 exp✓
� |x � y |2
2t
◆
.
The transition density p
t
(x , y) = (2⇡t)�d
2 exp⇣
� |x�y|22t
⌘
satisfies the heatequation
@p
@t
=12�p, t > 0,
with initial condition p0(x , y) = �x
(y).
David Nualart (Kansas University) July 2016 32 / 54
Stopping times
Consider a filtration {Ft
, t � 0} in a probability space (⌦,F ,P), thatsatisfies the following conditions :
(i) If A 2 F is such that P(A) = 0, then A 2 F0.(ii) The filtration is right-continuous, that is, for every t � 0,
Ft
= \n�1F
t+ 1n
.
DefinitionA random variable T : ⌦ ! [0,1] is a stopping time with respect to a filtration{F
t
, t � 0} if{T t} 2 F
t
, 8t � 0.
David Nualart (Kansas University) July 2016 33 / 54
Properties of stopping times
1. T is a stopping time if and only if {T < t} 2 Ft
for all t � 0.Proof :
{T < t} = [n
{T t � 1n
} 2 Ft
.
Conversely,
{T t} = \n
{T < t +1n
} 2 \Ft+ 1
n
= Ft
. ⇤
2. S _ T and S ^ T are stopping times.
3. Given a stopping time T ,
FT
= {A : A \ {T t} 2 Ft
, for all t � 0}.
is a �-field.
4. S T ) FS
⇢ FT
.
David Nualart (Kansas University) July 2016 34 / 54
5. Let {X
t
, t � 0} be a continuous and adapted process. The hitting time ofa set A ⇢ R is defined by
T
A
= inf{t � 0 : X
t
2 A}.
Then, if A is open or closed, T
A
is a stopping time.
6. Let X
t
be an adapted stochastic process with right-continuous paths andT < 1 a stopping time. Then the random variable
X
T
(!) = X
T (!)(!)
is FT
-measurable.
David Nualart (Kansas University) July 2016 35 / 54
Martingales
We assume that {Ft
, t � 0} is a filtration.
DefinitionAn adapted process M = {M
t
, t � 0} is called a martingale with respect to Ft
if
(i) For all t � 0, E(|Mt
|) < 1.
(ii) For each s t , E(Mt
|Fs
) = M
s
.
Property (ii) can also be written as :
E(Mt
� M
s
|Fs
) = 0
M
t
is a supermartingale (or submartingale) if property (ii) is replaced byE(M
t
|Fs
) M
s
(or E(Mt
|Fs
) � M
s
).
David Nualart (Kansas University) July 2016 36 / 54
Martingales
We assume that {Ft
, t � 0} is a filtration.
DefinitionAn adapted process M = {M
t
, t � 0} is called a martingale with respect to Ft
if
(i) For all t � 0, E(|Mt
|) < 1.
(ii) For each s t , E(Mt
|Fs
) = M
s
.
Property (ii) can also be written as :
E(Mt
� M
s
|Fs
) = 0
M
t
is a supermartingale (or submartingale) if property (ii) is replaced byE(M
t
|Fs
) M
s
(or E(Mt
|Fs
) � M
s
).
David Nualart (Kansas University) July 2016 36 / 54
Martingales
We assume that {Ft
, t � 0} is a filtration.
DefinitionAn adapted process M = {M
t
, t � 0} is called a martingale with respect to Ft
if
(i) For all t � 0, E(|Mt
|) < 1.
(ii) For each s t , E(Mt
|Fs
) = M
s
.
Property (ii) can also be written as :
E(Mt
� M
s
|Fs
) = 0
M
t
is a supermartingale (or submartingale) if property (ii) is replaced byE(M
t
|Fs
) M
s
(or E(Mt
|Fs
) � M
s
).
David Nualart (Kansas University) July 2016 36 / 54
Basic properties
1. For any integrable random variable X , {E(X |Ft
), t � 0} is a martingale.
2. If M
t
is a submartingale, then t ! E [Mt
] is nondecreasing.
3. If M
t
is a martingale and ' is a convex function such thatE(|'(M
t
)|) < 1 for all t � 0, then '(Mt
) is a submartingale.Proof : By Jensen’s inequality, if s t ,
E ('(Mt
)|Fs
) � ' (E(Mt
|Fs
)) = '(Ms
). ⇤
In particular, if M
t
is a martingale such that E(|Mt
|p) < 1 for all t � 0and for some p � 1, then |M
t
|p is a submartingale.
David Nualart (Kansas University) July 2016 37 / 54
Examples :
Let B
t
be a Brownian motion Ft
the filtration generated by B
t
:
Ft
= �{B
s
, 0 s t}.
Then, the processes
M
(1)t
= B
t
M
(2)t
= B
2t
� t
M
(3)t
= exp(aB
t
� a
2t
2)
where a 2 R, are martingales.
David Nualart (Kansas University) July 2016 38 / 54
1. B
t
is a martingale because
E(Bt
� B
s
|Fs
) = E(Bt
� B
s
) = 0.
2. For B
2t
� t , we can write, using the properties of the conditionalexpectation, for s < t
E(B2t
|Fs
) = E((Bt
� B
s
+ B
s
)2 |Fs
)
= E((Bt
� B
s
)2 |Fs
) + 2E((Bt
� B
s
)B
s
|Fs
)
+E(B2s
|Fs
)
= E (Bt
� B
s
)2 + 2B
s
E((Bt
� B
s
) |Fs
) + B
2s
= t � s + B
2s
.
3. Finally, for exp(aB
t
� a
2t
2 ) we have
E(eaB
t
� a
2t
2 |Fs
) = e
aB
s
E(ea(Bt
�B
s
)� a
2t
2 |Fs
)
= e
aB
s
E(ea(Bt
�B
s
)� a
2t
2 )
= e
aB
s
e
a
2(t�s)2 � a
2t
2 = e
aB
s
� a
2s
2 .
David Nualart (Kansas University) July 2016 39 / 54
1. B
t
is a martingale because
E(Bt
� B
s
|Fs
) = E(Bt
� B
s
) = 0.
2. For B
2t
� t , we can write, using the properties of the conditionalexpectation, for s < t
E(B2t
|Fs
) = E((Bt
� B
s
+ B
s
)2 |Fs
)
= E((Bt
� B
s
)2 |Fs
) + 2E((Bt
� B
s
)B
s
|Fs
)
+E(B2s
|Fs
)
= E (Bt
� B
s
)2 + 2B
s
E((Bt
� B
s
) |Fs
) + B
2s
= t � s + B
2s
.
3. Finally, for exp(aB
t
� a
2t
2 ) we have
E(eaB
t
� a
2t
2 |Fs
) = e
aB
s
E(ea(Bt
�B
s
)� a
2t
2 |Fs
)
= e
aB
s
E(ea(Bt
�B
s
)� a
2t
2 )
= e
aB
s
e
a
2(t�s)2 � a
2t
2 = e
aB
s
� a
2s
2 .
David Nualart (Kansas University) July 2016 39 / 54
1. B
t
is a martingale because
E(Bt
� B
s
|Fs
) = E(Bt
� B
s
) = 0.
2. For B
2t
� t , we can write, using the properties of the conditionalexpectation, for s < t
E(B2t
|Fs
) = E((Bt
� B
s
+ B
s
)2 |Fs
)
= E((Bt
� B
s
)2 |Fs
) + 2E((Bt
� B
s
)B
s
|Fs
)
+E(B2s
|Fs
)
= E (Bt
� B
s
)2 + 2B
s
E((Bt
� B
s
) |Fs
) + B
2s
= t � s + B
2s
.
3. Finally, for exp(aB
t
� a
2t
2 ) we have
E(eaB
t
� a
2t
2 |Fs
) = e
aB
s
E(ea(Bt
�B
s
)� a
2t
2 |Fs
)
= e
aB
s
E(ea(Bt
�B
s
)� a
2t
2 )
= e
aB
s
e
a
2(t�s)2 � a
2t
2 = e
aB
s
� a
2s
2 .
David Nualart (Kansas University) July 2016 39 / 54
Optional Stopping Theorem
Theorem (Optional Stopping Theorem)Suppose that M
t
is a continuous martingale and let S T K two
bounded stopping times. Then
E(MT
|FS
) = M
S
.
This theorem implies that E(MT
) = E(MS
).In the submartingale case we have E(M
T
|FS
) � M
S
.As a consequence, if T is a bounded stopping time,
M
t
(sub)martingale ) M
t^T
(sub)martingale
David Nualart (Kansas University) July 2016 40 / 54
Proof :
We will show that E(MT
) = E(M0).
Assume first that T takes value in a finite set :
0 t1 · · · t
n
K .
Then, by the martingale property
E(MT
) =n
X
i=1
E(MT
1{T=t
i
}) =n
X
i=1
E(Mt
i
1{T=t
i
})
=n
X
i=1
E(Mt
n
1{T=t
i
}) = E(Mt
n
) = E(M0).
In the general case we approximate T by the following nonincreasingsequence of stopping times
⌧n
=2n
X
k=1
kK
2n
1{ (k�1)K2n
T< kK
2n
}.
David Nualart (Kansas University) July 2016 41 / 54
By continuityM⌧
n
a.s.! M
T
.
To show that E(M0) = E(M⌧n
) ! E(MT
), it suffices to check that thesequence M⌧
n
is uniformly integrable. This follows from :
E(|M⌧n
|1{|M⌧n
|�A}) =2n
X
k=1
E(|MkK
2n
|1{|MkK
2n
|�A,⌧n
= kK
2n
})
2n
X
k=1
E(|MK
|1{|MkK
2n
|�A,⌧n
= kK
2n
})
= E(|MK
|1{|M⌧n
|�A})
E(|MK
|1{sup0sK
|Ms
|�A}),
which converges to zero as A " 1, uniformly in n. ⇤
David Nualart (Kansas University) July 2016 42 / 54
Doob’s maximal inequalities
TheoremLet {M
t
, t 2 [0,T ]} be a continuous martingale such that E(|MT
|p) < 1 for
some p � 1. Then, for all � > 0 we have
P
sup0tT
|Mt
| > �
!
1�p
E(|MT
|p). (1)
If p > 1, then
E
sup0tT
|Mt
|p!
✓
p
p � 1
◆
p
E(|MT
|p). (2)
David Nualart (Kansas University) July 2016 43 / 54
Proof of (1) :
Set⌧ = inf{s � 0 : |M
s
| � �} ^ T .
Because ⌧ is a bounded stopping time and |Mt
|p is a submartingale,
E(|M⌧ |p) E(|MT
|p).
From the definition of ⌧ ,
|M⌧ |p � 1{sup0tT
|Mt
|��}�p + 1{sup0tT
|Mt
|<�}|MT
|p.
Therefore,
P
sup0tT
|Mt
| > �
!
1�p
E(|M⌧ |p) 1�p
E(|MT
|p).
David Nualart (Kansas University) July 2016 44 / 54
Application to Brownian hitting times
Let B
t
be a Brownian motion. Fix a 2 R and consider the hitting time
⌧a
= inf{t � 0 : B
t
= a}
PropositionIf a < 0 < b, then
P(⌧a
< ⌧b
) =b
b � a
.
Proof : By the optional stopping theorem
E(Bt^⌧
a
) = E(B0) = 0.
Letting t ! 1 and using the dominated convergence theorem, it follows that
0 = aP(⌧a
< ⌧b
) + b(1 � P(⌧a
< ⌧b
)). ⇤
David Nualart (Kansas University) July 2016 45 / 54
PropositionLet T = inf{t � 0 : B
t
/2 (a, b)}, where a < 0 < b. Then
E(T ) = �ab.
Proof : Using that B
2t
� t is a martingale, we get
E(B2T^t
) = E(T ^ t).
Therefore,E(T ) = lim
t!1E(B2
T^t
) = E(B2T
) = �ab. ⇤
David Nualart (Kansas University) July 2016 46 / 54
PropositionFix a > 0. The hitting time
⌧a
= inf{t � 0 : B
t
= a},
satisfies
E [exp (�↵⌧a
)] = e
�p
2↵a. ↵ > 0 (3)
David Nualart (Kansas University) July 2016 47 / 54
Proof :
For any � > 0, the process M
t
= e
�B
t
��2t
2 is a martingale such that
E(Mt
) = E(M0) = 1.
By the optional stopping theorem we obtain, for all N � 1.
E(M⌧a
^N
) = 1.
Notice that M⌧a
^N
= exp⇣
�B⌧a
^N
� �2(⌧a
^N)2
⌘
e
a�. So, by thedominated convergence theorem we obtain
E (M⌧a
) = 1,
that is,
E
✓
exp✓
��2⌧a
2
◆◆
= e
��a.
With the change of variables �2
2 = ↵, we get
E ( exp (�↵⌧a
)) = e
�p
2↵a. ⇤ (4)
David Nualart (Kansas University) July 2016 48 / 54
The expectation of ⌧a
can be obtained by computing the derivative of (4)with respect to the variable ↵ :
E ( ⌧a
exp (�↵⌧a
)) =ae
�p
2↵a
p2↵
,
and letting ↵ # 0 we obtain E(⌧a
) = +1.
We can compute the density function of ⌧a
:
f⌧a
(s) =ap2⇡
s
� 32e
�a
2/2s, s � 0.
David Nualart (Kansas University) July 2016 49 / 54
Strong Markov property
TheoremLet B be a Brownian motion and let T be a finite stopping time with respect to
the filtration FB
t
generated by B. Then the process
{B
T+t
� B
T
, t � 0}
is a Brownian motion independent of B
T
.
As a consequence, for any f 2 C
b
(R) and any finite stopping time T forthe filtration FB
t
, we have
E [f (BT+t
)|FB
T
] = (Pt
f )(BT
),
where P
t
is the semigroup associated with the Brownian motion B.
David Nualart (Kansas University) July 2016 50 / 54
Proof :
Consider the process B
t
= B
T+t
� B
T
and suppose first that T isbounded. Let � 2 R and 0 s t . Applying the optional stoppingtheorem to the martingale
exp✓
i�B
t
+�2
t
2
◆
,
yieldsE
h
e
i�B
T+t
+�22 (T+t)|F
T+s
i
= e
i�B
T+s
+�22 (T+s).
Therefore,E
h
e
i�(BT+t
�B
T+s
)|FT+s
i
= e
��22 (t�s).
This implies that the increments of B are independent, stationary andnormally distributed.
If T is not bounded, then we can consider the stopping time T ^ N andlet N ! 1.
David Nualart (Kansas University) July 2016 51 / 54
Reflection principle
TheoremLet M
t
= sup0st
B
s
. Then
P(Mt
� a) = 2P(Bt
> a) = 21p2⇡t
Z 1
a
e
� x
22
dx .
David Nualart (Kansas University) July 2016 52 / 54
Proof :
We have
P(Bt
� a) = P(Bt
� a,Mt
� a) = P(Bt
� a|Mt
� a)P(Mt
� a)
= P(Bt
� a|⌧a
t)P(Mt
� a).
We know that {B⌧a
+s
� a, s � 0} is a Brownian motion independent ofF⌧
a
. Therefore,
P(Bt
� a|⌧a
t) = E [P(B⌧a
+(t�⌧a
) � a � 0|F⌧a
)|⌧a
t ] =12.
David Nualart (Kansas University) July 2016 53 / 54
Brownian filtration
DefineFB
t
= � {B
s
, 0 s t} .
Denote by N the family of sets inF of probability zero (null sets).
PropositionThe filtration
Ft
= �n
FB
t
,No
.
is right-continuous. Therefore, it satisfies conditions (i) and (ii).
David Nualart (Kansas University) July 2016 54 / 54