lecture 13: martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... ·...

30
Lecture 13: Martingales 1. Definition of a Martingale 1.1 Filtrations 1.2 Definition of a martingale and its basic properties 1.3 Sums of independent random variables and related models 1.4 Products of independent random variables and related mod- els 1.5 An exponential martingale 1.6 Likelihood ratios 3. Square Integrable Martingales 2.1 Doob’s martingale 2.2 Doob decomposition 2.3 Doob decomposition for square integrable martingales 2.4 Doob-Kolmogorov inequality 4. Convergence of Martingales 3.1 A.s. convergence of martingales 3.2 Law of large numbers for martingales 3.3 Central limit theorem for martingales 5. Stopping times 4.1 Definition and basic properties of stopping times 4.2 Stopped martingales 1

Upload: buitram

Post on 14-Apr-2018

214 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

Lecture 13: Martingales

1. Definition of a Martingale

1.1 Filtrations1.2 Definition of a martingale and its basic properties1.3 Sums of independent random variables and related models1.4 Products of independent random variables and related mod-els1.5 An exponential martingale1.6 Likelihood ratios

3. Square Integrable Martingales

2.1 Doob’s martingale2.2 Doob decomposition2.3 Doob decomposition for square integrable martingales2.4 Doob-Kolmogorov inequality

4. Convergence of Martingales

3.1 A.s. convergence of martingales3.2 Law of large numbers for martingales3.3 Central limit theorem for martingales

5. Stopping times

4.1 Definition and basic properties of stopping times4.2 Stopped martingales

1

Page 2: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

4.3 Optional stopping theorems4.4 Wald equation4.5 A fair game example

1. Definition of a Martingale

1.1 Filtrations

< Ω,F , P > be a probability space;

~S = S0, S1, . . . is a finite or infinite sequence of random vari-ables defined on this probability space.

F = F0,F1, . . . is a finite or infinite sequence of σ-algebrassuch that F0,F1, . . . ⊆ F .

(1) If FN = F0,F1, . . . ,FN is a finite sequence of σ-algebras,then the infinite sequence of σ-algebras F = F0,F1, . . . ,FN ,FN , . . . = Fn∧N , n = 0, 1, . . . is called a natural continuationof the initial finite sequence of σ-algebras.

(2) If ~SN = S0, S1, . . . , SN is a finite sequence of randomvariables, then the infinite sequence of random variable ~S =S0, S1, . . . , SN , SN , . . . = Sn∧N , n = 0, 1, . . . is called a natu-ral continuation of the initial finite sequence of random variables.

The continuation construction described above us restrictconsideration by infinite sequences of random variables and σ-algebras.

Definition 13.1. A sequence of σ-algebras F = F0,F1, . . .is a filtration if it is a nondecreasing sequence, i.e., F0 ⊆ F1 ⊆

2

Page 3: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

· · ·.

Examples

(1) F0 = F1 = F2 = · · ·. Two extreme cases are, whereF0 = ∅,Ω or F0 = F .

(2) ~S = S0, S1, . . . is a sequence of random variables, andFn = σ(Sk, k = 0, . . . , n), n = 0, 1, . . .. Then, F = F0,F1, . . .is a filtration. It is called the natural filtration generated by thesequence ~S. Note that in this case the random variable Sn isFn-measurable for every n = 0, 1, . . ..

(3) ~S = S0, S1, . . . and ~X = ~X0, ~X1, . . . are, respectively,a sequence of random variables and a sequence of random vec-tors, and Fn = σ(Sk, ~Xk, k = 0, . . . , n), n = 0, 1, . . .. Then,F = F0,F1, . . . is a filtration. Note that in this case the ran-dom variable Sn is also Fn-measurable for every n = 0, 1, . . ..

(4) If F0 ⊆ F1,⊆ · · · ,⊆ FN is a finite nondecreasing sequenceof σ-algebras, then its natural continuation F = Fn∧N , n =0, 1, . . . is also a nondecreasing sequence of σ-algebras, i.e., itis a filtration.

(5) If Fn = σ(Sk, k = 0, . . . , n), n = 0, 1, . . . , N is the finitesequence of σ-algebras generated by a finite sequence of ran-dom variables S0, . . . , SN, then F = Fn∧N , n = 0, 1, . . . is anatural filtration for the infinite sequence of random variables~S = Sn∧N , n = 0, 1, . . ..

3

Page 4: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

1.2 Definition of a martingale and its basic properties

< Ω,F , P > be a probability, ~S = S0, S1, . . . is a sequenceof random variables defined on this probability space, and F =F0,F1, . . . is a filtration at this probability space.

Definition 13.2. A sequence of random variables ~S = S0, S1, . . .is F -adapted to a filtration F = F0,F1, . . . if the random vari-able Sn is Fn-measurable (i.e., event Sn ∈ B ∈ Fn, B ∈ B1)for every n = 0, 1, . . ..

Definition 13.3. A F -adapted sequence of random variables ~S= S0, S1, . . . is a F -martingale (martingale with respect to afiltration F) if it satisfies the following conditions:

(a) E|Sn| <∞, n = 0, 1, . . .;(b) E(Sn+1/Fn) = Sn, n = 0, 1, . . ..

(1) Condition (b) can be written in the equivalent form

E(Sn+1 − Sn/Fn) = 0, n = 0, 1, . . . .

(2) A F -adapted sequence of random variables ~S = S0, S1, . . .is F -submartingale or F -supermartingale if (a) E|Sn| <∞, n =0, 1, . . . and, respectively (b’) E(Sn+1/Fn) ≥ Sn, n = 0, 1, . . . or(b”) E(Sn+1/Fn) ≤ Sn, n = 0, 1, . . ..

(3) If ~S = S0, S1, . . . is a martingale with respect to a filtra-tion F = Fn = σ(X0, . . . , Xn), n = 0, 1, . . . generated by thesequence of random variables ~X == X0, X1, . . ., then the no-tation E(Sn+1/X0, . . . , Xn) = E(Sn+1/Fn) and one can speakabout ~S as the martingale with respect to the sequence of ran-

4

Page 5: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

dom variable ~X .

(4) If ~S = S0, S1, . . . is a martingale with respect to a naturalfiltration F = Fn = σ(S0, . . . , Sn), n = 0, 1, . . . then the no-tation EnSn+m = E(Sn+m/Fn) may be used and one can referto the sequence ~S as a martingale without a specification of thecorresponding filtration.

The martingales possess the following basic properties:

1. If ~S = S0, S1, . . . is F -martingale, then

E(Sn+m/Fn) = Sn, 0 ≤ n < n+m <∞.

——————————-(a) E(Sn+m/Fn) = E(E(Sn+m/Fn+m−1)/Fn) = E(Sn+m−1/Fn);(b) Iterate this relation to get the above formula.——————————-

2. If ~S = S0, S1, . . . is F -martingale, then

ESn = ES0, n = 0, 1, . . . .

——————————-ESn = E(E(Sn/F0)) = ES0, n = 0, 1, . . ..——————————-

3. If ~S ′ = S ′0, S ′1, . . . and ~S ′′ = S ′′0 , S ′′1 , . . . are two F -martingales, and a, b ∈ R1, then the sequence ~S = S0 =aS ′0 + bS ′′0 , S1 = aS ′1 + bS ′′1 , . . . is also a F -martingale.

——————————-E(Sn+1/Fn) = E(aS ′n+1 + bS ′′n+1/Fn)

5

Page 6: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

= aE(S ′n+1/Fn) + bE(S ′′n+1/Fn) = aS ′n + bS ′′n = Sn, n = 0, 1, . . ..——————————-

4. If ~S = S0, S1, . . . is F -martingale and ES2n < ∞, n =

0, 1, . . ., then this sequence is non-decreasing, i.e.,

ES2n ≤ ES2

n+1, n = 0, 1, . . . .

——————————-(a) ESn(Sn+1 − Sn) = E(E(Sn(Sn+1 − Sn)/Fn))

= E(SnE(Sn+1 − Sn)/Fn)) = E(Sn · 0) = 0;

(b) 0 ≤ E(Sn+1 − Sn)2 = ES2n+1 − 2ESn+1Sn + ES2

n

= ES2n+1 − ES2

n − 2ESn(Sn+1 − Sn) = ES2n+1 − ES2

n.——————————-

1.3 Sums of independent random variables and relatedmodels

(1) Let X1, X2, . . . is a sequence of independent random variablesand ~S = Sn, n = 0, 1, . . ., where

Sn = S0 +X1 + · · ·+Xn, n = 0, 1, . . . , S0 = const.

Let also F = Fn, n = 0, 1, . . ., where

Fn = σ(X1, . . . , Xn) = σ(S0, S1, . . . , Sn), n = 0, 1, . . . , F0 = ∅,Ω.

In this case, the sequence ~S is a F martingale if and only if

EXn = 0, n = 1, 2, . . . .

——————————-(a) In this case, the sequence S is F -adapted;

6

Page 7: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

(b) E(Sn+1 − Sn/Fn) = E(Xn+1/Fn) = EXn+1, n = 0, 1, . . ..——————————-

(2) Let Xn, n = 1, 2, . . . is a sequence of independent randomvariables taking values +1 and −1 with probabilities pn and qn =1− pn, respectively. In this case, EXn = pn− qn and, therefore,the following ”compensated” sequence is a F -martingale,

Sn = S0 +n∑k=1

Xk − An, n = 0, 1, . . . ,

where

An =n∑k=1

(pk − qk), n = 0, 1, . . . .

1.4 Products of independent random variables and re-lated models

(1) Let X1, X2, . . . is a sequence of independent random variablesand ~S = Sn, n = 0, 1, . . ., where

Sn = S0

n∏k=1

Xk, n = 0, 1, . . . , S0 = const.

Let also F = Fn, n = 0, 1, . . ., where

Fn = σ(X1, . . . , Xn) = σ(S0, S1, . . . , Sn), n = 0, 1, . . . , F0 = ∅,Ω.

In this case, the sequence ~S is a F martingale if

EXn = 1, n = 1, 2, . . . .

——————————-(a) In this case, the sequence S is F -adapted;

7

Page 8: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

(b) Sn+1 − Sn = Sn(Xn+1 − 1), n = 0, 1, . . .;

(c) E(Sn+1 − Sn/Fn) = E(Sn(Xn+1 − 1)/Fn) = Sn · E(Xn+1 −1/Fn) = Sn · (EXn+1 − 1), n = 0, 1, . . ..

(d) If random variables Xn are a.s. positive, for example if Xn =eYn, n = 1, 2, . . ., and S0 > 0, then a.s. Sn > 0, n = 0, 1, . . .. Inthis case condition EXn = 1, n = 1, 2, . . . is also necessary con-dition for the sequence S to be a F -martingale.——————————-

(2) Sn = S0 exp∑nk=1 Yk, n = 0, 1, . . ., where S0 = const and

Yk = N(µk, σ2k), k = 1, 2, . . . are independent normal random

variables. In this case, E expYn = eµn+12σ

2n, n = 1, 2, . . . and

therefore, the following condition is necessary and sufficient con-dition under which the sequence ~S = Sn, n = 0, 1, . . . is aF -martingale,

µn +1

2σ2n = 0, n = 1, 2, . . . .

(3) Sn = S0(qp)

∑nk=1 Yk, n = 0, 1, . . ., where S0 = const and

Y1, Y2, . . . is a sequence of Bernoulli random variables taking val-ues 1 and −1 with probabilities, respectively, p and q = 1 − p,where 0 < p < 1. In this case, E(qp)

Yn = (qp)p + (qp)−1q = 1, n =

1, 2, . . . and, therefore, the sequence ~S = Sn, n = 0, 1, . . . is aF -martingale.

1.5 An exponential martingale

Let X1, X2, . . . are i.i.d. random variables such that ψ(t) =EetX1 < ∞ for some t > 0. Let also Yn = X1 + · · · + Xn, n =

8

Page 9: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

1, 2, . . . and

Sn =etYn

ψ(t)n=

n∏k=1

etXn

ψ(t), n = 1, 2, . . . , S0 = 1.

Let also F = Fn, n = 0, 1, . . ., where

Fn = σ(X1, . . . , Xn) = σ(S0, S1, . . . , Sn), n = 0, 1, . . . , F0 = ∅,Ω.

In this case, ~S = S0, S1, . . . is a F -martingale (known as anexponential martingale).——————————-(a) ~S = S0, S1, . . . is a F -adapted sequence;

(b) Sn+1 = SnetXn

ψ(t) , n = 0, 1, . . .

(c) E(Sn+1/Fn) = E(SnetXn

ψ(t) /Fn) = SnE(etXn

ψ(t) /Fn)) = Sn ·1 = Sn——————————-

1.6 Likelihood ratios

Let X1, X2, . . . be i.i.d. random variables and let f0(x) and f1(x)are different probability density functions. For simplicity assumethat f0(x) > 0, x ∈ R1. Let us define the so-called likelihoodratios,

Sn =f1(X1)f1(X2) · · · f1(Xn)

f0(X1)f0(X2) · · · f0(Xn), n = 1, 2, . . . , S0 = 1.

Let also the filtration F = F0,F1, . . ., where Fn = σ(X1, . . . , Xn),n = 0, 1, . . . , F0 = ∅,Ω.(1) ~S = S0, S1, . . . is F -adapted sequence of random variablessince, Sn is a non-random Borel function of random variablesX1, . . . , Xn.

9

Page 10: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

(2) Due to independence of random variables Xn, we get

E(Sn+1/Fn) = E(Snf1(Xn+1)f0(Xn+1)

/Fn) = SnEf1(Xn+1)f0(Xn+1)

, n = 0, 1, . . . .

(3) Thus, the sequence ~S = S0, S1, . . . is a F -martingale un-der hypopiesis that the common probability density functionsfor random variable Xn is f0(x). Indeed,

E f1(Xn+1)f0(Xn+1)

=∫∞−∞

f1(x)f0(x)

f0(x)dx =∫∞−∞ f1(x)dx = 1, n = 0, 1, . . ..

2. Square Integrable Martingales

2.1 Doob’s martingale

Definition 13.4. Let S ba a random variable such that E|S| <∞ and F = F0,F1, . . . is a filtration. In this case the followingsequence is a F -martingale (Doob’s martingale)

Sn = E(S/Fn), n = 0, 1, . . . .

——————————-E(Sn+1/Fn) = E(E(S/Fn+1)/Fn) = E(S/Fn) = Sn, n = 0, 1, . . ..——————————-

2.2 Doob decomposition

Definition 13.5. A sequence of random variables ~A =An, n = 0, 1, . . . is a predictable with respect to a filtrationF = F0,F1, . . . (F -predictable sequence) if A0 = 0 and therandom variable An is Fn−1-measurable for every n = 1, 2, . . ..

Theorem 13.1 (Doob decomposition). Let ~S = S0, S1, . . .be a sequence of random variables adapted to a filtration F =

10

Page 11: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

F0,F1, . . . and such that E|Sn| < ∞, n = 0, 1, . . .. Then, Scan be uniquely decomposed in the sum of two sequences ~M =M0,M1, . . ., which is a F -martingale, and ~A = A0, A1, . . .,which is a F -predictable sequence,

Sn = Mn + An, n = 0, 1, . . . .

——————————-(a) Let define M0 = S0, A0 = 0 and

Mn = M0 +n∑k=1

(Sk−E(Sk/Fk−1)), An = Sn−Mn, n = 1, 2, . . . .

(b)Mn is Fn-measurable since Sk, k = 0, 1, . . . , n and E(Sk/Fk−1),k = 1, . . . , n are Fn-measurable.

(c) E(Mn+1/Fn) = M0 +∑n+1k=1 E((Sk − E(Sk/Fk−1))/Fn)

= M0 +∑nk=1E((Sk − E(Sk/Fk−1))/Fn)

+E((Sn+1 − E(Sn+1/Fn))/Fn) = Mn + 0 = Mn.

(d) An = Sn−S0−∑nk=1(Sk−E(Sk/Fk−1)) =

∑nk=1E(Sk/Fk−1)−∑n−1

k=0 Sk is a Fn−1-measurable random variable.

(d) Note also that An+1−An = E(Sn+1/Fn)−Sn, n = 0, 1, . . ..

(f) Suppose that Sn = M ′n + A′n, n = 0, 1, . . . is another de-

composition in the sum of a Fn-martingale and Fn-predictablesequence. Then

A′n+1 − A′n = E(A′n+1 − A′n/Fn) = E((Sn+1 − Sn)−(M ′

n+1 −M ′n)/Fn) = E(Sn+1/Fn)− Sn − (M ′

n −M ′n)

= E(Sn+1/Fn)− Sn = An+1 − An, n = 0, 1, . . ..

(g) Since A0 = A′0 = 0, we get using (e),

An = A0 +∑n−1k=0(Ak+1 − Ak)

11

Page 12: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

= A′0 +∑n−1k=0(A

′k+1 − A′k) = A′n, n = 0, 1, . . ..

(h) M ′n = Sn − A′n = Sn − An = Mn, n = 0, 1, . . ..

——————————-

(1) If ~S = S0, S1, . . . is a submartingale, then ~A = A0, A1, . . .is an a.s. nondecreasing sequence, i.e.,

P (0 = A0 ≤ A1 ≤ A2 ≤ · · ·) = 1.

Indeed, according (d) An+1 − An = E(Sn+1/Fn) − Sn ≥ 0, n =0, 1, . . ..

2.3 Doob decomposition for square integrable martin-gales

< Ω,F , P > be a probability space;

F = F0,F1, . . . is a finite or infinite sequence of σ-algebrassuch that F0,F1, . . . ⊆ F .~S = S0, S1, . . . is a F -adapted sequence of random variablesdefined on this probability space.

Lemma 13.1. Let a sequence ~S is a F -martingale such thatE|Sn|2 <∞, n = 0, 1, . . .. Then the sequence ~S2 = S2

0 , S21 , . . .

is a submartingale.

——————————-E(S2

n+1/Fn) = E((Sn + (Sn+1 − Sn))2/Fn)

= E(S2n + 2Sn(Sn+1 − Sn) + (Sn+1 − Sn)2/Fn)

= E(S2n/Fn) + 2SnE(Sn+1 − Sn/Fn) + E((Sn+1 − Sn)2/Fn)

12

Page 13: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

= S2n + E((Sn+1 − Sn)2/Fn) ≥ S2

n, n = 0, 1, . . . .

——————————-

(1) According Doob decomposition theorem the sequence ~S2 canbe uniquely decomposed in the sum of a F -martingale ~M =M0,M1, . . ., and a F -predictable sequence ~A = A0, A1, . . .,

S2n = Mn + An, n = 0, 1, . . . ,

where

Mn = S20 +

n∑k=1

(S2k − E(S2

k/Fk−1)), An = S2n −Mn.

(2) An =∑nk=1E((Sk − Sk−1)2/Fk−1), n = 0, 1, . . ..

——————————-An = Sn − S2

0 −∑nk=1(S

2k − E(S2

k/Fk−1)) =

=∑nk=1(E(S2

k/Fk−1)− S2k−1)

=∑nk=1(E(S2

k−1+2Sk−1(Sk−Sk−1)+(Sk−Sk−1)2)/Fk−1)−S2k−1)

=∑nk=1E((Sk − Sk−1)2/Fk−1).

——————————-

(3) The F -predictable sequence ~A is a.s. nonnegative and non-decreasing in this case, i.e. P (0 = A0 ≤ A1 ≤ · · ·) = 1.

(4) ES2n = EMn + EAn = ES2

0 + EAn, n = 0, 1, . . ..

(5) The sequence ES20 , ES

21 , . . . is nondecreasing, i.e., ES2

n ≤ES2

n+1, n = 0, 1, . . . .——————————-

13

Page 14: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

(a) EMn = EM0 = ES20 , n = 0, 1, . . .;

(b) 0 = EA0 ≤ EA1 ≤ EA2 ≤ · · ·;(c) ES2

n = EMn + EAn = ES20 + EAn ≤ ES2

n+1, n = 0, 1, . . ..

——————————-

Definition 13.6. In this case, the notation 〈 ~S〉 = 〈S〉0, 〈S〉1, . . .is used for F -predictable sequence ~A and it is called a quadraticcharacteristic of the martingale ~S, i.e.,

〈S〉n =n∑k=1

E((Sk − Sk−1)2/Fk−1), n = 0, 1, . . . .

(6) Since 〈 ~S〉 is a nondecreasing sequence, there exists with prob-ability 1 a finite or infinite limit,

〈S〉∞ = limn→∞〈S〉n.

Example

If the martingale ~S is the sum of independent random variablesX1, X2, . . ., i.e., Sn = X1 + · · · + Xn, n = 0, 1, . . . such thatE|Xk|2 < ∞, EXk = 0, k = 1, . . ., then the quadratic charac-teristic

〈S〉n = EX21 + · · ·EX2

n, n = 0, 1, . . . = V arX1 + · · ·V arXn.

The quadratic characteristic is a non-random sequence.

2.4 Doob-Kolmogorov inequality

Theorem 13.2 (Doob-Kolmogorov inequality). If ~S =S0, S1, . . . is a square integrable F -martingale, then

P ( max0≤k≤n

|Sn| ≥ ε) ≤ 1

ε2ES2

n, ε > 0, n = 0, 1, . . . .

14

Page 15: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

——————————-(a) Ai = |Sj| ≤ ε, j = 1, . . . , i− 1, |Si| ≥ ε, i = 0, 1, . . . , n;

(b) ∪ni=0Ai = max0≤k≤n |Sn| ≥ ε;

(c) A = Ω \ (∪ni=0Ai);

(d) ES2n =

∑ni=0ES

2nIAi

+ ES2nIA ≥

∑ni=0ES

2nIAi

;

(e) E(Sn − Si)SiIAi= EE(Sn − Si)SiIAi

/Fi)= ESiIAi

E(Sn − Si/Fi) = 0;

(f) ES2nIAi

= E((Sn − Si + Si)2IAi

= E(Sn − Si)2IAi

+2E(Sn−Si)SiIAi+ES2

i IAi≥ ES2

i IAi≥ ε2EIAi

= ε2P (Ai);

(g) E(S2n) ≥

∑ni=0ES

2nIAi≥ ∑n

i=0 ε2P (Ai)

= ε2P (max0≤k≤n |Sn| ≥ ε).——————————-

3. Convergence of Martingales

3.1 A.s. convergence of martingales

Theorem 13.3. Let a sequence ~S is a F -martingale suchthat ES2

n < M,n = 0, 1, . . ., where M = const < ∞. Thenthere exists a random variable S such that

Sna.s.−→ S as n→∞.

——————————-(a) Since ES2

n is bounded and non-decreases in n, we can chooseM = limn→∞ES

2n;

(b) S(m)n = Sm+n − Sm, n = 0, 1, . . ., for m = 0, 1, . . .;

15

Page 16: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

(c) F (m)n = σ(S

(m)k = Sm+k − Sm, k = 0, . . . , n), n = 0, 1, . . .;

(d) F (m) = F (m)0 ,F (m)

1 , . . .;

(e) F (m)n ⊆ Fm+n, n = 0, 1, . . .;

(f) E(S(m)n+1/F (m)

n ) = E(E(Sm+n+1/Fm+n)/F (m)n )

= E(Sm+n/F (m)n ) = E(S(m)

n /F (m)n ) = S(m)

n , n = 0, 1, . . .;

(g) S(m)n , n = 0, 1, . . . is a F (m)

n -martingale;

(i) E(Sm+n − Sm)2 = ES2m+n − 2ESm+nSm + ES2

m

= ES2m+n − ES2

m − 2E(Sm+n − Sm)Sm = ES2m+n − ES2

m.

(h) P (maxm≤i≤m+n |Si − Sm| ≥ ε) ≤ 1ε2 (ES

2m+n − ES2

m).

(j) P (maxm≤i<∞ |Si − Sm| ≥ ε) ≤ 1ε2 (M − ES

2m);

(k) P (∩∞m=0 ∪∞i=m |Si − Sm| ≥ ε= limm→∞ P (maxm≤i<∞ |Si − Sm| ≥ ε)≤ limm→∞

1ε2 (M − ES

2m) = 0;

(l) P (|Si−Sm| ≥ ε for infinitely many i,m) = 0, for any ε > 0;

(m) P (ω : limm→∞maxi≥m |Si(ω)− Sm(ω)| = 0) = 1;

(n) A non-random sequence an → a as n → ∞ if and only ifmaxi≥m |ai − am| → 0 as m→∞.

(o) P (ω : limn→∞ Sn(ω) = S(ω)) = 1.

——————————-

Theorem 13.4**. Let a sequence ~S is a F -martingale suchthat E|Sn| < M,n = 0, 1, . . ., where M = const < ∞. Then

16

Page 17: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

there exists a random variable S such that

Sna.s.−→ S as n→∞.

Example

As is known the series 1 + 12 + 1

3 + · · · diverges the alternatingseries 1 − 1

2 + 13 − · · · converges. Let X1, X2, . . . be i.i.d. ran-

dom variables taking value +1 and −1 with equal probabilities.Let also, Sn =

∑ni=1

Xi

i , n = 1, 2, . . .. This sequence is a mar-tingale (with respect to the natural filtration generating by thissequence). Now, ES2

n = V arSn =∑ni=1

1i2 ≤

∑∞i=1

1i2 <∞. Thus,

there exists a random variable S such that Sna.s.−→ S as n→∞,

i.e., the random harmonic series∑∞i=1

Xi

i converges with proba-bility 1.

3.2 Law of large numbers for martingales

Theorem 13.5. Let a sequence ~S is a F -martingale suchthat ES2

n <∞, n = 0, 1, . . . and 〈S〉n a.s.−→∞. Then, for any non-decreasing function f(x) such that f(x) ≥ 1 and

∫∞0 f(x)−2dx <∞,

Snf(〈S〉n)

a.s.−→ 0 as n→∞.

——————————-(a) Yn =

∑ni=1

Si−Si−1

f(〈S〉i) , n = 1, 2, . . . , Y0 = 0;

(b) E(Yn+1−Yn/Fn) = E( Sn+1−Sn

f(〈S〉n+1)/Fn) = 1

f(〈S〉n+1)E(Sn+1/Fn)−

E( Sn

f(〈S〉n+1)/Fn) = Sn

f(〈S〉n+1)− Sn

f(〈S〉n+1)= 0, n = 0, 1, . . .;

(c) EYn =∑ni=1E

Si−Si−1

f(〈S〉i) =∑ni=1EE(Si−Si−1

f(〈S〉i) /Fi−1) = 0, n ≥ 1;

(d) 〈Y 〉n+1 − 〈Y 〉n = E(Yn+1 − Yn)2/Fn) = E( (Sn+1−Sn)2

f(〈S〉n+1)2/Fn)

17

Page 18: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

= 1f(〈S〉n+1)2

E(Sn+1 − Sn)2/Fn) = 〈S〉n+1−〈S〉nf(〈S〉n+1)2

;

(e) 〈Y 〉n =∑n−1k=0

〈S〉k+1−〈S〉kf(〈S〉k+1)2

≤ ∫ 〈S〉k+1

〈S〉k f(x)−2dx

≤ ∫∞0 f(x)−2dx = M a.s., for n = 1, 2, . . ..

(f) EY 2n = EY 2

0 + E〈Y 〉n ≤M,n = 0, 1, . . ..

(g) By Theorem 1, Yn =∑ni=1

Si−Si−1

f(〈S〉i)a.s.−→ Y as n→∞;

(h) Lemma (Kronecker). If an →∞ as n→∞ and∑nk=1

xkak→

c as n→∞ where |c| <∞ then 1an

∑nk=1 xk → 0 as n→∞.

(i) 1f(〈S〉n)

∑ni=1(Si − Si−1) = 1

f(〈S〉n)(Sn − S0)a.s.−→ 0 as n→∞.

(j) 1f(〈S〉n)Sn

a.s.−→ 0 as n→∞.——————————-

Example

Let X1, X2, . . . be independent random variables such thatσ2n = V arXn < ∞, EXn = µn, n = 1, 2, . . .. Denote bn =∑nk=1 σ

2k, an =

∑nk=1 µk. Assume that bn → ∞ as n → ∞. Let

also,

Sn =n∑k=1

(Xk − µk), n = 1, 2, . . . , S0 = 0.

This sequence is a martingale (with respect to the natural fil-tration generating by this sequence) and

ESn = 0, 〈S〉n = bn, n = 0, 1, . . . .

Choose, f(x) = max(x, 1). This function is non-decreasing and∫∞0 f(x)−2dx = 1 +

∫∞1 x−2dx <∞.

18

Page 19: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

(1) By Theorem 2,

Snbn

=f(bn)

bn· Snf(bn)

a.s.−→ 1 · 0 = 0 as n→∞.

(2) If anbn→ c, then, also,

1

bn

n∑k=1

Xka.s.−→ c as n→∞.

(3) If σ2n = σ2, EXn = µ, n = 1, 2, . . ., then bn = nσ2, an =µn, n = 1, 2, . . . , c = µ

σ2 and

1

nσ2

n∑k=1

Xka.s.−→ c as n→∞.

3.3 Central limit theorem for martingales

Theorem 13.6**. Let a sequence ~S is a F -martingale suchthat ES2

n <∞, n = 0, 1, . . . and the following condition hold:

(1) 〈S〉nnP−→ σ2 > 0 as n→∞;

(2)∑n

i=1E((Si−Si−1)2I(|Si−Si−1|≥ε

√n)

n → 0 as n→∞, ε > 0.

Then,Sn√nσ

d−→ S as n→∞,

where S is a standard normal random variable with the mean 0and the variance 1.

4. Stopping times

19

Page 20: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

4.1 Definition and basic properties of stopping times

< Ω,F , P > be a probability space;

F = F0,F1, . . . is a finite or infinite sequence of σ-algebrassuch that F0,F1, . . . ⊆ F ;

T = T (ω) be a random variable defined on a probability space< Ω,F , P > and taking values in the set 0, 1, . . . ,+∞;~S = S0, S1, . . . is a F -adapted sequence of random variablesdefined on the probability space < Ω,F , P >.

Definition 13.7. The random variable T is called a stoppingtime for filtration F if,

T = n ∈ Fn, n = 1, 2, . . . .

(1) The equivalent condition defining a stopping time T is torequire that T ≤ n ∈ Fn, n = 1, 2, . . . or T > n ∈ Fn, n =1, 2, . . ..

——————————-(a) Events T = k ∈ Fk ⊆ Fn, k = 0, 1, . . . , n, thus, eventT ≤ n = ∪nk=0T = k ∈ Fn and in sequel T > n ∈ Fn;(b) Events T > n − 1 ∈ Fn−1 ⊆ Fn, thus, T = n = T >n− 1 \ T > n ∈ Fn.——————————-

(2) Let H0, H1, . . . be a sequence of Borel subsets of a real line.A hitting time T = min(n ≥ 0 : Sn ∈ Hn) is an example ofstopping time.

——————————-

20

Page 21: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

T = n = S0 /∈ H0, . . . , Sn−1 /∈ Hn−1, Sn ∈ Hn ∈ Fn, n =0, 1, . . ..——————————-

(3) If T ′ and T ′′ are stopping times for a filtration F then T ′+T ′′,max(T ′, T ′′), min(T ′, T ′′) are stopping times.

——————————-(a) T = T ′ + T ′′ = n = ∪nk=0(T ′ = k, T ′′ = n− k ∈ Fn;(b) T = max(T ′, T ′′) ≤ n = T ′ ≤ n, T ′′ ≤ n ∈ Fn;(b) T = min(T ′, T ′′) > n = T ′ > n, T ′′ > n ∈ Fn;

——————————-

21

Page 22: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

4.2 Stopped martingales

Theorem 13.7. If ~S = Sn, n = 0, 1, . . . is a F -martingaleand T is a stopping time for filtration F , then the sequence~S ′ = S ′n = ST∧n, n = 0, 1, . . . is also a F -martingale.

——————————-(a) ST∧n =

∑n−1k=0 SkI(T = k) + SnI(T > n− 1);

(b) E(S ′n+1/Fn) = ST∧(n+1)/Fn) =∑nk=0E(SkI(T = k)/Fn)+

E(Sn+1I(T > n)/Fn) =∑nk=0 I(T = k)Sk+

I(T > n)E(Sn+1/Fn) =∑n−1k=0 I(T = k)Sk + I(T = n)Sn+

I(T > n)Sn =∑n−1k=0 I(T = k)Sk + SnI(T > n− 1)

= ST∧n = S ′n——————————-

(1) ES ′n = ES0, n = 0, 1, . . .

——————————-ES ′n = ES ′0 = EST∧0 = ES0.——————————-

Theorem 13.8. If ~S = Sn, n = 0, 1, . . . is a F -martingaleand T is a stopping time for filtration F such that P (T ≤ N) = 1for some integer constant N ≥ 0, then

EST = ES0.

——————————-(a) ST = ST∧N = S ′N a.s.

(b) EST = ES ′N = ES0.

22

Page 23: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

——————————-

4.3 Optional stopping theorems

Theorem 13.9. If ~S = Sn, n = 0, 1, . . . is a F -martingaleand T is a stopping time for filtration F such that(1) P (T <∞) = 1;

(2) E|ST | <∞;

(3) E(Sn/T > n)P (T > n) = ESnI(T > n)→ 0 as n→∞.

Then,EST = ES0.

——————————-(a) EST = EST I(T ≤ n) + EST I(T > n);

(b) E|ST | =∑∞k=0E(|ST |/T = k)P (T = k) <∞;

(c) |EST I(T > n)| ≤ ∑∞k=n+1E(|ST |/T = k)P (T = k)

→ 0 as n→∞;

(d) EST I(T ≤ n) = EST I(T ≤ n) + ESnI(T > n)

− ESnI(T > n) = EST∧n − ESnI(T > n);

(e) ESnI(T > n)→ 0 as n→∞;

(f) EST = limn→∞EST∧n = ES0.——————————-

Example

Let consider so-called symmetric random walk that is Sn =∑ni=1Xi, n = 0, 1, . . ., where X1, X2, . . . are i.i.d. Bernoulli ran-

dom variables taking value 1 and −1 with equal probabilities.

23

Page 24: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

In this case, sequence ~S = Sn, n = 0, 1, . . . is a martingalewith respect to the natural filtration F generated by randomvariables X1, X2, . . ..

Let a, b be strictly positive integers and T = min(n ≥ 1 : Sn =−a or b). It is a hitting time, and, therefore, a stopping timefor the filtration F .

(1) P (ST = −a) = ba+b .

(2) ET = ab.

——————————-(a) ST can take only two values −a or b with probabilities

Pa = P (ST = −a) and 1− Pa respectively;

(b) Thus, E|ST | <∞, i.e., condition (2) of Theorem 13.9 holds.

(c) A = Xk = 1, k = 1, . . . a+ b,Bc = −a < c+ Sk < b, k = 1, . . . a+ b, −a < c < b;

(d) P (A) = (12)a+b = p > 0;

(e) Bc ⊆ A,−a < c < b;

(f) max−a<c<b P (Bc) ≤ P (A) ≤ 1− p < 1;

(g) P (T > a+ b) = P (B0) ≤ 1− p;(h) P (T > 2(a+ b)) =

∑−a<c<b P (T > a+ b, Sa+b = c)P (Bc)

≤ (1− p) ∑−a<c<b P (T > a+ b, Sa+b = c)

= (1− p)P (T > a+ b) ≤ (1− p)2;(i) P (T > k(a+ b)) ≤ (1− p)k, k = 1, 2, . . .;

24

Page 25: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

(j) P (T > n)→ 0 as n→∞, i.e P (T <∞) = 1;

(k) Thus, condition (1) of Theorem 13.9 holds.

(l) |ESnI(T > n)| ≤ (a+ b)EI(T > n)→ 0 as n→∞;

(m) Thus condition (3) of Theorem 2 also holds;

(n) EST = −aPa + b(1− Pa) = ES0 = 0;

(o) Pa = ba+b .

(p) Consider a random sequence Vn = S2n − n, n = 0, 1, . . .. The

non-random sequence n is a quadratic characteristic for the sub-martingale S2

n and Vn is a martingale with respect to the naturalfiltration F generated by random variables X1, X2, . . .;

(q) ET <∞ that follows from (i);

(r) |VT | ≤ |ST |2 + T ≤ max(a, b)2 + T ;

(s) E|VT | ≤ max(a, b)2 + ET <∞;

(t) |EVT I(T > n)| ≤ (max(a, b)2 + n)P (T > n)→ 0 as n→∞;(u) By Theorem 2, EVT = a2Pa + b2(1− Pa)−ET = EV0 = 0;

(w) ET = ab.

——————————-

Theorem 13.10. If ~S = Sn, n = 0, 1, . . . is a F -martingaleand T is a stopping time for filtration F such that(1) ET <∞;

(2) E|Sn+1 − Sn|/Fn) ≤ K <∞, n = 0, 1, . . .;

Then,EST = ES0.

——————————-(a) Z0 = |S0|, Zn = |Sn − Sn−1|, n = 1, 2, . . .;

25

Page 26: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

(b) W = Z0 + · · ·+ ZT ;

(c) |ST | ≤ W ;

(d) EW =∑∞n=0

∑nk=0EZkI(T = n)

=∑∞k=0

∑∞n=k EZkI(T = n) =

∑∞k=0EZkI(T ≥ k)

= E|S0|+∑∞k=1EZkI(T ≥ k);

(e) I(T ≥ k) is Fk−1-measurable, for k = 1, 2, . . .;

(f) EZkI(T ≥ k) = EE(ZkI(T ≥ k)/Fk−1)= EI(T ≥ k)E(Zk/Fk−1) ≤ KP (T ≥ k);

(g) EW ≤ E|S0|+∑k=1KP (T ≥ k) = E|S0|+KET <∞;

(h)|EST∧n − EST | ≤ E|ST∧n − ST |I(T > n) ≤ 2EWI(T > n);

(i)∑∞k=0EWI(T = k) = EW <∞;

(j) EWI(T > n) =∑k>nEWI(T = k)→ 0 as n→∞;

(k) EST∧n = ES0;

(l) It follows from (h) - (k) that EST = ES0.——————————-

4.4. Wald equation

Theorem 13.11 (Wald Equation). Let X1, X2, . . . be i.i.d.random variables such that E|X1| < ∞, EX1 = µ. Let alsoYn =

∑nk=1Xk, n = 1, 2, . . .. Then, the centered sequence Sn =

Yn − µn, n = 1, 2, . . . , S0 = 0 is a martingale with respects toF generated by random variables X1, X2, . . .. Lets also T is astopping time with respect to filtration F such that ET < ∞.Then,

EYT = µET.

26

Page 27: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

——————————-(a) E(|Sn+1 − Sn|/Fn) = E|X1 − µ| ≤ 2µ, n = 1, 2, . . .;

(b) ST = YT − µT ;

(c) Theorem 4 implies that EST = ES0 = 0;

(d) EYT − µET = EST = 0.——————————-

4.5. A fair game example

A gambler flips a fair coin and wins his bet if it comes up headsand losses his bet if it comes up tails. The so-called ”martingalebetting strategy” can be viewed as a way to win in a fair gainis a to keep doubling the bet until the gambler eventually wins.

(1) Let Xn, n = 1, 2, . . . be independent random variables tak-ing value 2n−1 and −2n−1 with equal probabilities. These ran-dom variables represent the outcomes of sequential flips of thecoin. Thus, Sn is the gain of the gambler after n flips. SinceEXn = 0, n = 1, 2, . . ., the sequence Sn = X1 + · · · + Xn, n =1, 2, . . . , S0 = 0 is a martingale with respect to the natural fil-tration generating by the random variables X1, X2, . . ..

(2) Let T be the number of flip in which the gambler first timewin. By the definition, P (T = n) = (12)n, n = 1, 2, . . .. Then,according the definition of T and the description of ”martingalebetting strategy”,

ST = 2T − (1 + 2 + · · ·+ 2T−1) = 1.

(3) This, seems, contradicts to optional stopping theorem sinceT is a stopping time with ET < ∞, and, the equality EST =

27

Page 28: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

ES0 = 0 may be expected, while according (2) EST = 1.

(4) The explanation of this phenomenon is that conditions ofoptional stopping Theorems 13.8 – 13.10 does not hold.

——————————-(a) Theorem 13.8 can not be implied, since T is not boundedrandom variable;

(b) Theorem 13.9 can not be implied, since E(SnI(T > n))= −(1 + 2 + · · ·+ 2n−1) 1

2n−1 = −2n−12n 6→ 0 as n→∞;

(c) Theorem 13.10 can not be implied, since E(|Sn+1−Sn|/Fn) =E|Xn+1| = 2n →∞ as n→∞.

——————————-

LN Problems

1. Please, characterize a sequence of random variables ~S =S0, S1, . . . which is a martingale with respect to the filtrationF = F0,F1, . . . for the case where F0 = F1 = F2 = · · ·, inparticular, if F0 = ∅,Ω.

2. LetX1, X2, . . . be i.i.d random variables, EX1 = 0, V arX1 =σ2 < ∞. Let also S0 = 0 and Sn =

∑nk=1Xk − nσ2. Prove that

the sequence ~S = S0, S1, . . . is a martingale with respect tofiltration F = F0,F1, . . ., where Fn = σ(X1, . . . , Xn), n =0, 1, . . . , F0 = ∅,Ω.

3. Suppose that a sequence ~S = S0, S1, . . . is a martin-gale and also a predictable sequence with respect to a filtration

28

Page 29: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

F = F0,F1, . . .. Show that, in this case P (Sn = S0) = 1, n =0, 1, . . ..

4. Suppose that a sequence ~S = S0, S1, . . . is a martingalewith respect to a filtration F = F0,F1, . . .. Show that thesequence ~S ′ = S ′0 = S2

0 , S′1 = S2

1 , . . . is a submartingale withrespect to a filtration F .

5. Suppose that sequences ~S ′ = S ′0, S ′1, . . . and ~S ′′ =S ′′0 , S ′′1 , . . . are submartingales with respect to a filtration F =F0,F1, . . .. Show that the sequence ~S = S0 = S ′0 ∨ S ′′0 , S1 =S ′1∨S ′′1 , . . . is also a submartingale with respect to a filtration F .

6. Let F = F0,F1, . . . is a filtration and let F be a minimalσ-algebra which contains all σ-algebra Fn, n = 0, 1, . . .. Let alsoS a random variable such that P (S ≥ 0) = 1 and ES < ∞.Define, Sn = E(S/Fn), n = 0, 1, . . .. Prove that

Sna.s.−→ S as n→∞.

7. Let F = F0,F1, . . . is a filtration and let F be a minimalσ-algebra which contains all σ-algebra Fn, n = 0, 1, . . .. Let alsoS a random variable such that E|S| <∞. Prove that

E(S/Fn) a.s.−→ E(S/F) as n→∞.

8. LetX1, X2, . . . be a sequence of independent random variablessuch that EX2

n = bn < ∞, EXn = an 6= 0, n = 1, 2, . . .. Letdefine random variables Sn =

∏nk=1

Xk

ak, n = 1, 2, . . . , S0 = 1.

Prove that the sequence ~S = S0, S1, . . . is a martingale withrespect to filtration generated by random variables X1, X2, . . .and prove that condition

∏∞k=1

bkm2

k< ∞ implies that there exist

29

Page 30: Lecture 13: Martingales - s ukurser.math.su.se/pluginfile.php/5152/mod_resource/content/1/... · Lecture 13: Martingales 1. ... of random variables de ned on this probability space,

a random variable S such that

Sna.s.−→ S as n→∞.

9. LetX1, X2, . . . be a sequence of independent random variablessuch that EX2

n = bk < ∞, EXn = 0, n = 1, 2, . . .. Define Sn =∑nk=1Xk, n =, 1, 2, . . . , S0 = 0. Prove that the sequence ~S =S0, S1, . . . is a martingale with respect to filtration generatedby random variables X1, X2, . . . and that condition

∑∞k=1 bk <∞

implies that there exist a random variable S such that

Sna.s.−→ S as n→∞.

10. Please, re-formulate Theorem 4 for the case where the ran-dom variables Sn = X1 + · · · + Xn, n = 1, 2, . . . are sums ofindependent random variables and F is the natural filtrationgenerated by the random variables X1, X2, . . ..

11. Let X1, X2, . . . be non-negative i.i.d. random variables suchthat EX1 = µ > 0. Let also Yn =

∑nk=1Xk, n = 1, 2, . . . and

Tu = min(n ≥ 1 : Yn ≥ u), u > 0. Prove, that Tu is a stoppingtime such that

ETu <∞, u > 0.

12. Let X1, X2, . . . be non-negative i.i.d. random variables suchthat EX1 = µ > 0. Let also Yn =

∑nk=1Xk, n = 1, 2, . . . and

Tu = min(n ≥ 1 : Yn ≥ u), u > 0. Prove,

EYTu = µETu, u > 0.

30