18.600 f2019 lecture 26: moment generating functions · 2020-07-09 · moment generating functions....
TRANSCRIPT
![Page 1: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/1.jpg)
18.600: Lecture 26
Moment generating functions and characteristic functions
Scott Sheffield
MIT
1
![Page 2: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/2.jpg)
Outline
Moment generating functions
Characteristic functions
Continuity theorems and perspective
2
![Page 3: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/3.jpg)
Outline
Moment generating functions
Characteristic functions
Continuity theorems and perspective
3
![Page 4: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/4.jpg)
I The moment generating function of X is defined byM(t) = MX (t) := E [etX ].
I When X is discrete, can write M(t) =P
x etxpX (x). So M(t)
is a weighted average of countably many exponentialfunctions.
I When X is continuous, can write M(t) =R∞−∞ etx f (x)dx . So
M(t) is a weighted average of a continuum of exponentialfunctions.
I We always have M(0) = 1.
I If b > 0 and t > 0 thenE [etX ] ≥ E [etmin{X ,b}] ≥ P{X ≥ b}etb.
I If X takes both positive and negative values with positiveprobability then M(t) grows at least exponentially fast in |t|as |t| → ∞.
Moment generating functions
I Let X be a random variable.
4
![Page 5: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/5.jpg)
I When X is discrete, can write M(t) =P
x etxpX (x). So M(t)
is a weighted average of countably many exponentialfunctions.
I When X is continuous, can write M(t) =R∞−∞ etx f (x)dx . So
M(t) is a weighted average of a continuum of exponentialfunctions.
I We always have M(0) = 1.
I If b > 0 and t > 0 thenE [etX ] ≥ E [etmin{X ,b}] ≥ P{X ≥ b}etb.
I If X takes both positive and negative values with positiveprobability then M(t) grows at least exponentially fast in |t|as |t| → ∞.
Moment generating functions
I Let X be a random variable.
I The moment generating function of X is defined by M(t) = MX (t) := E [etX ].
5
![Page 6: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/6.jpg)
I When X is discrete, can write M(t) =P
x etxpX (x). So M(t)
is a weighted average of countably many exponentialfunctions.
I When X is continuous, can write M(t) =R∞−∞ etx f (x)dx . So
M(t) is a weighted average of a continuum of exponentialfunctions.
I We always have M(0) = 1.
I If b > 0 and t > 0 thenE [etX ] ≥ E [etmin{X ,b}] ≥ P{X ≥ b}etb.
I If X takes both positive and negative values with positiveprobability then M(t) grows at least exponentially fast in |t|as |t| → ∞.
Moment generating functions
I Let X be a random variable.
I The moment generating function of X is defined by M(t) = MX (t) := E [etX ].
6
![Page 7: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/7.jpg)
I When X is continuous, can write M(t) =R∞−∞ etx f (x)dx . So
M(t) is a weighted average of a continuum of exponentialfunctions.
I We always have M(0) = 1.
I If b > 0 and t > 0 thenE [etX ] ≥ E [etmin{X ,b}] ≥ P{X ≥ b}etb.
I If X takes both positive and negative values with positiveprobability then M(t) grows at least exponentially fast in |t|as |t| → ∞.
Moment generating functions
I Let X be a random variable.
I The moment generating function of X is defined by M(t) = MX (t) := E [etX ]. P
tx I When X is discrete, can write M(t) = e pX (x). So M(t) x is a weighted average of countably many exponential functions.
7
![Page 8: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/8.jpg)
I We always have M(0) = 1.
I If b > 0 and t > 0 thenE [etX ] ≥ E [etmin{X ,b}] ≥ P{X ≥ b}etb.
I If X takes both positive and negative values with positiveprobability then M(t) grows at least exponentially fast in |t|as |t| → ∞.
Moment generating functions
I Let X be a random variable.
I The moment generating function of X is defined by M(t) = MX (t) := E [etX ]. P
tx I When X is discrete, can write M(t) = e pX (x). So M(t) x is a weighted average of countably many exponential functions. R ∞ I When X is continuous, can write M(t) = etx f (x)dx . So −∞ M(t) is a weighted average of a continuum of exponential functions.
8
![Page 9: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/9.jpg)
I If b > 0 and t > 0 thenE [etX ] ≥ E [etmin{X ,b}] ≥ P{X ≥ b}etb.
I If X takes both positive and negative values with positiveprobability then M(t) grows at least exponentially fast in |t|as |t| → ∞.
Moment generating functions
I Let X be a random variable.
I The moment generating function of X is defined by M(t) = MX (t) := E [etX ]. P
tx I When X is discrete, can write M(t) = e pX (x). So M(t) x is a weighted average of countably many exponential functions. R ∞ I When X is continuous, can write M(t) = etx f (x)dx . So −∞ M(t) is a weighted average of a continuum of exponential functions.
I We always have M(0) = 1.
9
![Page 10: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/10.jpg)
I If X takes both positive and negative values with positiveprobability then M(t) grows at least exponentially fast in |t|as |t| → ∞.
Moment generating functions
I Let X be a random variable.
I The moment generating function of X is defined by M(t) = MX (t) := E [etX ]. P
tx I When X is discrete, can write M(t) = e pX (x). So M(t) x is a weighted average of countably many exponential functions. R ∞ I When X is continuous, can write M(t) = etx f (x)dx . So −∞ M(t) is a weighted average of a continuum of exponential functions.
I We always have M(0) = 1.
I If b > 0 and t > 0 then tX ] ≥ E [et min{X ,b}] ≥ P{X ≥ b}etb E [e .
10
![Page 11: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/11.jpg)
Moment generating functions
I Let X be a random variable.
I The moment generating function of X is defined by M(t) = MX (t) := E [etX ]. P
tx I When X is discrete, can write M(t) = e pX (x). So M(t) x is a weighted average of countably many exponential functions. R ∞ I When X is continuous, can write M(t) = etx f (x)dx . So −∞ M(t) is a weighted average of a continuum of exponential functions.
I We always have M(0) = 1.
I If b > 0 and t > 0 then tX ] ≥ E [et min{X ,b}] ≥ P{X ≥ b}etb E [e .
I If X takes both positive and negative values with positive probability then M(t) grows at least exponentially fast in |t| as |t| → ∞.
11
![Page 12: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/12.jpg)
I Then M 0(t) = ddtE [e
tX ] = E�ddt (e
tX )�= E [XetX ].
I in particular, M 0(0) = E [X ].
I Also M 00(t) = ddtM
0(t) = ddtE [Xe
tX ] = E [X 2etX ].
I So M 00(0) = E [X 2]. Same argument gives that nth derivativeof M at zero is E [X n].
I Interesting: knowing all of the derivatives of M at a singlepoint tells you the moments E [X k ] for all integer k ≥ 0.
I Another way to think of this: writeetX = 1 + tX + t2X 2
2! + t3X 3
3! + . . ..
I Taking expectations givesE [etX ] = 1 + tm1 +
t2m22! + t3m3
3! + . . ., where mk is the kthmoment. The kth derivative at zero is mk .
Moment generating functions actually generate moments
I Let X be a random variable and M(t) = E [etX ].
12
![Page 13: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/13.jpg)
I in particular, M 0(0) = E [X ].
I Also M 00(t) = ddtM
0(t) = ddtE [Xe
tX ] = E [X 2etX ].
I So M 00(0) = E [X 2]. Same argument gives that nth derivativeof M at zero is E [X n].
I Interesting: knowing all of the derivatives of M at a singlepoint tells you the moments E [X k ] for all integer k ≥ 0.
I Another way to think of this: writeetX = 1 + tX + t2X 2
2! + t3X 3
3! + . . ..
I Taking expectations givesE [etX ] = 1 + tm1 +
t2m22! + t3m3
3! + . . ., where mk is the kthmoment. The kth derivative at zero is mk .
Moment generating functions actually generate moments
I Let X be a random variable and M(t) = E [etX ]. � � d tX ] = E d tX ) I Then M 0(t) = E [e (e = E [XetX ]. dt dt
13
![Page 14: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/14.jpg)
I Also M 00(t) = ddtM
0(t) = ddtE [Xe
tX ] = E [X 2etX ].
I So M 00(0) = E [X 2]. Same argument gives that nth derivativeof M at zero is E [X n].
I Interesting: knowing all of the derivatives of M at a singlepoint tells you the moments E [X k ] for all integer k ≥ 0.
I Another way to think of this: writeetX = 1 + tX + t2X 2
2! + t3X 3
3! + . . ..
I Taking expectations givesE [etX ] = 1 + tm1 +
t2m22! + t3m3
3! + . . ., where mk is the kthmoment. The kth derivative at zero is mk .
Moment generating functions actually generate moments
I Let X be a random variable and M(t) = E [etX ]. � � d tX ] = E d tX ) I Then M 0(t) = E [e (e = E [XetX ]. dt dt
I in particular, M 0(0) = E [X ].
14
![Page 15: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/15.jpg)
I So M 00(0) = E [X 2]. Same argument gives that nth derivativeof M at zero is E [X n].
I Interesting: knowing all of the derivatives of M at a singlepoint tells you the moments E [X k ] for all integer k ≥ 0.
I Another way to think of this: writeetX = 1 + tX + t2X 2
2! + t3X 3
3! + . . ..
I Taking expectations givesE [etX ] = 1 + tm1 +
t2m22! + t3m3
3! + . . ., where mk is the kthmoment. The kth derivative at zero is mk .
Moment generating functions actually generate moments
I Let X be a random variable and M(t) = E [etX ]. � � d tX ] = E d tX ) I Then M 0(t) = E [e (e = E [XetX ]. dt dt
I in particular, M 0(0) = E [X ]. d d I Also M 00(t) = M 0(t) = E [XetX ] = E [X 2etX ]. dt dt
15
![Page 16: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/16.jpg)
I Interesting: knowing all of the derivatives of M at a singlepoint tells you the moments E [X k ] for all integer k ≥ 0.
I Another way to think of this: writeetX = 1 + tX + t2X 2
2! + t3X 3
3! + . . ..
I Taking expectations givesE [etX ] = 1 + tm1 +
t2m22! + t3m3
3! + . . ., where mk is the kthmoment. The kth derivative at zero is mk .
Moment generating functions actually generate moments
I Let X be a random variable and M(t) = E [etX ]. � � d tX ] = E d tX ) I Then M 0(t) = E [e (e = E [XetX ]. dt dt
I in particular, M 0(0) = E [X ]. d d I Also M 00(t) = M 0(t) = E [XetX ] = E [X 2etX ]. dt dt
I So M 00(0) = E [X 2]. Same argument gives that nth derivative of M at zero is E [X n].
16
![Page 17: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/17.jpg)
I Another way to think of this: writeetX = 1 + tX + t2X 2
2! + t3X 3
3! + . . ..
I Taking expectations givesE [etX ] = 1 + tm1 +
t2m22! + t3m3
3! + . . ., where mk is the kthmoment. The kth derivative at zero is mk .
Moment generating functions actually generate moments
I Let X be a random variable and M(t) = E [etX ]. � � d tX ] = E d tX ) I Then M 0(t) = E [e (e = E [XetX ]. dt dt
I in particular, M 0(0) = E [X ]. d d I Also M 00(t) = M 0(t) = E [XetX ] = E [X 2etX ]. dt dt
I So M 00(0) = E [X 2]. Same argument gives that nth derivative of M at zero is E [X n].
I Interesting: knowing all of the derivatives of M at a single point tells you the moments E [X k ] for all integer k ≥ 0.
17
![Page 18: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/18.jpg)
I Taking expectations givesE [etX ] = 1 + tm1 +
t2m22! + t3m3
3! + . . ., where mk is the kthmoment. The kth derivative at zero is mk .
Moment generating functions actually generate moments
I Let X be a random variable and M(t) = E [etX ]. � � d tX ] = E d tX ) I Then M 0(t) = E [e (e = E [XetX ]. dt dt
I in particular, M 0(0) = E [X ]. d d I Also M 00(t) = M 0(t) = E [XetX ] = E [X 2etX ]. dt dt
I So M 00(0) = E [X 2]. Same argument gives that nth derivative of M at zero is E [X n].
I Interesting: knowing all of the derivatives of M at a single point tells you the moments E [X k ] for all integer k ≥ 0.
I Another way to think of this: write tX + t
3X 3 e = 1 + tX + t
2X 2 + . . .. 2! 3!
18
![Page 19: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/19.jpg)
Moment generating functions actually generate moments
I Let X be a random variable and M(t) = E [etX ]. � � d tX ] = E d tX ) I Then M 0(t) = E [e (e = E [XetX ]. dt dt
I in particular, M 0(0) = E [X ]. d d I Also M 00(t) = M 0(t) = E [XetX ] = E [X 2etX ]. dt dt
I So M 00(0) = E [X 2]. Same argument gives that nth derivative of M at zero is E [X n].
I Interesting: knowing all of the derivatives of M at a single point tells you the moments E [X k ] for all integer k ≥ 0.
I Another way to think of this: write tX + t
3X 3 e = 1 + tX + t
2X 2 + . . .. 2! 3!
I Taking expectations gives m2 m3 E [etX ] = 1 + tm1 + t
2 + t
3 + . . ., where mk is the kth 2! 3!
moment. The kth derivative at zero is mk . 19
![Page 20: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/20.jpg)
I Write the moment generating functions as MX (t) = E [etX ]and MY (t) = E [etY ] and MZ (t) = E [etZ ].
I If you knew MX and MY , could you compute MZ?
I By independence, MZ (t) = E [et(X+Y )] = E [etX etY ] =E [etX ]E [etY ] = MX (t)MY (t) for all t.
I In other words, adding independent random variablescorresponds to multiplying moment generating functions.
Moment generating functions for independent sums
I Let X and Y be independent random variables and Z = X + Y .
20
![Page 21: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/21.jpg)
I If you knew MX and MY , could you compute MZ?
I By independence, MZ (t) = E [et(X+Y )] = E [etX etY ] =E [etX ]E [etY ] = MX (t)MY (t) for all t.
I In other words, adding independent random variablescorresponds to multiplying moment generating functions.
Moment generating functions for independent sums
I Let X and Y be independent random variables and Z = X + Y .
tX ] I Write the moment generating functions as MX (t) = E [e and MY (t) = E [etY ] and MZ (t) = E [etZ ].
21
![Page 22: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/22.jpg)
I By independence, MZ (t) = E [et(X+Y )] = E [etX etY ] =E [etX ]E [etY ] = MX (t)MY (t) for all t.
I In other words, adding independent random variablescorresponds to multiplying moment generating functions.
Moment generating functions for independent sums
I Let X and Y be independent random variables and Z = X + Y .
tX ] I Write the moment generating functions as MX (t) = E [e and MY (t) = E [etY ] and MZ (t) = E [etZ ].
I If you knew MX and MY , could you compute MZ ?
22
![Page 23: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/23.jpg)
I In other words, adding independent random variablescorresponds to multiplying moment generating functions.
Moment generating functions for independent sums
I Let X and Y be independent random variables and Z = X + Y .
tX ] I Write the moment generating functions as MX (t) = E [e and MY (t) = E [etY ] and MZ (t) = E [etZ ].
I If you knew MX and MY , could you compute MZ ? tX tY ] = I By independence, MZ (t) = E [et(X +Y )] = E [e e
tX ]E [e E [e tY ] = MX (t)MY (t) for all t.
23
![Page 24: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/24.jpg)
Moment generating functions for independent sums
I Let X and Y be independent random variables and Z = X + Y .
tX ] I Write the moment generating functions as MX (t) = E [e and MY (t) = E [etY ] and MZ (t) = E [etZ ].
I If you knew MX and MY , could you compute MZ ? tX tY ] = I By independence, MZ (t) = E [et(X +Y )] = E [e e
tX ]E [e E [e tY ] = MX (t)MY (t) for all t.
I In other words, adding independent random variables corresponds to multiplying moment generating functions.
24
![Page 25: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/25.jpg)
I If X1 . . .Xn are i.i.d. copies of X and Z = X1 + . . .+ Xn thenwhat is MZ?
I Answer: MnX . Follows by repeatedly applying formula above.
I This a big reason for studying moment generating functions.It helps us understand what happens when we sum up a lot ofindependent copies of the same random variable.
Moment generating functions for sums of i.i.d. random variables
I We showed that if Z = X + Y and X and Y are independent, then MZ (t) = MX (t)MY (t)
25
![Page 26: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/26.jpg)
I Answer: MnX . Follows by repeatedly applying formula above.
I This a big reason for studying moment generating functions.It helps us understand what happens when we sum up a lot ofindependent copies of the same random variable.
Moment generating functions for sums of i.i.d. random variables
I We showed that if Z = X + Y and X and Y are independent, then MZ (t) = MX (t)MY (t)
I If X1 . . . Xn are i.i.d. copies of X and Z = X1 + . . . + Xn then what is MZ ?
26
![Page 27: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/27.jpg)
I This a big reason for studying moment generating functions.It helps us understand what happens when we sum up a lot ofindependent copies of the same random variable.
Moment generating functions for sums of i.i.d. random variables
I We showed that if Z = X + Y and X and Y are independent, then MZ (t) = MX (t)MY (t)
I If X1 . . . Xn are i.i.d. copies of X and Z = X1 + . . . + Xn then what is MZ ?
I Answer: MXn . Follows by repeatedly applying formula above.
27
![Page 28: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/28.jpg)
Moment generating functions for sums of i.i.d. random variables
I We showed that if Z = X + Y and X and Y are independent, then MZ (t) = MX (t)MY (t)
I If X1 . . . Xn are i.i.d. copies of X and Z = X1 + . . . + Xn then what is MZ ?
I Answer: MXn . Follows by repeatedly applying formula above.
I This a big reason for studying moment generating functions. It helps us understand what happens when we sum up a lot of independent copies of the same random variable.
28
![Page 29: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/29.jpg)
I Answer: Yes. MZ (t) = E [etZ ] = E [etaX ] = MX (at).
I If Z = X + b then can I use MX to determine MZ?
I Answer: Yes. MZ (t) = E [etZ ] = E [etX+bt ] = ebtMX (t).
I Latter answer is the special case of MZ (t) = MX (t)MY (t)where Y is the constant random variable b.
Other observations
I If Z = aX then can I use MX to determine MZ ?
29
![Page 30: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/30.jpg)
I If Z = X + b then can I use MX to determine MZ?
I Answer: Yes. MZ (t) = E [etZ ] = E [etX+bt ] = ebtMX (t).
I Latter answer is the special case of MZ (t) = MX (t)MY (t)where Y is the constant random variable b.
Other observations
I If Z = aX then can I use MX to determine MZ ? tZ ] = E [etaX ] = MX (at). I Answer: Yes. MZ (t) = E [e
30
![Page 31: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/31.jpg)
I Answer: Yes. MZ (t) = E [etZ ] = E [etX+bt ] = ebtMX (t).
I Latter answer is the special case of MZ (t) = MX (t)MY (t)where Y is the constant random variable b.
Other observations
I If Z = aX then can I use MX to determine MZ ? tZ ] = E [etaX ] = MX (at). I Answer: Yes. MZ (t) = E [e
I If Z = X + b then can I use MX to determine MZ ?
31
![Page 32: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/32.jpg)
I Latter answer is the special case of MZ (t) = MX (t)MY (t)where Y is the constant random variable b.
Other observations
I If Z = aX then can I use MX to determine MZ ? tZ ] = E [etaX ] = MX (at). I Answer: Yes. MZ (t) = E [e
I If Z = X + b then can I use MX to determine MZ ? tZ ] = E [etX +bt ] = ebt MX (t). I Answer: Yes. MZ (t) = E [e
32
![Page 33: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/33.jpg)
Other observations
I If Z = aX then can I use MX to determine MZ ? tZ ] = E [etaX ] = MX (at). I Answer: Yes. MZ (t) = E [e
I If Z = X + b then can I use MX to determine MZ ? tZ ] = E [etX +bt ] = e I Answer: Yes. MZ (t) = E [e bt MX (t).
I Latter answer is the special case of MZ (t) = MX (t)MY (t) where Y is the constant random variable b.
33
![Page 34: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/34.jpg)
I Answer: if n = 1 then MX (t) = E [etX ] = pet + (1− p)e0. Ingeneral MX (t) = (pe
t + 1− p)n.
I What if X is Poisson with parameter λ > 0?
I Answer: MX (t) = E [etx ] =P∞
n=0etne−λλn
n! =
e−λP∞
n=0(λet)n
n! = e−λeλet= exp[λ(et − 1)].
I We know that if you add independent Poisson randomvariables with parameters λ1 and λ2 you get a Poissonrandom variable of parameter λ1 + λ2. How is this factmanifested in the moment generating function?
Examples
I Let’s try some examples. What is MX (t) = E [etX ] when X is binomial with parameters (p, n)? Hint: try the n = 1 case first.
34
![Page 35: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/35.jpg)
I What if X is Poisson with parameter λ > 0?
I Answer: MX (t) = E [etx ] =P∞
n=0etne−λλn
n! =
e−λP∞
n=0(λet)n
n! = e−λeλet= exp[λ(et − 1)].
I We know that if you add independent Poisson randomvariables with parameters λ1 and λ2 you get a Poissonrandom variable of parameter λ1 + λ2. How is this factmanifested in the moment generating function?
Examples
I Let’s try some examples. What is MX (t) = E [etX ] when X is binomial with parameters (p, n)? Hint: try the n = 1 case first.
0 I Answer: if n = 1 then MX (t) = E [etX ] = pet + (1 − p)e . In general MX (t) = (pe
t + 1 − p)n .
35
![Page 36: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/36.jpg)
I Answer: MX (t) = E [etx ] =P∞
n=0etne−λλn
n! =
e−λP∞
n=0(λet)n
n! = e−λeλet= exp[λ(et − 1)].
I We know that if you add independent Poisson randomvariables with parameters λ1 and λ2 you get a Poissonrandom variable of parameter λ1 + λ2. How is this factmanifested in the moment generating function?
Examples
I Let’s try some examples. What is MX (t) = E [etX ] when X is binomial with parameters (p, n)? Hint: try the n = 1 case first.
0 I Answer: if n = 1 then MX (t) = E [etX ] = pet + (1 − p)e . In general MX (t) = (pe
t + 1 − p)n .
I What if X is Poisson with parameter λ > 0?
36
![Page 37: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/37.jpg)
I We know that if you add independent Poisson randomvariables with parameters λ1 and λ2 you get a Poissonrandom variable of parameter λ1 + λ2. How is this factmanifested in the moment generating function?
Examples
I Let’s try some examples. What is MX (t) = E [etX ] when X is binomial with parameters (p, n)? Hint: try the n = 1 case first.
0 I Answer: if n = 1 then MX (t) = E [etX ] = pet + (1 − p)e . In general MX (t) = (pe
t + 1 − p)n .
I What if X is Poisson with parameter λ > 0? P∞ tn −λλn tx ] = e e I Answer: MX (t) = E [e = n=0 n! P∞ t )n t −λ (λe −λ λe e = e e = exp[λ(et − 1)]. n=0 n!
37
![Page 38: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/38.jpg)
Examples
I Let’s try some examples. What is MX (t) = E [etX ] when X is binomial with parameters (p, n)? Hint: try the n = 1 case first.
0 I Answer: if n = 1 then MX (t) = E [etX ] = pet + (1 − p)e . In general MX (t) = (pe
t + 1 − p)n .
I What if X is Poisson with parameter λ > 0? tx ] =
P∞ etne−λλn I Answer: MX (t) = E [e = n=0 n! P∞ t )n t −λ (λe −λ λe e = e e = exp[λ(et − 1)]. n=0 n!
I We know that if you add independent Poisson random variables with parameters λ1 and λ2 you get a Poisson random variable of parameter λ1 + λ2. How is this fact manifested in the moment generating function?
38
![Page 39: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/39.jpg)
I MX (t) =1√2π
R∞−∞ etxe−x
2/2dx =
1√2π
R∞−∞ exp{−
(x−t)22 + t2
2 }dx = et2/2.
I What does that tell us about sums of i.i.d. copies of X?
I If Z is sum of n i.i.d. copies of X then MZ (t) = ent2/2.
I What is MZ if Z is normal with mean µ and variance σ2?
I Answer: Z has same law as σX + µ, soMZ (t) = M(σt)eµt = exp{σ2t2
2 + µt}.
More examples: normal random variables
I What if X is normal with mean zero, variance one?
39
![Page 40: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/40.jpg)
I What does that tell us about sums of i.i.d. copies of X?
I If Z is sum of n i.i.d. copies of X then MZ (t) = ent2/2.
I What is MZ if Z is normal with mean µ and variance σ2?
I Answer: Z has same law as σX + µ, soMZ (t) = M(σt)eµt = exp{σ2t2
2 + µt}.
More examples: normal random variables
I What if X is normal with mean zero, variance one? R ∞ √1 tx −x I MX (t) = e e 2/2dx = −∞ R ∞ (x−t)2
t2/2 2π
√1 exp{− + t2 }dx = e . −∞ 2 2 2π
40
![Page 41: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/41.jpg)
I If Z is sum of n i.i.d. copies of X then MZ (t) = ent2/2.
I What is MZ if Z is normal with mean µ and variance σ2?
I Answer: Z has same law as σX + µ, soMZ (t) = M(σt)eµt = exp{σ2t2
2 + µt}.
More examples: normal random variables
I What if X is normal with mean zero, variance one? R ∞ √1 tx −x I MX (t) = e e 2/2dx = −∞ R ∞ (x−t)2
t2/2 2π
√1 exp{− + t2 }dx = e . −∞ 2 2 2π
I What does that tell us about sums of i.i.d. copies of X ?
41
![Page 42: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/42.jpg)
I What is MZ if Z is normal with mean µ and variance σ2?
I Answer: Z has same law as σX + µ, soMZ (t) = M(σt)eµt = exp{σ2t2
2 + µt}.
More examples: normal random variables
I What if X is normal with mean zero, variance one? R ∞ √1 tx −x I MX (t) = e e 2/2dx = −∞ R ∞ (x−t)2
t2/2 2π
√1 exp{− + t2 }dx = e . −∞ 2 2 2π
I What does that tell us about sums of i.i.d. copies of X ? nt2/2 I If Z is sum of n i.i.d. copies of X then MZ (t) = e .
42
![Page 43: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/43.jpg)
I Answer: Z has same law as σX + µ, soMZ (t) = M(σt)eµt = exp{σ2t2
2 + µt}.
More examples: normal random variables
I What if X is normal with mean zero, variance one? R ∞ √1 tx −x I MX (t) = e e 2/2dx = −∞ R ∞ (x−t)2
2π
√1 exp{− + t2 }dx = et
2/2 . −∞ 2 2 2π
I What does that tell us about sums of i.i.d. copies of X ? nt2/2 I If Z is sum of n i.i.d. copies of X then MZ (t) = e .
I What is MZ if Z is normal with mean µ and variance σ2?
43
![Page 44: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/44.jpg)
More examples: normal random variables
I What if X is normal with mean zero, variance one? R ∞ √1 tx −x I MX (t) = e e 2/2dx = −∞ 2π
√1 R ∞
exp{− (x−t)2 + t
2 }dx = et2/2 . −∞ 2 2 2π
I What does that tell us about sums of i.i.d. copies of X ? nt2/2 I If Z is sum of n i.i.d. copies of X then MZ (t) = e .
I What is MZ if Z is normal with mean µ and variance σ2?
I Answer: Z has same law as σX + µ, so µt MZ (t) = M(σt)e = exp{σ2t2
+ µt}. 2
44
![Page 45: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/45.jpg)
I MX (t) =R∞0 etxλe−λxdx = λ
R∞0 e−(λ−t)xdx = λ
λ−t .
I What if Z is a Γ distribution with parameters λ > 0 andn > 0?
I Then Z has the law of a sum of n independent copies of X .So MZ (t) = MX (t)
n =�
λλ−t
�n.
I Exponential calculation above works for t < λ. What happenswhen t > λ? Or as t approaches λ from below?
I MX (t) =R∞0 etxλe−λxdx = λ
R∞0 e−(λ−t)xdx =∞ if t ≥ λ.
More examples: exponential random variables
I What if X is exponential with parameter λ > 0?
45
![Page 46: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/46.jpg)
I What if Z is a Γ distribution with parameters λ > 0 andn > 0?
I Then Z has the law of a sum of n independent copies of X .So MZ (t) = MX (t)
n =�
λλ−t
�n.
I Exponential calculation above works for t < λ. What happenswhen t > λ? Or as t approaches λ from below?
I MX (t) =R∞0 etxλe−λxdx = λ
R∞0 e−(λ−t)xdx =∞ if t ≥ λ.
More examples: exponential random variables
I What if X is exponential with parameter λ > 0? R ∞ R ∞ tx λe λ I MX (t) = e −λx dx = λ e−(λ−t)x dx = . 0 0 λ−t
46
![Page 47: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/47.jpg)
I Then Z has the law of a sum of n independent copies of X .So MZ (t) = MX (t)
n =�
λλ−t
�n.
I Exponential calculation above works for t < λ. What happenswhen t > λ? Or as t approaches λ from below?
I MX (t) =R∞0 etxλe−λxdx = λ
R∞0 e−(λ−t)xdx =∞ if t ≥ λ.
More examples: exponential random variables
I What if X is exponential with parameter λ > 0? R ∞ R ∞ tx λe λ I MX (t) = e −λx dx = λ e−(λ−t)x dx = . 0 0 λ−t I What if Z is a Γ distribution with parameters λ > 0 and
n > 0?
47
![Page 48: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/48.jpg)
I Exponential calculation above works for t < λ. What happenswhen t > λ? Or as t approaches λ from below?
I MX (t) =R∞0 etxλe−λxdx = λ
R∞0 e−(λ−t)xdx =∞ if t ≥ λ.
More examples: exponential random variables
I What if X is exponential with parameter λ > 0? R ∞ R ∞ tx λe λ I MX (t) = e −λx dx = λ e−(λ−t)x dx = . 0 0 λ−t I What if Z is a Γ distribution with parameters λ > 0 and
n > 0?
I Then Z has the law of a sum of n independent copies of X . � � n λ So MZ (t) = MX (t)n = . λ−t
48
![Page 49: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/49.jpg)
I MX (t) =R∞0 etxλe−λxdx = λ
R∞0 e−(λ−t)xdx =∞ if t ≥ λ.
More examples: exponential random variables
I What if X is exponential with parameter λ > 0? R ∞ R ∞ tx λe λ I MX (t) = e −λx dx = λ e−(λ−t)x dx = . 0 0 λ−t I What if Z is a Γ distribution with parameters λ > 0 and
n > 0?
I Then Z has the law of a sum of n independent copies of X . � � n λ So MZ (t) = MX (t)n = . λ−t
I Exponential calculation above works for t < λ. What happens when t > λ? Or as t approaches λ from below?
49
![Page 50: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/50.jpg)
More examples: exponential random variables
I What if X is exponential with parameter λ > 0? R ∞ R ∞ tx λe λ I MX (t) = e −λx dx = λ e−(λ−t)x dx = . 0 0 λ−t I What if Z is a Γ distribution with parameters λ > 0 and
n > 0?
I Then Z has the law of a sum of n independent copies of X . � � n λ So MZ (t) = MX (t)n = . λ−t
I Exponential calculation above works for t < λ. What happens when t > λ? Or as t approaches λ from below? R ∞ R ∞ tx λe I MX (t) = e −λx dx = λ e−(λ−t)x dx = ∞ if t ≥ λ. 0 0
50
![Page 51: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/51.jpg)
I What is MX if X is standard Cauchy, so that fX (x) =1
π(1+x2).
I Answer: MX (0) = 1 (as is true for any X ) but otherwiseMX (t) is infinite for all t 6= 0.
I Informal statement: moment generating functions are notdefined for distributions with fat tails.
More examples: existence issues
I Seems that unless fX (x) decays superexponentially as x tends to infinity, we won’t have MX (t) defined for all t.
51
![Page 52: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/52.jpg)
I Answer: MX (0) = 1 (as is true for any X ) but otherwiseMX (t) is infinite for all t 6= 0.
I Informal statement: moment generating functions are notdefined for distributions with fat tails.
More examples: existence issues
I Seems that unless fX (x) decays superexponentially as x tends to infinity, we won’t have MX (t) defined for all t.
1 I What is MX if X is standard Cauchy, so that fX (x) = . π(1+x2)
52
![Page 53: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/53.jpg)
I Informal statement: moment generating functions are notdefined for distributions with fat tails.
More examples: existence issues
I Seems that unless fX (x) decays superexponentially as x tends to infinity, we won’t have MX (t) defined for all t.
1 I What is MX if X is standard Cauchy, so that fX (x) = . π(1+x2)
I Answer: MX (0) = 1 (as is true for any X ) but otherwise MX (t) is infinite for all t 6= 0.
53
![Page 54: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/54.jpg)
More examples: existence issues
I Seems that unless fX (x) decays superexponentially as x tends to infinity, we won’t have MX (t) defined for all t.
1 I What is MX if X is standard Cauchy, so that fX (x) = . π(1+x2)
I Answer: MX (0) = 1 (as is true for any X ) but otherwise MX (t) is infinite for all t 6= 0.
I Informal statement: moment generating functions are not defined for distributions with fat tails.
54
![Page 55: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/55.jpg)
Outline
Moment generating functions
Characteristic functions
Continuity theorems and perspective
55
![Page 56: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/56.jpg)
Outline
Moment generating functions
Characteristic functions
Continuity theorems and perspective
56
![Page 57: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/57.jpg)
I The characteristic function of X is defined byφ(t) = φX (t) := E [e itX ]. Like M(t) except with i thrown in.
I Recall that by definition e it = cos(t) + i sin(t).
I Characteristic functions are similar to moment generatingfunctions in some ways.
I For example, φX+Y = φXφY , just as MX+Y = MXMY .
I And φaX (t) = φX (at) just as MaX (t) = MX (at).
I And if X has an mth moment then E [Xm] = imφ(m)X (0).
I But characteristic functions have a distinct advantage: theyare always well defined for all t even if fX decays slowly.
Characteristic functions
I Let X be a random variable.
57
![Page 58: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/58.jpg)
I Recall that by definition e it = cos(t) + i sin(t).
I Characteristic functions are similar to moment generatingfunctions in some ways.
I For example, φX+Y = φXφY , just as MX+Y = MXMY .
I And φaX (t) = φX (at) just as MaX (t) = MX (at).
I And if X has an mth moment then E [Xm] = imφ(m)X (0).
I But characteristic functions have a distinct advantage: theyare always well defined for all t even if fX decays slowly.
Characteristic functions
I Let X be a random variable.
I The characteristic function of X is defined by φ(t) = φX (t) := E [e itX ]. Like M(t) except with i thrown in.
58
![Page 59: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/59.jpg)
I Characteristic functions are similar to moment generatingfunctions in some ways.
I For example, φX+Y = φXφY , just as MX+Y = MXMY .
I And φaX (t) = φX (at) just as MaX (t) = MX (at).
I And if X has an mth moment then E [Xm] = imφ(m)X (0).
I But characteristic functions have a distinct advantage: theyare always well defined for all t even if fX decays slowly.
Characteristic functions
I Let X be a random variable.
I The characteristic function of X is defined by φ(t) = φX (t) := E [e itX ]. Like M(t) except with i thrown in.
it I Recall that by definition e = cos(t) + i sin(t).
59
![Page 60: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/60.jpg)
I For example, φX+Y = φXφY , just as MX+Y = MXMY .
I And φaX (t) = φX (at) just as MaX (t) = MX (at).
I And if X has an mth moment then E [Xm] = imφ(m)X (0).
I But characteristic functions have a distinct advantage: theyare always well defined for all t even if fX decays slowly.
Characteristic functions
I Let X be a random variable.
I The characteristic function of X is defined by φ(t) = φX (t) := E [e itX ]. Like M(t) except with i thrown in.
it I Recall that by definition e = cos(t) + i sin(t).
I Characteristic functions are similar to moment generating functions in some ways.
60
![Page 61: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/61.jpg)
I And φaX (t) = φX (at) just as MaX (t) = MX (at).
I And if X has an mth moment then E [Xm] = imφ(m)X (0).
I But characteristic functions have a distinct advantage: theyare always well defined for all t even if fX decays slowly.
Characteristic functions
I Let X be a random variable.
I The characteristic function of X is defined by φ(t) = φX (t) := E [e itX ]. Like M(t) except with i thrown in.
it I Recall that by definition e = cos(t) + i sin(t).
I Characteristic functions are similar to moment generating functions in some ways.
I For example, φX +Y = φX φY , just as MX +Y = MX MY .
61
![Page 62: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/62.jpg)
I And if X has an mth moment then E [Xm] = imφ(m)X (0).
I But characteristic functions have a distinct advantage: theyare always well defined for all t even if fX decays slowly.
Characteristic functions
I Let X be a random variable.
I The characteristic function of X is defined by φ(t) = φX (t) := E [e itX ]. Like M(t) except with i thrown in.
it I Recall that by definition e = cos(t) + i sin(t).
I Characteristic functions are similar to moment generating functions in some ways.
I For example, φX +Y = φX φY , just as MX +Y = MX MY .
I And φaX (t) = φX (at) just as MaX (t) = MX (at).
62
![Page 63: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/63.jpg)
I But characteristic functions have a distinct advantage: theyare always well defined for all t even if fX decays slowly.
Characteristic functions
I Let X be a random variable.
I The characteristic function of X is defined by φ(t) = φX (t) := E [e itX ]. Like M(t) except with i thrown in.
it I Recall that by definition e = cos(t) + i sin(t).
I Characteristic functions are similar to moment generating functions in some ways.
I For example, φX +Y = φX φY , just as MX +Y = MX MY .
I And φaX (t) = φX (at) just as MaX (t) = MX (at).
I And if X has an mth moment then E [X m] = imφ(m)(0). X
63
![Page 64: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/64.jpg)
Characteristic functions
I Let X be a random variable.
I The characteristic function of X is defined by φ(t) = φX (t) := E [e itX ]. Like M(t) except with i thrown in.
it I Recall that by definition e = cos(t) + i sin(t).
I Characteristic functions are similar to moment generating functions in some ways.
I For example, φX +Y = φX φY , just as MX +Y = MX MY .
I And φaX (t) = φX (at) just as MaX (t) = MX (at).
I And if X has an mth moment then E [X m] = imφ(m)(0). X
I But characteristic functions have a distinct advantage: they are always well defined for all t even if fX decays slowly.
64
![Page 65: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/65.jpg)
Outline
Moment generating functions
Characteristic functions
Continuity theorems and perspective
65
![Page 66: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/66.jpg)
Outline
Moment generating functions
Characteristic functions
Continuity theorems and perspective
66
![Page 67: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/67.jpg)
I Proofs using characteristic functions apply in more generality,but they require you to remember how to exponentiateimaginary numbers.
I Moment generating functions are central to so-called largedeviation theory and play a fundamental role in statisticalphysics, among other things.
I Characteristic functions are Fourier transforms of thecorresponding distribution density functions and encode“periodicity” patterns. For example, if X is integer valued,φX (t) = E [e itX ] will be 1 whenever t is a multiple of 2π.
Perspective
I In later lectures, we will see that one can use moment generating functions and/or characteristic functions to prove the so-called weak law of large numbers and central limit theorem.
67
![Page 68: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/68.jpg)
I Moment generating functions are central to so-called largedeviation theory and play a fundamental role in statisticalphysics, among other things.
I Characteristic functions are Fourier transforms of thecorresponding distribution density functions and encode“periodicity” patterns. For example, if X is integer valued,φX (t) = E [e itX ] will be 1 whenever t is a multiple of 2π.
Perspective
I In later lectures, we will see that one can use moment generating functions and/or characteristic functions to prove the so-called weak law of large numbers and central limit theorem.
I Proofs using characteristic functions apply in more generality, but they require you to remember how to exponentiate imaginary numbers.
68
![Page 69: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/69.jpg)
I Characteristic functions are Fourier transforms of thecorresponding distribution density functions and encode“periodicity” patterns. For example, if X is integer valued,φX (t) = E [e itX ] will be 1 whenever t is a multiple of 2π.
Perspective
I In later lectures, we will see that one can use moment generating functions and/or characteristic functions to prove the so-called weak law of large numbers and central limit theorem.
I Proofs using characteristic functions apply in more generality, but they require you to remember how to exponentiate imaginary numbers.
I Moment generating functions are central to so-called large deviation theory and play a fundamental role in statistical physics, among other things.
69
![Page 70: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/70.jpg)
Perspective
I In later lectures, we will see that one can use moment generating functions and/or characteristic functions to prove the so-called weak law of large numbers and central limit theorem.
I Proofs using characteristic functions apply in more generality, but they require you to remember how to exponentiate imaginary numbers.
I Moment generating functions are central to so-called large deviation theory and play a fundamental role in statistical physics, among other things.
I Characteristic functions are Fourier transforms of the corresponding distribution density functions and encode “periodicity” patterns. For example, if X is integer valued, φX (t) = E [e itX ] will be 1 whenever t is a multiple of 2π. 70
![Page 71: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/71.jpg)
I We say that Xn converge in distribution or converge in lawto X if limn→∞ FXn(x) = FX (x) at all x ∈ R at which FX iscontinuous.
I Levy’s continuity theorem (see Wikipedia): iflimn→∞ φXn(t) = φX (t) for all t, then Xn converge in law toX .
I Moment generating analog: if moment generatingfunctions MXn(t) are defined for all t and n andlimn→∞MXn(t) = MX (t) for all t, then Xn converge in law toX .
Continuity theorems
I Let X be a random variable and Xn a sequence of random variables.
71
![Page 72: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/72.jpg)
I Levy’s continuity theorem (see Wikipedia): iflimn→∞ φXn(t) = φX (t) for all t, then Xn converge in law toX .
I Moment generating analog: if moment generatingfunctions MXn(t) are defined for all t and n andlimn→∞MXn(t) = MX (t) for all t, then Xn converge in law toX .
Continuity theorems
I Let X be a random variable and Xn a sequence of random variables.
I We say that Xn converge in distribution or converge in law to X if limn→∞ FXn (x) = FX (x) at all x ∈ R at which FX is continuous.
72
![Page 73: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/73.jpg)
I Moment generating analog: if moment generatingfunctions MXn(t) are defined for all t and n andlimn→∞MXn(t) = MX (t) for all t, then Xn converge in law toX .
Continuity theorems
I Let X be a random variable and Xn a sequence of random variables.
I We say that Xn converge in distribution or converge in law to X if limn→∞ FXn (x) = FX (x) at all x ∈ R at which FX is continuous.
I Levy’s continuity theorem (see Wikipedia): if limn→∞ φXn (t) = φX (t) for all t, then Xn converge in law to X .
73
![Page 74: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/74.jpg)
Continuity theorems
I Let X be a random variable and Xn a sequence of random variables.
I We say that Xn converge in distribution or converge in law to X if limn→∞ FXn (x) = FX (x) at all x ∈ R at which FX is continuous.
I Levy’s continuity theorem (see Wikipedia): if limn→∞ φXn (t) = φX (t) for all t, then Xn converge in law to X .
I Moment generating analog: if moment generating functions MXn (t) are defined for all t and n and limn→∞ MXn (t) = MX (t) for all t, then Xn converge in law to X .
74
![Page 75: 18.600 F2019 Lecture 26: Moment generating functions · 2020-07-09 · Moment generating functions. I Let X be a random variable. I The moment generating function of X is defined](https://reader033.vdocuments.us/reader033/viewer/2022042811/5fa7ffa678f9cd3b2f48f3b4/html5/thumbnails/75.jpg)
MIT OpenCourseWarehttps://ocw.mit.edu
18.600 Probability and Random Variables Fall 2019
For information about citing these materials or our Terms of Use, visit: https://ocw.mit.edu/terms.
75