lecture10 asymptotics fixed
Post on 04-Jun-2018
218 Views
Preview:
TRANSCRIPT
-
8/13/2019 Lecture10 Asymptotics Fixed
1/3
LECTURE10Asymptotics (fixed)
10.1. Convergence in Probability
Reminder from basic calculus - sequences Xnand limits. limn+ Xn=X if for any >0 there is N() with|Xn X| < for all n > N().
Extending the notion of limits and convergence to random variablesis not straightforward. There are multiple concepts of convergence (al-most sure convergence, convergence in probability, mean square con-vergence, convergence in distribution). We will focus on two of them.
Definition 10.1. Let {Xn} be a sequence of random variables and letX be a random variable. We say that Xn converges in probability toX if for all >0
limn+
P[|Xn X| ] = 0We write: Xn
p X or plim Xn = XProperties
(a) IfXnp Xand Yn p Y . Then Xn+Yn p X+ Y
(b) IfXnp Xand a is constant. Then aXn p aX
(c) Suppose Xnp a and the real function g is continuous at a.
Then g(Xn) p
g(a)(d) IfXn
p Xand Yn p Y . Then XnYn p XYTheorem 10.2. [Reminder]Chebyshev inequality
P(|X | ) 2
2
Theorem 10.3. Weak law of large numbers Let{Xn} be a se-quence of iid random variables having common mean and variance2 < . LetXn= n1
ni=1. Then
Xnp
Proof. To be done on the board.
NB There are various versions of the law of large numbers.
Example 1. Let X1,...,Xn denote a random sample from a distribu-tion with mean and variance 2. Show that the sample varianceS2n=
1
n1
(Xi Xn)2 is a consistent estimator of variance 2.
1
-
8/13/2019 Lecture10 Asymptotics Fixed
2/3
2 ,
10.2. Convergence in Distribution
Definition 10.4. Let{
Xn}
be a sequence of random variables and letX be a random variable. LetFXn and FXbe, respectively the cdfs ofXn and X. We say that Xn converges in distribution to X if
limn+
FXn(x) =FX(x)
and we writeXn
d XWe sometimes say that X is the limiting distribution ofXn.
Properties
(a) IfXn converges to X in probability, then Xn converges to X indistribution
(b) (Continuous mapping theorem) Suppose Xn converges toX in distribution and g is a continuous function on the supportof X. Then g(Xn) converges to g(X) in distribution
(c) (Slutskys Theorem) LetXn,X,An,Bnbe random variables
and let a and b be constants. If Xnd X, An p a, Bn p b,
thenAn+BnXn
d a +bXTheorem 10.5. (Lindberg-Levy Central Limit Theorem): LetX1, X2,...,Xn denote a sequence of independently and identically dis-tributed (iid) random variables withE(Xi) = andvar(Xi) =
2. Let
Xn= n1 ni=1
Xi. Then
n1/2(Xn ) d N(0, 2)NB1 There are various versions of the Central Limit Theorem.
NB2 Note that the theorem does NOT make any distributional as-sumption on the Xi.
Example 2. Suppose thatXiare iid Bernoulli random variables wherethe probability of successp = 0.25. Use a normal approximation to findP(X 0.2) ifn= 100.Theorem 10.6. (Delta method.) Let{Xn} be a sequence of randomvariables such that
n1/2(Xn ) d N(0, 2).Suppose the functiong(x) is differentiable at andg() = 0. Then
n1/2(g(Xn) g()) d N(0, 2(g())2)
-
8/13/2019 Lecture10 Asymptotics Fixed
3/3
LECTURE 10. ASYMPTOTICS (FIXED) 3
10.3. Multivariate version
Theorem 10.7. LetXn be a sequence of p dimensional vectors and letXbe a random vector. ThenXn pXif and only ifXnj p Xj, for all
j = 1,...,p
Theorem 10.8. LetXn be a sequence of iid random vectors with com-mon mean vector and variance-covariance matrix which is positivedefinite. Assume the common moment generating function exists in anopen neighborhood of0. Then
Yn= 1
n
(Xi ) d Np(0, )
10.4. Examples
Example 3. Suppose Xi is distributed NIID(2,1) for i= 1,...,n. Let
Y =X2. Suppose n = 100. Find P(Y 3.7)
Example 4. Let Y denote the sum of the observations of a ran-dom sample of size 12 from a distribution having pmf p(x) = 1
6,
x = 1, 2, 3, 4, 5, 6 , zero elsewhere. Using a normal approximation,compute an approximate value ofP(36 Y 48)Example 5. Suppose Xt = t + t1 where t and t1 are iid with
mean 0 and variance 1. (a) Show that X p 0 and (b) find V in
n1/2X d
N(0, V)
top related