math statistics

5
AS305 Statistical Methods For Insurance AS305 Statistical Methods For Insurance REVIEW OF MATHEMATICAL STATISTICS REVIEW OF MATHEMATICAL STATISTICS Estimation the idea The entire purpose of estimation theory is to arrive at an estimator — preferably an easily implementable one. The estimator takes the measured data as input and produces an ti t f th t estimate of the parameter. The difference between estimate and estimator. The former refers to the specific value obtained when applying an estimation procedure to a set of numbers. The latter refers to it rule or formula that produces the estimate. Population parameters estimator The estimator derived based certain optimality criterion and desired to have certain good properties to have certain good properties. A simple beginning Method of Moment Method of moments is a method of estimation of population parameters such as mean, variance, median, etc. (which need not be moments), by substituting unobservable population moments with sample moments and then solving those equations for the quantity be estimated. Example Suppose X 1 , ..., X n are independent identically distributed random variables with a gamma distribution with probability density random variables with a gamma distribution with probability density function for x > 0, and 0 for x < 0. The first moment i e the expected value of a random variable with this The first moment, i.e., the expected value, of a random variable with this probability distribution is and the second moment, i.e., the expected value of its square, is These are the “population moments” The first and second "sample moments" m 1 and m 2 are respectively

Upload: joseph-lim

Post on 12-Jan-2016

216 views

Category:

Documents


2 download

DESCRIPTION

SMI

TRANSCRIPT

Page 1: Math Statistics

AS305 Statistical Methods For InsuranceAS305 Statistical Methods For Insurance

REVIEW OF MATHEMATICAL STATISTICSREVIEW OF MATHEMATICAL STATISTICS

Estimation – the idea

The entire purpose of estimation theory is to arrive at an estimator p p y— preferably an easily implementable one. The estimator takes the measured data as input and produces an

ti t f th testimate of the parameter.The difference between estimate and estimator. The former refers to the specific value obtained when applying an estimation procedure to a set of numbers. pp y g pThe latter refers to it rule or formula that produces the estimate.

Population parametersp p

estimator

The estimator derived based certain optimality criterion and desired to have certain good propertiesto have certain good properties.

A simple beginning – Method of Momentp g g

Method of moments is a method of estimation of population parameters such as mean, variance, median, etc. (which need not be moments), by substituting unobservable population moments with sample moments and then solving those equations for the quantity be estimated.

Example Suppose X1, ..., Xn are independent identically distributed random variables with a gamma distribution with probability densityrandom variables with a gamma distribution with probability density function

for x > 0, and 0 for x < 0.

The first moment i e the expected value of a random variable with thisThe first moment, i.e., the expected value, of a random variable with this probability distribution is

and the second moment, i.e., the expected value of its square, is

These are the “population moments”The first and second "sample moments" m1 and m2 are respectively

Page 2: Math Statistics

Method of Moment estimates for and

Equating the population moments with the sample moments, we getq g p p p , g

Solving these two equations for α and β, we get

We then use these 2 quantities as estimates, based on the sample, of the two unobservable population parameters α and βtwo unobservable population parameters α and β.

Least Squareq

Linear Regression estimatesg

Model 2Model 2Y = x + x x

Method of maximum likelihood – the idea

Suppose there is a sample x1, x2, ..., xn of n independent and Suppose t e e s a sa p e 1, 2, , n o depe de t a didentically distributed observations, coming from a distribution with an unknown probability density function f0(·).

f0 belongs to a certain family of distributions { f(·| θ), θ ∈ Θ }, called the parametric model,

It is desirable to find an estimator which would be as close to the true value θ0 as possible.

The maximum likelihood estimator selects the parameter value which gives the observed data the largest possible g g pprobability (or probability density, in the continuous case).

Page 3: Math Statistics

The situation

For an independent and identically distributed sample, this o a depe de t a d de t ca y d st buted sa p e, t sjoint density function is

From a different perspective by considering the observed values x1, x2, ..., xn to be fixed "parameters" of this function, whereas θ will be the function's variable and allowed to vary freely; this function will be called the likelihood:

(| ) l ( (| )) l(|x1,…, xn) =ln( L(|x1,…, xn ))

Method of maximum likelihood

The method of maximum likelihood estimates θ0The method of maximum likelihood estimates θ0by finding a value of θ that maximizes .

This method of estimation defines a maximum-likelihood estimator (MLE) of θ0 if any maximum exists.

An MLE estimate is the same regardless of whether we maximize the likelihood or the log-whether we maximize the likelihood or the loglikelihood function, since log is a monotonically increasing function.increasing function.

Example Suppose one wishes to determine just how biased an unfair coin is Call the probability of tossing a HEAD p Theunfair coin is. Call the probability of tossing a HEAD p. The goal then becomes to determine p.Suppose the coin is tossed 80 times, the outcome is 49 HEADS andSuppose the coin is tossed 80 times, the outcome is 49 HEADS and

31 TAILS. For 0 ≤ p ≤ 1, the likelihood function to be maximised is

which has solutions p = 0 p = 1 and p = 49/80 The solution whichwhich has solutions p = 0, p = 1, and p = 49/80. The solution which maximizes the likelihood is clearly p = 49/80 (since p = 0 and p = 1 result in a likelihood of zero).

Thus the maximum likelihood estimator for p is 49/80.

Properties of estimatorp

Unbiasedness An estimator, , is unbiased if E( | ) = for all . The bias is bias() = E( |) - .

A t ti bi d Asymptotic unbiased Consistencyconverges to in probability (weak): as the sample size goes to infinityconverges to in probability (weak): as the sample size goes to infinity, the probability that the estimator is in error by more than a small amount goes to zero

M S E d V i Mean Square Error and Variance

UMVUE, uniformly minimum variance unbiased estimator Asymptotic NormalityAsymptotic Normality

Page 4: Math Statistics

Example:A population has the exponential distribution with a mean of We want to estimate the population mean by taking anof . We want to estimate the population mean by taking an independent sample of size 3.

Mean

Median

Unbiasedness

MeanMean

Median

Comparing Variance (efficiency)p g Suppose a random variable has the uniform distribution on the interval (0 ) Consider the estimatorinterval (0, ). Consider the estimator . Show that this estimator is asymptotically unbiased. Let Yn be the maximum from a sample of size n. Thenn p

As n → ∞, the limit is , making this estimator asymptotically unbiasedunbiased.

Page 5: Math Statistics

For the uniform distribution on the interval (0, ) compare the MSE of the estimators 2 and Also evaluate)max(1 xxn MSE of the estimators 2 and . Also evaluate the MSE of these estimators

x ),..,max(11 nxxn

n

variance

Except for the case n = 1 (and then the two estimators are identical), the one based on the maximum has the smaller MSE.

Asymptotic Properties of MLEp p

Regular conditions Regular conditions

C i t Consistency

Asymptotic Normality and Variance

CR lower bound and UMVUE, uniformly minimum variance unbiased estimator

Chebyshev’s inequality and consistency of qestimator

Proof