You are on page 1of 5

Estimation the idea

AS305 Statistical Methods For Insurance

The entire p
purpose
p
of estimation theory
y is to arrive at an estimator
preferably an easily implementable one.
The estimator takes the measured data as input and produces an
estimate
ti t off the
th parameter.
t
The difference between estimate and estimator.
The former refers to the specific value obtained when
applying
pp y g an estimation p
procedure to a set of numbers.
The latter refers to it rule or formula that produces the
estimate.

REVIEW OF MATHEMATICAL STATISTICS


Population
p
p
parameters
estimator

The estimator derived based certain optimality criterion and desired


to have certain good properties
properties.

A simple
p beginning
g
g Method of Moment
Method of moments is a method of estimation of
population parameters such as mean, variance,
median, etc. (which need not be moments), by
substituting unobservable population moments
with sample moments and then solving those
equations for the quantity be estimated.

Example Suppose X1, ..., Xn are independent identically distributed


random variables with a gamma distribution with probability density
function
for x > 0, and 0 for x < 0.
The first moment,
moment i.e.,
i e the expected value,
value of a random variable with this
probability distribution is
and the second moment, i.e., the expected value of its square, is
These are the population moments
The first and second "sample moments" m1 and m2 are respectively

Method of Moment estimates for and

Least Square
q

Equating
q
g the ppopulation
p
moments with the sample
p moments,, we gget

Solving these two equations for and , we get

We then use these 2 quantities as estimates, based on the sample, of the


two unobservable population parameters and .

Linear Regression
g
estimates

Method of maximum likelihood the idea

Model 2
Y = x +
x x

Suppose there
t e e iss a sample
sa p e x1, x2, ...,, xn o
of n independent
depe de t a
and
d
identically distributed observations, coming from a
distribution with an unknown probability density function f0().
f0 belongs to a certain family of distributions {f(|), },
called the parametric model,
It is desirable to find an estimator
to the true value 0 as possible.

which would be as close

The maximum likelihood estimator selects the parameter


value which g
gives the observed data the largest
g
p
possible
probability (or probability density, in the continuous case).

The situation

For
o a
an independent
depe de t a
and
d identically
de t ca y d
distributed
st buted sa
sample,
p e, tthis
s
joint density function is

From a different perspective by considering the observed


values x1, x2, ..., xn to be fixed "parameters" of this function,
whereas will be the function's variable and allowed to vary
freely; this function will be called the likelihood:

Method of maximum likelihood

The method of maximum likelihood estimates 0


by finding a value of that maximizes
.

This method of estimation defines a maximumlikelihood estimator (MLE) of 0 if any


maximum exists.
An MLE estimate is the same regardless of
whether we maximize the likelihood or the loglog
likelihood function, since log is a monotonically
increasing function.

l(|x
| 1,, xn) =ln(
l ( L((|x
| 1,, xn ))

Example Suppose one wishes to determine just how biased an


unfair coin is
is. Call the probability of tossing a HEAD p.
p The
goal then becomes to determine p.
Suppose the coin is tossed 80 times, the outcome is 49 HEADS and
31 TAILS. For 0 p 1, the likelihood function to be maximised
is

Properties
p
of estimator

Unbiasedness

An estimator, , is unbiased if E(
The bias is bias() = E( |) - .
A
Asymptotic
t ti unbiased
bi
d

| ) = for all .

Consistency

converges to in probability (weak): as the sample size goes to infinity,


infinity
the probability that the estimator is in error by more than a small
amount goes to zero

which has solutions p = 0


0, p = 1,
1 and p = 49/80
49/80. The solution which
maximizes the likelihood is clearly p = 49/80 (since p = 0 and p =
1 result in a likelihood of zero).
Thus the maximum likelihood estimator for p is 49/80.

M
Mean
S
Square E
Error and
dV
Variance
i

UMVUE, uniformly minimum variance unbiased estimator

Asymptotic Normality

Example:A population has the exponential distribution with a mean


of .
We want to estimate the population mean by taking an
independent sample of size 3.

Mean

Unbiasedness

Mean

Median

Median

Comparing
p g Variance (efficiency)

Suppose a random variable has the uniform distribution on the


interval (0,
(0 ).
) Consider the estimator
.
Show that this estimator is asymptotically unbiased.

Let Yn be the maximum from a sample


p of size n. Then

As n , the limit is , making this estimator asymptotically


unbiased.
unbiased

For the uniform distribution on the interval (0, ) compare the


MSE of the estimators 2 x and n 1 n max( x1 ,.., xn ) . Also evaluate
the MSE of these estimators

variance

Asymptotic
p
Properties
p
of MLE

Regular conditions

C
Consistency
i t

Asymptotic Normality and Variance

Except for the case n = 1 (and then the two estimators are identical), the
one based on the maximum has the smaller MSE.

Chebyshevs inequality
q
and consistency of
estimator

Proof

CR lower bound and UMVUE, uniformly minimum


variance unbiased estimator

You might also like