You are on page 1of 4

Bias of Maximum Likelihood Estimate of Mean and Variance of a Distribution

Vikram Kamath

MLE Estimates:

We know that the Maximum Likelihood Estimate for the Mean of a Distribution is: n 1 = xi (1) n i=1 And this is nothing but the sample mean. The Maximum Likelihood Estimate of the Variance is: = 1 n
n

(xi )2
i=1

(2)

Which is nothing but the sample variance. We also know that the First and Second Moments of a random variable x are given by: E[x] = and E[(x )2 ] = 2 respectively (3)

Bias of MLE Mean


E[]

We can nd the Bias of the MLE mean by calculating the value of:

Which is given by: E[] = E[


n

1 n

xi ]
i=1 n

(F rom (1))

1 1 = E[ xi ] = ( E[xi ]) n i=1 n i=1 1 (n) ( f rom (3)) n = E[] = = This shows that the MLE estimate of the mean of a distribution is not biased.

Bias of MLE Variance


E[ 2 ]

We can nd the Bias of the MLE Variance by calculating the value of:

Which is given by E[ 2 ] = E[
n

1 n

(xi )2 ]
i=1

(F rom (2))

1 E[ (x2 + 2 2xi )] n i=1 i 1 E n


n n n

(x2 ) + i
i=1 i=1

(2 ) 2
i=1

(xi )

1 = E n

(x2 ) + n2 2(n) i
i=1

(f rom(1)) =

1 n

xi = n =
i=1 i=1

xi

1 = E n

(x2 ) (n2 ) i
i=1

1 n

E[x2 ] E[n2 ] i
i=1

1 n

E[x2 ] nE[2 ] i
i=1

1 n

E[x2 ] E[2 ] i
i=1

x1 , x2 , x3 , ..., xn are random variables equivalent to the single random variable x and hence E[x2 ] can be replaced by E[x2 ] above. The above hence becomes: i = 1 n
n

E[x2 ] E[2 ]
i=1

1 n E[x2 ] E[2 ] n

(4)

= E[x2 ] E[2 ]

We also know that (DIY Derivation or look it up): E[x2 ] = E[(x )2 ] + (E[x])2 = E[x2 ] = 2 + 2 (5)

Similarly: E[2 ] = E[( )2 ] + (E[])2


2 = E[2 ] = + 2

(6)

Plugging the results of Eq(5) and Eq(6) in Eq(4), we get:


2 E[ 2 ] = ( 2 + 2 ) ( + 2 )

(7)

We will now use the following theorem (Proof left as an exercise/DIY): Theorem: If x1 , x2 , x3 , ..., xn are n independent and identically distributed 2 random variable (i.i.d), then the sum of their variances (denoted by x ) is equal 2 to n times the sum of their individual variances (denoted by ) That is: V ar(x1 + x2 + x3 + ..., +xn ) = n 2
2 = x = n 2

(8)

We will now use another theorem (Proof left as an exercise/DIY): Theorem: If x is a random variable, then the variance of a constant multiple 2 c of x (denoted by cx ) is equal to c2 times the variance of x. That is:
2 2 cx = c2 x

(9)

Because is the sample mean i.e. =


1 n n

xi
i=1

We can say that:


2 = V ar() = V ar(

x1 + x2 + x3 , ..., +xn ) n

Using Eq(8) and Eq(9) and solving for the above we get:
2 = V ar(

2 x1 + x2 + x3 , ..., +xn n 2 )= 2 = n n n
2 = =

2 (10) n We know that that MLE mean is not biased. Using this knowledge and Eq(10) and substituting in Eq(7), we get: E[ 2 ] = ( 2 + 2 ) ( 2 n 2 + 2 ) n

= E[ 2 ] = 2

(11)

= E[ 2 ] =

(n 1) 2 n

This shows that the MLE Variance IS biased.

You might also like