You are on page 1of 4

STATS 200 (Stanford University, Summer 2015)

Solutions to Homework 2
DeGroot & Schervish X.Y .Z means Exercise Z at the end of Section X.Y in our text,
Probability and Statistics (Fourth Edition) by Morris H. DeGroot and Mark J. Schervish.
1. DeGroot & Schervish 8.2.10.
Solution: Let U1 = X1 +X2 +X3 and
4 +X5 +X6 . Since X1 , . . . , X6 iid N (0, 1),
U2 = X
we have U1 , U2 iid N (0, 3). Then U1 / 3, U2 / 3 iid N (0, 1), so
2

U1
U2
Y
= ( ) + ( ) 22 .
3
3
3
Thus, c = 1/3.

2. Let Un 2n . Show that n1/2 (Un n) converges in distribution as n , and find the
limiting distribution.
Note: Recall that the 2n distribution has expectation n and variance 2n. You may use
these facts without proof.
Solution: Let {Zn n 1} be a sequence of independent random variables such that
Zn N (0, 1) for every n 1. Then for every n 1, the random variable ni=1 Zi2 has the
same distribution as Un (specifically, 2n ). Now observe that Z12 21 , so E(Z12 ) = 1 and
Var(Z12 ) = 2 < . Then by the central limit theorem,
1 n
1 n
( Zi2 n) = n( Zi2 1) D N (0, 2).
n i=1
n i=1
Since Un and ni=1 Zi2 have the same distribution for every n 1, it follows immediately
that n1/2 (Un n) D N (0, 2) as well.

3. DeGroot & Schervish 7.5.8.


Solution to (a): Let m = min1in xi . The likelihood is
n

i=1

i=1

Lx () = exp( xi ) I(,xi ) () = exp( xi ) exp(n) I(,m) ().


Observe that the likelihood Lx () is strictly positive and strictly increasing for < m,
while Lx () = 0 for all m. Now note that limm Lx () = 1, where the notation
limm denotes the limit as increases toward m (i.e., the limit from the left). However,
evaluating the likelihood at m itself yields Lx (m) = 0. Thus, there is no value of that
maximizes Lx (), and so the maximum likelihood estimator does not exist.

Solution to (b): We could instead simply take the pdf to be

exp( x) if x ,
f (x) =

if x < .

0
Then Lx (m) = 1, so the likelihood attains its maximum at = m. Thus, the maximum
likelihood estimator of is = m = min1in Xi .

Solutions to Homework 2

4. Let X1 , . . . , Xn iid N (0, 2 ), where 2 > 0 is unknown. Find the maximum likelihood
estimator of 2 .
Solution: The likelihood and log-likelihood are
n
x2
1
1 n
n/2
Lx ( 2 ) =
exp( i2 ) = (2 2 )
exp( 2 x2i ),
2
2 i=1
2 2
i=1
n
n
1
n
`x ( 2 ) = log Lx ( 2 ) = log(2) log 2 2 x2i .
2
2
2 i=1

Then
n

n
1
1 n 2
2
2
2
`
(
)
=

+
x
=
0

x ,
x
( 2 )
2 2 2( 2 )2 i=1 i
n i=1 i

and it can be seen that this critical point is indeed the maximizer, i.e.,
1 n
`x ( 2 )
`x ( x2i ) = max
2 >0
n i=1

for all x Rp .

Thus, the maximum likelihood estimator of 2 is


2 = n1 ni=1 Xi2 .
Note: If xi = 0 for every i {1, . . . , n}, then n1 ni=1 x2i = 0, which is outside the
parameter space and hence cannot be the maximum likelihood estimate. However,
since X1 , . . . , Xn are continuous random variables, the probability that they will all
equal zero (or indeed that any of them will equal zero) is zero. Thus, we can safely
forget about this issue.

Solutions to Homework 2

5. Let Yn Bin(n, ), where is unknown and 0 1.


(a) Show that the maximum likelihood estimator of is n = Yn /n.
Solution: The likelihood is Lx () = (nx)x (1 )nx , and hence the log-likelihood
is `x () = log (nx) + x log + (n x) log(1 ). Then

x nx
x
`x () =
= 0 = ,

1
n
and it can be seen that this critical point is indeed the maximizer, i.e.,
x
`x ( ) = max `x ()
01
n

for all x {0, . . . , n}.

Thus, the maximum likelihood estimator of is n = Xn /n.

(b) Let = arcsin( ). Find the maximum likelihood estimator n of .


Note: There is no need to worry about any issues here with the parameter space
for the parameter , i.e., you may assume that simply follows from the parameter
space = [0, 1] for . More formally,
you may assume that is the image of

under the function g(t) = arcsin( t ), which is = [0, /2], i.e., 0 /2.

Solution: The maximum likelihood estimator of = arcsin( ) is simply

n = arcsin( n ) = arcsin( Xn /n )
by Theorem 4.2.4 of the lecture notes.

(c) Show that n(n ) converges in distribution as n , and find the limiting
distribution. What do you notice about the variance of the limiting distribution?
(Specifically, how does it depend on , or equivalently, on ?)
Note 1: You may use the result of Example 2.2.4 of the lecture notes without proof.
Note 2: The derivative of the arcsin function is dtd arcsin t = (1 t2 )1/2 .

Solution: Let g(t) = arcsin( t ), which has derivative

1
1
1

g (t) =
.
( 2t ) =

2 t(1 t)
1 ( t )2

Now note that [g ()]2 = 1/[4(1 )]. Then

n(n ) = n[g(n ) g()] D N (0, 1/4)


by the delta method. The variance of the limiting distribution does not depend on
(or, equivalently, on ) at all.

Solutions to Homework 2

6. Let Y be a single observation of a Geometric() random variable with pmf f (y) = (1)y
for all integers y 0 (and zero otherwise), where is unknown and 0 < 1.
Note: Under this setup, Y counts the number of failures before the first success occurs in
a sequence of iid trials.
(a) Find the maximum likelihood estimator of .
Solution: The likelihood is Ly () = (1 )y , and hence the log-likelihood is
`y () = log Ly () = y log(1 ) + log . Then
1
y
1

`y () =
= 0 =
,

1
y+1
and it can be seen that this critical point is indeed the maximizer, i.e.,
1
`y (
) = max `y ()
for all integers y 0.
0<1
y+1
Thus, the maximum likelihood estimator of is = 1/(Y + 1).

(b) Explain the connection between the estimator in part (a) of this problem and the
maximum likelihood estimator in part (a) of problem 5 above.
Hint: Recall the note about successes and failures.
Solution: In part (a) of problem 5, the maximum likelihood estimator was the
number of successes divided by the total number of trials. Now consider the geometric random variable Y in this problem. Recall that Y counts the number of failures
before the first success occurs. Then once this first success occurs, the number of
successes that have occurred is 1, while the total number of trials that have occurred
is Y + 1. Thus, the maximum likelihood estimator can again be interpreted as the
number of successes divided by the total number of trials.

7. Suppose that a large box contains tickets numbered from 1 to , where is an unknown
positive integer. A ticket is randomly drawn from the box, and we assume that each
ticket is equally likely to be drawn (i.e., the number on the ticket has a discrete uniform
distribution on the set {1, . . . , }). Suppose ticket #715 is drawn. What is the maximum
likelihood estimate of based on this observation?
Solution: The likelihood based on drawing ticket #715 is

1
if {715, 716, . . .},
L715 () = 1{715,716,...} () =

0
if {715, 716, . . .}.

Now simply note that the function 1/ is decreasing in , so its maximum on the set
{715, 716, . . .} occurs at 715. Thus, the maximum likelihood estimate is MLE = 715.
Note: Many people find this result counterintuitive. Drawing ticket #715 tells us
with certainty that the number of tickets in the box is at least 715. It may be more
than 715, but it cannot be less than 715. Thus, some people might feel that 715 itself
is not a good estimate of the number of tickets in the box. (Of course, whether or
not 715 is a good estimate of depends on precisely what we mean by good.)

You might also like