Professional Documents
Culture Documents
(x + 1) x
x+2
f (x) = (1 + )
if x {0, 1, 2, . . .},
if x {0, 1, 2, . . .},
i=1
i=1
i=1
xi (xi + 2)
`x () = i=1 i=1
= 0 xi + xi = 2n + xi
1+
i=1
i=1
i=1
1 n
=
xi .
2n i=1
(It can be verified by any of several approaches that this critical point is indeed the
maximum.) Then
1 n
n =
Xi
2n i=1
is the maximum likelihood estimator of .
(b) Let
n denote the maximum likelihood estimator of from part (a). Show that
n(n ) converges in distribution as n , and find the limiting distribution.
Solution: By the central limit theorem,
1 n
n( Xi 2) D N [0, 2(1 + )].
n i=1
since E (X1 ) = 2 and Var (X1 ) = 2(1 + ). Then
(1 + )
n(n ) D N [0,
],
2
noting that (1 + )/2 = 2(1 + )/22 .
(c) Let 2 = Var (X1 ) = 2(1 + ). Find the maximum likelihood estimator of 2 .
Note: Assume the parameter space for 2 is [0, ), i.e., the parameter space for 2
is exactly what it logically should be.
Solution: The maximum likelihood estimator of 2 is
n2 = 2n (1 + n ).
(d)
Let
n2 denote the maximum likelihood estimator of 2 from part (b). Show that
n(
n2 2 ) converges in distribution as n , and find the limiting distribution.
Solution: Let g(t) = 2t(1 + t), so that 2 = g(). Then g (t) = 2(1 + 2t), and thus
n(
n2 2 ) D N [0, 2(1 + )(1 + 2)2 ]
by the delta method, noting that 2(1 + )(1 + 2)2 = [(1 + )/2][2(1 + 2)]2 .
1
= ( Xi )
n i=1
1 n
Bias () = E () = E ( Xi ) 2
n i=1
n
2
1
= 2 E ( Xi ) 2
n
i=1
2
n
n
1
1
= 2 [E ( Xi )] + 2 Var ( Xi ) 2
n
n
i=1
i=1
1
1
= 2 (n)2 + 2 (n) 2 =
n
n
n
n
since i=1 Xi Poisson(n).
x+1
f (x) =
if x k,
if x < k,
and its mean is k/( 1) if > 1 (and if 1). The Gamma(a, b) distribution has pdf
ba a1
x exp(bx)
f (x) = (a)
if x > 0,
if x 0,
( x) [(
i=1
n
k
i=1 xi
a1
)]
exp(b)
(
)
n+a1 exp(b)
kn
xi
n
which we recognize as an unnormalized Gamma(n + a, b n log k + ni=1 log xi ) distribution. (Note that ni=1 log xi n log k 0 since xi k for each i {1, . . . , n}.) Thus,
n+a
b n log k + ni=1 log xi
E(Y ) = ai E(Xi ) = ai = ai = 0,
i=1
i=1
Var(Y ) =
i=1
b Y N (0, 1), so bY 2 21 .
a2i
i=1
Var(Xi ) = a2i .
i=1
f (x) dx < 1
i=1
6. Let X be a discrete random variable with pmf f (x), where R is unknown. Let
X = {x R f (x) > 0} denote the support of the pmf f (x), and suppose that X does
not depend on . Now suppose that we have a prior () such that the prior mean exists
and is finite, i.e.,
< () d < .
Show that the posterior mean E( X = x) exists and is finite for all data values x X .
Hints: For a sum to be finite, it is necessary (though not sufficient) for every term in the
sum to be finite. Also, since X does not depend on , the marginal distribution of X is
strictly positive for all x X , i.e., m(x) > 0 for all x X .
Solution (Method I): The prior () is simply the marginal distribution of , so
the prior mean is E(). By the law of total expectation,
E() = E[E( X)] = E( X = x) P (X = x) = E( X = x) m(x).
xX
xX
Since E() is finite, every term in the sum must be finite. Then E( X = x) must be
finite for all x X , noting that m(x) > 0 for all x X .
Solution (Method II): First, note that f (x) 1 for all x X and all R since it
is a pmf. Then
E( X = x) = ( X = x) d ( X = x) d =
f (x) ()
d
m(x)
1
() d < ,
m(x)
noting that m(x) > 0 for all x X and () d = E() is finite since E() (which
is just the prior mean) is finite.
8. Let X1 , . . . , Xn be iid random variables such that E,2 (X1 ) = and Var,2 (X1 ) = 2 are
both finite. However, suppose that X1 , . . . , Xn are not normally distributed. Define
Xn =
1 n
Xi ,
n i=1
Sn2 =
n
1
1 n
2
2
[ Xi2 n( X n ) ].
(Xi X n ) =
n 1 i=1
n 1 i=1
E,2 (Sn2 ) =
(d) Do we know for certain that Sn2 is the maximum likelihood estimator of 2 ?
Note: A simple of answer of Yes or No is good enough.
Solution: No, since we dont even know what the likelihood function is. Also
note that even if X1 , . . . , Xn were normally distributed, Sn2 wouldnt be the MLE
of 2 anyway. (An explanation is not required.)
(1 x)1
f (x) =
if 0 < x < 1,
otherwise.
Also,
E (X1 ) =
1
,
1+
Var (X1 ) =
.
(1 + )2 (2 + )
[n log + ( 1) log(1 Xi )]
i=1
n n
n
.
= + log(1 Xi ) = 0 = n
i=1
i=1 log(1 Xi )
`Xn () =
This point is the only critical point, and it can be seen from the form of the loglikelihood that `Xn () both as 0 and as . Then the critical point is
indeed the maximum, and
nMLE =
.
n
i=1 log(1 Xi )
(The justification for why the critical point is indeed the maximum is not required
for full credit since this fact is fairly obvious by inspection of the log-likelihood.)
(b) Do we know for certain that nMLE is an unbiased estimator of ?
Note: A simple of answer of Yes or No is good enough.
Solution: No. There is no reason why it necessarily must be, and indeed it isnt.
(An explanation is not required.)
X n = n1 Xi .
i=1
n ( Xn
) D N [0,
].
1+
(1 + )2 (2 + )
2
1
(1 + )4
(1 + )2
)]
=
=
.
1+
(1 + )2 (2 + ) (1 + )2 (2 + )
2+
Then
(1 + )2
n (n ) D N [0,
]
2+
10. Let X1 , X2 , . . . be a sequence of Unif(0, 1) random variables. For each n 1, let Yn have
a Bin(m, xn ) distribution conditional on Xn = xn , where m 1 is an integer.
(a) Find E(Y1 ) and Var(Y1 ) (not conditional on X1 ).
Note: The Unif(0, 1) distribution has mean 1/2 and variance 1/12, and the Bin(m, )
distribution has mean m and variance m(1 ). You may use any of these facts
without proof.
Solution: By the laws of total expectation and variance,
m
E(Y1 ) = E[E(Y1 X1 )] = E(mX1 ) = ,
2
Var(Y1 ) = E[Var(Y1 X1 )] + Var[E(Y1 X1 )]
= E[mX1 (1 X1 )] + Var[mX1 ]
= m E(X1 ) m E(X12 ) + m2 Var(X1 )
=
m
1
1 2
m2 m(m + 2)
m[ + ( ) ] +
=
,
2
12
2
12
12
where we have used the fact that E(X12 ) = Var(X1 ) + [E(X1 )]2 .
(b) For each n 1, let Zn = ni=1 Yi . Find sequences of constants bn and cn such that
bn (Zn cn ) D N (0, 1).
Solution: By the central limit theorem,
Zn m
m(m + 2)
n(
) D N [0,
],
n
2
12
or equivalently,
Thus, bn =
12
mn
(Zn
) D N (0, 1).
m(m + 2) n
2
n exp(nx)
f (Xn ) (x) =
if x 0,
if x < 0.
Prove that Xn P 0.
Solution: The cdf of each Xn is
F
(Xn )
1 exp(nx)
(x) =
if x 0,
if x < 0.
Then for any > 0, P (Xn 0 > ) = P (Xn > ) = 1 F (Xn ) () = exp(n) 0.
12. Let X1 , X2 , . . . be iid N (, 2 ) random variables, and let X n and Sn2 be the usual sample
mean and sample variance (respectively) of the first n observations, i.e.,
Xn =
1 n
Xi ,
n i=1
Sn2 =
1 n
1 n 2
n
2
2
(X
)
(X ) .
X
=
i
Xi
n 1 i=1
n 1 i=1
n1
1 n
Xi P ,
n i=1
1 n 2
X P 2 + 2
n i=1 i
n 1 n 2
2
[ Xi ( X ) ].
n 1 n i=1
The result now follows from the hint and the fact that n/(n 1) 1.
(b) Now suppose that X1 , X2 , . . . are iid with mean and variance 2 (both finite), but
their distribution is not normal. What additional conditions (if any) are needed on
this distribution for the result of part (a) to hold?
Solution: No additional conditions are needed. The proof in part (a) needs only
that and 2 are finite.
2x exp(x2 ) if x 0,
f (x) =
if x < 0,
0
where > 0 is unknown. Suppose we assign a Gamma(a, b) prior to , where a > 0 and
b > 0 are known.
Note: The Gamma(a, b) distribution has pdf
ba
a1
(a) x exp(bx)
f (x) =
if x > 0,
if x 0,
and its mean is a/b. You may use these facts without proof.
(a) Find the posterior distribution of .
Solution: Ignoring terms that do not depend on , the posterior is
n
which we recognize as the unnormalized pdf of a Gamma(a + n, b + ni=1 x2i ) distribution. Thus, xn Gamma(a + n, b + ni=1 x2i ).
with f (X,Y ) (x, y) = 0 for all other values of x and y. Find E(Y X = 0).
Solution: First, the conditional pmf of Y X = 0 is
f (X,Y ) (0, 0)
0.1
0.1
f (X,Y ) (0, 0)
=
=
=
= 0.2,
(X)
(X,Y
)
(X,Y
)
f (0)
f
(0, 0) + f
(0, 1) 0.1 + 0.4 0.5
f (Y X) (1 0) = 1 f (Y X) (0 0) = 1 0.2 = 0.8.
f (Y X) (0 0) =
10
(x 1)2
exp[
]
2 x3
2x
f (x) =
if x > 0,
if x 0,
n
n
3 n
n (xi 1)2
log log(2) log xi
.
2
2
2 i=1
2 i=1
xi
Differentiating yields
1
n 1 n (xi 1)2
1 n (xi 1)2
`x () =
= 0 = [
] .
2 2 i=1
xi
n i=1
xi
Since there is only one critical point and `x () as 0 and as , it is clear
that this point is indeed the maximum. Hence,
n
2
= [ 1 (Xi 1) ]
n i=1
Xi
1
f (x) =
if < x < + 1,
otherwise,
1in
1
=
1in
Then for all possible observed values x, the maximum likelihood estimate can be taken
as any value such that max1in xi 1 < < min1in xi . There are infinitely many such
values since max1in xi 1 < min1in xi . Thus, a maximum likelihood estimator exists
but is not unique.
11
17. Let X1 , . . . , Xn iid Poisson() conditional on , and let the prior on be Gamma(a, b).
Note: The Poisson() distribution has pmf
x exp()
for x {0, 1, 2, . . .}
(zero for all other x),
x!
with mean and variance . The Gamma(a, b) distribution has pdf
f (x) =
f (x) =
ba
xa1 exp(bx)
(a)
for x > 0
i=1
2x exp(x2 )
f (x) =
0
where > 0 is unknown.
if x 0,
if x < 0,
i=1
i=1
`x () = xi = 0 = n 2 .
i=1
i=1 xi
This is the only critical point, and `x () as 0 and as . Thus,
n
= n
i=1 Xi2
is the maximum likelihood estimator of .
12
(b) Now suppose that instead of > 0, we take the parameter space to be {1, 2}, i.e.,
it is known with certainty that either = 1 or = 2. Find the maximum likelihood
of estimator of under this new restriction.
Solution: At = 1 and = 2, the log-likelihood is
n
`x (1) = x2i ,
i=1
i=1
Observe that
`x (1) `x (2)
1 n 2
x log 2.
n i=1 i
if
if
1
n
1
n
n
i=1 Xi2 log 2,
n
i=1 Xi2 < log 2.
1
n
19. An incorrect result and its incorrect proof are shown below.
(Incorrect) Result: Students t distribution with one degree of freedom is a
discrete distribution that takes values +1 and 1 with probability 1/2 each.
(Incorrect) Proof: Let Z N (0, 1). ThenZ 2 has a chi-squared distribution with
one degree of freedom, and hence T = Z/ Z 2 has a Students t distribution with
one degree of freedom. However, T = Z/ Z 2 = Z/Z, which is either +1 or 1
according to whether Z > 0 or Z < 0, each of which occurs with probability 1/2.
State (in one or two sentences) why this proof of this result is incorrect.
f (1) = 1/2,
f (2) = /2,
(X) =
if X = 0,
if X > 0.
13
(b) Let = [0, 1] = { R 0 1} denote the parameter space. Show that for every
in , so either (0)
or (2)
.
21. Let X1 , . . . , Xn iid N (0, 2 ), where 2 > 0 is unknown. Suppose our prior pdf for 2 is
ba
1 a+1
b
(a) 2 ) exp( 2 )
2
( ) =
if 2 > 0,
if 2 0,
2
i=1
1
1 n
( 2 )[a+(n/2)+1] exp[ 2 (b + x2i )].
2 i=1
(b) Find the posterior mean of 2 . (Be sure that your answer is correct for all possible
values of a > 0, b > 0, and n 1.)
Solution: If a + n2 > 1 (i.e., if n 2 or a > 21 ), then the posterior mean is
E( 2 x) =
14