,
and
Then
which is more than
so that
is not an unbiased estimate
of
which is
if
.
so that this estimate is unbiased
for any
.
Var(
Var(
.
The derivative of this with respect to
is
which is 0 when
or


This is equal to
when
or
. In this case
and we get
.
and the
log likelihood is

The derivative of this with respect to
is
which is 0 when
which gives
the estimate ?. (I haven't worked it out yet.)
is the same as the intersection
of the events
,
. Since these
events are independent we have, for
,

The derivative of this with respect to y is the density of Y so
the density is
which is part a).

which is not
. Thus Y is biased but
so that
is unbiased.
has a chi-squared distribution with n-1 degrees of freedom,
which is a Gamma distribution with shape
and scale 2. This permits
us to prove the hint since
and
. It follows that
has mean
, bias
and
variance
. Thus the mean squared error of
is

which is a minimum when
is minimized. Take
the derivative with respect to K and set it equal to 0 to get
whose solution is
.

The derivative with respect to
is simply
which is 0 when
. Put this
in for each
and note that

Now take the derivative with respect to
to get

which is 0 when

is just
because
and
have the same mean. But
so that the expected value of the mle is
. An unbiased
estimate is obtained by multiplying by 2 to get
