Postscript version of this file
STAT 450 Lecture 16
Reading for Today's Lecture:
Goals of Today's Lecture:
- Do MLE examples.
- Discuss parametriztion invariance of MLE.
- Introduce large sample theory for MLE.
Today's notes
Maximum Likelihood Estimation
To find an MLE we maximize L. This is a typical function
maximization problem which we approach by setting the gradient
of L equal to 0 and then checking to see that the root is
a maximum, not a minimum or saddle point.
The Binomial Distribution
If X has a Binomial
distribution then
The function L is 0 at
and at
unless X=0or X=n so for
the MLE must be found by setting
U=0 and getting
For X=n the log-likelihood has derivative
for all
so that the likelihood is an increasing
function of
which is maximized at
.
Similarly when X=0 the maximum is at
.
If
are independent then the log likelihood is of
the form
The score function is
The mle
maximizes
.
If the maximum occurs
at the interior of the parameter space and the log likelihood continuously
differentiable then
solves the likelihood equations
Examples
N
)
There is a unique root of the likelihood equations. It is a global maximum.
[Remark: Suppose we had called
the parameter.
The score function would still have two components with the
first component being the same as before but now the
second component is
Setting the new likelihood equations equal to 0 still gives
This is a general invariance (or equivariance) principal.
If
is some reparametrization of a model
(a one to one relabelling of the parameter values) then
.
We will see that this does not
apply to other estimators.]
Cauchy: location
There is at least 1 root of the likelihood equations but
often several more. One of the roots is a global maximum,
others, if they exist may be local minima or maxima.
Binomial(
)
If X=0 or X=n there is no root of the likelihood equations;
in this case the likelihood is monotone. For other values of
X there is a unique root, a global maximum. The global
maximum is at
even if X=0 or n.
The 2 parameter exponential
The density is
The resulting log-likelihood is
for
and
otherwise is
As a function of
this is increasing till
reaches
which gives the mle of
.
Now plug in this value
for
and get the so-called profile likelihood for
:
Take the
derivative and set it equal to 0 to get
Notice that the mle
does not solve the likelihood equations; we had to
look at the edge of the possible parameter space.
The parameter
is called a support or
truncation parameter. ML methods behave oddly in
problems with such parameters.
Three parameter Weibull
The density in question is
There are 3 derivatives to take to solve the likelihood equations.
Setting the
derivative equal to 0 gives the equation
where we use the notation
to indicate
that the mle of
could be found by finding the mles of the
other two parameters and then plugging in to the formula above.
It is not possible to find explicitly the remaining two parameters;
numerical methods are needed. However, you can see that putting
and letting
will make the log
likelihood go to
.
The mle is not uniquely
defined, then, since any
and any
will do.
If the true value of
is more than 1 then the probability that
there is a root of the likelihood equations is high; in this case there
must be two more roots: a local maximum and a saddle point! For a
true value of
the theory we detail below applies to the
local maximum and not to the global maximum of the likelihood equations.
Large Sample Theory
In the next few lectures we will be working toward explaining
and ``proving'' the following theorem:
Theorem: Under ``suitable regularity conditions''
- 1.
- The MLE is consistent.
- 2.
- The MLE is asymptotically normal.
The meaning of the first assertion is that in a precise mathematical sense
as the sample size n goes to infinity. In this course we
will simply provide some heuristics which help indicate why this
theorem ought to be true.
The second assertion means that
for a certain
;
we will later show how to compute
,
how to estimate
and how to
use this to get confidence intervals and hypothesis
tests.
Richard Lockhart
1999-10-18