Today's notes
Reading for Today's Lecture:
Goals of Today's Lecture:
Today's notes
So far: we have shown that for
an
iid sample from
we have
Next: compute
.
First
Take expected values to get
Definition: The Fisher Information is
In general if
denotes the joint density of all
the data then
This is an identity in
,
so we can differentiate both
sides with respect to
to get
The calculation above involves changing the order of differentiation and integration and is not always valid. In the irregular examples on your homework this doesn't work. It does work in the normal example I did; this is the usual outcome.
Differentiate the same identity again and get
This gives a so-called Bartlett identity
In the normal example we had an iid sampling model. The
Fisher information when we have n observations
was
This leads to the following theorem.
Theorem: In iid sampling
A: Exponential rate
.
We have
iid with density
.
We find
B: Exponential mean
.
We have
iid with density
.
We find
C: Cauchy with location parameter
.
We have
iid with density
.
We find
D: Uniform
We have
iid with density
.
We find
This family has the feature that the support of the density,
namely
,
depends on
.
In such families
it is common for the standard mle theory to fail.
Confidence Intervals:
We can base confidence intervals on one of several forms. For this
section I will assume that
is a scalar (one dimensional)
parameter and use
to denote a derivative with respect to
the parameter. There are 3 standard versions of the normal approximation:
Each of these quantities may be used to derive confidence intervals for
by finding the collection of all
for which the
quantity is smaller than some critical point.
The second and third quantities are of the form