Reading for Today's Lecture: ?
Goals of Today's Lecture:
Last time: We used the change of variables formula to
compute the density of
Suppose that
are independent N(0,1). We define the
distribution to be that of
.
Thus our third assertion is that (n-1)s2 can be rewritten as
Finally the fourth part of the theorem is a consequence of the
first 3 parts of the theorem and the definition of the
distribution, namely, that
if it has the same distribution
as
However, I now derive the density of T in this definition:
I can differentiate this with respect to t by
simply differentiating the inner integral:
We give two definitions of expected values:
Def'n If X has density f then
Def'n: If X has discrete density f then
Now if Y=g(X) for smooth g then
In general, there are random variables which are neither absolutely
continuous nor discrete. Look at my STAT 801 web pages to see how
is defined in general.
Facts: E is a linear, monotone, positive operator:
Major technical theorems:
Monotone Convergence: If
and
(which has to exist) then
Dominated Convergence: If
and there
is a random variable X such that
(technical
details of this convergence later in the course) and
a random variable Y such that
with
then
Fatou's Lemma: If
then
Theorem: With this definition of E if X has density
f(x) (even in Rp say) and Y=g(X) then
This works for instance even if X has a density but Y doesn't.
Def'n: The
moment (about the origin) of a real
random variable X is
(provided it exists).
We generally use
for E(X). The
central moment is
Def'n: For an Rp valued random vector X we define
to be the vector whose
entry is E(Xi)(provided all entries exist).
Def'n: The (
)
variance covariance matrix of X is
Moments and probabilities of rare events are closely connected as will
be seen in a number of important probability theorems. Here is one
version of Markov's inequality (one case is Chebyshev's inequality):
The intuition is that if moments are small then large deviations from
average are unlikely.
Example moments: If Z is standard normal then
and (integrating by parts)
so that
If now
,
that is,
,
then
and
Last time defined expectation and stated Monotone Convergence Theorem, Dominated Convergence Theorem and Fatou's Lemma. Reviewed elementary definitions of expected value and basic properties of E.
Def'n: The
moment (about the origin) of a real
random variable X is
(provided it exists).
We generally use
for E(X). The
central moment is
Def'n: For an Rp valued random vector X we define
to be the vector whose
entry is E(Xi)(provided all entries exist).
Def'n: The (
)
variance covariance matrix of X is
Moments and probabilities of rare events are closely connected as will
be seen in a number of important probability theorems. Here is one
version of Markov's inequality (one case is Chebyshev's inequality):
The intuition is that if moments are small then large deviations from
average are unlikely.
Example moments: If Z is standard normal then
and (integrating by parts)
so that
If now
,
that is,
,
then
and