Reading for Today's Lecture: ?
Goals of Today's Lecture:
We give two definitions of expected values:
Def'n If X has density f then
Def'n: If X has discrete density f then
Now if Y=g(X) for smooth g and X has density fX then
by the change of variables formula for integration. This is good
because otherwise we might have two different values for E(eX).
In general, there are random variables which are neither absolutely
continuous nor discrete. Look at my STAT 801 web pages to see how
is defined in general.
Facts: E is a linear, monotone, positive operator:
Major technical theorems:
Monotone Convergence: If
and
(which has to exist) then
Dominated Convergence: If
and there
is a random variable X such that
(technical
details of this convergence later in the course) and
a random variable Y such that
with
then
Fatou's Lemma: If
then
Theorem: With this definition of E if X has density
f(x) (even in Rp say) and Y=g(X) then
This works for instance even if X has a density but Y doesn't.
Def'n: The
moment (about the origin) of a real
random variable X is
(provided it exists).
We generally use
for E(X). The
central moment is
Def'n: For an Rp valued random vector X we define
to be the vector whose
entry is E(Xi)(provided all entries exist).
Def'n: The (
)
variance covariance matrix of X is
Moments and probabilities of rare events are closely connected as will
be seen in a number of important probability theorems. Here is one
version of Markov's inequality (one case is Chebyshev's inequality):
The intuition is that if moments are small then large deviations from
average are unlikely.
Example moments: If Z is standard normal then
and (integrating by parts)
so that
If now
,
that is,
,
then
and
If
then we have
with Z standard
multivariate normal.
Hence
Moreover
To compute
look at entry ij in
ZZT which is ZiZj. Then
SO:
and
Theorem: If
are independent and each Xi is
integrable then
is integrable and