Monte Carlo computation of expected value:
To compute
: do experiment
times to generate
independent observations all with same distribution as
.
Get
.
Use
To estimate
: use same
and compute
In practice: random quantities are not used.
Use: pseudo-random uniform random numbers.
They ``behave like'' iid Uniform(0,1) variables.
Many, many generators. One standard kind: linear congruential generators.
Start with
an integer in range
.
Compute
Integers
and
are chosen so that
Use
We now pretend we can generate
which are iid Uniform(0,1).
Monte Carlo methods for generating a sample:
Fact:
continuous CDF and
satisfy
Proof: For simplicity: assume
strictly increasing on
inverval
with
and
.
If
Uniform(0,1) then
Conversely: if
and
then
there is a unique
such that
Application: generate
and solve
for
to get
which has cdf
.
Example:For the exponential distribution
Observation:
so
also has
an Exponential(
) distribution.
Example:: for
the standard normal cdf solving
Special purpose transformations:
Example:: If
and
are iid
define
and
by
NOTE: Book says
Then
Compute joint cdf of
at
.
Define

![]() |
So
Moreover
has cdf
is Uniform
.
So: generate
iid Uniform(0,1).
Define
You have generated two independent
variables.
If
difficult to solve (eg
hard
to compute) sometimes use acceptance rejection.
Goal: generate
where
.
Tactic: find density
and constant
such that
Algorithm:
Facts:
![]() |
||
![]() |
||
![]() |
||
Most important fact:
:
Proof: this is like the old craps example:
We compute
Condition on
to get
![]() |
||
![]() |
||
![]() |
||
Example:Half normal distribution:
has density
Use
. To find
maximize
Choose
to minimize
(or
); get
so
Algorithm is then: generate
and
then compute
To generate
: use a third
to pick a sign
at random: negative if
otherwise positive.
The inverse transformation method works for discrete distributions.
has possible values
with
probabilities
compute
cumulative probabilities
Generate
Uniform(0,1).
Find
such that
Put
.