STAT 804: 97-3

Assignment 1

  1. Let be a Gaussian white noise process. Define

    Compute and plot the autocovariance function of X.

    Solution:

  2. Suppose that are uncorrelated and have mean 0 with finite variance. Verify that is stationary and that it is wide sense white noise assuming that the sequence is iid.

    Solution: I should have simply asked the question with ``suppose that are iid with mean 0 and variance " instead of the first sentence. We have The autocovariance function of X is

    Thus is second order white noise. In fact, from question 4 this sequence is strongly stationary. It is also second order white noise but it is not strict sense white noise.

  3. Suppose that

    where is an iid mean 0 sequence with variance . Compute the autocovariance function and plot the results for and . I have shown in class that the roots of a certain polynomial must have modulus more than 1 for there to be a stationary solution X for this difference equation. Translate the conditions on the roots to get conditions on the coefficients and plot in the plane the region for which this process can be rewritten as a causal filter applied to the noise process .

    Solution: This is my rephrasing of the question. To compute the autocovariance function you have two possibilities. First you can factor

    with the the roots of and then write, as in class,

    where

    The autocovariance function is then

    This would be rather tedious to compute; you would have to decide how many terms to take in the infinite sums.

    The second possibility is the recursive method:

    To get started you need values for and . The simplest thing to do, since the value of is free to choose when you plot, is to just assume so that you just compute the autocorrelation function. To get put h=1 in the recursion above and get

    so that . Divide the recursion by to see that the recursion is then

    You can use this for . Note that my choice of the symbol for coefficients in the recursion was silly.

    Now the roots are of the form

    The stationarity conditions are that both of these roots must be larger than 1 in modulus.

    If then the two roots are real. Set them equal to 1 and then to -1 to get the boundary of the region of interest:

    gives or, for we get . Similarly, setting the root equal to -1 gives

    It is now not hard to check that the inequalities

    and

    guarantee, for that the roots have absolute value more than 1.

    When the discriminant is negative the two roots are complex conjugates

    and have modulus squared

    which will be more than 1 provided .

    Finally for the process is simply an AR(1) which will be stationary for . Putting together all these limits gives a triangle in the plane bounded by the lines , and .

  4. Suppose that is strictly stationary. If g is some function from to R show that

    is strictly stationary. What property must g have to guarantee the analogous result with strictly stationary replaced by order stationary?

    Solution: You must prove the following assertion: for any k and any we have

    (for the mathematically inclined you need this for ``Borel sets A".) Define by

    so that

    and

    Then

    where

    is the inverse image of A under the map . In fact the probability on the right is the definition of the probability on the left!

    (REMARK: A number of students worried about whether or not you could take this ; I suspect there were worried about the existence of a so-called functional inverse of . The latter exists only if is a bijection: one-to-one and onto. But the inverse image B of A exists for any ; it is defined as . As a simple example if then there is no functional inverse of but for instance,

    so that the inverse image of is perfectly well defined.)

    For the special case t=0 we also get

    But since X is stationary

    from which we get the desired result.

    For the second part, if g is affine, that is for some vector A and a constant b then Y will have stationary mean and covariance if X does. In fact I think the condition is necessary but do not know a complete proof.

  5. Suppose that is an iid mean 0 variance sequence and that are constants. Define

    1. Derive the autocovariance of the process X.

      Solution:

      simplifies to

    2. Show that implies

      This condition shows that the infinite sum defining X converges ``in the sense of mean square''. It is possible to prove that this means that X can be defined properly. [Note: I don't expect much rigour in this calculation.

      Solution: I had in mind the simple calculation

      which has mean 0 and variance

      The latter quantity converges to 0 since

      More rigour requires the following ideas. I had no intention for students to discover or use these ideas but some, at least, were interested to know.

      Let be the set of all random variables X such that where we agree to regard two random variables and as being the same if . (Literally we define them to be equivalent in this case and then let be the set of equivalence classes.) It is a mathematical fact about that it is a Banach space, or a complete normed vector space with a norm defined by . The important point is that any Cauchy sequence in converges to some limit.

      Define and note that for we have

      which shows that is Cauchy because the sum converges. Thus there is an such that in which means

      This is precisely our definition of .

  6. Given a stationary mean 0 series with autocorrelation , and a fixed lag D find the value of A which minimizes the mean squared error

    and for the minimizing A evaluate the mean squared error in terms of the autocorrelation and the variance of .

    Solution: I added the mean 0 later because you need it and I had forgotten it. You get

    this quadratic is minimized when its derivative is 0 which is when

    Put in this value for A to get a mean squared error of

    or just

  7. Suppose is a stationary Gaussian series with mean and autocovariance , . Show that is stationary and find its mean and autocovariance.

    Solution: The stationarity comes from question 4. To compute the mean and covariance of Y we use the fact that the moment generating function of a random variable is . Since is just the mgf of at s=1 we see that the mean of Y is just . To compute the covariance we need

    which is just the mgf of at 1. Since is we see that the autocovariance of Y is

    or

  8. The semivariogram of a stationary process X is

    (Without the 1/2 it's called the variogram.) Evaluate in terms of the autocovariance of X.

    Solution:



Richard Lockhart
Fri Nov 14 16:35:29 PST 1997