STAT 350


Postscript version of these questions

Assignment 2

1.
In this problem you will prove that

\begin{displaymath}\phi(x) = \frac{1}{\sqrt{2\pi}} e^{-x^2/2} \qquad -\infty < x < \infty
\end{displaymath}

is a density.
(a)
Let $I = \int_{-\infty}^\infty \phi(x) dx$. Show that

\begin{displaymath}I^2 = \int_{-\infty}^\infty \int_{-\infty}^\infty \phi(x)\phi(y)\, dx \, dy
.
\end{displaymath}

HINT: What is $\int_{-\infty}^\infty \phi(y)dy$ in terms of I.
(b)
Now if

\begin{displaymath}J = \int_{-\infty}^\infty \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi}}
e^{-x^2/2} \frac{1}{\sqrt{2\pi}} e^{-y^2/2} \, dx \, dy
\end{displaymath}

do the double integral J in polar co-ordinates ( $x=r\cos\theta$, $y=r\sin\theta$) to show J=1.
(c)
Deduce that $\phi$ is a density.

2.
Suppose X1,X2,X3 are independent $N(\mu,\sigma^2)$ random variables, so that $X_i=\mu+\sigma Z_i$ with Z1,Z2,Z3 independent standard normals.
(a)
If XT = (X1,X2,X3) and ZT=(Z1,Z2,Z3) express X in the form AZ+b for a suitable matrix A and vector b.
(b)
Show that X is $MVN_3(\mu_X,\Sigma_X)$ and identify $\mu_X$ and $\Sigma_X$.
(c)
Let $Y_i = X_i-\bar{X}$ for i=1,2,3 and $Y_4=\bar{X}$. Show that $Y\sim MVN_4(\mu_Y,\Sigma_Y)$ and find $\mu_Y$ and $\Sigma_Y$.

3.
Working with partitioned matrices. Suppose that the design matrix X is partitioned as $X=[{\bf 1}\vert X_1\vert X_2]$ where Xi has picolumns.
(a)
Write XTX as a partitioned (3 rows, 3 columns) matrix.
(b)
A matrix

\begin{displaymath}A = \left[\begin{array}{ccc}
A_1 & 0 & 0 \\
0 & A_2 & 0 \\
0 & 0 & A_3
\end{array}\right]
\end{displaymath}

is called block diagonal. Show that A-1 exists if and only if each Ai-1 exists and that then A-1is block diagonal.
(c)
Suppose that ${\bf 1}^T X_i = 0$ for i=1,2 and X1TX2=0. Show that XTX is block diagonal and give a formula for (XTX)-1.
(d)
Suppose $\beta^T = [ \beta_0\vert \beta_1^T\vert\beta_2^T]$ is partitioned to conform with the partitioning of X (that is $\beta_0$ is a scalar and $\beta_i$ is a column vector of length pi for i=1,2. Let $\tilde\beta_0$ be obtained by fitting

\begin{displaymath}Y={\bf 1}\beta_0+\epsilon
\end{displaymath}

by least squares, $\tilde\beta_1$ be obtained by fitting

\begin{displaymath}Y=X_1\beta_1+\epsilon
\end{displaymath}

and similarly for $\tilde\beta_2$. Let $\hat\beta$ be the usual least squares estimate for

\begin{displaymath}Y=X\beta+\epsilon \, .
\end{displaymath}

Show that $\hat\beta^T = [ \tilde\beta_0\vert \tilde\beta_1^T\vert\tilde\beta_2^T]$.
(e)
Let $\hat\mu_i$ be the vectors of fitted values corresponding to the estimates $\tilde\beta_i$ for i=1,2,3. Show that for $i\ne j$ we have $\hat\mu_i \perp \hat\mu_j$.
(f)
For the design matrix Xb of the first assignment identify X1 and X2 and verify the orthogonality condition of this problem.
4.
Page 321. Problem 7.33 parts a, b, e and f, 7.34 and 7.35 part a.
DUE: Wednesday, 3 February

Richard Lockhart
1999-02-02