Postscript version of this assignment

STAT 802

Assignment 1

  1. Suppose $X^T=(X_1^T,X_2^T)$ has a$MVN_{p+q}$ distribution with $\mu$ partitioned as $\mu^T=(\mu_1^T, \mu_2^T)$ and Variance covariance

    \begin{displaymath}
\Sigma = \left[\begin{array}{rr} \Sigma_{11} & \Sigma_{12} \\
\Sigma_{21} & \Sigma_{22}\end{array}\right]
\end{displaymath}

    1. Show that ${\rm Cov}(X_1,X_2) = 0$ if and only if $X_1$ and $X_2$ are independent. Use my definition of MVN; you are not allowed to assume that $X$ has a density. A mathematically careful argument may rely on the fact that if $Y_1,\ldots,Y_r$ are independent and $\phi_i$ are (measurable) functions then $\phi_1(Y_1),\ldots,\phi_r(Y_r)$ are independent.

    2. Show that whether or not $\Sigma_{11}$ is singular, each column of $\Sigma_{12}$ is in the column space of $\Sigma_{11}$.

      Hint: one way to do this is to use the following outline.

      1. Define $P$ and $\Lambda$ to be the matrix of orthonormal eigenvectors and the diagonal matrix of corresponding eigenvalues of $\Sigma_{11}$ I showed in class. The columns of $P$ are a basis of $p$ dimensional space if $X_1$ has $p$ components. You can write any vector $v$ as a linear combination of the columns of $P$, that is, in the form $Px=v$.

      2. Suppose $v$ is a column of $\Sigma_{12}$ and write $v=\sum b_i P_i$ where the $P_i$ are the columns of $P$. You are supposed to find an $x$ such that $\Sigma_{11}x = P\Lambda P^t x = v$. If $x = \sum a_i P_i$ then use the orthogonality of the $P_i$ to derive a relationship between $a_i$, $\lambda_i$ and $b_i$.

      3. If $\lambda_i \neq 0$ you can solve this for $a_i$ while if $\lambda_i=0$ then the relationship in the previous part is true anyway. This gives you a formula for $x$ as a linear combination of the $P_i$.

    3. Show that there is a matrix $A$ which is $p\times q$ such that $\Sigma_{12} = \Sigma_{11}A$.

    4. Let $h$ be a column of $\Sigma_{12}$ and $v_1$ and $v_2$ be any vectors such that $\Sigma_{11}v_i = h$ for $i=1,2$. Show that $\Sigma_{21} v_1
= \Sigma_{21} v_2$.

    5. Show that if $\Sigma_{11} A_i = \Sigma_{12}$ for $i=1,2$ then $\Sigma_{21}A_1=\Sigma_{21} A_2$.

    6. Show that $a^T \Sigma_{11} a = 0$ implies $P(a^T X_1= a^T \mu_1) =
1$ for any $a\in R^p$.

    7. Suppose $x$ is such that $a^T \Sigma_{11} a = 0$ implies $a^T x = a^T \mu$. Show that the conditional distribution of $X_2$ given $X_1=x$ is well-defined and is multivariate normal with mean

      \begin{displaymath}
\mu_1+\Sigma_{21} a
\end{displaymath}

      and variance covariance

      \begin{displaymath}
\Sigma_{22} - \Sigma_{21} A
\end{displaymath}

      where $a$ is any solution of $\Sigma_{11}a = (x-\mu_1)$ and $A$ is any solution of $\Sigma_{11}A = \Sigma_{12}$.

    NOTE: Most facts about $MVN(\mu,\Sigma)$ variates $X$ can be demonstrated by writing $X=AZ+\mu$ for well chosen $A$.

  2. Fix $a\in R^p$. Minimize and maximize $f(x)= (a^T x)^2$ subject to $x^T x=1$.

  3. Fix $a\in R^p$. Minimize and maximize $f(x)=a^Tx$ subject to $x^TQx = 1$ for a given symmetric positive definite matrix $Q$.

  4. Write out the spectral decomposition of the matrix

    \begin{displaymath}
A=\left[\begin{array}{rr} 1 & -1 \\ -1 & 1 \end{array}\right]
\end{displaymath}

    and then find a symmetric square root.

  5. Suppose $X=(X_1,X_2,X_3)$ has a MVN distribution with $\mu=0$ and

    \begin{displaymath}
\Sigma = \left[\begin{array}{rrr} 2 & 1 & -1 \\ 1 & 1 & -1 \\ -1 & -1 & 1 \end{array}\right]
\end{displaymath}

    Find the conditional distribution of $X_1$ given $X_2=x_2$ and $X_3=x_3$. For which values of $x_2$ and $x_3$ does this make sense?



Richard Lockhart
2002-08-28