STAT 801



Problems: Assignment 5
Postscript version of these questions

1.
Suppose $X_1,\ldots,X_m$ are iid $N( \mu, \sigma^2 )$ and $Y_1,\ldots,Y_n$ are iid $N( \chi, \tau^2)$. Assume the Xs are independent of the Ys.

(a)
Find complete and sufficient statistics.

(b)
Find UMVUE's of $ \mu - \chi$ and $ \sigma^2/\tau^2$.

(c)
Now suppose you know that $ \sigma = \tau$. Find UMVUE's of $ \chi-\mu$ and of $( \chi-\mu )/ \sigma$. (You have already found the UMVUE for $ \sigma^2$.)

(d)
Now suppose $ \sigma$ and $ \tau$ are unknown but that you know that $ \mu = \chi$. Prove there is no UMVUE for $ \mu$. (Hint: Find the UMVUE if you knew $ \sigma / \tau =a$ with a known. Use the fact that the solution depends on a to finish the proof.)

(e)
Why doesn't the Lehmann-Scheffé theorem apply?

2.
Suppose $ X_1, \ldots ,X_n $ iid Poisson( $ \lambda $ ). Find the UMVUE for $ \lambda $ and for $1-\exp(- \lambda ) = P(X_1 \ne 0)$.

3.
Suppose $ X_1, \ldots ,X_n $ iid with

\begin{displaymath}P(X_1=k) =
{\rm Prob}({\rm Poisson}( \lambda )=k \vert {\rm Poisson}( \lambda )> 0)\end{displaymath}

for $k=1,2,3, \ldots$ . For n=1 and 2 find the UMVUE of $1 - \exp ( - \lambda )$. (Hint: The expected value of any function of X is a power series in $ \lambda $ divided by $e^\lambda- 1 $. Set this equal to $1 - \exp ( - \lambda )$ and deduce that two power series are equal. Since this implies their coefficients are the same you can see what the estimate must be. )

4.
 Suppose $\{X_{ij}; j=1, \ldots ,n_i ; i=1, \ldots ,p\}$ are independent $N( \mu_i , \sigma^2)$ random variables. (This is the usual set-up for the one-way layout.)

(a)
Find the MLE's for $\mu_i$ and $ \sigma$ .

(b)
Find the expectations and variances of these estimators.

5.
Let Ti be the error sum of squares in the ith cell in the previous question.

(a)
Find the joint density of the Ti.

(b)
Find the best estimate of $ \sigma^2$ of the form $
\sum_1^p a_i T_i $ in the sense of mean squared error.

(c)
Do the same under the condition that the estimator must be unbiased.

(d)
If only $T_1, \ldots T_p$ are observed what is the MLE of $ \sigma$?

(e)
Find the UMVUE of $ \sigma^2$ for the usual one-way layout model, that is, the model of the last two questions.

6.
Exponential families: Suppose $ X_1, \ldots ,X_n $ are iid with density

\begin{displaymath}f(x; \theta_1,\ldots,\theta_p ) = c(\theta) \exp(\sum_1^pT_i(x)\theta_i ) h(x).\end{displaymath}

(a)
Find minimal sufficient statistics.

(b)
If $S_1,\ldots,S_p$ are the minimal sufficient statistics show that setting $ S_i = {\rm E}_\theta(S_i)$ and solving gives the likelihood equations. (Note the connection to the method of moments.)

7.
In question 4 take ni=2 for all i and let $p\to\infty$ . What happens to the MLE of $ \sigma$?

8.
Suppose that $Y_1,\ldots,Y_n$ are independent random variables and that $x_1,\ldots,x_n$ are the corresponding values of some covariate. Suppose that the density of Yi is

\begin{displaymath}f(y_i)=\exp\left(-y_i\exp(-\alpha - \beta x_i)-
\alpha - \beta x_i\right)1(y_i > 0 )\end{displaymath}

where $\alpha$, and $\beta $ are unknown parameters.

(a)
Find the log-likelihood, the score function and the Fisher information.

(b)
For the data set in $\sim$/teaching/courses/801/data1 fit the model and produce a contour plot of the log-likelihood surface, the profile likelihood for $\beta $ and an approximate 95% confidence interval for $\beta $.

9.
Consider the random effects one way layout. You have data $X_{ij}; i=1, \ldots ,p;j=1, \ldots ,n $ and a model $ X_{ij} = \mu+
\alpha_i + \epsilon_{ij}$ where the $\alpha$'s are iid $N(0, \tau^2)$ and the $ \epsilon$'s are iid $N(0, \sigma^2)$. The $\alpha$s are independent of the $ \epsilon$s.

(a)
Compute the mean and variance covariance matrix of the vector you get by writing out all the Xij as a vector.

(b)
Suppose that M is a matrix of the form aI+b11t where I is a $p\times p$ identity and 1 denotes a column vector of p ones. Show that M-1 is of the form cI+d11t and find c and d. In what follows you may use the fact that the determinant of M is ap-1(a+pb).

(c)
Write down the likelihood.

(d)
Find minimal sufficient statistics.

(e)
Are they complete?

(f)
Data sets like this are usually analyzed based on the fixed effects ANOVA table. Use the formulas for expected mean squares in this table to develop ``method of moments'' estimates of the three parameters. (Because the data are not iid this is not going to be exactly the same technique as the examples in class.)

(g)
Can you find the MLE's?

10.
For each of the doses $d_1, \ldots , d_p$ a number of animals $n_1,
\ldots , n_p$ are treated with the corresponding dose of some drug. The number dying at dose d is Binomial with parameter h(d). A common model for h(d) is $\log(h/(1-h))= \alpha + \beta d. $

(a)
Find the likelihood equations for estimating $\alpha$ and $\beta $.

(b)
Find the Fisher information matrix.

(c)
Define the parameter LD50 as the value of d for which h(d)= 1/2; express LD50 as a function of $\alpha$ and $\beta $.

(d)
Use a Taylor expansion to find large sample confidence limits for LD50.

(e)
At each of the doses -3.204, -2.903, 2.602, -2.301 and -2.000 a sample of 40 mice were exposed to antipneumonococcus serum. The numbers surviving were 7, 18, 32, 35, and 38 respectively. Get numerical values for the theory above. You can use glm or get preliminary estimates based on linear regression of the MLE of h(di) against dose.

11.
Suppose $ X_1, \ldots ,X_n $ are a sample of size n from the density

\begin{displaymath}f_{\alpha,\beta}(x)=\frac{1}{\beta\Gamma(\alpha)}\left(\frac{x}{\beta}
\right)^{\alpha-1}\exp(-x/\beta)\,1(x>0).\end{displaymath}

In the following question the digamma function $\psi$ is defined by $\psi(\alpha)=\frac{d}{d\alpha}\log(\Gamma(\alpha))$ and the trigamma function $\psi^\prime$ is the derivative of the digamma function. From the identity $\Gamma(\alpha+1)=\alpha\Gamma(\alpha)$ you can deduce recurrence relations for the digamma and trigamma functions.

(a)
For $\alpha=\alpha_o$ known find the mle for $\beta $.

(b)
When both $\alpha$ and $\beta $ are unknown what equation must be solved to find $\hat\alpha$, the mle of $\alpha$?

(c)
Evaluate the Fisher information matrix.

(d)
A sample of size 20 is in the file
$\sim$/teaching/801/gamma.
Use this data in the following questions. First take $\alpha=1$ and find the mle of $\beta $ subject to this restriction.

(e)
Now use ${\rm E}(X)=\alpha\beta$ and ${\rm Var}(X)=\alpha\beta^2$ to get method of moments estimates $\tilde\alpha$ and $\tilde\beta$ for the parameters. (This was done in class so I just mean get numbers.)

(f)
Do two steps of Newton Raphson to get MLEs.

(g)
Use Fisher's scoring idea, which is to replace the second derivative in Newton Raphson with the Fisher information (and then not change it as you run the iteration), to redo the previous question.

(h)
Compute standard errors for the MLEs and compare the difference between the estimates in the previous 2 questions to the SEs.

(i)
Do a likelihood ratio test of $H_o: \alpha=1$.



Richard Lockhart
1998-11-18