up previous
Postscript version of this page

STAT 801: Mathematical Statistics

Hypothesis Testing and Decision Theory

Decision analysis of hypothesis testing takes $ D=\{0,1\}$ and

$\displaystyle L(d,\theta)=1($make an error$\displaystyle )
$

or more generally $ L(0,\theta) = \ell_1 1(\theta\in\Theta_1)$ and $ L(1,\theta) = \ell_2 1(\theta\in\Theta_0)$ for two positive constants $ \ell_1$ and $ \ell_2$. We make the decision space convex by allowing a decision to be a probability measure on $ D$. Any such measure can be specified by $ \delta=P($reject$ )$ so $ {\cal D} =[0,1]$. The loss function of $ \delta\in[0,1]$ is

$\displaystyle L(\delta,\theta) = (1-\delta)\ell_1 1(\theta\in\Theta_1) + \delta \ell_0
1(\theta\in\Theta_0) \, .
$

Simple hypotheses: Prior is $ \pi_0>0$ and $ \pi_1>0$ with $ \pi_0+\pi_1=1 $.

Procedure: map from sample space to $ \cal D$ - a test function.

Risk function of procedure $ \phi(X)$ is a pair of numbers:

$\displaystyle R_\phi(\theta_0) = E_0(L(\delta,\theta_0))
$

and

$\displaystyle R_\phi(\theta_1) = E_1(L(\delta,\theta_1))
$

We find

$\displaystyle R_\phi(\theta_0) = \ell_0 E_0(\phi(X)) =\ell_0\alpha
$

and

$\displaystyle R_\phi(\theta_1) = \ell_1 E_1(1-\phi(X)) = \ell_1 \beta
$

The Bayes risk of $ \phi$ is

$\displaystyle \pi_0\ell_0 \alpha+ \pi_1\ell_1\beta
$

We saw in the hypothesis testing section that this is minimized by

$\displaystyle \phi(X) = 1(f_1(X)/f_0(X) > \pi_0\ell_0/(\pi_1\ell_1))
$

which is a likelihood ratio test. These tests are Bayes and admissible. The risk is constant if $ \beta\ell_1 = \alpha\ell_0$; you can use this to find the minimax test in this context.

up previous



Richard Lockhart
2001-01-03