Chapter Contents
Chapter Contents
Previous
Previous
Next
Next
The ARIMA Procedure

The ESACF Method

The Extended Sample Autocorrelation Function (ESACF) method can tentatively identify the orders of a stationary or nonstationary ARMA process based on iterated least squares estimates of the autoregressive parameters. Tsay and Tiao (1984) proposed the technique, and Choi (1990) provides useful descriptions of the algorithm.

Given a stationary or nonstationary time series {\{z_{t} : 1 \le t \le n \}} with mean corrected form {\tilde{z}_{t} = z_{t} - {\mu}_{z}}, with a true autoregressive order of p+d, and with a true moving-average order of q, you can use the ESACF method to estimate the unknown orders p+d and q by analyzing the autocorrelation functions associated with filtered series of the form

w^{(m,j)}_{t} = \hat{{\Phi}}_{(m,j)}(B)\tilde{z}_{t} 
= \tilde{z}_{t} -
\sum_{i=1}^m \hat{{\phi}}^{(m,j)}_{i}
\tilde{z}_{t-i}

where B represents the backshift operator, where m = pmin, ... , pmax are the autoregressive test orders, where j = qmin+1, ... , qmax+1 are the moving average test orders, and where { \hat{{\phi}}^{(m,j)}_{i} } are the autoregressive parameter estimates under the assumption that the series is an ARMA(m,j) process.

For purely autoregressive models (j = 0), ordinary least squares (OLS) is used to consistently estimate { \hat{{\phi}}^{(m,0)}_{i} }. For ARMA models, consistent estimates are obtained by the iterated least squares recursion formula, which is initiated by the pure autoregressive estimates:

\hat{{\phi}}^{(m,j)}_{i} 
= \hat{{\phi}}^{(m+1,j-1)}_{i} 
- \hat{{\phi}}^{(m,j-1)}_{i-1}
\frac{\hat{{\phi}}^{(m+1,j-1)}_{m+1}}{\hat{{\phi}}^{(m,j-1)}_{m} }

The jth lag of the sample autocorrelation function of the filtered series, w(m,j)t , is the extended sample autocorrelation function, and it is denoted as rj(m) = rj(w(m,j)) .

The standard errors of rj(m) are computed in the usual way using Bartlett's approximation of the variance of the sample autocorrelation function, var(r_{j(m)}) \approx (1 + {\sum}^{j-1}_{t=1}r^2_{j}(w^{(m,j)})).

If the true model is an ARMA (p+d, q) process, the filtered series, w(m,j)t , follows an MA(q) model for {j {\geq} q } so that

r_{j(p+d)} \approx 0 \hspace*{1em} j \gt q
r_{j(p+d)} \neq 0 \hspace*{1em} j = q

Additionally, Tsay and Tiao (1984) show that the extended sample autocorrelation satisfies

r_{j(m)} \approx 0 \hspace*{9.75em} j-q \gt m-p-d \leq 0
r_{j(m)} \neq c(m-p-d,j-q) \hspace*{2em} 0 \leq j-q \leq m-p-d

where c(m-p-d, j-q) is a nonzero constant or a continuous random variable bounded by -1 and 1.

An ESACF table is then constructed using the rj(m) for m = pmin, ... , pmax and j = qmin+1, ... , qmax+1 to identify the ARMA orders (see Table 7.3). The orders are tentatively identified by finding a right (maximal) triangular pattern with vertices located at (p+d, q) and (p+d, qmax) and in which all elements are insignificant (based on asymptotic normality of the autocorrelation function). The vertex (p+d, q) identifies the order. Table 7.4 depicts the theoretical pattern associated with an ARMA(1,2) series.

Table 7.3: ESACF Table
  MA
AR0123··
0r1(0)r2(0)r3(0)r4(0)··
1r1(1)r2(1)r3(1)r4(1)··
2r1(2)r2(2)r3(2)r4(2)··
3r1(3)r2(3)r3(3)r4(3)··
·······
·······

Table 7.4: Theoretical ESACF Table for an ARMA(1,2) Series
  MA
AR01234567
0*XXXXXXX
1*X000000
2*XX00000
3*XXX0000
4*XXXX000
 X = significant terms
 0 = insignificant terms
 * = no pattern

Chapter Contents
Chapter Contents
Previous
Previous
Next
Next
Top
Top

Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.