Abstract
In time series modeling consistent criteria like Bayesian Information Criterion (BIC) outperform in terms of predictability loss-efficient criteria like Akaike Information Criterion (AIC) when data are generated by a finite-order autoregressive process, and the reverse is true when data are generated by an infinite-order autoregressive process. Since in practice we don’t know the data-generating process, it is useful to have an adaptive criterion that behaves as either a consistent or just as a loss-efficient criterion, whichever performs better. Here we derive such a criterion. Moreover, our criterion is adaptive to effective sample sizes and not sensitive to maximum a priori determined order limits.
Mathematics Subject Classification:
Notes
Since in this article we assumed that the true model structure is unknown and that autoregressive processes can be used to approximate the model structures, least squares procedures were used to estimate the regression coefficients. As pointed out by Hamilton (Citation1994) least squares procedures are preferable to maximum likelihood estimation procedures (MLE) for autoregressive processes. Moreover, in our simulations (in an R environment) we also experienced difficulties using MLE particularly when the sample sizes were small relative to the order of the AR approximation of the model structure.
For the simulated MA(1) and MA(2) structures PMSE is estimated by averaging generated by each of the 5,000 realizations, and
is the order of the model AR(
) minimizing the estimated PMSE. For the AR(1) and AR(2) structures
is the true order. The smallest PMSE is estimated by averaging
and
.
We chose a range of values for p* that would be commensurate with the sample size: for T = 250, since 10log 10(T) = 23, we let p* range from to T/6 ≈ 41; for T = 50, since 10log 10(T) = 16, we let p* range from
and 0.4T = 20.