Abstract
In this paper, we investigate the selecting performances of a bootstrapped version of the Akaike information criterion for nonlinear self-exciting threshold autoregressive-type data generating processes. Empirical results will be obtained via Monte Carlo simulations. The quality of our method is assessed by comparison with its non-bootstrap counterpart and through a novel procedure based on artificial neural networks.
Notes
1. For an overview on the bootstrap for time-dependent data, see Citation12Citation46Citation47. In particular, Härdle et al. Citation47 deals with the higher order performances of the different schemes.
2. The expected block size is a function of the following three parameters: ,
,
where with R(ξ) the autocovariance function, estimated at the lag ξ=1, 2, …Ξ, is denoted. The estimation of these parameters involves the choice of the spectral bandwidth, say, Ψ. Politis and Romano Citation48 suggest considering the smallest integer, say ψˆ, after which the correlogram shows correlations not significantly different from 0, that is, R(ξ)∼eq 0 for ξ>0. After determining ψˆ the recommendation is to take Ψ=2ψˆ.
3. All the simulations have been implemented using the software R (8.1 version) and performed by employing hardware resources of the University of California, San Diego. In particular, we made use of the computer EULER (maintained by the Mathematical Department) and the supercomputer IBM–TERAGRID.
4. The number of bootstrap replications has been set to 125 as empirical evidences showed it was the best compromise between performances yielded by the method and computational time.
5. The Matlab code is available from his website http://fmg.lse.ac.uk/patton/ code.html.
6. The percentage gain is computed as follows: 100*[f*(p1, p2)−f(p1, p2)]/f(p1, p2), where, with f*(p1, p2), the frequency of selection of the correct model achieved in the bootstrap world is denoted.