Abstract
We investigate the nonparametric additive regression estimation in random design and long-memory errors and construct adaptive thresholding estimators based on wavelet series. The proposed approach achieves asymptotically near-optimal convergence rates when the unknown function and its univariate additive components belong to Besov space. We consider the problem under two noise structures; (1) homoskedastic Gaussian long memory errors and (2) heteroskedastic Gaussian long memory errors. In the homoskedastic long-memory error case, the estimator is completely adaptive with respect to the long-memory parameter. In the heteroskedastic long-memory case, the estimator may not be adaptive with respect to the long-memory parameter unless the heteroskedasticity is of polynomial form. In either case, the convergence rates depend on the long-memory parameter only when long-memory is strong enough, otherwise, the rates are identical to those under i.i.d. errors. In addition, convergence rates are free from the curse of dimensionality.
Acknowledgments
We thank the Editor, the associate Editor and an anonymous referee for their valuable comments and suggestions which have lead to an improved version of the paper. We also thank Dr. German A. Schnaidt Grez and Dr. Brani Vidakovic for sharing their MATLAB codes on which we built our estimation algorithm.
Disclosure statement
No potential conflict of interest was reported by the author(s).