205
Views
2
CrossRef citations to date
0
Altmetric
Original Articles

LASSO order selection for sparse autoregression: a bootstrap approach

&
Pages 2668-2688 | Received 03 May 2016, Accepted 09 Jun 2017, Published online: 02 Jul 2017

References

  • Sun Y. Modeling long-memory time series with sparse autoregressive processes. J Uncertain Syst. Vol.6, No.4, pp.289–298. 2012;267–288.
  • Sklarz MA, Miller NG, Gersch W. Forecasting using long-order autoregressive processes: an example using housing starts. Real Estate Econ. 1987;15(4):374–388. doi: 10.1111/1540-6229.00438
  • Feng X, Seto M, Katsuyama K. Neural dynamic modelling on earthquake magnitude series. Geophys J Int. 1997;128(3):547–556.
  • Erer I, Sarikaya K, Bozkurt H. Enhanced radar imaging via sparsity regularized 2d linear prediction. In: Signal Processing Conference (EUSIPCO), 2014 Proceedings of the 22nd European. IEEE; 2014. p. 1751–1755.
  • Silverman BW, Johnstone I. Ebayesthresh: R programs for empirical bayes thresholding. J Stat Softw. 2005;12(8).
  • Malioutov D, Çetin M, Willsky AS. A sparse signal reconstruction perspective for source localization with sensor arrays. IEEE Trans Signal Process. 2005;53(8):3010–3022. doi: 10.1109/TSP.2005.850882
  • Niedzwiecki M, Ciotek M. Elimination of clicks from archive speech signals using sparse autoregressive modeling. In: 2012 Proceedings of the 20th European signal processing conference (EUSIPCO), Trivandrum, India.
  • Deng H. Adaptive algorithms for sparse impulse response identification. Washington, DC: George Washington University; 2005.
  • Cotter SF, Rao BD. Sparse channel estimation via matching pursuit with application to equalization. IEEE Trans Commun. 2002;50(3):374–377. doi: 10.1109/26.990897
  • Forster MR. Key concepts in model selection: performance and generalizability. J Math Psychol. 2000;44(1):205–231. doi: 10.1006/jmps.1999.1284
  • Akaike H. A new look at the statistical model identification. IEEE Trans Autom Control. 1974;19(6):716–723. doi: 10.1109/TAC.1974.1100705
  • Joyce JM. Kullback–Leibler divergence. In: International Encyclopedia of statistical science. Berlin: Springer; 2011. p. 720–722.
  • Hsiao C. Autoregressive modeling and causal ordering of economic variables. Journal of Economic Dynamics and Control. 1982;4:243–259. doi: 10.1016/0165-1889(82)90015-X
  • Andersen TG. Stochastic autoregressive volatility: a framework for volatility modeling. Math Financ. 1994;4(2):75–102.
  • Toulemonde G, Guillou A, Naveau P, et al. Autoregressive models for maxima and their applications to ch4 and n2o. Environmetrics. 2010;21(2):189–207.
  • Besse PC, Cardot H, Stephenson DB. Autoregressive forecasting of some functional climatic variations. Scand J Stat. 2000;27(4):673–687.
  • Kishida K. Autoregressive model analysis and decay ratio. Ann Nucl Energy. 1990;17(3):157–160. doi: 10.1016/0306-4549(90)90094-T
  • Haimov S, Michalev M, Savchenko A, et al. Classification of radar signatures by autoregressive model fitting and cluster analysis. IEEE T Geosci Remote. 1989;27(5):606–610.
  • Steketee JA. Some geophysical applications of the elasticity theory of dislocations. Can J Phys. 1958;36(9):1168–1198. doi: 10.1139/p58-123
  • Rabiner LR, Schafer RW. Digital speech processing. The Froehlich/Kent Encycl Telecommun. 2011;6:237–258.
  • Vaz F, Guedes de Oliveira P, Principe J. A study on the best order for autoregressive EEG modelling. Int J BioMed Comput. 1987;20(1–2):41–50.
  • Pinna GD, Maestri R, Di Cesare A. On the use of the Akaike information criterion in ar spectral analysis of cardiovascular variability signals: a case report study. In: Computers in cardiology, Proceedings. IEEE; 1993. p. 471–474.
  • Franses PH, Van Dijk D. The forecasting performance of various models for seasonality and nonlinearity for quarterly industrial production. Int J Forecasting. 2005;21(1):87–102.
  • Box GEP, Jenkins GM, Reinsel GC, et al. Time series analysis: forecasting and control. Wiley; 2015.
  • Burnham KP, Anderson DR. Model selection and multimodal inference: a practical information-theoretic approach. 2nd ed. New York: Springer-Verlag; 2002.
  • Mallows CL. Some comments on c p. Technometrics. 1973;15(4):661–675.
  • Kass RE, Raftery AE. Bayes factors. J Am Stat Assoc. 1995;90(430):773–795. doi: 10.1080/01621459.1995.10476572
  • Hannan EJ, Quinn BG. The determination of the order of an autoregression. J R Statist Soc Ser B (Methodol). 1979;190–195.
  • Akaike H. Fitting autoregressive models for prediction. Ann Inst Stat Math. 1969;21(1):243–247. doi: 10.1007/BF02532251
  • Bozdogan H. Akaike's information criterion and recent developments in information complexity. J Math Psychol. 62–91.
  • Broersen PMT. Automatic spectral analysis with time series models. IEEE Trans Instrum Meas. 2002;51(2):211–216. doi: 10.1109/19.997814
  • Billings SA, Wei HL. An adaptive orthogonal search algorithm for model subset selection and non-linear system identification. Int J Control. 2008;81(5):714–724.
  • Fenga L, Politis DN. Bootstrap-based arma order selection. J Stat Comput Simul. 2011;81(7):799–814. doi: 10.1080/00949650903484166
  • Acosta-González E, Fernández-Rodríguez F. Model selection via genetic algorithms illustrated with cross-country growth data. Empir Econ. 2007;33(2):313–337. doi: 10.1007/s00181-006-0104-3
  • Bozdogan H. Akaike's information criterion and recent developments in information complexity. J Math Psychol. 2000;44(1):62–91. doi: 10.1006/jmps.1999.1277
  • Li R, Cui H. Variable selection via regularization. Hoboken, NJ: Wiley Online Library; 1985.
  • Korenberg MJ, Paarmann LD. Orthogonal approaches to time-series analysis and system identification. IEEE Signal Process Mag. 1991;8(3):29–43. doi: 10.1109/79.127999
  • Korenberg MJ, Paarmann LD. An orthogonal arma identifier with automatic order estimation for biological modeling. Ann Biomed Eng. 1989;17(6):571–592. doi: 10.1007/BF02367464
  • Potscher BM. Model selection under nonstationarity: autoregressive models and stochastic linear regression models. Ann Stat. 1989;1257–1274. doi: 10.1214/aos/1176347267
  • Hurvich CM, Tsai C-L. Regression and time series model selection in small samples. Biometrika. 1989;76(2):297–307. doi: 10.1093/biomet/76.2.297
  • Bhansali RJ, Downham DY. Some properties of the order of an autoregressive model selected by a generalization of akaike's fpe criterion. Biometrika. 1977;64(3):547–551.
  • Shibata R. Selection of the number of regression variables; a minimax choice of generalized fpe. Ann Inst Stat Math. 1986;38(1):459–474. doi: 10.1007/BF02482533
  • Venter JH, Steel SJ. Some contributions to selection and estimation in the normal linear model. Ann Inst Stat Math. 1992;44(2):281–297. doi: 10.1007/BF00058641
  • Claeskens G, Hjort NL. Model selection and model averaging, Vol. 330. Cambridge: Cambridge University Press; 2008.
  • Konishi S, Kitagawa G. Generalised information criteria in model selection. Biometrika. 1996;83(4):875–890. doi: 10.1093/biomet/83.4.875
  • Nishii R. Maximum likelihood principle and model selection when the true model is unspecified. J Multivar Anal. 1988;27(2):392–403. doi: 10.1016/0047-259X(88)90137-6
  • Woodroofe M. On model selection and the arc sine laws. Ann Stat. 1982;1182–1194. doi: 10.1214/aos/1176345983
  • Bozdogan H. Model selection and Akaike's information criterion (AIC): the general theory and its analytical extensions. Psychometrika. 1987;52(3):345–370.
  • IanMcLeod A, Zhang Y. Improved subset autoregression: with r package. J Stat Softw. 2008;28(2):1–28.
  • Nardi Y, Rinaldo A. Autoregressive process modeling via the lasso procedure. J Multivar Anal. 2011;102(3):528–549. doi: 10.1016/j.jmva.2010.10.012
  • Schmidt DF, Makalic E. Estimation of stationary autoregressive models with the bayesian lasso. J Time Ser Anal. 2013;34(5):517–531. doi: 10.1111/jtsa.12027
  • Danforth DG. An empirical investigation of sparse distributed memory using discrete speech recognition. Research Institute for Advanced Computer Science. NASA Ames Research Center; 1990.
  • Fan K-C, Wang Y-K. A genetic sparse distributed memory approach to the application of handwritten character recognition. Pattern Recogn. 1997;30(12):2015–2022. doi: 10.1016/S0031-3203(97)00017-4
  • Huang J, Ma S, Zhang C-H. Adaptive lasso for sparse high-dimensional regression models. Stat Sin. 2008;18(4):1603.
  • Akaike H. Prediction and entropy. Springer; 1998.
  • Ozaki T. On the order determination of arima models. Appl Stat. 1977;290–301. doi: 10.2307/2346970
  • Kreiss J-P. Bootstrap procedures for AR (∞)–processes. Springer; 1992.
  • Breiman L. The little bootstrap and other methods for dimensionality selection in regression: X-fixed prediction error. J Am Stat Assoc. 1992;87(419):738–754. doi: 10.1080/01621459.1992.10475276
  • Zhang P. Inference after variable selection in linear regression models. Biometrika. 1992;79(4):741–746. doi: 10.1093/biomet/79.4.741
  • Efron B, Tibshirani R. Bootstrap methods for standard errors, confidence intervals, and other measures of statistical accuracy. Stat Sci. 1986;1(1):54–75. doi: 10.1214/ss/1177013815
  • James G, Witten D, Hastie T, et al. An introduction to statistical learning. Springer; 2013.
  • Tibshirani R. Regression shrinkage and selection via the lasso. J R Statist Soc Ser B (Methodol). 1996;267–288.
  • Efron B, Hastie T, Johnstone I, et al. Least angle regression. Ann Stat. 2004;32(2):407–499. doi: 10.1214/009053604000000067
  • Hesterberg T, Choi NH, Meier L, et al. Least angle and l1 penalized regression: a review. Stat Surv. 2008;2:61–93. doi: 10.1214/08-SS035

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.