217
Views
10
CrossRef citations to date
0
Altmetric
Original Articles

Outlier-resistant high-dimensional regression modelling based on distribution-free outlier detection and tuning parameter selection

Pages 1799-1812 | Received 01 Aug 2016, Accepted 23 Jan 2017, Published online: 13 Feb 2017

References

  • Tibshirani R. Regression shrinkage and selection via the lasso. J Roy Stat Soc Ser B. 1996;73:273–282.
  • Zou H. The adaptive lasso and its Oracle properties. J Amer Stat Assoc. 2006;101:1418–1429.
  • Zou H, Hastie T. Regularization and variable selection via the elastic net. J R Stat Soc Ser B. 2005;67:301–320.
  • Frank IE, Friedman JH. A statistical view of some chemometrics regression tools. Technometrics. 1993;5:109–135.
  • Lambert-Lacroix S. Robust regression through the Huber's criterion and adaptive lasso penalty. Elec J Stat. 2011;5:1015–1053.
  • Alfons A, Croux C, Gelper S. Sparse least trimmed squares regression for analyzing high-dimensional large data sets. Ann Appl Stat. 2013;7:226–248.
  • Park H, Sakaori F, Konishi S. Robust sparse regression and tuning parameter selection via the efficient bootstrap information criteria. J Stat Comp Sim. 2014;84:1596–1607.
  • Park H, Konishi S. Robust coordinate descent algorithm robust solution path for high-dimensional sparse regression modeling. Comm Stat Simul Comp. 2016;45:115–129.
  • Filzmoser P, Maronna R, Werner M. Outlier identification in high dimensions. Comp Stat Data Anal. 2008;52:1694–1711.
  • Park H, Konishi S. Robust logistic regression modelling via the elastic net-type regularization and tuning parameter selection. J Stat Comp Sim. 2016;86:1450–1461.
  • Akaike H. Information theory and an extension of the maximum likelihood principle. In: Second international symposium on information theory, Petrov BN, Csaki F, editors. Budapest: Akademiai Kiado; 1973. p. 267–281.
  • Schwarz G. Estimating the dimension of a model. Ann Stat. 1978;6:461–464.
  • Wang H, Li R, Tsai C-L. Tuning parameter selectors for the smoothly clipped absolute deviation method. Biometrika. 2007;94:553–568.
  • Konishi S, Kitagawa G. Generalised information criteria in model selection. Biometrika. 1996;83:875–890.
  • Park H, Konishi S. Principal component selection via adaptive regularization method and generalized information criterion. Stat Papers. 2015. doi:10.1007/s00362-015-0691-1
  • Hoerl AE, Kennard RW. Ridge regression: biased estimation for nonorthogonal problems. Technometrics. 1970;12:55–67.
  • Konishi S, Kitagawa G. Information criteria and statistical modeling. New York: Springer; 2008.
  • Fan J, Li R. Variable selection via nonconcave penalized likelihood and its Oracle properties. J Amer Stat Assoc. 2001;96:1348–1360.
  • Khan JA, Van Aelst S, Zamar RH. Robust linear model selection based on least angle regression. J Amer Stat Assoc. 2007;102:1289–1299.
  • Friedman J, Hastie T, Tibshirani R. ‘glmnet: Lasso and elastic-net regularized generalized linear models. R package version 1.1-4’. Available from: http://CRAN.R-project.org/package=glmnet, 2009

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.