395
Views
6
CrossRef citations to date
0
Altmetric
Variable Selection

A Confidence Region Approach to Tuning for Variable Selection

Pages 295-314 | Received 01 Oct 2010, Published online: 14 Jun 2012

References

  • Akaike, H. (1973), “Information Theory and an Extension of the Maximum Likelihood Principle,” in Second International Symposium on Information Theory, eds. B. Petrov and F. Csaki, Budapest: Academiai Kiado.
  • Bondell, H.D., and Reich, B.J. (2008), “Simultaneous Regression Shrinkage, Variable Selection and Clustering of Predictors With OSCAR,” Biometrics, 64, 115–123.
  • Breiman, L., and Spector, P. (1992), “Submodel Selection and Evaluation in Regression. The X-Random Case,” International Statistical Review, 60, 291–319.
  • Candes, E., and Tao, T. (2007), “The Dantzig Selector: Statistical Estimation When p is Much Larger Than n,” The Annals of Statistics, 35, 2313–2351.
  • Fan, J., and Li, R. (2001), “Variable Selection via Nonconcave Penalized Likelihood and Its Oracle Properties,” Journal of the American Statistical Association, 96, 1348–1360.
  • Fan, J., and Lv, J. (2008), “Sure Independence Screening for Ultra-High Dimensional Feature Space” (with Discussion), Journal of the Royal Statistical Society, Series B, 70, 849–911.
  • Goeman, J. J. (2007), “Penalized: L1 (lasso) and L2 (ridge) Penalized Estimation in GLMs and in the Cox Model.” Available at http://cran.r-project.org/web/packages/penalized/index.html.
  • Hastie, T., Tibshirani, R., and Freidman, J. (2001), The Elements of Statistical Learning: Data Mining, Inference and Prediction, New York: Springer.
  • James, G., Radchenko, P., and Dasso, J.L. (2009), “Connections Between the Dantzig Selector and Lasso,” Journal of The Royal Statistical Society, Series B, 12, 127–142.
  • Konishi, S., and Kitagawa, G. (1996), “Generalised Information Criteria in Model Selection,” Biometrika, 83, 875–890.
  • Leng, C., Li, Y., and Wahba, G. (2006), “A Note on the Lasso and Related Procedures in Model Selection,” Statistica Sinica, 16, 1273–1284.
  • Mallows, C.L. (1973), “Some Comments on CP,” Technometrics, 15, 661–675.
  • Meinshausen, N. (2007), “Relaxed Lasso,” Computational Statistics and Data Analysis, 52, 374–393.
  • Meinshausen, N., and Buhlmann, P. (2006), “High Dimensional Graphs and Variable Selection With Lasso,” The Annals of Statistics, 34, 1436–1462.
  • Nishii, R. (1984), “Asymptotic Properties of Criteria for Selection of Variables in Multiple Regression,” Annals of Statistics, 12, 758–765.
  • Park, M. Y., and Hastie, T. (2007), “L1 Regularization Path for Generalized Linear Models and Cox Proportional Hazards Model.” Available at http://cran.r-project.org/web/packages/glmpath/glmpath.pdf.
  • Park, T., and Casella, G. (2008), “The Bayesian Lasso,” Journal of the American Statistical Association, 103, 681–686.
  • Schwarz, G. (1978), “Estimating the Dimension of a Model,” The Annals of Statistics, 6, 451–464.
  • Shao, J. (1997), “An Asymptotic Theory for Linear Model Selection,” Statistica Sinica, 7, 221–264.
  • Shibata, R. (1981), “An Optimal Selection of Regression Variables,” Biometrika, 68, 45–54.
  • Stone, M. (1977), “An Asymptotic Equivalence of Choice of Model by Cross-Validation and Akaike’s Criterion,” Journal of the Royal Statistical Society, Series B, 30, 44–47.
  • Tibshirani, R. (1996), “Regression Shrinkage and Selection via the Lasso,” Journal of the Royal Statistical Society, Series B, 58, 267–288.
  • Wainwright, M.J. (2009), “Information-Theoretic Limitations on Sparsity Recovery in the High-Dimensional and Noisy Setting,” IEEE Transections and Information Theory, 55, 5728–5741.
  • Wang, H. (2009), “Forward Regression for Ultra-High Dimensional Variable Screening,” Journal of the American Statistical Association, 104, 1512–1524.
  • Wang, H., and Leng, C. (2007), “Unified LASSO Estimation by Least Squares Approximation,” Journal of the American Statistical Association, 102, 1039–1048.
  • Wang, H., Li, R., and Tsai, C.-L. (2007), “Tuning Parameter Selectors for the Smoothly Clipped Absolute Deviation Method,” Biometrika, 94, 553–568.
  • Zhao, P., and Yu, B. (2006), “On Model Selection Consistency of Lasso,” Journal of Machine Learning Research, 7, 2541–2563.
  • Zou, H. (2006), “The Adaptive Lasso and Its Oracle Properties,” Journal of the American Statistical Association, 101, 1418–1429.
  • Zou, H., and Hastie, T. (2005), “Regularization and Variable Selection via the Elastic Net,” Journal of the Royal Statistical Society, Series B, 67, 301–320.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.