158
Views
1
CrossRef citations to date
0
Altmetric
Research Article

A new type of generalized information criterion for regularization parameter selection in penalized regression with application to treatment process data

& ORCID Icon
Pages 488-512 | Received 12 Jul 2022, Accepted 13 Jun 2023, Published online: 17 Jul 2023

References

  • Andrews, D., and A. Herzberg. 1985. Prognostic variables for survival in a randomized comparison of treatments for prostatic cancer. In Data, pp. 261–274. Springer New York. doi:10.1007/978-1-4612-5098-2_47.
  • Antoniadis, A., and J. Fan. 2001. Regularization of wavelet approximations. Journal of the American Statistical Association 96 (455):939–967. doi:10.1198/016214501753208942.
  • Bernardo, J. M., A. F. Smith, J. M. Bernardo, and A. F. M. Smith. 1994. Bayesian theory. New York: John Wiley & Sons. doi:10.1002/9780470316870.
  • Breiman, L., et al. 1996. Heuristics of instability and stabilization in model selection. Annals of Statistics 24 (6):2350–2383. doi:10.1214/aos/1032181158.
  • Cawley, G. C., and N. L. Talbot. 2010. On over-fitting in model selection and subsequent selection bias in performance evaluation. Journal of Machine Learning Research 11 (7):2079–2107.
  • Demidenko, E. 2013. Mixed models: Theory and applications with r. New York: John Wiley & Sons.
  • Falk, C. F., and M. Muthukrishna 2023. Parsimony in model selection: Tools for assessing fit propensity. Psychological Methods 28 (1):123–136. doi:10.1037/met0000422.
  • Fan, J., and R. Li. 2001. Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association 96 (456):1348–1360. doi:10.1198/016214501753382273.
  • Frank, L. E., and J. H. Friedman. 1993. A statistical view of some chemometrics regression tools. Technometrics 35 (2):109–135. doi:10.1080/00401706.1993.10485033.
  • Fu, W. J. 1998. Penalized regressions: The bridge versus the lasso. Journal of Computational and Graphical Statistics 7 (3):397–416. doi:10.1080/10618600.1998.10474784.
  • Gneiting, T., and A. E. Raftery. 2007. Strictly proper scoring rules, prediction, and estimation. Journal of the American Statistical Association 102 (477):359–378. doi:10.1198/016214506000001437.
  • Kass, R. E., and A. E. Raftery. 1995. Bayes factors. Journal of the American Statistical Association 90 (430):773–795. doi:10.1080/01621459.1995.10476572.
  • Kawano, S., D. Kim, I. Hoshina, and S. Konishi. 2013. Predictive model selection criteria for Bayesian lasso. Bulletin of the Comutational Statistics of Japan 28 (1):67–82. doi:10.5183/jjscs.1501001_220.
  • Lee, S., and S. Kim. 2019. Marginalized lasso in sparse regression. Journal of the Korean Statistical Society 48 (3):396–411. doi:10.1016/j.jkss.2018.12.004.
  • Li, K. C. 1987. Asymptotic optimality for Cp, Cl, cross-validation and generalized cross-validation: Discrete index set. Annals of Statistics 15 (3):958–975. doi:10.1214/aos/1176350486.
  • Li, R., and H. Liang. 2008. Variable selection in semiparametric regression modeling. Annals of Statistics 36 (1):261. doi:10.1214/009053607000000604.
  • Lukas, M. A. 2006. Robust generalized cross-validation for choosing the regularization parameter. Inverse Problems 22 (5):1883–1902. doi:10.1088/0266-5611/22/5/021.
  • Martn, E. S., and F. Quintana. 2002. Consistency and identifiability revisited. Brazilian Journal of Probability and Statistics 16 (1):99–106.
  • Miller, A. 2002. Subset selection in regression. London: CRC Press. doi:10.1201/9781420035933.
  • Nishii, R., et al. 1984. Asymptotic properties of criteria for selection of variables in multiple regression. Annals of Statistics 12(2):758–765. doi:10.1214/aos/1176346522.
  • Park, T., and G. Casella. 2008. The Bayesian lasso. Journal of the American Statistical Association 103 (482):681–686. doi:10.1198/016214508000000337.
  • Park, M. Y., and T. Hastie. 2007. Penalized logistic regression for detecting gene interactions. Biostatistics 9 (1):30–50. doi:10.1093/biostatistics/kxm010.
  • Park, C., and Y. J. Yoon. 2011. Bridge regression: Adaptivity and group selection. Journal of Statistical Planning and Inference 141 (11):3506–3519. doi:10.1016/j.jspi.2011.05.004.
  • Piironen, J., and A. Vehtari. 2016. Comparison of bayesian predictive methods for model selection. Statistics and Computing 27 (3):1–25. doi:10.1007/s11222-016-9649-y.
  • Shao, J. 1997. An asymptotic theory for linear model selection. Statistica Sinica 7 (2):221–242.
  • Shibata, R. 1980. Asymptotically efficient selection of the order of the model for estimating parameters of a linear process. Annals of Statistics 8 (1):147–164. doi:10.1214/aos/1176344897.
  • Tabrizi, E., E. Bahrami Samani, and M. Ganjali. 2020. Identifiability of parameters in longitudinal correlated poisson and inflated beta regression model with non-ignorable missing mechanism. Statistics 54 (3):1–20. doi:10.1080/02331888.2020.1748883.
  • Tian, Y., and X. Song. 2020. Bayesian bridge-randomized penalized quantile regression. Computational Statistics & Data Analysis 144:1–26. doi:10.1016/j.csda.2019.106876.
  • Tibshirani, R. 1996. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological) 58 (1):267–288. doi:10.1111/j.2517-6161.1996.tb02080.x.
  • Tibshirani, R., M. Saunders, S. Rosset, J. Zhu, and K. Knight. 2005. Sparsity and smoothness via the fused lasso. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 67 (1):91–108. doi:10.1111/j.1467-9868.2005.00490.x.
  • Tibshirani, R., M. Wainwright, and T. Hastie. 2015. Statistical learning with sparsity: The lasso and generalizations. London: CRC Press.
  • Tsanas, A., M. Little, P. McSharry, and L. Ramig. 2009. Accurate telemonitoring of Parkinson’s disease progression by non-invasive speech tests. Nature Precedings 1–1. doi:10.1038/npre.2009.3920.1.
  • Vehtari, A., and J. Ojanen. 2012. A survey of bayesian predictive methods for model assessment, selection and comparison. Statistics Surveys 6 (none):142–228. doi:10.1214/12-SS102.
  • Vradi, E., W. Brannath, T. Jaki, and R. Vonk. 2018. Model selection based on combined penalties for biomarker identification. Journal of Biopharmaceutical Statistics 28 (4):735–749. doi:10.1080/10543406.2017.1378662.
  • Wang, H., and C. Leng. 2007. Unified lasso estimation by least squares approximation. Journal of the American Statistical Association 102 (479):1039–1048. doi:10.1198/016214507000000509.
  • Wang, H., R. Li, and C. L. Tsai. 2007. Tuning parameter selectors for the smoothly clipped absolute deviation method. Biometrika 94 (3):553–568. doi:10.1093/biomet/asm053.
  • Zhang, Y., R. Li, and C. L. Tsai. 2010. Regularization parameter selections via generalized information criterion. Journal of the American Statistical Association 105 (489):312–323. doi:10.1198/jasa.2009.tm08013.
  • Zou, H., and T. Hastie. 2005. Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 67 (2):301–320. doi:10.1111/j.1467-9868.2005.00503.x.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.