References
- Fan J, Fan Y. High dimensional classification using features annealed independence rules. Ann Stat. 2008;36(6):2605–2637. doi: 10.1214/07-AOS504.
- Vapnik V. The nature of statistical learning theory. New York (NY): Springer; 1995.
- Friedman J, Hastie T, Tibshirani R. The elements of statistical learning. 2nd ed. New York (NY): Springer; 2009.
- Bradley PS, Mangasarian OL. Feature selection via concave minimization and support vector machines. In: Proceedings of the Fifteenth International Conference on Machine Learning; 1998. p. 82–90 .
- Zhu J, Rosset S, Hastie T, et al. 1-norm support vector machines. Adv Neural Inf Process Syst. 2004;16:49–56.
- Wegkamp M, Yuan M. Support vector machines with a reject option. Bernoulli. 2011;17:1368–1385. doi: 10.3150/10-BEJ320
- Zou H, Yuan M. The f-infinity norm support vector machine. Stat Sin. 2008;18:379–398.
- Wang L, Zhu J, Zou H. The doubly regularized support vector machine. Stat Sin. 2006; 16:589–615.
- Zou H. An improved 1-norm svm for simultaneous classification and variable selection. J Mach Learn Res. 2007;2:675–681.
- Zhang H, Ahn J, Lin X, et al. Gene selection using support vector machines with non-convex penalty. Bioinformatics. 2006;22:88–95. doi: 10.1093/bioinformatics/bti736
- Becker N, Toedt G, Lichter P, et al. Elastic scad as a novel penalization method for svm classification tasks in high-dimensional data. BMC Bioinform. 2011;12(1):1–13. doi: 10.1186/1471-2105-12-138
- Park C, Kim K-R, Myung R, et al. Oracle properties of scad-penalized support vector machine. J Stat Plan Inference. 2012;142:2257–2270. doi: 10.1016/j.jspi.2012.03.002
- Zhang X, Wu Y, Wang L, et al. Variable selection for support vector machines in moderately high-dimensions. J R Stat Soc B: Stat Methodol. 2016;78:53–76. doi: 10.1111/rssb.12100
- Liu X, Zhao B, He W. Simultaneous feature selection and classification for data-adaptive kernel-penalized SVM. Mathematics. 2020;8(10):1846. doi: 10.3390/math8101846
- Fan J, Lv J. Sure independence screening for ultrahigh-dimensional feature space. J R Stat Soc B: Stat Methodol. 2008;70(5):849–911. doi: 10.1111/j.1467-9868.2008.00674.x
- Fan J, Song R. Sure independence screening in generalized linear models with NP-dimensionality. Ann Stat. 2010;6:3567–3604.
- Fan J, Samworth R, Wu Y. Ultrahigh dimensional feature selection: beyond the linear model. J Mach Learn Res. 2009;10:2013–2038.
- Fan J, Feng Y, Song R. Nonparametric independence screening in sparse ultrahigh-dimensional additive models. J Am Stat Assoc. 2011;106:544–557. doi: 10.1198/jasa.2011.tm09779
- Kazemi M, Shahsavani D, Arashi M. A sure independence screening procedure for ultra-high dimensional partially linear additive models. J Appl Stat. 2019;46(8):1385–1403. doi: 10.1080/02664763.2018.1548583
- Kazemi M, Shahsavani D, Arashi M. Variable selection and structure identification for ultrahigh-dimensional partially linear additive models with application to cardiomyopathy microarray data. Stat Optim Inf Comput. 2018;6(3):373–382.
- Fan J, Ma YB, Dai W. Nonparametric independence screening in sparse ultra-high-dimensional varying coefficient models. J Am Stat Assoc. 2014;109:1270–1284. doi: 10.1080/01621459.2013.879828
- Tibshirani R. Regression shrinkage and selection via the lasso. J R Stat Soc B: Stat Methodol. 1996;58:267–288.
- Fan J, Li R. Variable selection via nonconcave penalized likelihood and its oracle properties. J Am Stat Assoc. 2001;96:1348–1360. doi: 10.1198/016214501753382273
- Zou H, Hastie T. Regularization and variable selection via the elastic net. J R Stat Soc B: Stat Methodol. 2005;67(2):301–320. doi: 10.1111/j.1467-9868.2005.00503.x
- Zou H. The adaptive lasso and its oracle properties. J Am Stat Assoc. 2006;101:1418–1429. doi: 10.1198/016214506000000735
- Froehlich H, Zell A. Efficient parameter selection for support vector machines in classification and regression via model-based global optimization. Proc Int Joint Conf Neural Networks. 2005;3:1431–1438.
- Zhu LP, Li L, Li R, et al. Model-free feature screening for ultrahigh-dimensional data. J Am Stat Assoc. 2011;106(496):1464–1475. doi: 10.1198/jasa.2011.tm10563
- Li R, Zhong W, Zhu L. Feature screening via distance correlation learning. J Am Stat Assoc. 2012;107(499):1129–1139. doi: 10.1080/01621459.2012.695654
- Liu J, Li R, Wu R. Feature selection for varying coefficient models with ultrahigh-dimensional covariates. J Am Stat Assoc. 2014;109(505):266–274. doi: 10.1080/01621459.2013.850086
- Mai Q, Zou H. The Kolmogorov filter for variable screening in high-dimensional binary classification. Biometrika. 2013;100(1):229–234. doi: 10.1093/biomet/ass062
- Cui H, Li R, Zhong W. Model-free feature screening for ultrahigh-dimensional discriminant analysis. J Am Stat Assoc. 2015;110(510):630–641. doi: 10.1080/01621459.2014.920256
- Cheng X, Wang H. A generic model-free feature screening procedure for ultra-high dimensional data with categorical response. Comput Methods Programs Biomed. 2022;229: Article id 107269.
- Pan R, Wang H, Li R. Ultrahigh-dimensional multiclass linear discriminant analysis by pairwise sure independence screening. J Am Stat Assoc. 2016;111(513):169–179.
- Cheng X, Wang H, Zhu L, Zhong W, Zhou H. MFSIS: moder-free sure independent screening procedures. R package version 0.2.0; 2022. Available from: https://CRAN.R-project.org/package=MFSIS
- Golub TR, Slonim DK, Tamayo P, et al. Molecular classification of cancer: class discovery and class prediction by expression monitoring. Science . 1999;286:531–537. doi: 10.1126/science.286.5439.531