136
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Nonparametric discriminant analysis with network structures in predictor

ORCID Icon
Pages 3836-3861 | Received 28 Jun 2021, Accepted 28 May 2022, Published online: 16 Jun 2022

References

  • Bagirov AM, Ferguson B, Ivkovic S, et al. New algorithms for multi-class cancer diagnosis using tumor gene expression signatures. Bioinformatics. 2003;19:1800–1807.
  • Bicciato S, Luchini A, Bello CD. PCA disjoint models for multiclass cancer analysis using gene expression data. Bioinformatics. 2003;19:571–578.
  • Liu JJ, Cutler G, Li W, et al. Multiclass cancer classification and biomarker discovery using GA-based algorithms. Bioinformatics. 2005;21:2691–2697.
  • Clemmensen L, Hastie T, Witten D, et al. Sparse discriminant analysis. Technometrics. 2011;53:406–413.
  • Witten DM, Tibshirani R. Penalized classification using Fisher's linear discriminant. J R Stat Soc Series B. 2011;73:753–772.
  • Ramey JA, Stein CK, Young PD, et al. High-dimensional regularized discriminant analysis. 2017. arXiv:1602.01182.
  • Bouveyron C, Girard S, Schmid C. High-dimensional discriminant analysis. Commun Stat Theory Methods. 2007;36:2607–2623.
  • Kolar M, Liu H. Optimal feature selection in high-dimensional discriminant analysis. IEEE Trans Inf Theory. 2015;61:1063–1083.
  • Mai Q, Zou H, Yuan M. A direct approach to sparse discriminant analysis in ultra-high dimensions. Biometrika. 2012;99:29–42.
  • Cai TT, Zhang L. High dimensional linear discriminant analysis: optimality, adaptive algorithm and missing data. J R Stat Soc Series B. 2019;81:675–705.
  • Sifaou H, Kammoun A, Alouini M-S. High-dimensional linear discriminant analysis classifier for spiked covariance model. J Mach Learn Res. 2020;21:1–24.
  • Fan J, Li R, Zhang C-H, et al. Statistical foundations of data science. New York: CRC Press; 2021.
  • Ye J, Ji S, Chen J. Multi-class discriminant kernel learning via convex programming. J Mach Learn Res. 2008;9:719–758.
  • Hidaka A, Kurita T. Nonlinear discriminant analysis based on probability estimation by Gaussian mixture model. In: Frnti P., Brown G., Loog M., Escolano F., Pelillo M. editors Structural, Syntactic, and Statistical Pattern Recognition. S+SSPR 2014; Berlin, Heidelberg: Springer; 2014. p. 133–142. (Lecture Notes in Computer Science; vol. 8621).
  • George CM, Chijioke JN. Discriminant analysis with mixed non normal variables. Commun Stat Theory Methods. 2021. doi:10.1080/03610926.2021.1908563.
  • Chen L-P. Multiclassification to gene expression data with some complex features. Biostat Biom Open Access J. 2018;9:Article ID 555751.
  • Cai W, Guan G, Pan R, et al. Network linear discriminant analysis. Comput Stat Data Anal. 2018;117:32–44.
  • Chen L-P, Yi GY, Zhang Q, et al. Multiclass analysis and prediction with network structured covariates. J Stat Distrib Appl. 2019;6(6). doi:10.1186/s40488-019-0094-2.
  • He W, Yi GY, Chen L-P. Support vector machine with graphical network structures in features. Proceedings, Machine Learning and Data Mining in Pattern Recognition, 15th International Conference on Machine Learning and Data Mining, MLDM 2019, vol.II, New York, NY, USA, ibai-publishing, 557–570; 2019.
  • Chen L-P. Network-based discriminant analysis for multiclassification. J Classif. 2022;to appear:1–22 doi:10.1007/s00357-022-09414-y.
  • Bielza C, Li G, Larrañaga P. Multi-dimensional classification with Bayesian networks. Int J Approx Reason. 2011;52:705–727.
  • Miguel Hernández-Lobato J, Hernández-Lobato D, Suárez A. Network-based sparse Bayesian classification. Pattern Recognit. 2011;44:886–900.
  • Baladanddayuthapani V, Talluri R, Ji Y, et al. Bayesian sparse graphical models for classification with application to protein expression data. Ann Appl Stat. 2014;8:1443–1468.
  • Peterson CB, Stingo FC, Vannucci M. Joint Bayesian variable and graph selection for regression models with network-structured predictors. Stat Med. 2015;35:1017–1031.
  • Hastie T, Tibshirani R, Friedman J. The elements of statistical learning: data mining, inference, and prediction. New York: Springer; 2008.
  • James G, Witten D, Hastie T, et al. An introduction to statistical learning: with applications in R hardcover. New York: Springer; 2017.
  • Walpole R, Myers R, Myers S, et al. Probability & statistics for engineers & scientists. New York: Pearson; 2016.
  • Yang E, Ravikumar P, Allen GI, et al. Graphical models via univariate exponential family distribution. J Mach Learn Res. 2015;16:3813–3847.
  • Hastie T, Tibshirani R, Wainwright M. Statistical learning with sparsity: the lasso and generalizations. New York: CRC press; 2015.
  • Ravikumar P, Wainwright MJ, Lafferty J. High-dimensional Ising model selection using ℓ1-regularized logistic regression. Ann Stat. 2010;38:1287–1319.
  • Meinshausen N, Bühlmann P. High-dimensional graphs and variable selection with the lasso. Ann Stat. 2006;34:1436–1462.
  • Zou H. The adaptive lasso and its oracle properties. J Am Stat Assoc. 2006;101:1418–1429.
  • Fan J, Li R. Variable selection via nonconcave penalized likelihood and its oracle properties. J Am Stat Assoc. 2001;96:1348–1360.
  • Tibshirani R. Regression shrinkage and selection via the lasso. J R Stat Soc Series B. 1996;58:267–288.
  • Zhao T, Liu H, Roeder K, et al. The huge package for high-dimensional undirected graph estimation in R. J Mach Learn Res. 2012;13:1059–1062.
  • Wan Y-W, Allen GI, Baker Y, et al. XMRF: an R package to fit Markov networks to high-throughput genetics data. BMC Syst Biol. 2016;10(Suppl 3):Article ID 69.
  • Wang H, Li R, Tsai C. Tuning parameter selectors for the smoothly clipped absolute deviation method. Biometrika. 2007;94:553–568.
  • Chen L-P, Yi GY. Analysis of noisy survival data with graphical proportional hazards measurement error models. Biometrics. 2021;77:956–969.
  • Schwarz G. Estimating the dimension of model. Ann Stat. 1978;6:461–464.
  • Hubert L, Arabie P. Comparing partitions. J Classif. 1985;2:193–218.
  • Chen L-P. Investigating effects of bandwidth selection in local polynomial regression model with applications. Model Assist Stat Appl. 2019;14:31–45.
  • Chatterjee S, Diaconis P. Estimating and understanding exponential random graph models. Ann Stat. 2013;41:2428–2461.
  • Chatterjee S, Diaconis P, Sly A. Random graphs with a given degree sequence. Ann Appl Probab. 2011;21:1400–1435.
  • Yan T, Xu J. A central limit theorem in the β-model for undirected random graphs with a diverging number of vertices. Biometrika. 2013;100:519–524.
  • Snijders TAB. Markov chain Monte Carlo estimation of exponential random graph models. J Soc Struct. 2002;3(2):1–40.
  • Handcock MS. Assessing degeneracy in statistical models of social networks. Working Paper no. 39, Center for Statistics and the Social Sciences, University of Washington; 2003. Available online at https://csss.uw.edu/Papers/wp39.pdf.
  • Hunter DR, Handcock MS. Inference in curved exponential family models for networks. J Comput Graph Stat. 2006;15:565–583.
  • Yan T, Leng C, Zhu J. Asymptotics in directed exponential random graph models with an increasing bi-degree sequence. Ann Stat. 2016;44:31–57.
  • Hillar C, Wibisono A. Maximum entropy distributions on graphs. 2013. arXiv preprint arXiv:1301.3321.
  • Yan T, Zhao Y, Qin H. Asymptotic normality in the maximum entropy distributions on graphs with an growing number of parameters. J Multivar Anal. 2015;133:61–76.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.