199
Views
4
CrossRef citations to date
0
Altmetric
Articles

Choosing the optimal hybrid covariance estimators in adaptive elastic net regression models using information complexity

Pages 2983-2996 | Received 22 Apr 2019, Accepted 20 Jul 2019, Published online: 28 Jul 2019

References

  • Dettling M, Bühlman P. Boosting for tumor classification with gene expression data. Bioinformatics. 2003;19(9):1061–1069. doi: 10.1093/bioinformatics/btf867
  • Lu Y, Han J. Cancer classification using gene expression data. Inf Syst. 2003;28:243–268. doi: 10.1016/S0306-4379(02)00072-8
  • Chiaromonte F, Martinelli J. Dimension reduction strategies for analyzing global gene expression data with a response. Math Biosci. 2002;176:123–144. doi: 10.1016/S0025-5564(01)00106-7
  • Chu F, Wang L. Application of support vector machines to cancer classification with microarray data. Int J Neural Syst. 2005;15(6):475–484. doi: 10.1142/S0129065705000396
  • Chen Y. Robust shrinkage estimation of high dimensional covariance matrices. IEEE Workshop on Sensor Array and Multichannel Signal Processing (SAM).2010. Jerusalem, Israel.
  • Hall P, Marron JS, Neeman A. Geometric representation of high dimension, low sample size data. J R Stat Soc Ser-B. 2005;67(3):427–444. doi: 10.1111/j.1467-9868.2005.00510.x
  • Liu W, Yuan K, Ye D. Reducing microarray data via non-negative matrix factorization for visualization and clustering analysis. J Biomedical Inf. 2008;41:602–606. doi: 10.1016/j.jbi.2007.12.003
  • Nyamundada G, Brennon L, Gormley IC. Probabilistic principal component analysis for metabolomic data. BMC Bioinform. 2010;1:571–582. doi: 10.1186/1471-2105-11-571
  • Rezghi M, Obulkasim A. Noise free principal component analysis: A efficient dimension reduction technique for high dimensional molecular data. Expert Sys App. 2014;41:7797–7804. doi: 10.1016/j.eswa.2014.06.024
  • Shi J, Luo Z. Non-linear dimensionality reduction of gene expression data for visualization and clustering analysis of cancer tissue samples. Comp. Biol Med. 2010;40:723–732. doi: 10.1016/j.compbiomed.2010.06.007
  • Miguel AC. Continuous latent variable models for dimensionality reduction and sequential data reconstruction [dissertation]. University of Sheffield; 2001.
  • Yuzbasi B, Arashi M. Double shrunken selection operator. Commun Stat Simul Comput. 2010: 1–9. doi:10.1080/03610918.2017.1395040.
  • Ahmed S, Yuzbasi B. Big and complex data analysis: methodologies and applications. New York: Springer; 2017.
  • Zhou H, Hastie T. Regularization and variable selection via the elastic net. J R Stat Soc Ser B. 2005;67(2):301–320. doi: 10.1111/j.1467-9868.2005.00503.x
  • Mohebbi S, Pamukcu E, Bozdogan H. A new data adaptive elastic net predictive model using hybridized smoothed covariance estimators with information complexity. J Stat Comput Simul. 2019;89(6):1060–1089. doi: 10.1080/00949655.2019.1576683
  • Shurygin A, Nauka. The linear combination of the simplest discriminator and Fisher’s one. Applied statistics. Moscow: Nauka; 1983. p. 144–158.
  • Press SJ. Estimation of a normal covariance matrix. SantaMonica (CA): RANDCorporation; 1975; https/www.rand.org/pubs/papers/P5436.html.
  • Chen MCF. Estimation of covariance matrices under a quadratic loss function. Research Report S-46, Department of Mathematics, SUNY at Albany (Island of Capri, Italy). 1976. p. 1–33.
  • Bozdogan H. Shrinkage covariance estimators; 2010. Unpublished Lecture Notes.
  • Chen YWA, Eldar YC, et al. Shrinkage algorithms for MMSE covariance estimation. IEE Trans Signal Process. 2010;58(10):5016–5029. doi: 10.1109/TSP.2010.2053029
  • Ledoit O, Wolf M. A well-conditioned estimator for large dimensional covariance matrices. J Multivar Anal. 2004;88(2):365–411. doi: 10.1016/S0047-259X(03)00096-4
  • Thomaz C. Maximum entropy covariance estimate for statistical pattern recognition [dissertation]. Imperial College, University of London, London; 2004.
  • Pamukcu E, Bozdogan H, Calik S. A novel hybrid dimension reduction technique for undersized high dimensional gene expression data sets using information complexity criterion for cancer classification. Comput Math Methods Med. 2015;2015. Article ID 370640, 14 p. doi: 10.1155/2015/370640
  • Bozdogan H, Pamukcu E. Novel dimension reduction techniques for high dimensional data using information complexity. In: Gupta A, Capponi A, editors. Optimization challenges in complex, networked, and risky systems. Maryland, USA: INFORMS; 2016. p. 140–170.
  • Tibshirani R. Regression shrinkage and selection via lasso. J R Stat Soc Ser B. 1996;58:267–288.
  • Hoerl AE, Kennard RW. Ridge regression biased estimation for nonorthogonal problems. Technometrics. 1970;12(1):55–67. doi: 10.1080/00401706.1970.10488634
  • Akaike H. Information theory and an extension of the maximum likelihood principle. In: B. N. Petrov and F. Csaki, editors. Second International Symposium on Information Theory. Academiai Kiado, Budapest; 1973. p. 267–281.
  • Schwarz G. Estimating the dimension of model. AnnStat. 1978;6:461–464.
  • Bozdogan H. Modelselectionandakaike’sinformationcriteria(AIC):thegeneraltheoryand its analytical extensions. Psychometrica. 1987;52:317–332. doi: 10.1007/BF02294361
  • Van Emden H. An analysis of complexity. 1971.
  • Bozdogan H, Haughton DM. Information complexity criteria for regression models. Comput Stat Data Anal. 1998;28:51–76. doi: 10.1016/S0167-9473(98)00025-5

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.