References
- Carvalho CM, Chang J, Lucas JE, et al. High-dimensional sparse factor modeling: applications in gene expression genomics. J Amer Statist Assoc. 2012;103(484):1438–1456. doi: 10.1198/016214508000000869
- Hoerl AE, Kennard RW. Ridge regression: biased estimation for nonorthogonal problems. Technometrics. 1970;12(1):55–67. doi: 10.1080/00401706.1970.10488634
- Tibshirani R. Regression shrinkage and selection via the lasso. J R Stat Soc Ser B (Methodol). 1996;58(1):267–288.
- Efron B, Hastie T, Johnstone I, et al. Least angle regression. Ann Stat. 2004;32(2):407–499. doi: 10.1214/009053604000000067
- Candes E, Tao T. The dantzig selector: statistical estimation when p is much larger than n. Ann Stat. 2007;35(6):2313–2351. doi: 10.1214/009053606000001523
- Zou H, Hastie T. Regularization and variable selection via the elastic net. J R Stat Soc Ser B (Stat Methodol). 2005;67(2):301–320. doi: 10.1111/j.1467-9868.2005.00503.x
- Zou H. The adaptive lasso and its oracle properties. J Amer Statist Assoc. 2006;101(476):1418–1429. doi: 10.1198/016214506000000735
- Aldahmani S, Dai H, Zhang Q-Z. Hybrid graphical least square estimation and its application in portfolio selection. Stat Interface. 2019;12:631–645. doi: 10.4310/SII.2019.v12.n4.a11
- Fan J, Li R. Variable selection via nonconcave penalized likelihood and its oracle properties. J Amer Statist Assoc. 2001;96(456):1348–1360. doi: 10.1198/016214501753382273
- Picard RR, Cook RD. Cross-validation of regression models. J Amer Statist Assoc. 1984;79(387):575–583. doi: 10.1080/01621459.1984.10478083
- Hoerl AE, Kannard RW, Baldwin KF. Ridge regression: some simulations. Commun Statist Theory Methods. 1975;4(2):105–123.
- Haq M, Kibria B. A shrinkage estimator for the restricted linear regression model: ridge regression approach. J Appl Stat Sci. 1996;3(4):301–316.
- Meijer RJ, Goeman JJ. Efficient approximate k-fold and leave-one-out cross-validation for ridge regression. Biom J. 2013;55(2):141–155. doi: 10.1002/bimj.201200088
- Lauritzen SL. Graphical models. Oxford: Oxford University Press; 1996.
- Golumbic MC. Algorithmic graph theory and perfect graphs. Vol. 57. Amsterdam: Elsevier; 2004.
- Højsgaard S, Edwards D, Lauritzen S. Graphical models with R. Boston (MA): Springer Science & Business Media; 2012.
- Friedman J, Hastie T, Tibshirani R. Sparse inverse covariance estimation with the graphical lasso. Biostatistics. 2008;9(3):432–441. doi: 10.1093/biostatistics/kxm045
- Rothman AJ, Bickel PJ, Levina E, et al. Sparse permutation invariant covariance estimation. Electron J Stat. 2008;2:494–515. doi: 10.1214/08-EJS176
- Scheinberg K, Ma S, Goldfarb D. Sparse inverse covariance selection via alternating linearization methods. In Advances in neural information processing systems. 2010. p. 2101–2109.
- Yuan M. Efficient computation of l1 regularized estimates in gaussian graphical models. J Comput Graph Stat. 2008;17(4):809–826. doi: 10.1198/106186008X382692
- Baba K, Shibata R, Sibuya M. Partial correlation and conditional correlation as measures of conditional independence. Aust N Z J Stat. 2004;46(4):657–664. doi: 10.1111/j.1467-842X.2004.00360.x
- Scheetz TE, Kim K-YA, Swiderski RE, et al. Regulation of gene expression in the mammalian eye and its relevance to eye disease. Proc Nat Acad Sci. 2006;103(39):14429–14434. doi: 10.1073/pnas.0602562103
- Bolstad BM, Irizarry RA, Åstrand M, et al. A comparison of normalization methods for high density oligonucleotide array data based on variance and bias. Bioinformatics. 2003;19(2):185–193. doi: 10.1093/bioinformatics/19.2.185
- Huang J, Ma S, Zhang C-H. Adaptive lasso for sparse high-dimensional regression models. Statist Sinica. 2008;18:1603–1618.