800
Views
22
CrossRef citations to date
0
Altmetric
ARTICLES: Sparsity

Positive Semidefinite Rank-Based Correlation Matrix Estimation With Application to Semiparametric Graph Estimation

Pages 895-922 | Received 01 Jan 2013, Published online: 20 Oct 2014

References

  • Agarwal, A., Negahban, S., and Wainwright, M.J. (2012), “Fast Global Convergence of Gradient Methods for High-Dimensional Statistical Recovery,” The Annals of Statistics, 40, 2452–2482.
  • Banerjee, O., Ghaoui, L.E., and d’Aspremont, A. (2008), “Model Selection Through Sparse Maximum Likelihood Estimation,” Journal of Machine Learning Research, 9, 485–516.
  • Beck, A., and Teboulle, M. (2009), “A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems,” SIAM Journal on Imaging Sciences, 2, 183–202.
  • Blei, D., and Lafferty, J. (2007), “A Correlated Topic Model of Science,” Annals of Applied Statistics, 1, 17–35.
  • Cai, T., Liu, W., and Luo, X. (2011), “A Constrained ℓ1 Minimization Approach to Sparse Precision Matrix Estimation,” Journal of the American Statistical Association, 106, 594–607.
  • Chen, X., Lin, Q., Kim, S., Carbonell, J., and Xing, E. (2012), “A Smoothing Proximal Gradient Method for General Structured Sparse Regression,” Annals of Applied Statistics, 6, 719–752.
  • Dempster, A. (1972), “Covariance Selection,” Biometrics, 28, 157–175.
  • Friedman, J., Hofling, H., and Tibshirani, R. (2007), “Pathwise Coordinate Optimization,” Annals of Applied Statistics, 1, 302–332.
  • Guo, Y., Hastie, T., and Tibshirani, R. (2007), “Regularized Linear Discriminant Analysis and its Application in Microarrays,” Biostatistics, 8, 86–100.
  • Han, F., Zhao, T., and Liu, H. (2013), “Coda: High Dimensional Copula Discriminant Analysis,” Journal of Machine Learning Research, 14, 629–671.
  • Hoerl, A.E., and Kennard, R.W. (1970), “Ridge Regression: Biased Estimation for Nonorthogonal Problems,” Technometrics, 12, 55–67.
  • Honorio, J., Ortiz, L., Samaras, D., Paragios, N., and Goldstein, R. (2009), “Sparse and Locally Constant Gaussian Graphical Models,” Proceedings of the Advances in Neural Information Processing Systems Conference, pp. 745–753.
  • Jalali, A., Johnson, C., and Ravikumar, P. (2012), “High-Dimensional Sparse Inverse Covariance Estimation Using Greedy Methods,” Proceedings of the 15th International Conference on Artificial Intelligence and Statistics, pp. 574–582.
  • Ji, S., and Ye, J. (2009), “An Accelerated Gradient Method for Trace Norm Minimization,” Proceedings of the 26th Annual International Conference on Machine Learning, pp. 457–464.
  • Klaassen, C., and Wellner, J. (1997), “Efficient Estimation in the Bivariate Normal Copula Model: Normal Margins are Least-Favorable,” Bernoulli, 3, 55–77.
  • Lam, C., and Fan, J. (2009), “Sparsistency and Rates of Convergence in Large Covariance Matrix Estimation,” The Annals of Statistics, 37, 42–54.
  • Lauritzen, S. (1996), Graphical Models (Vol. 17), New York: Oxford University Press.
  • Li, H., and Gui, J. (2006), “Gradient Directed Regularization for Sparse Gaussian Concentration Graphs, With Applications to Inference of Genetic Networks,” Biostatistics, 7, 302–317.
  • Liu, H., Han, F., Yuan, M., Lafferty, J., and Wasserman, L. (2012), “High Dimensional Semiparametric Gaussian Copula Graphical Models,” The Annals of Statistics, 40, 2293–2326.
  • Liu, H., Lafferty, J., and Wasserman, L. (2009), “The Nonparanormal: Semiparametric Estimation of High Dimensional Undirected Graphs,” Journal of Machine Learning Research, 10, 2295–2328.
  • Liu, H., Roeder, K., and Wasserman, L. (2010), “Stability Approach to Regularization Selection for High Dimensional Graphical Models,” Proceedings of the Advances in Neural Information Processing Systems Conference, pp. 1432–1440.
  • Mai, Q., Zou, H., and Yuan, M. (2012), “A Direct Approach to Sparse Discriminant Analysis in Ultra-High Dimensions,” Biometrika, 99, 29–42.
  • Meinshausen, N., and Bühlmann, P. (2006), “High Dimensional Graphs and Variable Selection With the Lasso,” The Annals of Statistics, 34, 1436–1462.
  • ——— (2010), “Stability Selection,” Journal of the Royal Statistical Society, Series B, 72, 417–473.
  • Nesterov, Y. (1988), “On an Approach to the Construction of Optimal Methods of Smooth Convex Functions,” Ékonomika i Matematicheskie Metody, 24, 509–517.
  • ——— (2005), “Smooth Minimization of Non-Smooth Functions,” Mathematical Programming, 103, 127–152.
  • Peng, J., Wang, P., Zhou, N., and Zhu, J. (2009), “Partial Correlation Estimation by Joint Sparse Regression Models,” Journal of the American Statistical Association, 104, 735–746.
  • Ravikumar, P., Wainwright, M., Raskutti, G., and Yu, B. (2011), “High-Dimensional Covariance Estimation by Minimizing ℓ1-Penalized Log-Determinant Divergence,” Electronic Journal of Statistics, 5, 935–980.
  • Rousseeuw, P., and Molenberghs, G. (1993), “Transformation of Nonpositive Semidefinite Correlation Matrices,” Communications in Statistics—Theory and Methods, 22, 965–984.
  • Shojaie, A., and Michailidis, G. (2010), “Penalized Likelihood Methods for Estimation of Sparse High-Dimensional Directed Acyclic Graphs,” Biometrika, 97, 519–538.
  • Sun, H., and Li, H. (2012), “Robust Gaussian Graphical Modeling Via ℓ1 Penalization,” Biometrics, 68, 1197–1206.
  • Sun, T., and Zhang, C.-H. (2012), “Sparse Matrix Inversion With Scaled Lasso,” Technical Report, Department of Statistics, Rutgers University.
  • Tsukahara, H. (2005), “Semiparametric Estimation in Copula Models,” Canadian Journal of Statistics, 33, 357–375.
  • Wainwright, M. (2009), “Sharp Thresholds for Highdimensional and Noisy Sparsity Recovery Using ℓ1Constrained Quadratic Programming,” IEEE Transactions on Information Theory, 55, 2183–2201.
  • Wille, A., Zimmermann, P., Vranova, E., Frholz, A., Laule, O., Bleuler, S., Hennig, L., Prelic, A., von Rohr, P., Thiele, L., Zitzler, E., Gruissem, W., and Bühlmann, P. (2004), “Sparse Graphical Gaussian Modeling of the Isoprenoid Gene Network in Arabidopsis thaliana,” Genome Biology, 5, R92.
  • Yin, J., and Li, H. (2011), “A Sparse Conditional Gaussian Graphical Model for Analysis of Genetical Genomics Data,” Annals of Applied Statistics, 5, 2630–2650.
  • Yuan, M. (2010), “High Dimensional Inverse Covariance Matrix Estimation Via Linear Programming,” Journal of Machine Learning Research, 11, 2261–2286.
  • Yuan, M., and Lin, Y. (2007), “Model Selection and Estimation in the Gaussian Graphical Model,” Biometrika, 94, 19–35.
  • Zhao, P., and Yu, B. (2006), “On Model Selection Consistency of Lasso,” Journal of Machine Learning Research, 7, 2541–2563.
  • Zhao, T., and Liu, H. (2012), “Sparse Additive Machine,” Proceedings of the 15th International Conference on Artificial Intelligence, pp. 1435–1443.
  • ——— (2013), “Semiparametric Sparse Column Inverse Operator,” Technical Report, Department of Computer Science, Johns Hopkins University.
  • Zhao, T., Liu, H., Roeder, K., Lafferty, J., and Wasserman, L. (2012a), “The Huge Package for High-Dimensional Undirected Graph Estimation in R,” Journal of Machine Learning Research, 13, 1059–1062.
  • Zhao, T., Roeder, K., and Liu, H. (2012b), “Smooth-Projected Neighborhood Pursuit for High-Dimensional Nonparanormal Graph Estimation,” Proceedings of the Advances in Neural Information Processing Systems Conference, pp. 162–170.
  • Zou, H. (2006), “The Adaptive Lasso and its Oracle Properties,” Journal of the American Statistical Association, 101, 1418–1429.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.