1,595
Views
19
CrossRef citations to date
0
Altmetric
Theory and Methods

Sparse Minimum Discrepancy Approach to Sufficient Dimension Reduction with Simultaneous Variable Selection in Ultrahigh Dimension

, &
Pages 1277-1290 | Received 01 Jan 2017, Published online: 26 Oct 2018

References

  • Bickel, P. J., and Levina, E. (2008a), “Covariance Regularization by Thresholding,” The Annals of Statistics, 36, 2577–2604.
  • ——— (2008b), “Regularized Estimation of Large Covariance Matrices,” The Annals of Statistics, 36, 199–227.
  • Bickel, P. J., Ritov, Y., and Tsybakov, A. B. (2009), “Simultaneous Analysis of Lasso and Dantzig Selector,” The Annals of Statistics, 37, 1705–1732.
  • Breiman, L., Friedman, J., Stone, C. J., and Olshen, R. A. (1984), Classification and Regression Trees, Boca Raton, FL: CRC Press.
  • Bühlmann, P., and Van De Geer, S. (2011), Statistics for High-Dimensional Data: Methods, Theory and Applications, New York: Springer Science & Business Media.
  • Cai, T. T., and Zhou, H. H. (2012), “Minimax Estimation of Large Covariance Matrices Under l1-Norm,” Statistica Sinica, 22, 1319–1349.
  • Chen, L., and Huang, J. Z. (2012), “Sparse Reduced-Rank Regression for Simultaneous dimension Reduction and Variable Selection,” Journal of the American Statistical Association, 107, 1533–1545.
  • Chen, X., Zou, C., and Cook, R. D. (2010), “Coordinate-Independent Sparse Sufficient Dimension Reduction and Variable Selection,” The Annals of Statistics, 38, 3696–3723.
  • Cook, R. D. (1994), “On the Interpretation of Regression Plots,” Journal of the American Statistical Association, 89, 177–189.
  • ——— (1998), Regression Graphics: Ideas for Studying Regressions Through Graphics, New York: Wiley.
  • ——— (2004), “Testing Predictor Contributions in Sufficient Dimension Reduction,” The Annals of Statistics, 32, 1062–1092.
  • ——— (2007), “Fisher Lecture: Dimension Reduction in Regression,” Statistical Science, 22, 1–26.
  • Cook, R. D., and Forzani, L. (2008), “Principal Fitted Components for Dimension Reduction in Regression,” Statistical Science, 23, 485–501.
  • Cook, R. D., Forzani, L., and Rothman, A. J. (2012), “Estimating Sufficient Reductions of the Predictors in Abundant High-dimensional Regressions,” The Annals of Statistics, 40, 353–384.
  • Cook, R. D., and Ni, L. (2005), “Sufficient Dimension Reduction via Inverse Regression,” Journal of the American Statistical Association, 100, 410–428.
  • Cook, R. D., and Weisberg, S. (1991), “Discussion of ‘Sliced Inverse Regression for Dimension Reduction’,” Journal of the American Statistical Association, 86, 328–332.
  • Ding, S., and Cook, R. D. (2014), “Dimension Folding PCA and PFC for Matrix-Valued Predictors,” Statistica Sinica, 24, 463–492.
  • ——— (2015a), “Higher-Order Sliced Inverse Regressions,” Wiley Interdisciplinary Reviews: Computational Statistics, 7, 249–257.
  • ——— (2015b), “Tensor Sliced Inverse Regression,” Journal of Multivariate Analysis, 133, 216–231.
  • Donoho, D. L., Johnstone, I. M., Kerkyacharian, G., and Picard, D. (1995), “Wavelet Shrinkage: Asymptopia?” Journal of the Royal Statistical Society, Series B, 57, 301–369.
  • Fan, J., and Li, R. (2001), “Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties,” Journal of the American statistical Association, 96, 1348–1360.
  • Fan, J., and Lv, J. (2008), “Sure Independence Screening for Ultrahigh Dimensional Feature Space,” Journal of the Royal Statistical Society, Series B, 70, 849–911.
  • Hall, P., and Li, K.-C. (1993), “On Almost Linearity of Low Dimensional Projections from High Dimensional Data,” The Annals of Statistics, 21, 867–889.
  • Hilafu, H., and Yin, X. (2017), “Sufficient Dimension Reduction and Variable Selection for Large-p-small-n Data with Highly Correlated Predictors,” Journal of Computational and Graphical Statistics, 26, 26–34.
  • Huang, J., Ma, S., and Zhang, C.-H. (2008), “Adaptive Lasso for Sparse High-Dimensional Regression Models,” Statistica Sinica, 18, 1603–1618.
  • Karoui, N. E. (2008), “Operator norm Consistent Estimation of Large-Dimensional Sparse Covariance Matrices,” The Annals of Statistics, 36, 2717–2756.
  • Khan, J., Wei, J. S., Ringner, M., Saal, L. H., Ladanyi, M., Westermann, F., Berthold, F., Schwab, M., Antonescu, C. R., Peterson, C., and Meltzer, P. S. (2001), “Classification and Diagnostic Prediction of Cancers using gene Expression Profiling and Artificial Neural Networks,” Nature Medicine, 7, 673–679.
  • Lam, C., and Fan, J. (2009), “Sparsistency and Rates of Convergence in Large Covariance Matrix Estimation,” Annals of statistics, 37, 4254–4278.
  • Li, B., Kim, M. K., and Altman, N. (2010), “On Dimension Folding of Matrix-or Array-valued Statistical Objects,” The Annals of Statistics, 38, 1094–1121.
  • Li, B., and Wang, S. (2007), “On Directional Regression for Dimension Reduction,” Journal of the American Statistical Association, 102, 997–1008.
  • Li, K.-C. (1991), “Sliced Inverse Regression for Dimension Reduction,” Journal of the American Statistical Association, 86, 316–327.
  • Li, L. (2007), “Sparse Sufficient Dimension Reduction,” Biometrika, 94, 603–613.
  • Li, L., and Yin, X. (2008), “Sliced Inverse Regression with Regularizations,” Biometrics, 64, 124–131.
  • Li, R., Zhong, W., and Zhu, L. (2012), “Feature Screening via Distance Correlation Learning,” Journal of the American Statistical Association, 107, 1129–1139.
  • Lin, Q., Zhao, Z., and Liu, J. S. (2018), “On Consistency and Sparsity for Sliced Inverse Regression in High Dimensions,” The Annals of Statistics, 46, 580–610.
  • Ma, Y., and Zhu, L. (2012), “A Semiparametric Approach to Dimension Reduction,” Journal of the American Statistical Association, 107, 168–179.
  • Qian, W., Li, W., Sogawa, Y., Fujimaki, R., Yang, X., and Liu, J. (2018), “An Interactive Greedy Approach to Group Sparsity in High Dimension,” preprint arXiv:1707.02963.
  • Qian, W., Yang, Y., and Zou, H. (2016), “Tweedie’s Compound Poisson Model with Grouped Elastic Net,” Journal of Computational and Graphical Statistics, 25, 606–625.
  • Rothman, A. J. (2012), “Positive Definite Estimators of Large Covariance Matrices,” Biometrika, 99, 733–740.
  • Rothman, A. J., Levina, E., and Zhu, J. (2009), “Generalized Thresholding of Large Covariance Matrices,” Journal of the American Statistical Association, 104, 177–186.
  • Shao, J., Wang, Y., Deng, X., and Wang, S. (2011), “Sparse Linear Discriminant Analysis by Thresholding for High Dimensional Data,” The Annals of Statistics, 39, 1241–1265.
  • Tan, K. M., Wang, Z., Liu, H., and Zhang, T. (2018), “Sparse Generalized Eigenvalue Problem: Optimal Statistical Rates via Truncated Rayleigh Flow,” Journal of the Royal Statistical Society, Series B, accepted.
  • Tibshirani, R. (1996), “Regression Shrinkage and Selection via the Lasso,” Journal of the Royal Statistical Society, Series B, 58, 267–288.
  • Tseng, P., and Yun, S. (2009), “A Coordinate Gradient Descent Method for Nonsmooth Separable Minimization,” Mathematical Programming, 117, 387–423.
  • Wang, H., and Xia, Y. (2008), “Sliced Regression for Dimension Reduction,” Journal of the American Statistical Association, 103, 811–821.
  • Wang, T., Zhao, H., Chen, M., and Zhu, L. (2018), “Estimating a Sparse Reduction for General Regression in high Dimensions,” Statistics and Computing, 28, 33–46.
  • Wei, F., and Huang, J. (2010), “Consistent Group Selection in High-Dimensional Linear Regression,” Bernoulli, 16, 1369–1384.
  • Wu, T. T., and Lange, K. (2010), “The MM Alternative to EM,” Statistical Science, 25, 492–505.
  • Wu, Y., and Li, L. (2011), “Asymptotic Properties of Sufficient Dimension Reduction with a Diverging Number of Predictors,” Statistica Sinica, 21, 707–730.
  • Xue, L., Ma, S., and Zou, H. (2012), “Positive-Definite l1-Penalized Estimation of Large Covariance Matrices,” Journal of the American Statistical Association, 107, 1480–1491.
  • Yin, X., and Hilafu, H. (2015), “Sequential Sufficient Dimension Reduction for Large p, Small n Problems,” Journal of the Royal Statistical Society, Series B, 77, 879–892.
  • Yin, X., and Li, B. (2011), “Sufficient Dimension Reduction Based on an Ensemble of Minimum Average Variance Estimators,” The Annals of Statistics, 39, 3392–3416.
  • Yu, Z., Dong, Y., and Shao, J. (2016), “On Marginal Sliced Inverse Regression for Ultrahigh Dimensional Model-free Feature Selections,” The Annals of Statistics, 44, 2594–2623.
  • Yu, Z., Dong, Y., and Zhu, L.-X. (2016), “Trace Pursuit: A General Framework for Model-free Variable Selection,” Journal of the American Statistical Association, 111, 813–821.
  • Yu, Z., Zhu, L., Peng, H., and Zhu, L. (2013), “Dimension Reduction and Predictor Selection in Semiparametric Models,” Biometrika, 100, 641–654.
  • Yuan, M., and Lin, Y. (2006), “Model Selection and Estimation in Regression with Grouped Variables,” Journal of the Royal Statistical Society, Series B, 68, 49–67.
  • Zhang, C.-H. (2010), “Nearly Unbiased Variable Selection Under Minimax Concave Penalty,” The Annals of Statistics, 38, 894–942.
  • Zhou, S., van de Geer, S., and Bühlmann, P. (2009), “Adaptive Lasso for high Dimensional Regression and Gaussian Graphical Modeling,” preprint arXiv:0903.2515.
  • Zhu, L., Miao, B., and Peng, H. (2006), “On Sliced Inverse Regression with High-Dimensional Covariates,” Journal of the American Statistical Association, 101, 630–643.
  • Zhu, L.-P., Li, L., Li, R., and Zhu, L.-X. (2012), “Model-Free Feature Screening for Ultrahigh-Dimensional Data,” Journal of the American Statistical Association, 106, 1464–1475.
  • Zou, H. (2006), “The Adaptive Lasso and its Oracle Properties,” Journal of the American Statistical Association, 101, 1418–1429.
  • Zou, H., Hastie, T., and Tibshirani, R. (2006), “Sparse Principal Component Analysis,” Journal of Computational and Graphical Statistics, 15, 265–286.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.