References
- Fukumizu K, Bach FR, Jordan MI. Dimensionality reduction for supervised learning with reproducing kernel Hilbert spaces. J Mach Learn Res. 2004;5(Jan):73–99.
- Cook RD, Ni L. Sufficient dimension reduction via inverse regression: a minimum discrepancy approach. J Amer Statist Assoc. 2005;100(470):410–428.
- Li K-C. Sliced inverse regression for dimension reduction. J Amer Statist Assoc. 1991;86(414):316–327.
- Cook RD, Weisberg S. Sliced inverse regression for dimension reduction: comment. J Amer Statist Assoc. 1991;86(414):328–332.
- Cook RD. Save: a method for dimension reduction and graphics in regression. Commun Statist Theor Methods. 2000;29(9–10):2109–2121.
- Li B, Wang S. On directional regression for dimension reduction. J Amer Statist Assoc. 2007;102(479):997–1008.
- Zhu L-P, Zhu L-X. On kernel method for sliced average variance estimation. J Multivar Anal. 2007;98(5):970–991.
- Li L, Yin X. Sliced inverse regression with regularizations. Biometrics. 2008;64(1):124–131.
- Xia Y, Tong H, Li WK, et al. An adaptive estimation of dimension reduction space. J R Stat Soc Ser B (Statist Methodol). 2002;64(3):363–410.
- Li B, Dong Y. Dimension reduction for nonelliptically distributed predictors. Ann Stat. 2009;37(3):1272–1298.
- Ma Y, Zhu L. A semiparametric approach to dimension reduction. J Amer Statist Assoc. 2012;107(497):168–179.
- Tan KM, Wang Z, Zhang T, et al. A convex formulation for high-dimensional sparse sliced inverse regression. Biometrika. 2018;105(4):769–782.
- Lin Q, Zhao Z, Liu JS. Sparse sliced inverse regression via lasso. J Amer Statist Assoc. 2017;114(09):1–33.
- Fukumizu K, Leng C. Gradient-based kernel method for feature extraction and variable selection. Advances in neural information processing systems; 2012. p. 2114–2122.
- Wu Q, Mukherjee S, Liang F. Localized sliced inverse regression. Advances in neural information processing systems; 2009. p. 1785–1792.
- Kim M, Pavlovic V. Dimensionality reduction using covariance operator inverse regression. 2008 IEEE Conference on computer vision and pattern recognition; IEEE; 2008. p. 1–8.
- Ma Y, Zhu L. A review on dimension reduction. Int Stat Rev. 2013;81(1):134–150.
- Li B. Sufficient dimension reduction: methods and applications with R. Boca Raton, FL: CRC Press; 2018.
- Cook RD, Forzani LM, Tomassi DR. Ldr: A package for likelihood-based sufficient dimension reduction. J Stat Softw. 2011;39(3):1–20.
- Cook RD, Forzani L. Likelihood-based sufficient dimension reduction. J Amer Statist Assoc. 2009;104(485):197–208.
- Mao K, Liang F, Mukherjee S. Supervised dimension reduction using Bayesian mixture modeling. Proceedings of the thirteenth international conference on artificial intelligence and statistics; 2010. p. 501–508.
- Reich BJ, Bondell HD, Li L. Sufficient dimension reduction via bayesian mixture modeling. Biometrics. 2011;67(3):886–895.
- Yin X, Cook RD. Estimating central subspaces via inverse third moments. Biometrika. 2003;90(1):113–125.
- Williams CKI, Rasmussen CE. Gaussian processes for machine learning. Cambridge, MA : MIT Press; 2006.
- Tokdar ST, Zhu YM, Ghosh JK. et al. Bayesian density regression with logistic gaussian process and subspace projection. Bayesian Anal. 2010;5(2):319–344.
- McLachlan G, Peel D. Finite mixture models. Hoboken, NJ : John Wiley & Sons; 2004.
- Ferré L. Determining the dimension in sliced inverse regression and related methods. J Amer Statist Assoc. 1998;93(441):132–140.
- McDonald GC, Schwing RC. Instabilities of regression estimates relating air pollution to mortality. Technometrics. 1973;15(3):463–481.
- Chatterjee S, Hadi AS. Regression analysis by example. Hoboken, NJ : John Wiley & Sons; 2015.
- Dua D, Graff C. UCI machine learning repository; 2017.
- Cui T, Martin J, Marzouk YM, et al. Likelihood-informed dimension reduction for nonlinear inverse problems. Inverse Probl. 2014;30(11):114015.
- Solonen A, Cui T, Hakkarainen J, et al. On dimension reduction in gaussian filters. Inverse Probl. 2016;32(4):045003.
- Zahm O, Cui T, Law K, et al. Certified dimension reduction in nonlinear bayesian inverse problems. preprint 2018. arXiv:1807.03712 .
- Constantine PG. Active subspaces: emerging ideas for dimension reduction in parameter studies. Vol. 2. Philadelphia, PA: SIAM; 2015.
- Lam R, Zahm O, Marzouk Y, et al. Multifidelity dimension reduction via active subspaces. preprint 2018. arXiv:1809.05567.
- Morris MD, Mitchell TJ, Ylvisaker D. Bayesian design and analysis of computer experiments: use of derivatives in surface prediction. Technometrics. 1993;35(3):243–255.