ABSTRACT
In this article, we propose a new method for sufficient dimension reduction when both response and predictor are vectors. The new method, using distance covariance, keeps the model-free advantage, and can fully recover the central subspace even when many predictors are discrete. We then extend this method to the dual central subspace, including a special case of canonical correlation analysis. We illustrated estimators through extensive simulations and real datasets, and compared to some existing methods, showing that our estimators are competitive and robust.
Acknowledgments
The authors would like to thank the Editor, an Associate Editor and two referees for their valuable comments and suggestions, which lead to a greatly improved paper.
Disclosure statement
No potential conflict of interest was reported by the authors.