ABSTRACT
Sufficient dimension reduction (SDR) methods are popular model-free tools for preprocessing and data visualization in regression problems where the number of variables is large. Unfortunately, reduce-and-classify approaches in discriminant analysis usually cannot guarantee improvement in classification accuracy, mainly due to the different nature of the two stages. On the other hand, envelope methods construct targeted dimension reduction subspaces that achieve dimension reduction and improve parameter estimation efficiency at the same time. However, little is known about how to construct envelopes in discriminant analysis models. In this article, we introduce the notion of the envelope discriminant subspace (ENDS) as a natural inferential and estimative object in discriminant analysis that incorporates these considerations. We develop the ENDS estimators that simultaneously achieve sufficient dimension reduction and classification. Consistency and asymptotic normality of the ENDS estimators are established, where we carefully examine the asymptotic efficiency gain under the classical linear and quadratic discriminant analysis models. Simulations and real data examples show superb performance of the proposed method. Supplementary materials for this article are available online.
Supplementary Materials
Proofs, Technical Details, and Additional Numerical Results: Detailed proofs, technical details, and additional simulation studies are provided in the supplement to this article. (Pdf file).
Computer Code: MATLAB code and numerical examples to reproduce the simulation results are provided in the online supplement to this article. (Zipped file).
Acknowledgments
The authors thank the editor, the associate editor, and two referees for their constructive comments. The authors acknowledge support for this project from the National Science Foundation (grants DMS-1613154 and CCF-1617691).