ABSTRACT
We propose a penalized likelihood method to fit the linear discriminant analysis model when the predictor is matrix valued. We simultaneously estimate the means and the precision matrix, which we assume has a Kronecker product decomposition. Our penalties encourage pairs of response category mean matrix estimators to have equal entries and also encourage zeros in the precision matrix estimator. To compute our estimators, we use a blockwise coordinate descent algorithm. To update the optimization variables corresponding to response category mean matrices, we use an alternating minimization algorithm that takes advantage of the Kronecker structure of the precision matrix. We show that our method can outperform relevant competitors in classification, even when our modeling assumptions are violated. We analyze three real datasets to demonstrate our method’s applicability. Supplementary materials, including an R package implementing our method, are available online.
Supplementary Materials
Appendix: Includes simulations comparing our method to the methods proposed by Zhong and Suslick (Citation2015) and vector-valued sparse linear discriminant methods; simulations illustrating the efficiency gained by joint estimation of the μ*j’s, Δ*, and Φ* using (Equation3
(3)
(3) ); and simulations investigating the sensitivity of (Equation3
(3)
(3) ) to the choice of weights.
Code: Includes R scripts to create the real datasets we analyze in Section 5, and to reproduce the simulation results.
MatrixLDA: An R package implementing our method, along with auxiliary functions for prediction and tuning parameter selection.
Acknowledgments
The authors thank the associate editor and referees for helpful comments.