Abstract
We investigate a robust penalized logistic regression algorithm based on a minimum distance criterion. Influential outliers are often associated with the explosion of parameter vector estimates, but in the context of standard logistic regression, the bias due to outliers always causes the parameter vector to implode, that is, shrink toward the zero vector. Thus, using LASSO-like penalties to perform variable selection in the presence of outliers can result in missed detections of relevant covariates. We show that by choosing a minimum distance criterion together with an elastic net penalty, we can simultaneously find a parsimonious model and avoid estimation implosion even in the presence of many outliers in the important small n large p situation. Minimizing the penalized minimum distance criterion is a challenging problem due to its nonconvexity. To meet the challenge, we develop a simple and efficient MM (majorization–minimization) algorithm that can be adapted gracefully to the small n large p context. Performance of our algorithm is evaluated on simulated and real datasets. This article has supplementary materials available online.
ACKNOWLEDGMENTS
The authors thank Christopher Amos for generously allowing them to work with the lung cancer dataset. All plots were made using the open source R package ggplot2 (Wickham Citation2009). Eric Chi was supported by grant DE-FG02-97ER25308 from the Department of Energy. David W. Scott was supported in part by grant DMS-09-07491 from the National Science Foundation.