ABSTRACT
We propose an algorithm, semismooth Newton coordinate descent (SNCD), for the elastic-net penalized Huber loss regression and quantile regression in high dimensional settings. Unlike existing coordinate descent type algorithms, the SNCD updates a regression coefficient and its corresponding subgradient simultaneously in each iteration. It combines the strengths of the coordinate descent and the semismooth Newton algorithm, and effectively solves the computational challenges posed by dimensionality and nonsmoothness. We establish the convergence properties of the algorithm. In addition, we present an adaptive version of the “strong rule” for screening predictors to gain extra efficiency. Through numerical experiments, we demonstrate that the proposed algorithm is very efficient and scalable to ultrahigh dimensions. We illustrate the application via a real data example. Supplementary materials for this article are available online.
Supplementary materials
Appendices: Appendices containing the background on convex analysis and properties of Newton derivative, the derivation of SNA for penalized Huber loss regression, and proof for theoretical results.
R Code: R code for the timing experiments in Section 5.2 except the part involving SNA. We would be happy to share the code for SNA on request.
Acknowledgments
The authors thank the editor, the associate editor and an anonymous reviewer for their helpful and constructive comments that led to considerable improvements of the article. The authors are also grateful to Professor Roger Koenker for providing detailed information about the quantreg package.