Abstract
Building on previous research of Chi and Chi, this article revisits estimation in robust structured regression under the criterion. We adopt the majorization-minimization (MM) principle to design a new algorithm for updating the vector of regression coefficients. Our sharp majorization achieves faster convergence than the previous alternating proximal gradient descent algorithm by Chi and Chi. In addition, we reparameterize the model by substituting precision for scale and estimate precision via a modified Newton’s method. This simplifies and accelerates overall estimation. We also introduce distance-to-set penalties to enable constrained estimation under nonconvex constraint sets. This tactic also improves performance in coefficient estimation and structure recovery. Finally, we demonstrate the merits of our improved tactics through a rich set of simulation examples and a real data application.
Supplementary Materials
Supplementary materials and code for this article are available online. The supplement.pdf file contains the two simulation examples of convex regression and trend filtering under the criterion. The L2E-code.zip file includes code for implementing the
isotonic regression and reproducing and in the paper. To implement other
regression methods in the article, we refer readers to the eponymous L2E R package on the CRAN.
Disclosure Statement
The authors report there are no competing interests to declare.
Acknowledgments
The authors are grateful to the editor, the associate editor, and the two referees for their helpful comments and suggestions. The authors thank Lisa Lin for her help with the R package.