160
Views
0
CrossRef citations to date
0
Altmetric
Articles

High-dimensional inference robust to outliers with ℓ1-norm penalization

Pages 5866-5876 | Received 05 Jun 2021, Accepted 16 Dec 2021, Published online: 30 Dec 2021
 

Abstract

This article studies inference in the high-dimensional linear regression model with outliers. Sparsity constraints are imposed on the vector of coefficients of the covariates. The number of outliers can grow with the sample size while their proportion goes to 0. We propose a two-step procedure for inference on the coefficients of a fixed subset of regressors. The first step is a based on several square-root lasso 1-norm penalized estimators, while the second step is the ordinary least squares estimator applied to a well-chosen regression. We establish asymptotic normality of the two-step estimator. The proposed procedure is efficient in the sense that it attains the semiparametric efficiency bound when applied to the model without outliers under homoscedasticity. This approach is also computationally advantageous, it amounts to solving a finite number of convex optimization programs.

MSC 2020 SUBJECT CLASSIFICATION:

Acknowledgments

The author wishes to thank the reviewer for his valuable suggestions and comments.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.