160
Views
0
CrossRef citations to date
0
Altmetric
Articles

High-dimensional inference robust to outliers with ℓ1-norm penalization

Pages 5866-5876 | Received 05 Jun 2021, Accepted 16 Dec 2021, Published online: 30 Dec 2021

References

  • Alfons, A.,. C. Croux, and S. Gelper. 2013. Sparse least trimmed squares regression for analyzing high-dimensional large data sets. The Annals of Applied Statistics 7 (1):226–48. doi:10.1214/12-AOAS575.
  • Belloni, A., V. Chernozhukov, and L. Wang. 2011a. Square-root lasso: Pivotal recovery of sparse signals via conic programming. Biometrika 98 (4):791–806. doi:10.1093/biomet/asr043.
  • Belloni, A., V. Chernozhukov, and L. Wang. 2011b. ℓ1-penalized quantile regression in high-dimensional sparse models. The Annals of Statistics 39 (1):82–130.
  • Belloni, A., D. Chen, V. Chernozhukov, and C. Hansen. 2012. Sparse models and methods for optimal instruments with an application to eminent domain. Econometrica 80 (6):2369–429.
  • Belloni, A., V. Chernozhukov, and C. Hansen. 2014a. High-dimensional methods and inference on structural and treatment effects. Journal of Economic Perspectives 28 (2):29–50. doi:10.1257/jep.28.2.29.
  • Belloni, A., V. Chernozhukov, and C. Hansen. 2014b. Inference on treatment effects after selection among high-dimensional controls. The Review of Economic Studies 81 (2):608–50. doi:10.1093/restud/rdt044.
  • Belloni, A., V. Chernozhukov, C. Hansen, and D. Kozbur. 2016a. Inference in high-dimensional panel models with an application to gun control. Journal of Business & Economic Statistics 34 (4):590–605. doi:10.1080/07350015.2015.1102733.
  • Belloni, A., V. Chernozhukov, and Y. Wei. 2016b. Post-selection inference for generalized linear models with many controls. Journal of Business & Economic Statistics 34 (4):606–19. doi:10.1080/07350015.2016.1166116.
  • Belloni, A., V. Chernozhukov, I. Fernández-Val, and C. Hansen. 2017. Program evaluation and causal inference with high-dimensional data. Econometrica 85 (1):233–98. doi:10.3982/ECTA12723.
  • Belloni, A., V. Chernozhukov, and K. Kato. 2019. Valid post-selection inference in high-dimensional approximately sparse quantile regression models. Journal of the American Statistical Association 114 (526):749–58. doi:10.1080/01621459.2018.1442339.
  • Beyhum, J. 2020. Inference robust to outliers with ℓ1-norm penalization. Esaim: PS 24:688–702.
  • Bickel, P. J., Y. Ritov, and A. B. Tsybakov. 2009. Simultaneous analysis of lasso and dantzig selector. The Annals of Statistics 37 (4):1705–32. doi:10.1214/08-AOS620.
  • Collier, O., and A. S. Dalalyan. 2017. Rate-optimal estimation of p-dimensional linear functionals in a sparse gaussian model. arXiv preprint arXiv:1712.05495.
  • Dalalyan, A., and P. Thompson. 2019. Outlier-robust estimation of a sparse linear model using ℓ1-penalized huber’s m-estimator. In Advances in neural information processing systems, 13188–98. Red Hook, United States: Curran Associates, Inc.
  • Dalalyan, A. S. 2012. SOCP based variance free Dantzig selector with application to robust estimation. Comptes Rendus Mathematique 350 (15-16):785–8. doi:10.1016/j.crma.2012.09.016.
  • Gannaz, I. 2007. Robust estimation and wavelet thresholding in partially linear models. Statistics and Computing 17 (4):293–310. doi:10.1007/s11222-007-9019-x.
  • Gao, X., and Y. Fang. 2016. Penalized weighted least squares for outlier detection and robust regression. arXiv preprint arXiv:1603.07427
  • Giraud, C. 2014. Introduction to high-dimensional statistics. London, United Kingdom: Chapman and Hall/CRC.
  • Hampel, F. R., E. M. Ronchetti, P. J. Rousseeuw, and W. A. Stahel. 2011. Robust statistics: The approach based on influence functions, vol. 196. Hoboken, United States: John Wiley & Sons.
  • Huber, P. J. 2004. Robust statistics, vol. 523. Hoboken, United States: John Wiley & Sons.
  • Javanmard, A., and A. Montanari. 2014. Confidence intervals and hypothesis testing for high-dimensional regression. The Journal of Machine Learning Research 15 (1):2869–909.
  • Lambert-Lacroix, S., and L. Zwald. 2011. Robust regression through the Huber’s criterion and adaptive lasso penalty. Electronic Journal of Statistics 5 (none):1015–53. doi:10.1214/11-EJS635.
  • Lee, Y., S. N. MacEachern, and Y. Jung. 2012. Regularization of case-specific parameters for robustness and efficiency. Statistical Science 27 (3):350–72. doi:10.1214/11-STS377.
  • Li, W. 2012. Simultaneous variable selection and outlier detection using LASSO with applications to aircraft landing data analysis. PhD. diss., Rutgers University-Graduate School-New Brunswick.
  • Liu, J., P. C. Cosman, and B. D. Rao. 2017. Robust linear regression via ℓ0 regularization. IEEE Transactions on Signal Processing 66 (3):698–713.
  • Liu, L. Y., Shen, L. Tianyang, and C. Caramanis. 2020. High dimensional robust sparse regression. In International Conference on Artificial Intelligence and Statistics, PMLR, 411–421.
  • Maronna, R. A., R. D. Martin, V. J. Yohai, and M. Salibián-Barrera. 2018. Robust statistics: Theory and methods (with R). Hoboken, United States: Wiley.
  • Nguyen, N. H., and T. D. Tran. 2013. Robust lasso with missing and grossly corrupted observations. IEEE Transactions on Information Theory 59 (4):2036–58. doi:10.1109/TIT.2012.2232347.
  • Owen, A. B. 2007. A robust hybrid of lasso and ridge regression. Contemporary Mathematics 443 (7):59–72.
  • Rousseeuw, P. J., and A. M. Leroy. 2005. Robust regression and outlier detection, vol. 589. Hoboken, United States: John Wiley & Sons.
  • She, Y., and A. B. Owen. 2011. Outlier detection using nonconvex penalized regression. Journal of the American Statistical Association 106 (494):626–39. doi:10.1198/jasa.2011.tm10390.
  • Van de Geer, S., P. Bühlmann, Y. Ritov, and R. Dezeure. 2014. On asymptotically optimal confidence regions and tests for high-dimensional models. The Annals of Statistics 42 (3):1166–202. doi:10.1214/14-AOS1221.
  • Virouleau, A. A., Guilloux, S. Gaïffas, and M. Bogdan. 2017. High-dimensional robust regression and outliers detection with slope. arXiv preprint arXiv:1712.02640.
  • Yang, E., A. C. Lozano, and A. Aravkin. 2018. A general family of trimmed estimators for robust high-dimensional data analysis. Electronic Journal of Statistics 12 (2):3519–53. doi:10.1214/18-EJS1470.
  • Zhang, C.-H S. S. Zhang. 2014. Confidence intervals for low dimensional parameters in high dimensional linear models. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 76 (1):217–42. doi:10.1111/rssb.12026.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.