200
Views
7
CrossRef citations to date
0
Altmetric
Original Articles

Global non-smooth optimization in robust multivariate regression

&
Pages 124-138 | Received 10 Sep 2010, Accepted 11 Aug 2011, Published online: 10 Oct 2011
 

Abstract

Robust regression in statistics leads to challenging optimization problems. Here, we study one such problem, in which the objective is non-smooth, non-convex and expensive to calculate. We study the numerical performance of several derivative-free optimization algorithms with the aim of computing robust multivariate estimators. Our experiences demonstrate that the existing algorithms often fail to deliver optimal solutions. We introduce three new methods that use Powell's derivative-free algorithm. The proposed methods are reliable and can be used when processing very large data sets containing outliers.

Acknowledgements

The authors are grateful to two referees for their comments and corrections, which have helped to improve the text.

Notes

Strictly speaking, in the LMS method, the h-order statistic is used instead of the median to achieve the highest-breakdown point.

Technically, just above 50% of the data, namely [(n+p+1)/2], are considered clean.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.