438
Views
36
CrossRef citations to date
0
Altmetric
Original Articles

Learning sparse classifiers with difference of convex functions algorithms

&
Pages 830-854 | Received 12 Mar 2011, Accepted 19 Dec 2011, Published online: 27 Feb 2012
 

Abstract

Sparsity of a classifier is a desirable condition for high-dimensional data and large sample sizes. This paper investigates the two complementary notions of sparsity for binary classification: sparsity in the number of features and sparsity in the number of examples. Several different losses and regularizers are considered: the hinge loss and ramp loss, and ℓ2, ℓ1, approximate ℓ0, and capped ℓ1 regularization. We propose three new objective functions that further promote sparsity, the capped ℓ1 regularization with hinge loss, and the ramp loss versions of approximate ℓ0 and capped ℓ1 regularization. We derive difference of convex functions algorithms (DCA) for solving these novel non-convex objective functions. The proposed algorithms are shown to converge in a finite number of iterations to a local minimum. Using simulated data and several data sets from the University of California Irvine (UCI) machine learning repository, we empirically investigate the fraction of features and examples required by the different classifiers.

Notes

This paper was presented at the International Conference on Optimization: Techniques, Applications (ICOTA 8), Shanghai, 10–13 December 2010.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.