64
Views
0
CrossRef citations to date
0
Altmetric
Special Issue: Advances in Continuous Optimization

Dual formulation of the sparsity constrained optimization problem: application to classification

ORCID Icon, ORCID Icon & ORCID Icon
Pages 84-101 | Received 07 Mar 2022, Accepted 24 Oct 2023, Published online: 21 Nov 2023

References

  • E. Amaldi and V. Kann, On the approximability of minimizing nonzero variables or unsatisfied relations in linear systems, Theor. Comput. Sci. 209(1–2) (1998), pp. 237–260.
  • L.T.H. An and P.D. Tao, The DC (difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problems, Ann. Oper. Res. 133 (2005), pp. 23–46.
  • A. Beck and Y.C. Eldar, Sparsity constrained nonlinear optimization: Optimality conditions and algorithms, SIAM. J. Optim. 23(3) (2013), pp. 1480–1509.
  • D. Bertsimas, M.S. Copenhaver, and R. Mazumder, The trimmed Lasso: Sparsity and robustness, arXiv:1708.045272017.
  • D. Bertsimas and A. King, Logistic regression. from art to science, Stat. Sci. 32(3) (2017), pp. 367–384.
  • D. Bienstock, Computational study of a family of mixed-integer quadratic programming problems, Math. Program. 74 (1996), pp. 121–140.
  • M. Bogdan, E. van den Berg, C. Sabatti, W. Su, and E.J. Candès, Slope–adaptive variable selection via convex optimization, Ann. Appl. Stat. 9(3) (2015), pp. 1103–1140.
  • P.S. Bradley, O.L. Mangasarian, and W.N. Street, Feature selection via mathematical programming, INFORMS. J. Comput. 10(2) (1998), pp. 209–217.
  • O.P. Burdakov, C. Kanzow, and A. Schwartz, Mathematical programs with cardinality constraints: Reformulation by complementarity-type conditions and a regularization method, SIAM. J. Optim.26(1) (2016), pp. 397–425.
  • C-C. Chang and C-J. Lin, LIBSVM : A library for support vector machines, ACM. Trans. Intell. Syst. Technol. 2(27) (2011), pp. 1–27. Software available at http://www.csie.ntu.edu.tw/cjlin/libsvm.
  • N. Cristianini and J. Shawe–Taylor, An Introduction to Support Vector Machines and Other Kernel-based Learning Methods, Cambridge University Press, 2000.
  • D.L. Donoho, Compressed sensing, IEEE Transact. Inform. Theory 52 (2006), pp. 1289–1306.
  • J.G. Dy, C.E. Brodley, and S. Wrobel, Feature selection for unsupervised learning, J. Mach. Learn. Res. 5 (2004), pp. 845–889.
  • R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang, and C.-J. Lin, LIBLINEAR: A library for large linear classification, J. Mach. Learn. Res. 9 (2008), pp. 1871–1874.
  • M. Feng, J.E. Mitchell, J-S. Pang, X. Shen, and A. Wäcther, Complementarity formulations of ℓ0-norm optimization problems, Pac. J. Optim. 14(2) (2018), pp. 273–305.
  • M. Gaudioso, E. Gorgone, and J.-B. Hiriart-Urruty, Feature selection in SVM via polyhedral k-norm, Optim. Lett. 14 (2019), pp. 19–36.
  • M. Gaudioso, E. Gorgone, M. Labbé, and A.M. Rodríguez-Chía, Lagrangian relaxation for SVM feature selection, Comput. Oper. Res. 87 (2017), pp. 137–145.
  • M. Gaudioso, G. Giallombardo, and G. Miglionico, Sparse optimization via vector k-norm and DC programming with an application to feature selection for support vector machines, Comput. Optim. Appl. 86(2) (2023), pp. 745–766.
  • M. Gaudioso and J.-B. Hiriart-Urruty, Deforming ‖⋅‖1 into ‖⋅‖∞ via polyhedral norms: A pedestrian approach, SIAM Rev. 64(3) (2022), pp. 713–727.
  • J. Gotoh, A. Takeda, and K. Tono, DC formulations and algorithms for sparse optimization problems, Math. Program. 169 (2018), pp. 141–176.
  • A.B. Hempel and P.J. Goulart, A novel method for modelling cardinality and rank constraints, 53rd IEEE Conference on Decision and Control, Los Angeles, Cal., USA 2014 December 15–17, pp. 4322–4327.
  • J.-B. Hiriart-Urruty, Generalized differentiability/duality and optimization for problems dealing with differences of convex functions, Lecture Notes in Economic and Mathematical Systems; 1986, Vol. 256 Springer Verlag, pp. 37–70 .
  • S. Maldonado, J. Pérez, R. Weber, and M. Labbé, Feature selection for support vector machines via mixed integer linear programming, Inf. Sci. (Ny) 279 (2014), pp. 163–175.
  • O.L. Mangasarian, Exact 1-Norm support vector machines via unconstrained convex differentiable minimization, J. Mach. Learn. Res. 7 (2006), pp. 1517–1530.
  • M. Pilanci, M.J. Wainwright, and L. El Ghaoui, Sparse learning via Boolean relaxations, Math. Program. 151 (2015), pp. 63–87.
  • F. Rinaldi, F. Schoen, and M. Sciandrone, Concave programming for minimizing the zero-norm over polyhedral sets, Comput. Optim. Appl. 46 (2010), pp. 467–486.
  • A. Strekalovsky, Global optimality conditions for nonconvex optimization, J. Global Optim. 12 (1998), pp. 415–434.
  • E. Soubies, L. Blanc–Féraud, and G. Aubert, A unified view of exact continuous penalties for ℓ2- ℓ0 minimization, SIAM. J. Optim. 27(3) (2017), pp. 2034–2060.
  • P. Tseng, Convergence of a block coordinate descent method for nondifferentiable minimization, J. Optim. Theory. Appl. 109(3) (2001), pp. 475–494.
  • G.A. Watson, Linear best approximation using a class of polyhedral norms, Numer. Alg. 2 (1992), pp. 321–336.
  • J. Weston, A. Elisseeff, B. Schölkopf, and M. Tipping, Use of the zero-norm with linear models and kernel methods, J. Mach. Learn. Res. 3 (2003), pp. 1439–1461.
  • B. Wu, C. Ding, D. Sun, and K-C. Toh, On the Moreau-Yosida regularization of the vector k−norm related functions, SIAM. J. Optim. 24(2) (2014), pp. 766–794.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.