83
Views
0
CrossRef citations to date
0
Altmetric
Special Issue: Advances in Continuous Optimization

Techniques for accelerating branch-and-bound algorithms dedicated to sparse optimization

ORCID Icon, ORCID Icon & ORCID Icon
Pages 4-41 | Received 14 Apr 2022, Accepted 23 Jul 2023, Published online: 31 Aug 2023

References

  • S. Alliney and S.A. Ruzinsky, An algorithm for the minimization of mixed ℓ1 and ℓ2 norms with application to Bayesian estimation, IEEE. Trans. Signal Process. 42 (1994), pp. 618–627.
  • F. Bach, R. Jenatton, J. Mairal, and G. Obozinski, Optimization with Sparsity-Inducing Penalties, Now publishers Inc, 2011.
  • H. Bauschke and P. Combettes, Convex Analysis and Monotone Operator Theory in Hilbert Space, Springer, 2011.
  • R. Ben Mhenni, S. Bourguignon, M. Mongeau, J. Ninin, and H. Carfantan, Sparse branch and bound for exact optimization of ℓ0-norm penalized least squares, in ICASSP 2020, IEEE International Conference on Acoustics, Speech and Signal Processing, IEEE, Barcelona, Spain, 2020. ISBN: 978-1-5090-6632-2.
  • R. Ben Mhenni, S. Bourguignon, and J. Ninin, Global optimization for sparse solution of least squares problems, Optim. Methods Softw. 37 (2021), pp. 1740–1769.
  • D. Bertsimas, A. King, and R. Mazumder, Best subset selection via a modern optimization lens, Ann. Stat. 44(2) (2016), pp. 813–852.
  • D. Bertsimas, J. Pauphilet, and B. Van Parys, Sparse regression: Scalable algorithms and empirical performance, Stat. Sci. 35(4) (2020), pp. 555–578.
  • D. Bertsimas and R. Shioda, Algorithm for cardinality-constrained quadratic optimization, Comput. Optim. Appl. 43 (2009), pp. 1–22.
  • D. Bienstock, Computational study of a family of mixed-integer quadratic programming problems, Math. Program. 74(2) (1996), pp. 121–140.
  • A. Bonnefoy, V. Emiya, L. Ralaivola, and R. Gribonval, A dynamic screening principle for the lasso, in European Signal Processing Conference EUSIPCO, Lisbon, Portugal, 2014.
  • S. Bourguignon, J. Ninin, H. Carfantan, and M. Mongeau, Exact sparse approximation problems via mixed-integer programming: Formulations and computational performance, IEEE. Trans. Signal Process. 64(6) (2016), pp. 1405–1419.
  • A. Chambolle and T. Pock, A first-order primal-dual algorithm for convex problems with applications to imaging, J. Math. Imaging Vis. 40(1) (2011), pp. 120–145.
  • S.S. Chen, D.L. Donoho, and M.A. Saunders, Atomic decomposition by basis pursuit, SIAM Rev.43(1) (2001), pp. 129–159.
  • X.T. Cui, X.J. Zheng, S.S. Zhu, and X.L. Sun, Convex relaxations and MIQCQP reformulations for a class of cardinality-constrained portfolio selection problems, J. Glob. Optim. 56 (2013), pp. 1409–1423.
  • I. Daubechies, M. Defrise, and C. De Mol, An iterative thresholding algorithm for linear inverse problems with a sparsity constraint, Commun. Pure Appl. Math. 57 (2004), pp. 1413–1457.
  • A. Dedieu, H. Hazimeh, and R. Mazumder, Learning sparse classifiers: Continuous and mixed integer optimization perspectives, J. Mach. Learn. Res. 22(135) (2021), pp. 1–47.
  • E. Dolan and J.J. Moré, Benchmarking optimization software with performance profiles, Math. Program. 91 (2002), pp. 201–213.
  • J. Eckstein, Splitting methods for monotone operators with applications to parallel optimization, Ph.D. thesis, Massachusetts Institute of Technology, 1989.
  • B. Efron, T.J. Hastie, I.M. Johnstone, and R. Tibshirani, Least angle regression, Ann. Stat. 32 (2004), pp. 407–499.
  • L. El Ghaoui, V. Viallon, and T. Rabbani, Safe feature elimination for the LASSO and sparse supervised learning problems, Tech. Rep. EECS Department, University of California, Berkeley, 2010.
  • O. Fercoq, A. Gramfort, and J. Salmon, Mind the duality gap: Safer rules for the LASSO, in Proceedings of the 32nd International Conference on Machine Learning, F. Bach and D. Blei, eds., PMLR, Lille, France, 2015.
  • J.H. Friedman, T. Hastie, and R. Tibshirani, Regularization paths for generalized linear models via coordinate descent, J. Stat. Softw. 33(1) (2010), pp. 1–22.
  • T. Golub, D.K. Slonim, P. Tamayo, C. Huard, M. Gaasenbeek, J.P. Mesirov, H. Coller, M.L. Loh, J.R. Downing, M.A. Caligiuri, C.D. Bloomfield, and E.S. Lander, Molecular classification of cancer: Class discovery and class prediction by gene expression monitoring, Web supplement to the article, 1999. Available at https://www.openml.org/search?type=data&sort=runs&id=1104&status=active.
  • T. Guyard, C. Herzet, and C. Elvira, Node-screening tests for the l0-penalized least-squares problem, in ICASSP 2022–IEEE International Conference on Acoustics, Speech and Signal Processing, Singapore, 2022, pp. 5448–5452.
  • T. Hastie, R. Tibshirani, and J. Friedman, The Elements of Statistical Learning, Springer New York Inc., New York, NY, USA, 2001.
  • H. Hazimeh, R. Mazumder, and A. Saab, Sparse regression at scale: Branch-and-bound rooted in first-order optimization, Math. Program. 196 (2021), pp. 347–388.
  • J.J. Kormylo and J.M. Mendel, Maximum likelihood detection and estimation of Bernoulli–Gaussian processes, IEEE Trans. Inf. Theory 28 (1982), pp. 482–488.
  • H. Lee, A. Battle, R. Raina, and A. Ng, Efficient sparse coding algorithms, in Advances in Neural Information Processing Systems, B. Schölkopf, J. Platt, and T. Hoffman, eds., MIT Press, Vancouver, BC Canada, 2006.
  • M. Massias, From safe screening rules to working sets for faster lasso-type solvers, in 10th NIPS Workshop on Optimization for Machine Learning, Long Beach, CA, 2017.
  • J.J. Moreau, Fonctions convexes duales et points proximaux dans un espace hilbertien. Comptes Rendus Hebdomadaires des Séances de l'Académie des Sciences, Gauthier-Villars, Paris, 1962.
  • T. Moreau, M. Massias, A. Gramfort, P. Ablin, P.-A. Bannier, B. Charlier, M. Dagréou, T. Dupré la Tour, G. Durif, C.F. Dantas, Q. Klopfenstein, J. Larsson, E. Lai, T. Lefort, B. Malézieux, B. Moufad, B. T. Nguyen, A. Rakotomamonjy, Z. Ramzi, J. Salmon, and S. Vaiter, Benchopt: Reproducible, efficient and collaborative optimization benchmarks, in NeurIPS, Curran Associates, Inc., New Orleans, Lousiana, 2022.
  • P. Moulin and J. Liu, Analysis of multiresolution image denoising schemes using generalized gaussian and complexity priors, IEEE Trans. Inf. Theory 45 (1999), pp. 909–919.
  • B.K. Natarajan, Sparse approximate solutions to linear systems, SIAM J. Comput. 24 (1995), pp. 227–234.
  • E. Ndiaye, O. Fercoq, A. Gramfort, and J. Salmon, Gap safe screening rules for sparsity enforcing penalties, J. Mach. Learn. Res. 18 (2017), pp. 1–33.
  • G. Nemhauser and L. Wolsey, General algorithms. In Integer and Combinatorial Optimization. John Wiley & Sons, Ltd, Hoboken, NJ, 1988. Chapter II.4.
  • J. Nocedal and S.J. Wright, Numerical Optimization, 2nd ed., Springer, New York, 2006.
  • M. Osborne, B. Presnell, and B. Turlach, A new approach to variable selection in least squares problems, IMA J. Numer. Anal. 20(3) (2000), pp. 389–403.
  • A. Raj, J. Olbrich, B. Gärtner, B. Schölkopf, and M. Jaggi, Screening rules for convex problems, in 9th NIPS Workshop on Optimization for Machine Learning, Curran Associates, Inc., Barcelona, Spain, 2016.
  • R.T. Rockafellar, Convex Analysis, Princeton University Press, Princeton, 1970.
  • D.X. Shaw, S. Liu, and L. Kopman, Lagrangian relaxation procedure for cardinality-constrained portfolio optimization, Optim. Methods Softw. 23(3) (2008), pp. 411–420.
  • S. Solntsev, J. Nocedal, and R. Byrd, An algorithm for quadratic ℓ1-regularized optimization with a flexible active-set strategy, Optim. Methods Softw. 30 (2015), pp. 1213–1237.
  • E. Soubies, L. Blanc-Féraud, and G. Aubert, A unified view of exact continuous penalties for ℓ2- ℓ0 minimization, SIAM J. Optim. 27 (2017), pp. 2034–2060.
  • C. Soussen, J. Idier, D. Brie, and J. Duan, From Bernoulli-Gaussian deconvolution to sparse signal restoration, IEEE. Trans. Signal Process. 59(10) (2011), pp. 4572–4584.
  • R. Tibshirani, J. Bien, J. Friedman, T. Hastie, N. Simon, J. Taylor, and R. Tibshirani, Strong rules for discarding predictors in lasso-type problems, J. R. Stat. Soc. Ser. B: Stat. Methodol. 74 (2010), pp. 245–266.
  • A.M. Tillmann, D. Bienstock, A. Lodi, and A. Schwartz, Cardinality Minimization, Constraints, and Regularization: A Survey, 2021. Available at https://arxiv.org/abs/2106.09606.
  • J.A. Tropp, Greed is good: Algorithmic results for sparse approximation, IEEE Trans. Inf. Theory 50 (2004), pp. 2231–2242.
  • J.A. Tropp and S.J. Wright, Computational methods for sparse solution of linear inverse problems, Proc. IEEE 98(6) (2010), pp. 948–958.
  • P. Tseng, Convergence of a block coordinate descent method for nondifferentiable minimization, J. Optim. Theory Appl. 109 (2001), pp. 475–494.
  • M.J. Wainwright, Sharp thresholds for high-dimensional and noisy sparsity recovery using ℓ1-constrained quadratic programming (lasso), IEEE Trans. Inf. Theory 55(5) (2009), pp. 2183–2202.
  • Z. Xiang, H. Xu, and P.J. Ramadge, Learning sparse representations of high dimensional data on large scale dictionaries, in Advances in Neural Information Processing Systems, J. Shawe-Taylor, R. Zemel and P. Bartlett et al., eds., Curran Associates, Inc., Granada, Spain, 2011.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.