Publication Cover
Optimization
A Journal of Mathematical Programming and Operations Research
Volume 73, 2024 - Issue 7
175
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Mini-batch stochastic subgradient for functional constrained optimization

, &
Pages 2159-2185 | Received 19 Apr 2022, Accepted 28 Feb 2023, Published online: 15 Mar 2023
 

Abstract

In this paper, we consider finite sum composite optimization problems with many functional constraints. The objective function is expressed as a finite sum of two terms, one of which admits easy computation of (sub)gradients while the other is amenable to proximal evaluations. We assume a generalized bounded gradient condition on the objective which allows us to simultaneously tackle both smooth and nonsmooth problems. We also consider the cases of both with and without a quadratic functional growth property. Further, we assume that each constraint set is given as the level set of a convex but not necessarily a differentiable function. We reformulate the constrained finite sum problem into a stochastic optimization problem for which the stochastic subgradient projection method from Necoara and Singh [Stochastic subgradient projection methods for composite optimization with functional constraints; 2022 Journal of Machine Learning Research, 23, 1–35] specializes in a collection of mini-batch variants, with different mini-batch sizes for the objective function and functional constraints, respectively. More specifically, at each iteration, our algorithm takes a mini-batch stochastic proximal subgradient step aimed at minimizing the objective function and then a subsequent mini-batch subgradient projection step minimizing the feasibility violation. By specializing in different mini-batching strategies, we derive exact expressions for the stepsizes as a function of the mini-batch size and in some cases we also derive insightful stepsize-switching rules which describe when one should switch from a constant to a decreasing stepsize regime. We also prove sublinear convergence rates for the mini-batch subgradient projection algorithm which depend explicitly on the mini-batch sizes and on the properties of the objective function. Numerical results also show a better performance of our mini-batch scheme over its single-batch counterpart.

Mathematics Subject Classifications:

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

The research leading to these results has received funding from: the NO Grants 2014–2021 [RO-NO-2019-0184, project ELO-Hyp, contract no. 24/2020]; Unitatea Executiva pentru Finantarea Invatamantului Superior, a Cercetarii, Dezvoltarii si Inovarii (UEFISCDI) [PN-III-P4-PCE-2021-0720, under project L2O-MOC, nr. 70/2022] for N.K. Singh and I. Necoara. The OP VVV project CZ.02.1.01/0.0/0.0/16_019/0000765 Research Center for Informatics for V. Kungurtsev.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 630.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.