Publication Cover
Optimization
A Journal of Mathematical Programming and Operations Research
Volume 73, 2024 - Issue 7
177
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Mini-batch stochastic subgradient for functional constrained optimization

, &
Pages 2159-2185 | Received 19 Apr 2022, Accepted 28 Feb 2023, Published online: 15 Mar 2023
 

Abstract

In this paper, we consider finite sum composite optimization problems with many functional constraints. The objective function is expressed as a finite sum of two terms, one of which admits easy computation of (sub)gradients while the other is amenable to proximal evaluations. We assume a generalized bounded gradient condition on the objective which allows us to simultaneously tackle both smooth and nonsmooth problems. We also consider the cases of both with and without a quadratic functional growth property. Further, we assume that each constraint set is given as the level set of a convex but not necessarily a differentiable function. We reformulate the constrained finite sum problem into a stochastic optimization problem for which the stochastic subgradient projection method from Necoara and Singh [Stochastic subgradient projection methods for composite optimization with functional constraints; 2022 Journal of Machine Learning Research, 23, 1–35] specializes in a collection of mini-batch variants, with different mini-batch sizes for the objective function and functional constraints, respectively. More specifically, at each iteration, our algorithm takes a mini-batch stochastic proximal subgradient step aimed at minimizing the objective function and then a subsequent mini-batch subgradient projection step minimizing the feasibility violation. By specializing in different mini-batching strategies, we derive exact expressions for the stepsizes as a function of the mini-batch size and in some cases we also derive insightful stepsize-switching rules which describe when one should switch from a constant to a decreasing stepsize regime. We also prove sublinear convergence rates for the mini-batch subgradient projection algorithm which depend explicitly on the mini-batch sizes and on the properties of the objective function. Numerical results also show a better performance of our mini-batch scheme over its single-batch counterpart.

Mathematics Subject Classifications:

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

The research leading to these results has received funding from: the NO Grants 2014–2021 [RO-NO-2019-0184, project ELO-Hyp, contract no. 24/2020]; Unitatea Executiva pentru Finantarea Invatamantului Superior, a Cercetarii, Dezvoltarii si Inovarii (UEFISCDI) [PN-III-P4-PCE-2021-0720, under project L2O-MOC, nr. 70/2022] for N.K. Singh and I. Necoara. The OP VVV project CZ.02.1.01/0.0/0.0/16_019/0000765 Research Center for Informatics for V. Kungurtsev.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.