247
Views
2
CrossRef citations to date
0
Altmetric
Original Articles

Inexact proximal stochastic second-order methods for nonconvex composite optimization

&
Pages 808-835 | Received 10 Jun 2019, Accepted 04 Jan 2020, Published online: 15 Jan 2020
 

ABSTRACT

In this paper, we propose a framework of Inexact Proximal Stochastic Second-order (IPSS) method for solving nonconvex optimization problems, whose objective function consists of an average of finitely many, possibly weakly, smooth functions and a convex but possibly nonsmooth function. At each iteration, IPSS inexactly solves a proximal subproblem constructed by using some positive definite matrix which could capture the second-order information of original problem. Proper tolerances are given for the subproblem solution in order to maintain global convergence and the desired overall complexity of the algorithm. Under mild conditions, we analyse the computational complexity related to the evaluations on the component gradient of the smooth function. We also investigate the number of evaluations of subgradient when using an iterative subgradient method to solve the subproblem. In addition, based on IPSS, we propose a linearly convergent algorithm under the proximal Polyak–Łojasiewicz condition. Finally, we extend the analysis to problems with weakly smooth function and obtain the computational complexity accordingly.

Mathematics Subject Classifications 2010:

Acknowledgments

We would like to thank two anonymous referees for their valuable comments and suggestions.

Disclosure statement

No potential conflict of interest was reported by the authors.

Notes

1 fC1(Rd) means that f:RdR is continuously differentiable.

Additional information

Funding

This research was partially supported by the National Natural Science Foundation of China [grant numbers 11871453, 11731013], the 2018–2020 Young Elite Scientists Sponsorship Program by China Association for Science and Technology, the USA National Science Foundation [grant numbers 1522654, 1819161] and University Research Facility in Big Data Analytics of the Hong Kong Polytechnic University.

Notes on contributors

Xiao Wang

Xiao Wang is an associate professor at School of Mathematical Sciences at University of Chinese Academy of Sciences, China. Her research focuses on optimization theory, algorithms and applications. Her latest work is published in Optimization Methods and Software, Mathematics of Computation, and SIAM Journal on Optimization.

Hongchao Zhang

Hongchao Zhang is an Associate Professor at the Department of Mathematics at Louisiana State University, USA. His research focuses on nonlinear optimization theory and its applications. His latest work is published at Computational Optimization and Applications, Journal of Scientific Computing and SIAM Journal on Optimization.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 1,330.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.