1,131
Views
5
CrossRef citations to date
0
Altmetric
Articles

Tensor methods for finding approximate stationary points of convex functions

ORCID Icon & ORCID Icon
Pages 605-638 | Received 22 May 2020, Accepted 29 Aug 2020, Published online: 23 Sep 2020

References

  • M. Baes, Estimate sequence methods: Extensions and approximations (2009). Available at http://www.optimization-online.org/DB_FILE/2009/08/2372.pdf.
  • E.G. Birgin, J.L. Gardenghi, J.M. Martínez, S.A. Santos, and Ph.L. Toint, Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models, Math. Program. 163 (2017), pp. 359–368. doi: 10.1007/s10107-016-1065-8
  • J. Bolte, S. Sabach, M. Teboulle, and Y. Vaesburg, First order methods beyond convexity and Lipschitz gradient continuity with applications to quadratic inverse problems, SIAM. J. Optim. 28 (2018), pp. 2131–2151. doi: 10.1137/17M1138558
  • A. Bouaricha, Tensor methods for large, sparse unconstrained optimization, SIAM. J. Optim. 7 (1997), pp. 732–756. doi: 10.1137/S1052623494267723
  • S. Bubeck, Q. Jiang, Y.T. Lee, Y. Li, and A. Sidford, Near-optimal method for highly smooth convex optimization (2019). Available at arXiv, math. OC/1812.08026v2.
  • Y. Carmon, J.C. Duchi, O. Hinder, and A. Sidford, Lower bounds for finding stationary points II: first-order methods (2017). Available at arXiv, math. OC/1711.00841.
  • C. Cartis, N.I.M. Gould, and Ph.L. Toint, Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization, Found. Comput. Math. 18 (2018), pp. 1073–1107. doi: 10.1007/s10208-017-9363-y
  • C. Cartis, N.I.M. Gould, and Ph.L. Toint, Strong evaluation complexity bounds for arbitrary-order optimization of nonconvex nonsmooth composite functions (2020). Available at arXiv, math. OC/2001.10802.
  • C. Cartis, N.I.M. Gould, and Ph.L. Toint, Universal regularized methods – varying the power, the smoothness, and the accuracy, SIAM. J. Optim. 29 (2019), pp. 595–615. doi: 10.1137/16M1106316
  • X. Chen, Ph.L. Toint, and H. Wang, Complexity of partially separable convexly constrained optimization with non-lipschitz singularities, SIAM. J. Optim. 29 (2019), pp. 874–903. doi: 10.1137/18M1166511
  • X. Chen and Ph.L. Toint, High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms, Math. Program. (2020). doi:10.1007/s10107-020-01470-9.
  • N. Doikov and Yu. Nesterov, Minimizing uniformly convex functions by cubic regularization of newton method (2019). Available at arXiv, math. OC/1905.02671.
  • N. Doikov and Yu. Nesterov, Contracting proximal methods for smooth convex optimization. CORE Discussion Paper 2019/27.
  • N. Doikov and Yu. Nesterov, Inexact tensor methods with dynamic accuracies (2020). Available at arXiv, math. OC/2002.09403.
  • A. Gasnikov, P. Dvurechensky, E. Gorbunov, E. Vorontsova, D. Selikhanovych, and C.A. Uribe, The global rate of convergence for optimal tensor methods in smooth convex optimization (2019). Available at arXiv, math. OC/1809.00389v11.
  • G.N. Grapiglia and Yu. Nesterov, Regularized Newton methods for minimizing functions with Hölder continuous hessians, SIAM. J. Optim. 27 (2017), pp. 478–506. doi: 10.1137/16M1087801
  • G.N. Grapiglia and Yu. Nesterov, Accelerated regularized Newton methods for minimizing composite convex functions, SIAM. J. Optim. 29 (2019), pp. 77–99. doi: 10.1137/17M1142077
  • G.N Grapiglia and Yu. Nesterov, Tensor methods for minimizing convex functions with Hölder continuous higher-order derivatives. To appear in SIAM Journal on Optimization.
  • G.N. Grapiglia and Yu. Nesterov, On inexact solution of auxiliary problems in tensor methods for convex optimization, Optim. Methods Softw. (2020). doi:10.1080/10556788.2020.1731749.
  • B. Jiang, T. Lin, and S. Zhang, A unified adaptive tensor approximation scheme to accelerated composite convex optimization (2020). Available at arXiv, math. OC/1811.02427v2.
  • B. Jiang, H. Wang, and S. Zhang, An optimal high-order tensor method for convex optimization (2020). Available at arXiv, math OC/1812.06557v3.
  • H. Lu, R.M. Freund, and Yu. Nesterov, Relatively smooth convex optimization by first-order methods, and applications, SIAM. J. Optim. 28 (2018), pp. 333–354. doi: 10.1137/16M1099546
  • J.M. Martínez, On high-order model regularization for constrained optimization, SIAM. J. Optim. 27 (2017), pp. 2447–2458. doi: 10.1137/17M1115472
  • R.D.C. Monteiro and B.F. Svaiter, An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods, SIAM. J. Optim. 23 (2013), pp. 1092–1125. doi: 10.1137/110833786
  • Yu. Nesterov, Accelerating the cubic regularization of Newton's method on convex problems, Math. Program. 112 (2008), pp. 159–181. doi: 10.1007/s10107-006-0089-x
  • Yu. Nesterov, How to make gradients small, Optima 88 (2012), pp. 10–11.
  • Yu. Nesterov, Implementable tensor methods in unconstrained convex optimization, Math. Program. (2019). doi:10.1007/s10107-019-01449-1.
  • Yu. Nesterov, Inexact accelerated high-order proximal-point methods. CORE Discussion Paper 2020/08.
  • Yu. Nesterov, Inexact high-order proximal-point methods with auxiliary search procedure. CORE Discussion Paper 2020/10.
  • Yu. Nesterov and A. Nemirovskii, Interior Point Polynomial Methods in Convex Programming: Theory and Applications, SIAM, Philadelphia, 1994.
  • Yu. Nesterov and B.T. Polyak, Cubic regularization of Newton method and its global performance, Math. Program. 108 (2006), pp. 177–205. doi: 10.1007/s10107-006-0706-8
  • A. Rodomanov and Yu. Nesterov, Smoothness parameter of power of Euclidean norm, J. Optim. Theory. Appl. 185 (2020), pp. 303–326. doi: 10.1007/s10957-020-01653-6
  • R.B. Schnabel and T. Chow, Tensor methods for unconstrained optimization using second derivatives, SIAM. J. Optim. 1 (1991), pp. 293–315. doi: 10.1137/0801020