References
- Lemaire B. An asymptotical variational principle associated with the steepest descent method for a convex function. J Convex Anal. 1996;3(1):63–70.
- Rockafellar R. Monotone operators and the proximal point algorithm. SIAM J Control Optim. 1976;14(5):877–898.
- Balti M, May R. Asymptotic for the perturbed heavy ball system with vanishing damping term. Evol Equ Control The. 2017;6(2):177–186.
- Haraux A, Jendoubi M. On a second order dissipative ODE in Hilbert space with an integrable source term. Acta Math Sci. 2012;32(1):155–163.
- Polyak B. Some methods of speeding up the convergence of iteration methods. USSR Comput Math Math Phys. 1964;4(5):1–17.
- Su W, Boyd S, Candès E. A differential equation for modeling Nesterov's accelerated gradient method: theory and insights. J Mach Learn Res. 2016;17:1–43.
- Attouch H, Chbani Z, Riahi H. Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α⩽3. ESAIM: COCV. 2019;25(2):2. Available from: https://doi.org/10.1051/cocv/2017083.
- Wibisono A, Wilson A, Jordan M. A variational perspective on accelerated methods in optimization. Proc Nati Acad Sci. 2016;113(47):E7351–E7358.
- Nesterov Y. A method of solving a convex programming problem with convergence rate O(1/k2). Soviet Math Doklady. 1983;27(2):372–376.
- Beck A, Teboulle M. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J Imaging Sci. 2009;2(1):183–202.
- Attouch H, Chbani Z, Fadili J, et al. First-order optimization algorithms via inertial systems with Hessian driven damping. Math Program. 2020; hal-02193846 Available from: https://doi.org/10.1007/s10107-020-01591-1.
- Chen L, Luo H. First order optimization methods based on Hessian-driven Nesterov accelerated gradient flow. 2019. arXiv: 191209276.
- Luo H, Chen L. From differential equation solvers to accelerated first-order methods for convex optimization. Math Program. 2021; Available from: https://doi.org/10.1007/s10107–021–01713–3.
- Siegel J. Accelerated first-order methods: Differential equations and Lyapunov functions. 2019. arXiv: 190305671.
- Wilson A, Recht B, Jordan M. A Lyapunov analysis of momentum methods in optimization. 2016. arXiv: 161102635.
- Nesterov Y. Introductory lectures on convex optimization: A basic course. Springer Science & Business Media; 2013. (Applied Optimization; 87).
- Jendoubi M, May R. Asymptotics for a second-order differential equation with non-autonomous damping and an integrable source term. Appl Anal. 2015;94(2):435–443.
- Attouch H, Chbani Z, Peypouquet J, et al. Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity. Math Program Ser B. 2018;168(1–2):123–175.
- Aujol J, Dossal C. Optimal rate of convergence of an ODE associated to the fast gradient descent schemes for b>0. Hal-01547251v2. 2017.
- Sebbouh O, Dossal C, Rondepierre A. Nesterov's acceleration and Polyak's heavy ball method in continuous time: convergence rate analysis under geometric conditions and perturbations. 2019. arXiv:190702710.
- Attouch H, Buttazzo G, Michaille G. Variational analysis in Sobolev and BV spaces. Philadelphia: Society for Industrial and Applied Mathematics; 2014. (MOS–SIAM Series on Optimization.
- Paoli L. An existence result for vibrations with unilateral constraints: case of a nonsmooth set of constraints. Math Models Meth Appl Sci. 2000;10(06):815–831.
- Schatzman M. A class of nonlinear differential equations of second order in time. Nonlinear Anal. 1978;2(3):355–373.
- Apidopoulos V, Aujol J, Dossal C. The differential inclusion modeling FISTA algorithm and optimality of convergence rate in the case b⩽3. SIAM J Optim. 2018;28(1):551–574.
- Güler O. New proximal point algorithms for convex minimization. SIAM J Optim. 1992;2(4):649–664.
- Salzo S, Villa S. Inexact and accelerated proximal point algorithms. J Convex Anal. 2012;19(4):1167–1192.
- Villa S, Salzo S, Baldassarre L, et al. Accelerated and inexact forward-backward algorithms. SIAM J Optim. 2013;23(3):1607–1633.
- Aujol J, Dossal C. Stability of over-relaxations for the forward-backward algorithm, application to FISTA. SIAM J Optim. 2015;25(4):2408–2433.
- Schmidt M, Roux N, Bach F. Convergence rates of inexact proximal-gradient methods for convex optimization. Adv Neural Inf Process Syst. 2011;24:1458–1466.
- Rockafellar R. Convex analysis. Princeton (NJ): Princeton University Press; 1970.
- Bauschke H, Combettes P. Convex analysis and monotone operator theory in Hilbert spaces. New York: Springer Science+Business Media; 2011. (CMS Books in Mathematics).
- Brézis H. Functional analysis, Sobolev spaces and partial differential equations. New York: Springer; 2011. (Universitext).
- Dugundji J. Topology. Boston (MA): Allyn and Bacon, Inc.; 1966. (Allyn and Bacon Series in Advanced Mathematics).
- Alber Y, Burachik R, Iusem A. A proximal point method for nonsmooth convex optimization problems in Banach spaces. Abstr Appl Anal. 1997;2(1-2):97–120.
- Auslender A. Numerical methods for nondifferentiable convex optimization. In: Cornet B, Nguyen V, Vial J, editors. Nonlinear analysis and optimization. Berlin: Springer Berlin Heidelberg; 1987. Mathematical Programming Studies; p. 102–126.
- He B, Yuan X. An accelerated inexact proximal point algorithm for convex minimization. J Optim Theory Appl. 2012;154(2):536–548.
- Monteiro R, Svaiter B. Convergence rate of inexact proximal point methods with relative error criteria for convex optimization. Working paper. 2010.
- Lin Z, Li H, Fang C. Accelerated optimization for machine learning. Singapore: Springer Nature; 2020.
- d'Aspremont A. Smooth optimization with approximate gradient. SIAM J Optim. 2008;19(3):1171–1183.
- Attouch H, Cabot A, Chbani Z, et al. Inertial forward-backward algorithms with perturbations: application to Tikhonov regularization. J Optim Theory Appl. 2018;179(1):1–36.
- Nesterov Y. Gradient methods for minimizing composite functions. Math Program Series B. 2013;140(1):125–161.
- Barbu V. Differential equations. Cham: Springer; 2016. (Springer Undergraduate Mathematics Series).