References
- Chernyaev YuA. An extension of the gradient projection method and Newton's method to extremum problems constrained by a smooth surface. Comput Math Math Phys. 2015;55(9):1451–1460. doi: https://doi.org/10.1134/S0965542515090079
- Vial J-P. Strong and weak convexity of sets and functions. Math Oper Res. 1983;8(2):231–259. doi: https://doi.org/10.1287/moor.8.2.231
- Clarke FH, Stern RJ, Wolenski PR. Proximal smoothness and lower- C2 property. J Convex Anal. 1995;2(12):117–144.
- Karimi H, Nutini J, Schmidt M. Linear convergence of gradient and proximal-gradient methods under the Polyak-Lojasiewicz condition. In: Frasconi P, Landwehr N, Manco G, Vreeken J, editors. Machine learning and knowledge discovery in databases. ECML PKDD 2016. Lecture Notes in Computer Science, Vol. 9851. Cham: Springer; 2016. p. 795–811.
- Drusvyatskiy D, Lewis AS. Error bounds, quadratic growth, and linear convergence of proximal methods. Math Oper Res. 2018;43(3):919–948. arXiv:1602.06661. doi: https://doi.org/10.1287/moor.2017.0889
- Balashov M, Polyak B, Tremba A. Gradient projection and conditional gradient methods for constrained nonconvex minimization. Numer Funct Anal Optim. 2020;41(7):822–849. doi: https://doi.org/10.1080/01630563.2019.1704780
- Liu H, Wu W, Anthony M-C, et al. Quadratic optimization with orthogonality constraints: explicit Lojasiewicz exponent and linear convergence of line-search methods. Proceedings of The 33rd International Conference on Machine Learning, PMLR; Vol. 48, 2016; p. 1158–1167. arXiv: 1510.01025.
- Luo Zhi-Quan. New error bounds and their applications to convergence analysis of iterative algorithms. Math Program Ser B. 2000;88:341–355. doi: https://doi.org/10.1007/s101070050020
- Luenberger DG. The gradient projection methods along geodesics. Manage Sci. 1972;18(11):620–631. doi: https://doi.org/10.1287/mnsc.18.11.620
- Hager WW. Minimizing a quadratic over a sphere. SIAM J Optim Control. 2001;12(1):188–208. doi: https://doi.org/10.1137/S1052623499356071
- Absil P-A, Mahony R, Sepulchre R. Optimization algorithms on matrix manifolds. Princeton (NJ): Princeton University Press; 2008.
- da Cruz Neto JX, De Lima LL, Oliveira PR. Geodesic algorithms on Riemannian manifolds. Balkan J Geom Appl. 1998;3(2):89–100.
- Udrişte C. Convex functions and optimization methods on Riemannian manifolds. Mathematics and Its Applications. Vol. 297. Dordrecht: Springer; 1994.
- Edelman A, Arias TA, Smith ST. The geometry of algorithms with orthogonality constraints. SIAM J Matrix Anal Appl. 1998;20(2):303–353. doi: https://doi.org/10.1137/S0895479895290954
- Schneider R, Uschmajew A. Convergence results for projected line search methods on varieties of low-rank matricies via Lojasiewicz inequality. SIAM J Optim. 2015;25(1):622–646. doi: https://doi.org/10.1137/140957822
- Gao B, Liu X, Chen X, et al. On the Lojasiewicz exponent of the quadratic sphere constrained optimization problem. arXiv: 1611.08781v2.
- Barber RF, Ha W. Gradient descent with nonconvex constraints: local concavity determines convergence; 2017. arXiv: 1703.07755v3.
- Jain P, Kar P. Non-convex optimization for machine learning. Now foundations and trends; 2017. p. 154.
- Balashov MV. About the gradient projection algorithm for a strongly convex function and a proximally smooth set. J Convex Anal. 2017;24(2):493–500.
- Balashov MV. The gradient projection algorithm for a proximally smooth set and a function with Lipschitz continuous gradient. Sbornik: Math. 2020;211(4):481–504. doi: https://doi.org/10.1070/SM9214
- Cohen AI. Stepsize analysis for descent methods. JOTA. 1981;33(2):187–205. doi: https://doi.org/10.1007/BF00935546
- Birgin EG, Martinez JM, Raydan M. Nonmonotone spectral projected gradient methods on convex sets. SIAM J Optim. 2000;10(4):1196–1211. doi: https://doi.org/10.1137/S1052623497330963
- Birgin EG, Martinez JM, Raydan M. Spectral projected gradient methods: review and perspectives. J Stat Softw. 2014;60(3):1–21. doi: https://doi.org/10.18637/jss.v060.i03
- Absil P-A, Malick J. Projection-like retractions on matrix manifolds. SIAM J Optim. 2012;22(1):135–158. doi: https://doi.org/10.1137/100802529
- Conn AR, Gould NIM, Toint PhL. A globally convergent augmented Lagrangian algorithm for optimization with general constraints and simple bounds. SIAM J Numer Anal. 1991;28(2):545–572. doi: https://doi.org/10.1137/0728030
- Cartis C, Gould NIM, Toint Ph L. On the evaluation complexity of the cubic regularization methods for potentially rank-deficient nonlinear least-square problems and its relevance to constrained nonlinear optimization. SIAM J Optim. 2013;23(3):1553–1574. doi: https://doi.org/10.1137/120869687
- Birgin EG, Gardenghi JL, Mart'inez JM, et al. Evaluation complexity for nonlinear constrained optimization using unscaled KKT conditions and high-order models. SIAM J Optim. 2016;26(2):951–967. doi: https://doi.org/10.1137/15M1031631
- Aubin J-P, Ekeland I. Applied nonlinear analysis. New York (NY): John Wiley & Sons; 1984.
- Bertsekas DP. Nonlinear programming. 2nd ed. Belmont (MA): Athena Scientific; 1999. p. 777.
- Polyak BT. Introduction to optimization. NY, Translation series of mathematics and engineering; 1987.
- Polyak B, Tremba A. New versions of Newton method: step-size choice, convergence domain and under-determined equations. Optim Methods Softw. 2019. DOI:https://doi.org/10.1080/10556788.2019.1669154
- Cruz Neto JX, Oliveira PR, Soares Jr PA, et al. Learning how to play nash, potential games and alternating minimization method for structured nonconvex problems on Riemannian manifolds. J Convex Anal. 2013;20(2):395–438.
- Bolte J, Danilidis A, Lewis A. The Lojasiewicz inequality for nonsmooth subanalytic functions with applications to subgradient dynamical systems. SIAM J Optim. 2007;17(4):1205–1223. doi: https://doi.org/10.1137/050644641
- Bolte J, Danilidis A, Ley O, et al. Characterizations of a Lojasiewicz inequalities: subgradient flows, talweg, convexity. Trans AMS. 2010;362(6):3319–3363. doi: https://doi.org/10.1090/S0002-9947-09-05048-X
- Poliquin RA, Rockafellar RT, Thibault L. Local differentiability of distance functions. Trans Am Math Soc. 2000;352(11):5231–5249. doi: https://doi.org/10.1090/S0002-9947-00-02550-2
- Bounkhel M, Thibault L. On various notions of regularity of sets in nonsmooth analysis. Nonlin Anal. 2002;48:223–246. doi: https://doi.org/10.1016/S0362-546X(00)00183-8
- Balashov MV. Nonconvex optimization. In: Novikov DA, editor. Control theory (additional chapters): tutorial. Moscow: Lenand; 2019. 552 p. In Russian.
- Kolmogorov AN, Fomin SV. Elements of the function theory and functional analysis. 7th ed. Moscow: Fizmatlit; 2004. In Russian.
- Matsumoto Yu. An Introductory to Morse theory. Translation of mathematical monographs. Vol. 208. Providence (RI): AMS; 1997. p. 221.
- Nesterov Yu. Lectures on convex optimization. Cham: Springer; 2018.
- Huang S-Zh. Gradient inequalities: with applications to asymptotic behavior and stability of gradient-like systems. Mathematical Surveys and Monographs. Vol. 126. Providence (RI): AMS; 2006. p. 183.
- Wei K, Cai J-F, Chan TF, et al. Guarantees of Riemannian optimization for low rank matrix recovery. arXiv: 1511.01562v8; 2016.