Abstract
We propose a new stepsize for the gradient method. It is shown that this new stepsize will converge to the reciprocal of the largest eigenvalue of the Hessian, when Dai-Yang's asymptotic optimal gradient method (Computational Optimization and Applications, 2006, 33(1): 73–88) is applied for minimizing quadratic objective functions. Based on this spectral property, we develop a monotone gradient method that takes a certain number of steps using the asymptotically optimal stepsize by Dai and Yang, and then follows by some short steps associated with this new stepsize. By employing one step retard of the asymptotic optimal stepsize, a nonmonotone variant of this method is also proposed. Under mild conditions, R-linear convergence of the proposed methods is established for minimizing quadratic functions. In addition, by combining gradient projection techniques and adaptive nonmonotone line search, we further extend those methods for general bound constrained optimization. Two variants of gradient projection methods combining with the Barzilai-Borwein stepsizes are also proposed. Our numerical experiments on both quadratic and bound constrained optimization indicate that the new proposed strategies and methods are very effective.
Disclosure statement
No potential conflict of interest was reported by the author(s).
Notes
1 codes available at https://www.ime.usp.br/ egbirgin/tango/codes.php
Additional information
Funding
Notes on contributors
Yakui Huang
Yakui Huang received the PhD degree in applied mathematics from Xidian University, Xi'an, China, in 2015. After a two-year postdoc training in the Academy of Mathematics and Systems Science, Chinese Academy of Sciences, in 2017 he joined Hebei University of Technology, Tianjin, China, and is currently an Associate Professor with the Institute of Mathematics, Hebei University of Technology. His research interests include nonlinear optimization theory, numerical algorithms and applications in machine learning.
Yu-Hong Dai
Yu-Hong Dai received his PhD from the Institute of Computational Mathematics and Scientific/Engineering Computing, Chinese Academy of Sciences, Beijing, China, in 1997. Then he worked in the Academy of Mathematics and Systems Science (AMSS), Chinese Academy of Sciences, Beijing, China, and became a Full Professor in 2006. He is currently the Director of the Center for Optimization and Applications of AMSS and also assistant president of AMSS. His research interests include nonlinear optimization, integer programming and various optimization applications.
Xin-Wei Liu
Xin-Wei Liu received his PhD degree from the Institute of Computational Mathematics and Scientific/Engineering Computing, Chinese Academy of Sciences, Beijing, China, in 1998. He works as the Deputy Director and a Professor with the Institute of Mathematics, Hebei University of Technology, Tianjin, China. He is currently taking as an Associate Editor of Mathematical Methods of Operations Research and a member of editorial board of Journal of Numerical Methods and Computer Applications. His research interests are mainly in algorithms and applications of nonlinear optimization.
Hongchao Zhang
Hongchao Zhang received his PhD in applied mathematics from University of Florida in 2006. He then had a postdoc position at the Institute for Mathematics and Its Applications (IMA) and IBM T.J. Watson Research Center. He joined Louisiana State University (LSU) as an assistant professor in 2008 and is now an associate professor in the department of mathematics and Center for Computation & Technology (CCT) at LSU. His research interests are nonlinear optimization theory, algorithm and applications.