209
Views
13
CrossRef citations to date
0
Altmetric
Original Articles

Gradient methods exploiting spectral properties

, , &
Pages 681-705 | Received 09 Nov 2018, Accepted 01 Feb 2020, Published online: 17 Feb 2020
 

Abstract

We propose a new stepsize for the gradient method. It is shown that this new stepsize will converge to the reciprocal of the largest eigenvalue of the Hessian, when Dai-Yang's asymptotic optimal gradient method (Computational Optimization and Applications, 2006, 33(1): 73–88) is applied for minimizing quadratic objective functions. Based on this spectral property, we develop a monotone gradient method that takes a certain number of steps using the asymptotically optimal stepsize by Dai and Yang, and then follows by some short steps associated with this new stepsize. By employing one step retard of the asymptotic optimal stepsize, a nonmonotone variant of this method is also proposed. Under mild conditions, R-linear convergence of the proposed methods is established for minimizing quadratic functions. In addition, by combining gradient projection techniques and adaptive nonmonotone line search, we further extend those methods for general bound constrained optimization. Two variants of gradient projection methods combining with the Barzilai-Borwein stepsizes are also proposed. Our numerical experiments on both quadratic and bound constrained optimization indicate that the new proposed strategies and methods are very effective.

2010 Mathematics Subject Classifications:

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

Additional information

Funding

This work was supported by the National Natural Science Foundation of China (11701137, 11631013, 11671116), by the National 973 Program of China (2015CB856002), by the China Scholarship Council (No. 201806705007), and by the USA National Science Foundation (1522654, 1819161).

Notes on contributors

Yakui Huang

Yakui Huang received the PhD degree in applied mathematics from Xidian University, Xi'an, China, in 2015. After a two-year postdoc training in the Academy of Mathematics and Systems Science, Chinese Academy of Sciences, in 2017 he joined Hebei University of Technology, Tianjin, China, and is currently an Associate Professor with the Institute of Mathematics, Hebei University of Technology. His research interests include nonlinear optimization theory, numerical algorithms and applications in machine learning.

Yu-Hong Dai

Yu-Hong Dai received his PhD from the Institute of Computational Mathematics and Scientific/Engineering Computing, Chinese Academy of Sciences, Beijing, China, in 1997. Then he worked in the Academy of Mathematics and Systems Science (AMSS), Chinese Academy of Sciences, Beijing, China, and became a Full Professor in 2006. He is currently the Director of the Center for Optimization and Applications of AMSS and also assistant president of AMSS. His research interests include nonlinear optimization, integer programming and various optimization applications.

Xin-Wei Liu

Xin-Wei Liu received his PhD degree from the Institute of Computational Mathematics and Scientific/Engineering Computing, Chinese Academy of Sciences, Beijing, China, in 1998. He works as the Deputy Director and a Professor with the Institute of Mathematics, Hebei University of Technology, Tianjin, China. He is currently taking as an Associate Editor of Mathematical Methods of Operations Research and a member of editorial board of Journal of Numerical Methods and Computer Applications. His research interests are mainly in algorithms and applications of nonlinear optimization.

Hongchao Zhang

Hongchao Zhang received his PhD in applied mathematics from University of Florida in 2006. He then had a postdoc position at the Institute for Mathematics and Its Applications (IMA) and IBM T.J. Watson Research Center. He joined Louisiana State University (LSU) as an assistant professor in 2008 and is now an associate professor in the department of mathematics and Center for Computation & Technology (CCT) at LSU. His research interests are nonlinear optimization theory, algorithm and applications.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 1,330.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.