203
Views
0
CrossRef citations to date
0
Altmetric
Original Articles

New conjugate gradient-like methods for unconstrained optimization

&
Pages 1302-1316 | Received 20 Dec 2011, Accepted 24 Feb 2014, Published online: 07 Apr 2014

References

  • M. Al-Baali, Descent property and global convergence of the Fletcher–Reeves method with inexact line searches, IMA J. Numer. Anal. 5 (1985), pp. 121–124. doi: 10.1093/imanum/5.1.121
  • T.M. Apostol, Mathematical Analysis, 2nd ed., Addison-Wesley Publishing Company, Menlo Park, CA, 1974.
  • J. Barzilai and J.M. Borwein, Two point step size gradient method, IMA J. Numer. Anal. 8 (1988), pp. 141–148. doi: 10.1093/imanum/8.1.141
  • N.I.M. Gould, D. Orban, and Ph.L. Toint, CUTEr: A constrained and unconstrained testing environment, revisited, ACM Trans. Math. Softw. 29 (2003), pp. 373–394. doi: 10.1145/962437.962439
  • W. Cheng, A two-term PRP-based descent method, Numer. Funct. Anal. Opt. 28 (2007), pp. 1217–1230. doi: 10.1080/01630560701749524
  • E.E. Cragg and A.V. Levy, Study on a supermemory gradient method for the minimization of functions, JOTA 4 (1969), pp. 191–205. doi: 10.1007/BF00930579
  • Y.H. Dai and L.Z. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Opt. 43 (2001), pp. 87–101. doi: 10.1007/s002450010019
  • Y.H. Dai and Y. Yuan, Convergence properties of the Fletcher–Reeves method, IMA J. Numer. Anal. 16 (1996), pp. 155–164. doi: 10.1093/imanum/16.2.155
  • Y.H. Dai and Y. Yuan, Convergence of the Fletcher–Reeves method under a generalized Wolfe search, J. Comput. Math. 2 (1996), pp. 142–148.
  • Y.H. Dai and Y. Yuan, Nonlinear Conjugate Gradient Methods, Shanghai Science and Technology Publisher, Shanghai, 2000.
  • B. Dan and Y. Zhang, A Reference Report, 2014. Available at http://pan.baidu.com/s/1gpDLk
  • E.D. Dolan and J.J. Moré, Benchmarking optimization software with performance profiles, Math. Program. 91 (2002), pp. 201–213. doi: 10.1007/s101070100263
  • R. Fletcher and C.M. Reeves, Functional minimization by conjugate gradients, Comput. J. 7 (1964), pp. 149–154. doi: 10.1093/comjnl/7.2.149
  • J.C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim. 2 (1992), pp. 21–42. doi: 10.1137/0802003
  • W.W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim. 16 (2005), pp. 170–192. doi: 10.1137/030601880
  • W.W. Hager and H. Zhang, Algorithm 851: CG−DESCENT, a conjugate gradient method with guaranteed descent, ACM Trans. Math. Softw. 32 (2006), pp. 113–137. doi: 10.1145/1132973.1132979
  • W.W. Hager and H. Zhang, CG-DESCENT Version 1.4 User’ Guide, University of Florida, November 2005. Available at http://www.math.ufl.edu/ hager/papers/CG
  • Y.F. Hu and C. Storey, Global convergence result for conjugate gradient methods, JOTA 71 (1991), pp. 399–405. doi: 10.1007/BF00939927
  • G.H. Liu, J.Y. Han, and H.X. Yin, Global convergence of the Fletcher–Reeves algorithm with an inexact line search, Appl. Math. J. Chinese Univ. Ser. B 10 (1995), pp. 75–82. doi: 10.1007/BF02663897
  • A. Miele and J.W. Cantrell, Study on a memory gradient method for the minimization of functions, JOTA 3 (1969), pp. 459–470. doi: 10.1007/BF00929359
  • Y. Narushima and H. Yabe, Global convergence of a memory gradient method for unconstrained optimization, Comput. Optim. Appl. 35 (2006), pp. 325–346. doi: 10.1007/s10589-006-8719-z
  • Y. Narushima, H. Yabe, and J.A. Ford, A three-term conjugate gradient method with sufficient descent property for unconstrained optimization, SIAM J. Optim. 21 (2011), pp. 212–230. doi: 10.1137/080743573
  • E. Polak and G. Ribière, Note sur la convergence de méthodes de directions conjuguées, Rev. Française Informat. Recherche Opérationnelle 3 (1969), pp. 35–43.
  • Z.J. Shi and J. Shen, A gradient-related algorithm with inexact line searches, J. Comput. Appl. Math. 170 (2004), pp. 349–370. doi: 10.1016/j.cam.2003.10.025
  • Y. Zhang and K.R. Wang, A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties, Numer. Algorithm 60 (2012), pp. 135–152. doi: 10.1007/s11075-011-9515-0

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.