References
- M. Al-Baali, Descent property and global convergence of the Fletcher–Reeves method with inexact line search, IMA J. Numer. Anal. 5(1) (1985), pp. 121–124.
- M. Al-Baali, Numerical experience with a class of self-scaling quasi-Newton algorithms, J. Optim. Theory Appl. 96 (1998), pp. 533–553.
- K. Amini, P. Faramarzi, and N. Pirfalah, A modified Hestenes–Stiefel conjugate gradient method with an optimal property, Optim. Methods Softw. 34(4) (2019), pp. 770–782.
- N. Andrei, An unconstrained optimization test functions collection, Adv. Model. Optim. 10(1) (2008), pp. 147–161.
- N. Andrei, Another hybrid conjugate gradient algorithm for unconstrained optimization, Numer. Algor. 47(2) (2008), pp. 143–156.
- I. Bongartz, A.R. Conn, N.I.M. Gould, and P.L. Toint, CUTE: Constrained and unconstrained testing environments, ACM Trans. Math. Softw. 21(1) (1995), pp. 123–160.
- Y.H. Dai and C.X. Kou, A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search, SIAM J. Optim. 23(1) (2013), pp. 296–320.
- Y.H. Dai and L.Z. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim. 43(1) (2001), pp. 87–101.
- Y.H. Dai and Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim. 10(1) (1999), pp. 177–182.
- Y.H. Dai and Y.X. Yuan, An efficient hybrid conjugate gradient method for unconstrained optimization, Ann. Oper. Res. 103 (2001), pp. 33–47.
- S.S. Djordjevié, New hybrid conjugate gradient method as a convex combination of LS and FR methods, Acta Math. Sci. 39(1) (2019), pp. 214–228.
- E.D. Dolan and J.J. Moré, Benchmarking optimization software with performance profiles, Math. Program. 91(2) (2002), pp. 201–213.
- R. Fletcher and C.M. Reeves, Function minimization by conjugate gradients, Comput. J. 7(2) (1964), pp. 149–154.
- R. Fletcher, Practical Methods of Optimization, John Wiley & Sons, New York, 1987.
- J.C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim. 2(1) (1992), pp. 21–42.
- I. Hafaidia, H. Guebbai, M. Al-Baali, and M. Ghiat, A new hybrid conjugate gradient algorithm for unconstrained optimization, Vestnik Udmurtskogo Universiteta, Matematika, Mekhanika, Komp Yuternye Nauki Mathematics 33(2) (2023), pp. 348–364.
- W.W. Hager and H.C. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim. 16(1) (2005), pp. 170–192.
- W.W. Hager and H.C. Zhang, A survey of nonlinear conjugate gradient methods, Pacific J. Optim.2(1) (2006), pp. 35–58.
- M.R. Hestenes and E. Stiefel, Method of conjugate gradient for solving linear equations, J. Res. Natl. Bur. Stand. 49(6) (1952), pp. 409–436.
- A.H. Ibrahim, P. Kumam, A. Kamandi, and A.B. Abubakar, An efficient hybrid conjugate gradient method for unconstrained optimization, Optim. Methods Softw. 37(4) (2022), pp. 1370–1383.
- X.Z. Jiang, W. Liao, J.H. Yin, and J.B. Jian, A new family of hybrid three-term conjugate gradient methods with applications in image restoration, Numer. Algor. 91 (2022), pp. 161–191.
- Z. Khoshgam and A. Ashrafi, A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function, Optim. Methods Softw. 34(4) (2019), pp. 783–796.
- C.X. Kou and Y.H. Dai, A modified self-scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for unconstrained optimization, J. Optim. Theory Appl. 165(1) (2015), pp. 209–224.
- E. Polak and G. Ribière, Note sur la convergence de mèthodes de directions conjugèes, Rev. Fr. Inform. Rech. Oper. 3(16) (1969), pp. 35–43.
- J.K. Liu and S.J. Li, New hybrid conjugate gradient method for unconstrained optimization, Appl. Math. Comput. 245 (2014), pp. 36–43.
- Y. Liu and C. Storey, Efficient generalized conjugate gradient algorithms, Part 1: Theory, J. Optim. Theory Appl. 69(1) (1991), pp. 129–137.
- I.E. Livieris, V. Tampakas, and P. Pintelas, A descent hybrid conjugate gradient method based on the memoryless BFGS update, Numer. Algor. 79(4) (2018), pp. 1169–1185.
- M. Lotfi and S.M. Hosseini, An efficient hybrid conjugate gradient method with sufficient descent property for unconstrained optimization, Optim. Methods Softw. 37(5) (2022), pp. 1725–1739.
- M. Momeni and M.R. Peyghami, A new conjugate gradient algorithm with cubic Barzilai–Borwein stepsize for unconstrained optimization, Optim. Methods Softw. 34(3) (2019), pp. 650–664.
- J.J. Moré, B.S. Garbow, and K.E. Hillstrome, Testing unconstrained optimization software, ACM Trans. Math. Softw. 7 (1981), pp. 17–41.
- J. Nocedal and Y. Yuan, Analysis of a self-scaling quasi-Newton method, Math. Program. 61 (1993), pp. 19–37.
- S.S. Oren and D.G. Luenberger, Self-scaling variable metric (SSVM) algorithms: Part I: Criteria and sufficient conditions for scaling a class of algorithms, Manage. Sci. 20(5) (1974), pp. 733–899.
- S.S. Oren and E. Spedicato, Optimal conditioning of self-scaling variable metric algorithms, Math. Program. 10(1) (1976), pp. 70–90.
- H. Shao, H. Guo, X.Y. Wu, and P.J. Liu, Two families of self-adjusting spectral hybrid DL conjugate gradient methods and applications in image denoising, Appl. Math. Model. 118 (2023), pp. 393–411.
- D. Touati-Ahmed and C. Storey, Efficient hybrid conjugate gradient techniques, J. Optim. Theory Appl. 64 (1990), pp. 379–397.
- Y.X. Yuan and J. Stoer, A subspace study on conjugate gradient algorithms, Zeitsch. Angew. Math. Mech. 75(1) (1995), pp. 69–77.
- G. ZoutendijkNonlinear programming computational methods, in Integer and Nonlinear Programming, J. Abadie, ed., North-Holland, Amsterdam, 1987, pp. 37–86.