259
Views
7
CrossRef citations to date
0
Altmetric
Original Articles

A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations

, , &
Pages 382-395 | Received 27 Oct 2015, Accepted 14 Oct 2016, Published online: 19 Feb 2017

References

  • N. Andrei, Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization, J. Optim. Theory Appl. 159(1) (2013), pp. 159–182. doi: 10.1007/s10957-013-0285-9
  • D.K.R. Babajee, M.Z. Dauhooa, M.T. Darvishi, A. Karami, and A. Barati, Analysis of two Chebyshev-like third order methods free from second derivatives for solving systems of nonlinear equations, J. Comput. Appl. Math. 233 (2010), pp. 2002–2012. doi: 10.1016/j.cam.2009.09.035
  • P.N. Brown and Y. Saad, Convergence theory of nonlinear Newton–Krylov algorithms, SIAM J. Optim. 4(2) (1994), pp. 297–330. doi: 10.1137/0804017
  • E.D. Dolan and J.J Moré, Benchmarking optimization software with performance profiles, Math. Program. 91(2) (2002), pp. 201–213. doi: 10.1007/s101070100263
  • R. Fletcher and C. Reeves, Function minimization by conjugate gradients, Comput. J. 7(2) (1964), pp. 149–154. doi: 10.1093/comjnl/7.2.149
  • Z. Fu, K. Ren, J. Shu, X. Sun, and F. Huang, Enabling personalized search over encrypted outsourced data with efficiency improvement, IEEE Trans. Parallel Distrib. Syst. (2015). doi: 10.1109/TPDS.2015.2506573.
  • J.C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim. 2(1) (1992), pp. 21–42. doi: 10.1137/0802003
  • B. Gu and V.S. Sheng, A robust regularization path algorithm for v-support vector classification, IEEE Trans. Neural Netw. Learn. Syst. (2016). doi: 10.1109/TNNLS.2016.2527796.
  • G.-Z. Gu, D.-H. Li, L. Qi, and S.-Z. Zhou, Descent directions of quasi-Newton methods for symmetric nonlinear equations, SIAM J. Numer. Anal. 40 (2002), pp. 1763–1774. doi: 10.1137/S0036142901397423
  • B. Gu, V.S. Sheng, K.Y. Tay, W. Romano, and S. Li, Incremental support vector learning for ordinal regression, IEEE Trans. Neural Netw. Learn. Syst. 26 (2015), pp. 1403–1416. doi: 10.1109/TNNLS.2014.2342533
  • B. Gu, X. Sun, and V.S. Sheng, Structural minimax probability machine, IEEE Trans. Neural Netw. Learn. Syst. (2016). doi: 10.1109/TNNLS.2016.2544779.
  • W.W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim. 16(1) (2005), pp. 170–192. doi: 10.1137/030601880
  • W.W. Hager and H. Zhang, A survey of nonlinear conjugate gradient methods, Pac. J. Optim. 2(1) (2006), pp. 35–58.
  • W.W. Hager and H. Zhang, Algorithm 851: CG-DESCENT, a conjugate gradient method with guaranteed descent, ACM Trans. Math. Softw. (TOMS) 32(1) (2006), pp. 113–137. doi: 10.1145/1132973.1132979
  • M.R. Hestenes and E. Stiefel, Methods of conjugate gradients for solving linear systems, J. Res. Natl. Bureau Stand. 49(6) (1952), pp. 409–436. doi: 10.6028/jres.049.044
  • A.N. Iusem and M.V. Solodov, Newton-type methods with generalized distances for constrained optimization, Optimization 41(3) (1997), pp. 257–278. doi: 10.1080/02331939708844339
  • C.T. Kelley, Solution of the Chandrasekhar H-equation by Newton's method, J. Math. Phys. 21(7) (1980), pp. 1625–1628. doi: 10.1063/1.524647
  • D. Li and M. Fukushima, A globally and superlinearly convergent Gauss–Newton-based BFGS method for symmetric nonlinear equations, SIAM J. Numer. Anal. 37 (1999), pp. 152–172. doi: 10.1137/S0036142998335704
  • Y. Li, G. Yuan, and Z. Wei, A limited-memory BFGS algorithm based on a trust-region quadratic model for large-scale nonlinear equations, PLoS ONE 10(5) (2015), pp. 1–13.
  • J. Li, X. Li, B. Yang, and X. Sun, Segmentation-based image copy-move forgery detection scheme, IEEE Trans. Inform. Forensics Sec. 10 (2015), pp. 507–518. doi: 10.1109/TIFS.2014.2381872
  • Y. Liu and C. Storey, Efficient generalized conjugate gradient algorithms part 1: Theory, J. Optim. Theory Appl. 69(1) (1991), pp. 129–137. doi: 10.1007/BF00940464
  • Z. Pan, Y. Zhang, and S. Kwong, Efficient motion and disparity estimation optimization for low complexity multiview video coding, IEEE Trans. Broadcast. 61 (2015), pp. 166–176. doi: 10.1109/TBC.2015.2419824
  • L. Qi, Z. Wei, and G. Yuan, An active-set projected trust-region algorithm with limited memory BFGS technique for box-constrained nonsmooth equations, Optimization 62 (2013), pp. 857–878. doi: 10.1080/02331934.2011.603321
  • Z.-J. Shi, Convergence of line search methods for unconstrained optimization, Appl. Math. Comput. 157(2) (2004), pp. 393–405.
  • M.V. Solodov and B.F. Svaiter, A globally convergent inexact Newton method for systems of monotone equations, in Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods, M. Fukushima and L. Qi, eds., Springer, Dordrecht, 1999, pp. 355–369.
  • Z. Wei, S. Yao, and L. Liu, The convergence properties of some new conjugate gradient methods, Appl. Math. Comput. 183(2) (2006), pp. 1341–1350.
  • Z. Xia, X. Wang, X. Sun, and Q. Wang, A secure and dynamic multi-keyword ranked search scheme over encrypted cloud data, IEEE Trans. Parallel Distrib. Syst. 27 (2015), pp. 340–352. doi: 10.1109/TPDS.2015.2401003
  • Z. Xia, X. Wang, X. Sun, Q. Liu, and N. Xiong, Steganalysis of LSB matching using differences between nonadjacent pixels, Multimedia Tools Appl. 75 (2016), pp. 1947–1962. doi: 10.1007/s11042-014-2381-8
  • S. Yao, Z. Wei, and H. Huang, A notes about wyl's conjugate gradient method and its applications, Appl. Math. Comput. 191(2) (2007), pp. 381–388.
  • G. Yuan, Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, Optim. Lett. 3 (2009), pp. 11–21. doi: 10.1007/s11590-008-0086-5
  • G. Yuan, A new method with descent property for symmetric nonlinear equations, Numer. Funct. Anal. Optim. 31(8) (2010), pp. 974–987. doi: 10.1080/01630563.2010.498599
  • G. Yuan and X. Lu, A new backtracking inexact BFGS method for symmetric nonlinear equations, Comput. Math. Appl. 55 (2008), pp. 116–129. doi: 10.1016/j.camwa.2006.12.081
  • G. Yuan and X. Lu, A modified PRP conjugate gradient method, Ann. Oper. Res. 166(1) (2009), pp. 73–90. doi: 10.1007/s10479-008-0420-4
  • G. Yuan and Z. Wei, A trust region algorithm with conjugate gradient technique for optimization problems, Numer. Funct. Anal. Optim. 32 (2011), pp. 212–232. doi: 10.1080/01630563.2010.532273
  • G. Yuan and S. Yao, A BFGS algorithm for solving symmetric nonlinear equations, Optimization 62 (2013), pp. 85–99. doi: 10.1080/02331934.2011.564621
  • G. Yuan and M. Zhang, A modified Hestenes–Stiefel conjugate gradient algorithm for large-scale optimization, Numer. Funct. Anal. Optim. 34 (2013), pp. 914–937. doi: 10.1080/01630563.2013.777350
  • G. Yuan and M. Zhang, A three-terms Polak–Ribière–Polyak conjugate gradient algorithm for large-scale nonlinear equations, J. Comput. Appl. Math. 286 (2015), pp. 186–195. doi: 10.1016/j.cam.2015.03.014
  • G. Yuan, X. Lu, and Z. Wei, BFGS trust-region method for symmetric nonlinear equations, J. Comput. Appl. Math. 230 (2009), pp. 44–58. doi: 10.1016/j.cam.2008.10.062
  • G. Yuan, X. Lu, and Z. Wei, A conjugate gradient method with descent direction for unconstrained optimization, J. Comput. Appl. Math. 233 (2009), pp. 519–530. doi: 10.1016/j.cam.2009.08.001
  • G. Yuan, Z. Wei, and X. Lu, A BFGS trust-region method for nonlinear equations, Computing 92 (2011), pp. 317–333. doi: 10.1007/s00607-011-0146-z
  • G. Yuan, Z. Wei, and S. Lu, Limited memory BFGS method with backtracking for symmetric nonlinear equations, Math. Comput. Model. 54 (2011), pp. 367–377. doi: 10.1016/j.mcm.2011.02.021
  • G. Yuan, S. Lu, and Z. Wei, A new trust-region method with line search for solving symmetric nonlinear equations, Int. J. Comput. Math. 88 (2011), pp. 2109–2123. doi: 10.1080/00207160.2010.526206
  • G. Yuan, Z. Wei, and X. Lu, A BFGS trust-region method for nonlinear equations, Computing 92 (2011), pp. 317–333. doi: 10.1007/s00607-011-0146-z
  • G. Yuan, Z. Wei, and G. Li, A modified Polak–Ribière–Polyak conjugate gradient algorithm for nonsmooth convex programs, J. Comput. Appl. Math. 255 (2014), pp. 86–96. doi: 10.1016/j.cam.2013.04.032
  • G. Yuan, Z. Wei, and Q. Zhao, A modified Polak–Ribière–Polyak conjugate gradient algorithm for large-scale optimization problems, IIE Trans. 46 (2014), pp. 397–413. doi: 10.1080/0740817X.2012.726757
  • G. Yuan, X. Duan, W. Liu, X. Wang, Z. Cui, and Z. Sheng, Two new PRP conjugate gradient algorithms for minimization optimization models, PLoS ONE 10(10) (2015), pp. 1–24.
  • G. Yuan, Z. Meng, and Y. Li, A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations, J. Optim. Theory Appl. 168 (2016), pp. 129–152. doi: 10.1007/s10957-015-0781-1

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.