Abstract
We present two quasi-Newton algorithms for solving unconstrained optimization problems based on two modified secant relations to get reliable approximations of the Hessian matrices of the objective function. The proposed methods make use of both gradient and function values, and utilize information from the two most recent steps in contrast to the the usual secant relation using only the latest step. We show that the modified BFGS methods based on the new secant relations are globally convergent and have a local superlinear rate of convergence. Computational experiments are made on problems from the CUTEst library. Comparative numerical results show competitiveness of the proposed methods in the sense of the Dolan–Moré performance profiles.
Disclosure statement
No potential conflict of interest was reported by the author(s).