Abstract
Conjugate gradient algorithms come in a wide range of flavors. Conjugate gradient techniques primarily concentrate on the coefficient conjugate. We introduce a novel conjugate gradient approach that computes the parameter by using Newton updates. In addition, we have demonstrated that our conjugate gradient algorithms are globally convergent and descent property. For the specified test issues in [1] the performance profiles revealed that the novel conjugate gradient approach is effective and efficient.
Subject Classification: