A descent hybrid conjugate gradient method based on the memoryless BFGS update Print

I.E. Livieris, V. Tampakas and P. Pintelas. A descent hybrid conjugate gradient method based on the memoryless BFGS update. Numerical Algorithms, 2018.

 

 

Abstract - In this work, we present a new hybrid conjugate gradient method based on the approach of the convex hybridization of the conjugate gradient update parameters of DY and HS+, adapting a quasi-Newton philosophy. The computation of the hybrization parameter is obtained by minimizing the distance between the hybrid conjugate gradient direction and the self-scaling memoryless BFGS direction. Furthermore, a significant property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. The global convergence of the proposed method is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments on a set of unconstrained optimization test problems from the CUTEr collection, indicate that our proposed method is preferable and in general superior to classic conjugate gradient methods in terms of efficiency and robustness.