top
logo

Login Form



Visitors Counter

mod_vvisit_counterToday25
mod_vvisit_counterYesterday55
mod_vvisit_counterThis week234
mod_vvisit_counterThis month482
mod_vvisit_counterAll195113

Who's Online

We have 2 guests online

Home Members Ioannis E. Livieris A descent hybrid conjugate gradient method based on the memoryless BFGS update
Error
  • Error loading feed data.
  • Error loading feed data.
A descent hybrid conjugate gradient method based on the memoryless BFGS update PDF Print E-mail

I.E. Livieris, V. Tampakas and P. Pintelas. A descent hybrid conjugate gradient method based on the memoryless BFGS update. Numerical Algorithms, 2018.

 

 

Abstract - In this work, we present a new hybrid conjugate gradient method based on the approach of the convex hybridization of the conjugate gradient update parameters of DY and HS+, adapting a quasi-Newton philosophy. The computation of the hybrization parameter is obtained by minimizing the distance between the hybrid conjugate gradient direction and the self-scaling memoryless BFGS direction. Furthermore, a significant property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. The global convergence of the proposed method is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments on a set of unconstrained optimization test problems from the CUTEr collection, indicate that our proposed method is preferable and in general superior to classic conjugate gradient methods in terms of efficiency and robustness.

 

Search Engines




bottom
top

Department of Mathematics

Educational Software News

Call for papers

Newest Education Titles


bottom

Designed by Ioannis E. Livieris. | Validate XHTML | CSS