A new conjugate gradient algorithm for training neural networks based on a modified secant equation Print
I.E. Livieris and P. Pintelas, A new conjugate gradient algorithm for training neural networks based on a modified secant equation. Applied Mathematics and Computation, Volume 221, p.p. 491-502, 2013.

Abstract - Conjugate gradient methods have been established as excellent neural network training methods, due to the simplicity of their iteration, numerical efficiency and their low memory requirements. In this work, we propose a conjugate gradient neural network training algorithm which guarantees sufficient descent using any line search, avoiding thereby the usually inefficient restarts. Moreover, it approximates the second order curvature information of the error surface with a high-order accuracy by utilizing a new modified secant condition. Under mild conditions, we establish that the global convergence of our proposed method. Experimental results provide evidence that our proposed method is in general superior to the classical conjugate gradient training methods and has a potential to significantly enhance the computational efficiency and robustness of the training process.pan>