I.E. Livieris and P. Pintelas. Performance evaluation of descent CG methods for neural network training. In Proceedings of 9th Hellenic European Research on Computer Mathematics & its Applications Conference (HERCMA’09), Athens, 2009. Also included in Hermis An International Journal of Computer Mathematics and its Applications, Volume 11, pp. 40-46, 2009.
Abstract: Conjugate gradient methods constitute an excellent choice for efficiently training large neural networks since they don’t require the evaluation of the Hessian matrix neither the impractical storage of an approximation of it. Despite the theoretical and practical advantages of these methods their main
drawback is the use of restarting procedures in order to guarantee convergence, abandoning second order derivative information. In this work, we propose a neural network training algorithm which preserves the advantages of classical conjugate gradient methods and simultaneously avoids the inefficient restarts. Encouraging numerical experiments verify that the presented algorithm provides fast, stable and reliable convergence.