top
logo

Login Form



Visitors Counter

mod_vvisit_counterToday24
mod_vvisit_counterYesterday35
mod_vvisit_counterThis week87
mod_vvisit_counterThis month369
mod_vvisit_counterAll195000

Who's Online

We have 1 guest online

Home Members Ioannis E. Livieris An advanced conjugate gradient training algorithm based on a modified secant equation
Error
  • Error loading feed data.
  • Error loading feed data.
An advanced conjugate gradient training algorithm based on a modified secant equation PDF Print E-mail

I.E. Livieris and P. Pintelas, An Advanced Conjugate Gradient Training Algorithm Based on a Modified Secant Equation, ISRN Artificial Intelligence, 2012.

 

Abstract - Conjugate gradient methods constitute excellent neural network training methods, characterized by their simplicity, numerical efficiency and their very low memory requirements. In this paper, we propose a conjugate gradient neural network training algorithm which guarantees sufficient descent using any line search, avoiding thereby the usually inefficient restarts. Moreover, it achieves a high-order accuracy in approximating the second order curvature information of the error surface by utilizing the modified secant condition proposed by Li et al. (J. Comput. Appl. Math. 202(2):523--539, 2007). Under mild conditions, we establish that the proposed method is globally convergent for general functions under the strong Wolfe conditions. Experimental results provide evidence that our proposed method is preferable and in general superior to the classical conjugate gradient methods has a potential to significantly enhance the computational efficiency and robustness of the training process.

 

 

Search Engines




bottom
top

Department of Mathematics

Educational Software News

Call for papers

Newest Education Titles


bottom

Designed by Ioannis E. Livieris. | Validate XHTML | CSS