A new class of quasi-newtonian methods for optimal learning in mlp-networks

A. Bortoletti, C. Di Fiore, S. Fanelli, P. Zellini
2003 IEEE Transactions on Neural Networks  
In this paper, we present a new class of quasi-Newton methods for the effective learning in large multilayer perceptron (MLP)-networks. The algorithms introduced in this work, named QN, utilize an iterative scheme of a generalized BFGS-type method, involving a suitable family of matrix algebras . The main advantages of these innovative methods are based upon the fact that they have an ( log ) complexity per step and that they require ( ) memory allocations. Numerical experiences, performed on a
more » ... set of standard benchmarks of MLP-networks, show the competitivity of the QN methods, especially for large values of . Index Terms-Fast discrete transforms, neural networks, quasi-Newton methods.
doi:10.1109/tnn.2003.809425 pmid:18238010 fatcat:vor5ukyrtfg2dnb3on3droe3xm