A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is
Gradient search based neural network training algorithm may suffer from local optimum, poor generalization and slow convergence. In this study, a novel Memetic Algorithm based hybrid method with the integration of "extremal optimization" and "Levenberg-Marquardt" is proposed to train multilayer perceptron (MLP) networks. Inheriting the advantages of the two approaches, the proposed "EO-LM" method can avoid local minima and improve MLP network learning performance in generalization capabilitydoi:10.1080/18756891.2010.9727728 fatcat:u4qkiq27onaetf7hlr54oybzty