A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2004; you can also visit the original URL.
The file type is application/pdf
.
On sequential construction of binary neural networks
1995
IEEE Transactions on Neural Networks
A new technique, called Sequential Window Learning (SWL), for the construction of two-layer perceptrons with binary inputs is presented. It generates the number of hidden neurons together with the correct values for the weights, starting from any binary training set. The introduction of a new type of neuron, having a window-shaped activation function, considerably increases the convergence speed and the compactness of resulting networks. Furthermore, a preprocessing technique, called Hamming
doi:10.1109/72.377973
pmid:18263353
fatcat:atno7arlubasvc7stiphunlese