High capacity associative memories and connection constraints

Neil Davey, Rod Adams
2004 Connection science  
High capacity associative neural networks can be built from networks of perceptrons, trained using simple perceptron training. Such networks perform much better than those trained using the standard Hopfield one shot Hebbian learning. An experimental investigation into how such networks perform when the connection weights are not free to take any value is reported. The three restrictions investigated are: a symmetry constraint, a sign constraint and a dilution constraint. The selection of these
more » ... selection of these constraints is motivated by both engineering and biological considerations. The next section describes the architecture of the neural networks that underlie this investigation. Section 3 gives the learning rules used to train the networks and Section 4 discusses the types of constraint to the weights that are investigated. Results and conclusions follow in the last two Sections. Models Examined We consider networks of N units which we train with a set of N-ary, bipolar (+1/-1) training vectors, { p
doi:10.1080/09540090310001659981 fatcat:fdt2glxvn5bqfb7mbo6yjwcdbe