Generating network trajectories using gradient descent in state space

R.H.R. Hahnloser
1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227)  
A local and simple learning algorithm is introduced that gradually minimizes an error function for neural states of a general network. Unlike standard backpropagation algorithms, it is based on linearizing the neurodynamics, which are interpreted as constraints for the different network variables. From the resulting equations, the weight update is deduced which has a minimal norm and produces state changes directed precisely towards target values. As an application, it is shown how to generate
more » ... wn how to generate desired neural state space curves on recurrent Hopfield-type networks.
doi:10.1109/ijcnn.1998.687233 fatcat:ecljo57wn5bcdjoqvr4sjufyua