Initialization and self‐organized optimization of recurrent neural network connectivity

Joschka Boedecker, Oliver Obst, N. Michael Mayer, Minoru Asada
2009 HFSP Journal  
Reservoir computing (RC) is a recent paradigm in the field of recurrent neural networks. Networks in RC have a sparsely and randomly connected fixed hidden layer, and only output connections are trained. RC Networks have recently received increased attention as a mathematical model for generic neural microcircuits, to investigate and explain computations in neocortical columns. Applied to specific tasks, their fixed random connectivity, however, leads to significant variation in performance.
more » ... problem specific optimization procedures are known, which would be important for engineering applications, but also in order to understand how networks in biology are shaped to be optimally adapted to requirements of their environment. We study a general network initialization method using permutation matrices and derive a new unsupervised learning rule based on intrinsic plasticity (IP). The IP based learning uses only local learning, and its aim is to improve network performance in a self-organized way. Using three different benchmarks, we show that networks with permutation matrices for the reservoir connectivity have much more persistent memory than the other methods, but are also able to perform highly non-linear mappings. We also show that IP based on sigmoid transfer functions is limited concerning the output distributions that can be achieved. † N. Michael Mayer is now with the
doi:10.2976/1.3240502 pmid:20357891 pmcid:PMC2801534 fatcat:ek2ym26zijbureujlqnhnrfunq