Brain-Inspired Learning on Neuromorphic Substrates

Friedemann Zenke, Emre O. Neftci
2021 Proceedings of the IEEE  
| Neuromorphic hardware strives to emulate brain-like neural networks and thus holds the promise for scalable, low-power information processing on temporal data streams. Yet, to solve real-world problems, these networks need to be trained. However, training on neuromorphic substrates creates significant challenges due to the offline character and the required nonlocal computations of gradient-based learning algorithms. This article provides a mathematical framework for the design of practical
more » ... line learning algorithms for neuromorphic substrates. Specifically, we show a direct connection between real-time recurrent learning (RTRL), an online algorithm for computing gradients in conventional recurrent neural networks (RNNs), and biologically plausible learning rules for training spiking neural networks (SNNs). Furthermore, we motivate a sparse approximation based on block-diagonal Jacobians, which reduces the algorithm's computational complexity, diminishes the nonlocal information requirements, and empirically leads to good learning performance, thereby improving its applicability to neuromorphic substrates. In summary, our framework bridges the gap between synaptic plasticity and gradient-based approaches from deep learning and lays the foundations for powerful information processing on future neuromorphic hardware systems. KEYWORDS | Artificial neural networks; biological neural networks; learning systems; machine learning; neural network hardware; neuromorphic engineering; recurrent neural networks (RNNs).
doi:10.1109/jproc.2020.3045625 fatcat:pelkbpbg5jg7pjyvkvtpgrt2su