Unsupervised learning to overcome catastrophic forgetting in neural networks

Irene Munoz-Martin, Stefano Bianchi, Giacomo Pedretti, Octavian Melnic, Stefano Ambrogio, Daniele Ielmini
2019 IEEE Journal on Exploratory Solid-State Computational Devices and Circuits  
Continual learning is the ability to acquire a new task or knowledge without losing any previously collected information. Achieving continual learning in artificial intelligence (AI) is currently prevented by catastrophic forgetting, where training of a new task deletes all previously learned tasks. Here, we present a new concept of a neural network capable of combining supervised convolutional learning with bio-inspired unsupervised learning. Brain-inspired concepts such as
more » ... nt plasticity (STDP) and neural redundancy are shown to enable continual learning and prevent catastrophic forgetting without compromising standard accuracy achievable with state-of-the-art neural networks. Unsupervised learning by STDP is demonstrated by hardware experiments with a one-layer perceptron adopting phasechange memory (PCM) synapses. Finally, we demonstrate full testing classification of Modified National Institute of Standards and Technology (MNIST) database with an accuracy of 98% and continual learning of up to 30% non-trained classes with 83% average accuracy. INDEX TERMS Catastrophic forgetting, continual learning, convolutional neural network (CNN), neuromorphic engineering, phase-change memory (PCM), spike-timing-dependent plasticity (STDP), supervised learning, unsupervised learning.
doi:10.1109/jxcdc.2019.2911135 fatcat:jmog3mmwqvg7dp7ilolrvdkrwm