A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Unsupervised learning to overcome catastrophic forgetting in neural networks
2019
IEEE Journal on Exploratory Solid-State Computational Devices and Circuits
Continual learning is the ability to acquire a new task or knowledge without losing any previously collected information. Achieving continual learning in artificial intelligence (AI) is currently prevented by catastrophic forgetting, where training of a new task deletes all previously learned tasks. Here, we present a new concept of a neural network capable of combining supervised convolutional learning with bio-inspired unsupervised learning. Brain-inspired concepts such as
doi:10.1109/jxcdc.2019.2911135
fatcat:jmog3mmwqvg7dp7ilolrvdkrwm