A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Reducing Catastrophic Forgetting in Modular Neural Networks by Dynamic Information Balancing
[article]
2019
arXiv
pre-print
Lifelong learning is a very important step toward realizing robust autonomous artificial agents. Neural networks are the main engine of deep learning, which is the current state-of-the-art technique in formulating adaptive artificial intelligent systems. However, neural networks suffer from catastrophic forgetting when stressed with the challenge of continual learning. We investigate how to exploit modular topology in neural networks in order to dynamically balance the information load between
arXiv:1912.04508v1
fatcat:pftl4lqgfrardfipsc5zl4jsly