Overcoming Catastrophic Forgetting by Generative Regularization [article]

Patrick H. Chen, Wei Wei, Cho-jui Hsieh, Bo Dai
2021 arXiv   pre-print
In this paper, we propose a new method to overcome catastrophic forgetting by adding generative regularization to Bayesian inference framework. Bayesian method provides a general framework for continual learning. We could further construct a generative regularization term for all given classification models by leveraging energy-based models and Langevin-dynamic sampling to enrich the features learned in each task. By combining discriminative and generative loss together, we empirically show
more » ... the proposed method outperforms state-of-the-art methods on a variety of tasks, avoiding catastrophic forgetting in continual learning. In particular, the proposed method outperforms baseline methods over 15% on the Fashion-MNIST dataset and 10% on the CUB dataset
arXiv:1912.01238v3 fatcat:hddingjbzbennljz4nnn32lk2m