A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Current natural language processing models work well on a single task, yet they often fail to continuously learn new tasks without forgetting previous ones as they are re-trained throughout their lifetime, a challenge known as lifelong learning. State-of-the-art lifelong language learning methods store past examples in episodic memory and replay them at both training and inference time. However, as we show later in our experiments, there are three significant impediments: (1) needingdoi:10.18653/v1/2020.emnlp-main.39 fatcat:2hfhyh5hczbxtl3ulhvkmcfthy