A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Learn Continually, Generalize Rapidly: Lifelong Knowledge Accumulation for Few-shot Learning
[article]
2022
arXiv
pre-print
The ability to continuously expand knowledge over time and utilize it to rapidly generalize to new tasks is a key feature of human linguistic intelligence. Existing models that pursue rapid generalization to new tasks (e.g., few-shot learning methods), however, are mostly trained in a single shot on fixed datasets, unable to dynamically expand their knowledge; while continual learning algorithms are not specifically designed for rapid generalization. We present a new learning setup, Continual
arXiv:2104.08808v4
fatcat:e34s6tsvqjeaxoonphdhmdl2ki