A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
AdapterFusion: Non-Destructive Task Composition for Transfer Learning
2021
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
unpublished
Sequential fine-tuning and multi-task learning are methods aiming to incorporate knowledge from multiple tasks; however, they suffer from catastrophic forgetting and difficulties in dataset balancing. To address these shortcomings, we propose AdapterFusion, a new two stage learning algorithm that leverages knowledge from multiple tasks. First, in the knowledge extraction stage we learn task specific parameters called adapters, that encapsulate the task-specific information. We then combine the
doi:10.18653/v1/2021.eacl-main.39
fatcat:ceclmygggnehlps66t6iicwmtq