A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Adapting to the Long Tail: A Meta-Analysis of Transfer Learning Research for Language Understanding Tasks
[article]
2022
arXiv
pre-print
Natural language understanding (NLU) has made massive progress driven by large benchmarks, but benchmarks often leave a long tail of infrequent phenomena underrepresented. We reflect on the question: have transfer learning methods sufficiently addressed the poor performance of benchmark-trained models on the long tail? We conceptualize the long tail using macro-level dimensions (e.g., underrepresented genres, topics, etc.), and perform a qualitative meta-analysis of 100 representative papers on
arXiv:2111.01340v2
fatcat:bh3aeciyyfc4lpkf3ns7gai3ga