A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Task-adaptive Pre-training and Self-training are Complementary for Natural Language Understanding
2021
Findings of the Association for Computational Linguistics: EMNLP 2021
unpublished
Task-adaptive pre-training (TAPT) and Selftraining (ST) have emerged as the major semisupervised approaches to improve natural language understanding (NLU) tasks with massive amount of unlabeled data. However, it's unclear whether they learn similar representations or they can be effectively combined. In this paper, we show that TAPT and ST can be complementary with simple TFS protocol by following TAPT → Finetuning → Selftraining (TFS) process. Experimental results show that TFS protocol can
doi:10.18653/v1/2021.findings-emnlp.86
fatcat:iv3y3x24kvaozc5in2lyyahr64