A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
To Transfer or Not to Transfer: Misclassification Attacks Against Transfer Learned Text Classifiers
[article]
2020
arXiv
pre-print
Transfer learning --- transferring learned knowledge --- has brought a paradigm shift in the way models are trained. The lucrative benefits of improved accuracy and reduced training time have shown promise in training models with constrained computational resources and fewer training samples. Specifically, publicly available text-based models such as GloVe and BERT that are trained on large corpus of datasets have seen ubiquitous adoption in practice. In this paper, we ask, "can transfer
arXiv:2001.02438v1
fatcat:5tx2lyo44raixau4tpixllgeve