Filters








3 Hits in 5.5 sec

What to Pre-Train on? Efficient Intermediate Task Selection [article]

Clifton Poth, Jonas Pfeiffer, Andreas Rücklé, Iryna Gurevych
2021 arXiv   pre-print
With an abundance of candidate datasets as well as pre-trained language models, it has become infeasible to run the cross-product of all combinations to find the best transfer setting.  ...  We experiment with a diverse set of 42 intermediate and 11 target English classification, multiple choice, question answering, and sequence tagging tasks.  ...  We thank Leonardo Ribeiro and the anonymous reviewers for insightful feedback and suggestions on a draft of this paper.  ... 
arXiv:2104.08247v2 fatcat:4ljcfshev5f3tmgugrrrkh3s4m

FastIF: Scalable Influence Functions for Efficient Model Interpretation and Debugging

Han Guo, Nazneen Rajani, Peter Hase, Mohit Bansal, Caiming Xiong
2021 Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing   unpublished
In Proceedings of SustaiNLP: Tan. 2020. Explaining machine learning classifiers Workshop on Simple and Efficient Natural Language through diverse counterfactual explanations.  ...  ence on Natural Language Processing (EMNLP- IJCNLP), pages 11–20.  ... 
doi:10.18653/v1/2021.emnlp-main.808 fatcat:oqdurqbbivgbhltg53obpfhl7i

MAD-G: Multilingual Adapter Generation for Efficient Cross-Lingual Transfer

Alan Ansell, Edoardo Maria Ponti, Jonas Pfeiffer, Sebastian Ruder, Goran Glavaš, Ivan Vulić, Anna Korhonen
2021 Findings of the Association for Computational Linguistics: EMNLP 2021   unpublished
Proceedings of SustaiNLP: Bruna Morrone, Quentin de Laroussilhe, Andrea Workshop on Simple and Efficient Natural Language Gesmundo, Mona Attariyan, and Sylvain Gelly.  ...  In Proceedings of the 2018 Con- ural Language Processing, EMNLP 2021, Online, ference on Empirical Methods in Natural Language November , 2021.  ... 
doi:10.18653/v1/2021.findings-emnlp.410 fatcat:utufp4lsjbgjhpvx4qlxfd25hi