A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Learn to Cross-lingual Transfer with Meta Graph Learning Across Heterogeneous Languages
2020
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
unpublished
The recent emergence of multilingual pretraining language model (mPLM) has enabled breakthroughs on various downstream crosslingual transfer (CLT) tasks. However, mPLMbased methods usually involve two problems: (1) simply fine-tuning may not adapt generalpurpose multilingual representations to be task-aware on low-resource languages; (2) ignore how cross-lingual adaptation happens for downstream tasks. To address the issues, we propose a meta graph learning (MGL) method. Unlike prior works that
doi:10.18653/v1/2020.emnlp-main.179
fatcat:at5hl33n6zdr3ihuh4fw7gueje