Learn to Cross-lingual Transfer with Meta Graph Learning Across Heterogeneous Languages

Zheng Li, Mukul Kumar, William Headden, Bing Yin, Ying Wei, Yu Zhang, Qiang Yang
2020 Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)   unpublished
The recent emergence of multilingual pretraining language model (mPLM) has enabled breakthroughs on various downstream crosslingual transfer (CLT) tasks. However, mPLMbased methods usually involve two problems: (1) simply fine-tuning may not adapt generalpurpose multilingual representations to be task-aware on low-resource languages; (2) ignore how cross-lingual adaptation happens for downstream tasks. To address the issues, we propose a meta graph learning (MGL) method. Unlike prior works that
more » ... transfer from scratch, MGL can learn to cross-lingual transfer by extracting meta-knowledge from historical CLT experiences (tasks), making mPLM insensitive to low-resource languages. Besides, for each CLT task, MGL formulates its transfer process as information propagation over a dynamic graph, where the geometric structure can automatically capture intrinsic language relationships to guide cross-lingual transfer explicitly. Empirically, extensive experiments on both public and real-world datasets demonstrate the effectiveness of the MGL method.
doi:10.18653/v1/2020.emnlp-main.179 fatcat:at5hl33n6zdr3ihuh4fw7gueje