Cross-lingual Adaption Model-Agnostic Meta-Learning for Natural Language Understanding [article]

Qianying Liu, Fei Cheng, Sadao Kurohashi
2021 arXiv   pre-print
Meta learning with auxiliary languages has demonstrated promising improvements for cross-lingual natural language processing. However, previous studies sample the meta-training and meta-testing data from the same language, which limits the ability of the model for cross-lingual transfer. In this paper, we propose XLA-MAML, which performs direct cross-lingual adaption in the meta-learning stage. We conduct zero-shot and few-shot experiments on Natural Language Inference and Question Answering.
more » ... e experimental results demonstrate the effectiveness of our method across different languages, tasks, and pretrained models. We also give analysis on various cross-lingual specific settings for meta-learning including sampling strategy and parallelism.
arXiv:2111.05805v1 fatcat:rr4ff4w24fdo5lyxkwsspx5lum