A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is
Distributionally Robust Multilingual Machine Translation
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Multilingual neural machine translation (MNMT) learns to translate multiple language pairs with a single model, potentially improving both the accuracy and the memoryefficiency of deployed models. However, the heavy data imbalance between languages hinders the model from performing uniformly across language pairs. In this paper, we propose a new learning objective for MNMT based on distributionally robust optimization, which minimizes the worst-case expected loss over the set of language pairs.doi:10.18653/v1/2021.emnlp-main.458 fatcat:mwnl6f465vbftoe7pzm3crii5a