Rapid Domain Adaptation for Machine Translation with Monolingual Data [article]

Mahdis Mahdieh, Mia Xu Chen, Yuan Cao, Orhan Firat
2020 arXiv   pre-print
One challenge of machine translation is how to quickly adapt to unseen domains in face of surging events like COVID-19, in which case timely and accurate translation of in-domain information into multiple languages is critical but little parallel data is available yet. In this paper, we propose an approach that enables rapid domain adaptation from the perspective of unsupervised translation. Our proposed approach only requires in-domain monolingual data and can be quickly applied to a
more » ... g translation system trained on general domain, reaching significant gains on in-domain translation quality with little or no drop on general-domain. We also propose an effective procedure of simultaneous adaptation for multiple domains and languages. To the best of our knowledge, this is the first attempt that aims to address unsupervised multilingual domain adaptation.
arXiv:2010.12652v1 fatcat:3w2tjppnl5cp3kxtjmro4obona