NICT's Supervised Neural Machine Translation Systems for the WMT19 Translation Robustness Task

Raj Dabre, Eiichiro Sumita
2019 Proceedings of the Fourth Conference on Machine Translation (Volume 2: Shared Task Papers, Day 1)  
In this paper we describe our neural machine translation (NMT) systems for Japanese↔English translation which we submitted to the translation robustness task. We focused on leveraging transfer learning via fine tuning to improve translation quality. We used a fairly well established domain adaptation technique called Mixed Fine Tuning (MFT) (Chu et al., 2017) to improve translation quality for Japanese↔English. We also trained bi-directional NMT models instead of uni-directional ones as the
more » ... nal ones as the former are known to be quite robust, especially in low-resource scenarios. However, given the noisy nature of the in-domain training data, the improvements we obtained are rather modest.
doi:10.18653/v1/w19-5362 dblp:conf/wmt/DabreS19 fatcat:xvmg637rungrvmst3cae4yovdi