Code-Switching for Enhancing

Kai Song, Yue Zhang, Heng Yu, Weihua Luo, Kun Wang, Min Zhang
2019 Proceedings of the 2019 Conference of the North  
Leveraging user-provided translation to constrain NMT has practical significance. Existing methods can be classified into two main categories, namely the use of placeholder tags for lexicon words and the use of hard constraints during decoding. Both methods can hurt translation fidelity for various reasons. We investigate a data augmentation method, making code-switched training data by replacing source phrases with their target translations. Our method does not change the NMT model or decoding
more » ... algorithm, allowing the model to learn lexicon translations by copying source-side target words. Extensive experiments show that our method achieves consistent improvements over existing approaches, improving translation of constrained words without hurting unconstrained words.
doi:10.18653/v1/n19-1044 dblp:conf/naacl/SongZYLWZ19 fatcat:rdutx2aixnc5njppe6lwsudytu