A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Code-Switching for Enhancing
2019
Proceedings of the 2019 Conference of the North
Leveraging user-provided translation to constrain NMT has practical significance. Existing methods can be classified into two main categories, namely the use of placeholder tags for lexicon words and the use of hard constraints during decoding. Both methods can hurt translation fidelity for various reasons. We investigate a data augmentation method, making code-switched training data by replacing source phrases with their target translations. Our method does not change the NMT model or decoding
doi:10.18653/v1/n19-1044
dblp:conf/naacl/SongZYLWZ19
fatcat:rdutx2aixnc5njppe6lwsudytu