A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Hard but Robust, Easy but Sensitive: How Encoder and Decoder Perform in Neural Machine Translation
[article]
2019
arXiv
pre-print
Neural machine translation (NMT) typically adopts the encoder-decoder framework. A good understanding of the characteristics and functionalities of the encoder and decoder can help to explain the pros and cons of the framework, and design better models for NMT. In this work, we conduct an empirical study on the encoder and the decoder in NMT, taking Transformer as an example. We find that 1) the decoder handles an easier task than the encoder in NMT, 2) the decoder is more sensitive to the
arXiv:1908.06259v1
fatcat:6bvoj2conff5dft2gckrab2xrm