A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
2014
Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP)
In this paper, we propose a novel neural network model called RNN Encoder-Decoder that consists of two recurrent neural networks (RNN). One RNN encodes a sequence of symbols into a fixedlength vector representation, and the other decodes the representation into another sequence of symbols. The encoder and decoder of the proposed model are jointly trained to maximize the conditional probability of a target sequence given a source sequence. The performance of a statistical machine translation
doi:10.3115/v1/d14-1179
dblp:conf/emnlp/ChoMGBBSB14
fatcat:uiy743kyojcknh7pjgs4x33osa