A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Neural Associative Memory for Dual-Sequence Modeling
2016
Proceedings of the 1st Workshop on Representation Learning for NLP
Many important NLP problems can be posed as dual-sequence or sequence-tosequence modeling tasks. Recent advances in building end-to-end neural architectures have been highly successful in solving such tasks. In this work we propose a new architecture for dual-sequence modeling that is based on associative memory. We derive AM-RNNs, a recurrent associative memory (AM) which augments generic recurrent neural networks (RNN). This architecture is extended to the Dual AM-RNN which operates on two
doi:10.18653/v1/w16-1630
dblp:conf/rep4nlp/Weissenborn16
fatcat:7ii2on72bfh6pp7owiojf4xvhi