Copy this Sentence [article]

Vasileios Lioutas, Andriy Drozdyuk
2019 arXiv   pre-print
Attention is an operation that selects some largest element from some set, where the notion of largest is defined elsewhere. Applying this operation to sequence to sequence mapping results in significant improvements to the task at hand. In this paper we provide the mathematical definition of attention and examine its application to sequence to sequence models. We highlight the exact correspondences between machine learning implementations of attention and our mathematical definition. We
more » ... clear evidence of effectiveness of attention mechanisms evaluating models with varying degrees of attention on a very simple task: copying a sentence. We find that models that make greater use of attention perform much better on sequence to sequence mapping tasks, converge faster and are more stable.
arXiv:1905.09856v1 fatcat:mm36jun42zd4pnnqqm625cbsxa