A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Deep Copycat Networks for Text-to-Text Generation
2019
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Most text-to-text generation tasks, for example text summarisation and text simplification, require copying words from the input to the output. We introduce copycat, a transformerbased pointer network for such tasks which obtains competitive results in abstractive text summarisation and generates more abstractive summaries. We propose a further extension of this architecture for automatic post-editing, where generation is conditioned over two inputs (source language and machine translation),
doi:10.18653/v1/d19-1318
dblp:conf/emnlp/IveMS19
fatcat:c7yvvpe7wbfbhpiaz3kx55ggku