A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Speeding up Context-based Sentence Representation Learning with Non-autoregressive Convolutional Decoding
[article]
2018
arXiv
pre-print
Context plays an important role in human language understanding, thus it may also be useful for machines learning vector representations of language. In this paper, we explore an asymmetric encoder-decoder structure for unsupervised context-based sentence representation learning. We carefully designed experiments to show that neither an autoregressive decoder nor an RNN decoder is required. After that, we designed a model which still keeps an RNN as the encoder, while using a non-autoregressive
arXiv:1710.10380v3
fatcat:wgckw2cz7bdb7atat74gg4fcxa