Speeding up Context-based Sentence Representation Learning with Non-autoregressive Convolutional Decoding [article]

Shuai Tang, Hailin Jin, Chen Fang, Zhaowen Wang, Virginia R. de Sa
2018 arXiv   pre-print
Context plays an important role in human language understanding, thus it may also be useful for machines learning vector representations of language. In this paper, we explore an asymmetric encoder-decoder structure for unsupervised context-based sentence representation learning. We carefully designed experiments to show that neither an autoregressive decoder nor an RNN decoder is required. After that, we designed a model which still keeps an RNN as the encoder, while using a non-autoregressive
more » ... convolutional decoder. We further combine a suite of effective designs to significantly improve model efficiency while also achieving better performance. Our model is trained on two different large unlabelled corpora, and in both cases the transferability is evaluated on a set of downstream NLP tasks. We empirically show that our model is simple and fast while producing rich sentence representations that excel in downstream tasks.
arXiv:1710.10380v3 fatcat:wgckw2cz7bdb7atat74gg4fcxa