A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Complementary Learning of Word Embeddings
2018
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Continuous bag-of-words (CB) and skip-gram (SG) models are popular approaches to training word embeddings. Conventionally they are two standing-alone techniques used individually. However, with the same goal of building embeddings by leveraging surrounding words, they are in fact a pair of complementary tasks where the output of one model can be used as input of the other, and vice versa. In this paper, we propose complementary learning of word embeddings based on the CB and SG model.
doi:10.24963/ijcai.2018/607
dblp:conf/ijcai/SongS18
fatcat:iujvttbqtfaxrnbnurbzpc75im