A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2018; you can also visit the original URL.
The file type is application/pdf
.
Dict2vec : Learning Word Embeddings using Lexical Dictionaries
2017
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Learning word embeddings on large unlabeled corpus has been shown to be successful in improving many natural language tasks. The most efficient and popular approaches learn or retrofit such representations using additional external data. Resulting embeddings are generally better than their corpus-only counterparts, although such resources cover a fraction of words in the vocabulary. In this paper, we propose a new approach, Dict2vec, based on one of the largest yet refined datasource for
doi:10.18653/v1/d17-1024
dblp:conf/emnlp/TissierGH17
fatcat:stk4a2k4fjcmta4d3o2b4u46jm