Incorporating Extra Knowledge to Enhance Word Embedding

Arpita Roy, Shimei Pan
2020 Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence  
Word embedding, a process to automatically learn the mathematical representations of words from unlabeled text corpora, has gained a lot of attention recently. Since words are the basic units of a natural language, the more precisely we can represent the morphological, syntactic and semantic properties of words, the better we can support downstream Natural Language Processing (NLP) tasks. Since traditional word embeddings are mainly designed to capture the semantic relatedness between
more » ... d words in a predefined context, it may not be effective in encoding other information that is important for different NLP applications. In this survey, we summarize the recent advances in incorporating extra knowledge to enhance word embedding. We will also identify the limitations of existing work as well as point out a few promising future directions.
doi:10.24963/ijcai.2020/676 dblp:conf/ijcai/GaoCR0020 fatcat:n3hj4lad2vcphpmzdnwgflp7x4