A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Incorporating Extra Knowledge to Enhance Word Embedding
2020
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Word embedding, a process to automatically learn the mathematical representations of words from unlabeled text corpora, has gained a lot of attention recently. Since words are the basic units of a natural language, the more precisely we can represent the morphological, syntactic and semantic properties of words, the better we can support downstream Natural Language Processing (NLP) tasks. Since traditional word embeddings are mainly designed to capture the semantic relatedness between
doi:10.24963/ijcai.2020/676
dblp:conf/ijcai/GaoCR0020
fatcat:n3hj4lad2vcphpmzdnwgflp7x4