A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Enhancing Semantic Word Representations by Embedding Deeper Word Relationships
[article]
2019
arXiv
pre-print
Word representations are created using analogy context-based statistics and lexical relations on words. Word representations are inputs for the learning models in Natural Language Understanding (NLU) tasks. However, to understand language, knowing only the context is not sufficient. Reading between the lines is a key component of NLU. Embedding deeper word relationships which are not represented in the context enhances the word representation. This paper presents a word embedding which combines
arXiv:1901.07176v1
fatcat:l2vcvimvu5ctrasqtxncfd3nre