A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Contextual Embedding for Distributed Representations of Entities in a Text Corpus
2016
Knowledge Discovery and Data Mining
Distributed representations of textual elements in low dimensional vector space to capture context has gained great attention recently. Current state-of-the-art word embedding techniques compute distributed representations using co-occurrences of words within a contextual window discounting the flexibility to incorporate other contextual phenomena like temporal, geographical, and topical contexts. In this paper, we present a flexible framework that has the ability to leverage temporal,
dblp:conf/kdd/KaderBNH16
fatcat:tkq4xtop2bduvkrx6avsj3hov4