Incorporating Both Distributional and Relational Semantics in Word Representations [article]

Daniel Fried, Kevin Duh
2015 arXiv   pre-print
We investigate the hypothesis that word representations ought to incorporate both distributional and relational semantics. To this end, we employ the Alternating Direction Method of Multipliers (ADMM), which flexibly optimizes a distributional objective on raw text and a relational objective on WordNet. Preliminary results on knowledge base completion, analogy tests, and parsing show that word representations trained on both objectives can give improvements in some cases.
arXiv:1412.5836v3 fatcat:rdkhaz3nznbohekjh2g7l47sv4