A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Generalized Tuning of Distributional Word Vectors for Monolingual and Cross-Lingual Lexical Entailment
2019
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Lexical entailment (LE; also known as hyponymy-hypernymy or is-a relation) is a core asymmetric lexical relation that supports tasks like taxonomy induction and text generation. In this work, we propose a simple and effective method for fine-tuning distributional word vectors for LE. Our Generalized Lexical ENtailment model (GLEN) is decoupled from the word embedding model and applicable to any distributional vector space. Yet -unlike existing retrofitting models -it captures a general
doi:10.18653/v1/p19-1476
dblp:conf/acl/GlavasV19
fatcat:4rfdb6vbhvep7bz3nnr3ydulbi