Composition of Sentence Embeddings: Lessons from Statistical Relational Learning

Damien Sileo, Tim Van De Cruys, Camille Pradel, Philippe Muller
2019 Proceedings of the Eighth Joint Conference on Lexical and Computational Semantics (*  
Various NLP problems -such as the prediction of sentence similarity, entailment, and discourse relations -are all instances of the same general task: the modeling of semantic relations between a pair of textual elements. A popular model for such problems is to embed sentences into fixed size vectors, and use composition functions (e.g. concatenation or sum) of those vectors as features for the prediction. At the same time, composition of embeddings has been a main focus within the field of
more » ... stical Relational Learning (SRL) whose goal is to predict relations between entities (typically from knowledge base triples). In this article, we show that previous work on relation prediction between texts implicitly uses compositions from baseline SRL models. We show that such compositions are not expressive enough for several tasks (e.g. natural language inference). We build on recent SRL models to address textual relational problems, showing that they are more expressive, and can alleviate issues from simpler compositions. The resulting models significantly improve the state of the art in both transferable sentence representation learning and relation prediction.
doi:10.18653/v1/s19-1004 dblp:conf/starsem/SileoCPM19 fatcat:e5dtyzh3hfffvesvw42642w5iq