A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Bridging Knowledge Gaps in Neural Entailment via Symbolic Models
2018
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Most textual entailment models focus on lexical gaps between the premise text and the hypothesis, but rarely on knowledge gaps. We focus on filling these knowledge gaps in the Science Entailment task, by leveraging an external structured knowledge base (KB) of science facts. Our new architecture combines standard neural entailment models with a knowledge lookup module. To facilitate this lookup, we propose a fact-level decomposition of the hypothesis, and verifying the resulting sub-facts
doi:10.18653/v1/d18-1535
dblp:conf/emnlp/KangKSC18
fatcat:bfc3of2epberxltpykqzw2sjme