A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Neural Tree Indexers for Text Understanding
2017
Association for Computational Linguistics (ACL). Annual Meeting Conference Proceedings
Recurrent neural networks (RNNs) process input text sequentially and model the conditional transition between word tokens. In contrast, the advantages of recursive networks include that they explicitly model the compositionality and the recursive structure of natural language. However, the current recursive architecture is limited by its dependence on syntactic tree. In this paper, we introduce a robust syntactic parsing-independent tree structured model, Neural Tree Indexers (NTI) that
pmid:29081577
pmcid:PMC5657441
fatcat:lbfx6koqhjbxnd4qp7zmleachy