A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2016; you can also visit the original URL.
The file type is application/pdf
.
When Are Tree Structures Necessary for Deep Learning of Representations?
2015
Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing
Recursive neural models, which use syntactic parse trees to recursively generate representations bottom-up, are a popular architecture. However there have not been rigorous evaluations showing for exactly which tasks this syntax-based method is appropriate. In this paper, we benchmark recursive neural models against sequential recurrent neural models, enforcing applesto-apples comparison as much as possible. We investigate 4 tasks: (1) sentiment classification at the sentence level and phrase
doi:10.18653/v1/d15-1278
dblp:conf/emnlp/LiLJH15
fatcat:bcvvcfcvenbubl6qdaz66bua2q