A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Modeling Local Dependence in Natural Language with Multi-channel Recurrent Neural Networks
[article]
2018
arXiv
pre-print
Recurrent Neural Networks (RNNs) have been widely used in processing natural language tasks and achieve huge success. Traditional RNNs usually treat each token in a sentence uniformly and equally. However, this may miss the rich semantic structure information of a sentence, which is useful for understanding natural languages. Since semantic structures such as word dependence patterns are not parameterized, it is a challenge to capture and leverage structure information. In this paper, we
arXiv:1811.05121v1
fatcat:zdxqssw36rejrbc6zlpm5wwt2a