A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2018; you can also visit the original URL.
The file type is application/pdf
.
Recurrent Neural CRF for Aspect Term Extraction with Dependency Transmission
[chapter]
2018
Lecture Notes in Computer Science
This paper presents a novel neural architecture for aspect term extraction in fine-grained sentiment computing area. In addition to amalgamating sequential features (character embedding, word embedding and POS tagging information), we train an end-to-end Recurrent Neural Networks (RNNs) with meticulously designed dependency transmission between recurrent units, thereby making it possible to learn structural syntactic phenomena. The experimental results show that incorporating these shallow
doi:10.1007/978-3-319-99495-6_32
fatcat:2kyyh7jyefetnkjt3trrh5ru5u