A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Context-Sensitive and Role-Dependent Spoken Language Understanding Using Bidirectional and Attention LSTMs
2016
Interspeech 2016
To understand speaker intentions accurately in a dialog, it is important to consider the context of the surrounding sequence of dialog turns. Furthermore, each speaker may play a different role in the conversation, such as agent versus client, and thus features related to these roles may be important to the context. In previous work, we proposed context-sensitive spoken language understanding (SLU) using role-dependent long short-term memory (LSTM) recurrent neural networks (RNNs), and showed
doi:10.21437/interspeech.2016-1171
dblp:conf/interspeech/HoriHWH16
fatcat:nfaw4qqb3rhwrolupb7gehw27a