A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
NN-Grams: Unifying Neural Network and n-Gram Language Models for Speech Recognition
2016
Interspeech 2016
We present NN-grams, a novel, hybrid language model integrating n-grams and neural networks (NN) for speech recognition. The model takes as input both word histories as well as n-gram counts. Thus, it combines the memorization capacity and scalability of an n-gram model with the generalization ability of neural networks. We report experiments where the model is trained on 26B words. NN-grams are efficient at runtime since they do not include an output soft-max layer. The model is trained using
doi:10.21437/interspeech.2016-1295
dblp:conf/interspeech/DamavandiKSB16
fatcat:zwyyyu66vbfynnl4ihqdyur2qy