NN-Grams: Unifying Neural Network and n-Gram Language Models for Speech Recognition

Babak Damavandi, Shankar Kumar, Noam Shazeer, Antoine Bruguier
2016 Interspeech 2016  
We present NN-grams, a novel, hybrid language model integrating n-grams and neural networks (NN) for speech recognition. The model takes as input both word histories as well as n-gram counts. Thus, it combines the memorization capacity and scalability of an n-gram model with the generalization ability of neural networks. We report experiments where the model is trained on 26B words. NN-grams are efficient at runtime since they do not include an output soft-max layer. The model is trained using
more » ... oise contrastive estimation (NCE), an approach that transforms the estimation problem of neural networks into one of binary classification between data samples and noise samples. We present results with noise samples derived from either an n-gram distribution or from speech recognition lattices. NN-grams outperforms an n-gram model on an Italian speech recognition dictation task.
doi:10.21437/interspeech.2016-1295 dblp:conf/interspeech/DamavandiKSB16 fatcat:zwyyyu66vbfynnl4ihqdyur2qy