Efficient training of large neural networks for language modeling

H. Schwenk
2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541)  
Recently there has been increasing interest in using neural networks for language modeling. In contrast to the well known backoff n-gram language models, the neural network approach tries to limit the data sparseness problem by performing the estimation in a continuous space, allowing by this means smooth interpolations. The complexity to train such a model and to calculate one n-gram probability is however several orders of magnitude higher than for the backoff models, making the new approach
more » ... ifficult to use in real applications. In this paper several techniques are presented that allow the use of a neural network language model in a large vocabulary speech recognition system, in particular very fast lattice rescoring and efficient training of large neural networks on training corpora of over 10 million words. The described approach achieves significant word error reductions with respect to a carefully tuned 4-gram backoff language model in a state of the art conversational speech recognizer for the DARPA rich transcriptions evaluations.
doi:10.1109/ijcnn.2004.1381158 fatcat:pm2vjwt2q5aonmsfnfytj6z2qy