Connectionist language modeling for large vocabulary continuous speech recognition

Holger Schwenk, Jean-Luc Gauvain
2002 IEEE International Conference on Acoustics Speech and Signal Processing  
This paper describes ongoing work on a new approach for language modeling for large vocabulary continuous speech recognition. Almost all state-of-the-art systems use statistical ¢ -gram language models estimated on text corpora. One principle problem with such language models is the fact that many of the ¢ -grams are never observed even in very large training corpora, and therefore it is common to back-off to a lower-order model. In this paper we propose to address this problem by carrying out
more » ... he estimation task in a continuous space, enabling a smooth interpolation of the probabilities. A neural network is used to learn the projection of the words onto a continuous space and to estimate the ¢ -gram probabilities. The connectionist language model is being evaluated on the DARPA HUB5 conversational telephone speech recognition task and preliminary results show consistent improvements in both perplexity and word error rate.
doi:10.1109/icassp.2002.5743830 dblp:conf/icassp/SchwenkG02 fatcat:pjmzpemhizeazoi5atfubxtjqy