Recurrent neural network language models for keyword search

X. Chen, A. Ragni, J. Vasilakes, X. Liu, K. Knill, M.J.F. Gales
2017 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)  
Recurrent neural network language models (RNNLMs) have becoming increasingly popular in many applications such as automatic speech recognition (ASR). Significant performance improvements in both perplexity and word error rate over standard n-gram LMs have been widely reported on ASR tasks. In contrast, published research on using RNNLMs for keyword search systems has been relatively limited. In this paper the application of RNNLMs for the IARPA Babel keyword search task is investigated. In
more » ... vestigated. In order to supplement the limited acoustic transcription data, large amounts of web texts are also used in large vocabulary design and LM training. Various training criteria were then explored to improved RNNLMs' efficiency in both training and evaluation. Significant and consistent improvements on both keyword search and ASR tasks were obtained across all languages. Index Terms-speech recognition, keyword search, language model, recurrent neural network
doi:10.1109/icassp.2017.7953263 dblp:conf/icassp/ChenRVLKG17 fatcat:pxja34y3orea7nqrb5pd3trzie