A New Kullback–Leibler VAD for Speech Recognition in Noise

J. Ramirez, J.C. Segura, C. Benitez, A. delaTorre, A.J. Rubio
2004 IEEE Signal Processing Letters  
This letter shows an innovative voice activity detector (VAD) based on the Kullback-Leibler (KL) divergence measure. The algorithm is evaluated in the context of the recently approved ETSI standard for distributed speech recognition (DSR). The VAD uses long-term information of the noisy speech signal in order to define a more robust decision rule yielding high accuracy. The Mel-scaled filter bank log-energies (FBE) are modeled by means of Gaussian distributions, and a symmetric KL divergence is
more » ... ic KL divergence is used for the estimation of the distance between speech and noise distributions. The decision rule is formulated in terms of the average subband KL divergence that is compared to a noise-adaptable threshold. An exhaustive analysis using the AURORA databases is conducted in order to assess the performance of the proposed method and to compare it to existing standard VAD methods. Index Terms-Kullback-Leibler (KL) divergence, noise reduction, robust speech recognition, voice activity detection (VAD).
doi:10.1109/lsp.2003.821762 fatcat:4yn4nmxf35cuniaezw4ywytso4