A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Efficient Language Model Adaptation with Noise Contrastive Estimation and Kullback-Leibler Regularization
2018
Interspeech 2018
Many language modeling (LM) tasks have limited in-domain data for training. Exploiting out-of-domain data while retaining the relevant in-domain statistics is a desired property in these scenarios. Kullback-Leibler Divergence (KLD) regularization is a popular method for acoustic model (AM) adaptation. KLD regularization assumes that the last layer is a softmax that fully activates the targets of both in-domain and out-of-domain models. Unfortunately, this softmax activation is computationally
doi:10.21437/interspeech.2018-1345
dblp:conf/interspeech/Andres-FerrerBV18
fatcat:c2nkk2vumrcmfibjwexqdqjuey