Direct product based deep belief networks for automatic speech recognition

Petr Fousek, Steven Rennie, Pierre Dognin, Vaibhava Goel
2013 2013 IEEE International Conference on Acoustics, Speech and Signal Processing  
In this paper, we present new methods for parameterizing the connections of neural networks using sums of direct products. We show that low rank parameterizations of weight matrices are a subset of this set, and explore the theoretical and practical benefits of representing weight matrices using sums of Kronecker products. ASR results on a 50 hr subset of the English Broadcast News corpus indicate that the approach is promising. In particular, we show that a factorial network with more than 150
more » ... times less parameters in its bottom layer than its standard unconstrained counterpart suffers minimal WER degradation, and that by using sums of Kronecker products, we can close the gap in WER performance while maintaining very significant parameter savings. In addition, direct product DBNs consistently outperform standard DBNs with the same number of parameters. These results have important implications for research on deep belief networks (DBNs). They imply that we should be able to train neural networks with thousands of neurons and minimal restrictions much more rapidly than is currently possible, and that by using sums of direct products, it will be possible to train neural networks with literally millions of neurons tractably-an exciting prospect.
doi:10.1109/icassp.2013.6638238 dblp:conf/icassp/FousekRDG13 fatcat:ilhfo73zffgxfecoifbfp5kjxa