Exploration of the Impact of Maximum Entropy in Recurrent Neural Network Language Models for Code-Switching Speech

Ngoc Thang Vu, Tanja Schultz
2014 Proceedings of the First Workshop on Computational Approaches to Code Switching  
This paper presents our latest investigations of the jointly trained maximum entropy and recurrent neural network language models for Code-Switching speech. First, we explore extensively the integration of part-of-speech tags and language identifier information in recurrent neural network language models for Code-Switching. Second, the importance of the maximum entropy model is demonstrated along with a various of experimental results. Finally, we propose to adapt the recurrent neural network
more » ... nt neural network language model to different Code-Switching behaviors and use them to generate artificial Code-Switching text data.
doi:10.3115/v1/w14-3904 dblp:conf/acl-codeswitch/VuS14 fatcat:bohxkvtp55b6bdfgsbmirg6drq