Exploring Cross-Lingual Transfer of Morphological Knowledge In Sequence-to-Sequence Models

Huiming Jin, Katharina Kann
2017 Proceedings of the First Workshop on Subword and Character Level Models in NLP  
Multi-task training is an effective method to mitigate the data sparsity problem. It has recently been applied for crosslingual transfer learning for paradigm completion-the task of producing inflected forms of lemmata-with sequenceto-sequence networks. However, it is still vague how the model transfers knowledge across languages, as well as if and which information is shared. To investigate this, we propose a set of data-dependent experiments using an existing encoder-decoder recurrent neural
more » ... etwork for the task. Our results show that indeed the performance gains surpass a pure regularization effect and that knowledge about language and morphology can be transferred.
doi:10.18653/v1/w17-4110 dblp:conf/emnlp/JinK17 fatcat:vi2as53gkrb45hut7jbwh2tip4