A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is
Multi-task training is an effective method to mitigate the data sparsity problem. It has recently been applied for crosslingual transfer learning for paradigm completion-the task of producing inflected forms of lemmata-with sequenceto-sequence networks. However, it is still vague how the model transfers knowledge across languages, as well as if and which information is shared. To investigate this, we propose a set of data-dependent experiments using an existing encoder-decoder recurrent neuraldoi:10.18653/v1/w17-4110 dblp:conf/emnlp/JinK17 fatcat:vi2as53gkrb45hut7jbwh2tip4