How Language-Neutral is Multilingual BERT? [article]

Jindřich Libovický and Rudolf Rosa and Alexander Fraser
2019 arXiv   pre-print
Multilingual BERT (mBERT) provides sentence representations for 104 languages, which are useful for many multi-lingual tasks. Previous work probed the cross-linguality of mBERT using zero-shot transfer learning on morphological and syntactic tasks. We instead focus on the semantic properties of mBERT. We show that mBERT representations can be split into a language-specific component and a language-neutral component, and that the language-neutral component is sufficiently general in terms of
more » ... ling semantics to allow high-accuracy word-alignment and sentence retrieval but is not yet good enough for the more difficult task of MT quality estimation. Our work presents interesting challenges which must be solved to build better language-neutral representations, particularly for tasks requiring linguistic transfer of semantics.
arXiv:1911.03310v1 fatcat:ti3oyahm45ggtjiiwdsiy47s7m