A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
MetaXL: Meta Representation Transformation for Low-resource Cross-lingual Learning
[article]
2021
arXiv
pre-print
The combination of multilingual pre-trained representations and cross-lingual transfer learning is one of the most effective methods for building functional NLP systems for low-resource languages. However, for extremely low-resource languages without large-scale monolingual corpora for pre-training or sufficient annotated data for fine-tuning, transfer learning remains an under-studied and challenging task. Moreover, recent work shows that multilingual representations are surprisingly disjoint
arXiv:2104.07908v1
fatcat:xqeubatcxzhqjhqs2shdjoh5py