Filters








525 Hits in 5.0 sec

Meta-Learning for Fast Cross-Lingual Adaptation in Dependency Parsing [article]

Anna Langedijk, Verna Dankers, Phillip Lippe, Sander Bos, Bryan Cardenas Guevara, Helen Yannakoudakis, Ekaterina Shutova
2022 arXiv   pre-print
We apply model-agnostic meta-learning (MAML) to the task of cross-lingual dependency parsing.  ...  Meta-learning, or learning to learn, is a technique that can help to overcome resource scarcity in cross-lingual NLP problems, by enabling fast adaptation to new tasks.  ...  Conclusion In this paper, we present a meta-learning approach for the task of cross-lingual dependency parsing.  ... 
arXiv:2104.04736v3 fatcat:yptu2e7lkzhu3evfn4yzgvc67y

Meta-Learning for Fast Cross-Lingual Adaptation in Dependency Parsing

Anna Langedijk, Verna Dankers, Phillip Lippe, Sander Bos, Bryan Cardenas Guevara, Helen Yannakoudakis, Ekaterina Shutova
2022 Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)   unpublished
We apply model-agnostic meta-learning (MAML) to the task of cross-lingual dependency parsing.  ...  Meta-learning, or learning to learn, is a technique that can help to overcome resource scarcity in cross-lingual NLP problems, by enabling fast adaptation to new tasks.  ...  Conclusion In this paper, we present a meta-learning approach for the task of cross-lingual dependency parsing.  ... 
doi:10.18653/v1/2022.acl-long.582 fatcat:yqisykmfdnbwlittpfxczk7tai

X-METRA-ADA: Cross-lingual Meta-Transfer Learning Adaptation to Natural Language Understanding and Question Answering [article]

Meryem M'hamdi, Doo Soon Kim, Franck Dernoncourt, Trung Bui, Xiang Ren, Jonathan May
2021 arXiv   pre-print
In this work, we propose X-METRA-ADA, a cross-lingual MEta-TRAnsfer learning ADAptation approach for NLU.  ...  Recently, meta-learning has garnered attention as a promising technique for enhancing transfer learning under low-resource scenarios: particularly for cross-lingual transfer in Natural Language Understanding  ...  On difficulties of cross-lingual transfer with order differ- ences: A case study on dependency parsing.  ... 
arXiv:2104.09696v2 fatcat:yt6um3pmbrf5zh7bdclsu5duoi

Zero-Shot Cross-Lingual Transfer with Meta Learning [article]

Farhad Nooralahzadeh, Giannis Bekoulis, Johannes Bjerva, Isabelle Augenstein
2020 arXiv   pre-print
We experiment using standard supervised, zero-shot cross-lingual, as well as few-shot cross-lingual settings for different natural language understanding tasks (natural language inference, question answering  ...  Our extensive experimental setup demonstrates the consistent effectiveness of meta-learning for a total of 15 languages.  ...  We are grateful to the Nordic Language Processing Laboratory (NLPL) for providing access to its supercluster infrastructure.  ... 
arXiv:2003.02739v4 fatcat:zry6xjzgznabnah3xyx4btdjim

How Do Multilingual Encoders Learn Cross-lingual Representation? [article]

Shijie Wu
2022 arXiv   pre-print
Surprisingly, without any explicit cross-lingual signal, multilingual BERT learns cross-lingual representations in addition to representations for individual languages.  ...  As BERT revolutionized representation learning and NLP, it also revolutionized cross-lingual representations and cross-lingual transfer.  ...  For dependency parsing, following Tiedemann, Agić, and Nivre (2014), we adapt the disambiguation of many-to-one mappings by choosing as the head the ZERO-SHOT CROSS-LINGUAL TRANSFER OPTIMIZATION?  ... 
arXiv:2207.05737v1 fatcat:j6vfurgdhvhm5evwaqjhf4b3lu

Learning to Learn Morphological Inflection for Resource-Poor Languages

Katharina Kann, Samuel R. Bowman, Kyunghyun Cho
2020 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
We propose to cast the task of morphological inflection—mapping a lemma to an indicated inflected form—for resource-poor languages as a meta-learning problem.  ...  In particular, it obtains a 31.7% higher absolute accuracy than a previously proposed cross-lingual transfer model and outperforms the previous state of the art by 1.7% absolute accuracy on average over  ...  Acknowledgments This work has received support from Samsung Advanced Institute of Technology (Next Generation Deep Learning: from Pattern Recognition to AI) and Samsung Electronics (Improving Deep Learning  ... 
doi:10.1609/aaai.v34i05.6316 fatcat:oo6xxesfhvcsrlcxoonjq3hkeu

Learning to Learn Morphological Inflection for Resource-Poor Languages [article]

Katharina Kann, Samuel R. Bowman, Kyunghyun Cho
2020 arXiv   pre-print
We propose to cast the task of morphological inflection - mapping a lemma to an indicated inflected form - for resource-poor languages as a meta-learning problem.  ...  In particular, it obtains a 31.7% higher absolute accuracy than a previously proposed cross-lingual transfer model and outperforms the previous state of the art by 1.7% absolute accuracy on average over  ...  Acknowledgments This work has received support from Samsung Advanced Institute of Technology (Next Generation Deep Learning: from Pattern Recognition to AI) and Samsung Electronics (Improving Deep Learning  ... 
arXiv:2004.13304v1 fatcat:a65rnt375jb7zke6bop4hdlfem

Everything Is All It Takes: A Multipronged Strategy for Zero-Shot Cross-Lingual Information Extraction [article]

Mahsa Yarmohammadi, Shijie Wu, Marc Marone, Haoran Xu, Seth Ebner, Guanghui Qin, Yunmo Chen, Jialiang Guo, Craig Harman, Kenton Murray, Aaron Steven White, Mark Dredze (+1 others)
2021 arXiv   pre-print
We use English-to-Arabic IE as our initial example, demonstrating strong performance in this setting for event extraction, named entity recognition, part-of-speech tagging, and dependency parsing.  ...  Zero-shot cross-lingual information extraction (IE) describes the construction of an IE model for some target language, given existing annotations exclusively in some other language, typically English.  ...  For dependency parsing, following Tiedemann et al. (2014) , we adapt the disambiguation of many-to-one mappings by choosing as the head the node that is highest up in the dependency tree.  ... 
arXiv:2109.06798v1 fatcat:daskud27bzeahcqyepsjdzbqli

Low-Resource Adaptation of Neural NLP Models [article]

Farhad Nooralahzadeh
2020 arXiv   pre-print
To this end, we study distant supervision and sequential transfer learning in various low-resource settings.  ...  The objective of this thesis is to investigate methods for dealing with such low-resource scenarios in information extraction and natural language understanding.  ...  "MLQA: Evaluating Cross-lingual Extractive Question Answering". In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics.  ... 
arXiv:2011.04372v1 fatcat:626mbe5ba5bkdflv755o35u5pq

A Survey on Recent Approaches for Natural Language Processing in Low-Resource Scenarios [article]

Michael A. Hedderich, Lukas Lange, Heike Adel, Jannik Strötgen, Dietrich Klakow
2021 arXiv   pre-print
A goal of our survey is to explain how these methods differ in their requirements as understanding them is essential for choosing a technique suited for a specific low-resource setting.  ...  As they are known for requiring large amounts of training data, there is a growing body of work to improve the performance in low-resource settings.  ...  In Proceedings of the 2020 Conference on Model-agnostic meta-learning for fast adaptation of Empirical Methods in Natural Language Process- deep networks.  ... 
arXiv:2010.12309v3 fatcat:26dwmlkmn5auha2ob2qdlrvla4

Language-Agnostic Meta-Learning for Low-Resource Text-to-Speech with Articulatory Features [article]

Florian Lux, Ngoc Thang Vu
2022 arXiv   pre-print
In conjunction with language agnostic meta learning, this enables us to fine-tune a high-quality text-to-speech model on just 30 minutes of data in a previously unseen language spoken by a previously unseen  ...  In this work, we use embeddings derived from articulatory vectors rather than embeddings derived from phoneme identities to learn phoneme representations that hold across languages.  ...  Acknowledgements We would like to thank the anonymous reviewers for their insightful feedback and suggestions. This work was funded by the Carl Zeiss Foundation.  ... 
arXiv:2203.03191v1 fatcat:7j2nkqzuz5aepaqexoxda7u4vq

Message from the general chair

Benjamin C. Lee
2015 2015 IEEE International Symposium on Performance Analysis of Systems and Software (ISPASS)  
To inject knowledge, we use a state-of-the-art system which cross-links (or "grounds") expressions in free text to Wikipedia.  ...  Learning-based Multi-Sieve Co-reference Resolution with Knowledge Lev Ratinov and Dan Roth Saturday 11:00am-11:30am -202 A (ICC) We explore the interplay of knowledge and structure in co-reference resolution  ...  Cross-lingual Parse Disambiguation based on Semantic Correspondence Lea Frermann and Francis Bond Wednesday 4:20pm-4:40pm -Samda (ICC) We present a system for cross-lingual parse disambiguation, exploiting  ... 
doi:10.1109/ispass.2015.7095776 dblp:conf/ispass/Lee15 fatcat:ehbed6nl6barfgs6pzwcvwxria

Language Technology 2020: The Meta-Net Priority Research Themes [chapter]

Georg Rehm, Hans Uszkoreit
2013 META-NET Strategic Research Agenda for Multilingual Europe 2020  
near-real-time). ‚ Cross-lingual technology to increase the social reach and approach cross-culture understanding.  ...  e technical solutions needed include: itory of generic monolingual and cross-lingual language technologies, packaging state-of-the-art techniques in ro- bust, scalable, interoperable, and adaptable  ... 
doi:10.1007/978-3-642-36349-8_6 fatcat:jezmk52phre4lguizqtyh7gule

XPersona: Evaluating Multilingual Personalized Chatbot [article]

Zhaojiang Lin, Zihan Liu, Genta Indra Winata, Samuel Cahyawijaya, Andrea Madotto, Yejin Bang, Etsuko Ishii, Pascale Fung
2020 arXiv   pre-print
On the other hand, the state-of-the-art cross-lingual trained models achieve inferior performance to the other models, showing that cross-lingual conversation modeling is a challenging task.  ...  Our dataset includes persona conversations in six different languages other than English for building and evaluating multilingual personalized agents.  ...  On difficulties of cross-lingual transfer with order differ- ences: A case study on dependency parsing.  ... 
arXiv:2003.07568v2 fatcat:cwmq55vsgjdqlpub7d4a7wc6hq

Table of Contents

2021 IEEE/ACM Transactions on Audio Speech and Language Processing  
Yang Learning Cross-Lingual Mappings in Imperfectly Isomorphic Embedding Spaces . . . . . . . . ....Y.Li, K. Yu, and Y.  ...  Zhang Discourse and Dialog Domain Adaptive Meta-Learning for Dialogue State Tracking . . . . . . . . . . . . . . ....J. Zeng, Y. Yin, Y. Liu, Y. Ge, and J.  ... 
doi:10.1109/taslp.2021.3137066 fatcat:ocit27xwlbagtjdyc652yws4xa
« Previous Showing results 1 — 15 out of 525 results