Filters








724 Hits in 8.1 sec

Joint Representation Learning of Cross-lingual Words and Entities via Attentive Distant Supervision [article]

Yixin Cao and Lei Hou and Juanzi Li and Zhiyuan Liu and Chengjiang Li and Xu Chen and Tiansi Dong
2018 arXiv   pre-print
In this paper, we propose a novel method for joint representation learning of cross-lingual words and entities.  ...  Joint representation learning of words and entities benefits many NLP tasks, but has not been well explored in cross-lingual settings.  ...  In this paper, we propose a novel method for joint representation learning of cross-lingual words and entities.  ... 
arXiv:1811.10776v1 fatcat:rkyiznw2hvdjhdy4fylkxw4fay

Joint Representation Learning of Cross-lingual Words and Entities via Attentive Distant Supervision

Yixin Cao, Lei Hou, Juanzi Li, Zhiyuan Liu, Chengjiang Li, Xu Chen, Tiansi Dong
2018 Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing  
In this paper, we propose a novel method for joint representation learning of cross-lingual words and entities.  ...  Joint representation learning of words and entities benefits many NLP tasks, but has not been well explored in cross-lingual settings.  ...  Acknowledgments The work is supported by NSFC key project (No. 61533018,U1736204,61661146007), Ministry of Education and China Mobile Research Fund (No. 20181770250), and THUNUS NExT++ Co-Lab.  ... 
doi:10.18653/v1/d18-1021 dblp:conf/emnlp/0002HLLLCD18 fatcat:bmbfxf7bb5clljirnliytkagou

Neural Relation Extraction with Multi-lingual Attention

Yankai Lin, Zhiyuan Liu, Maosong Sun
2017 Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)  
attention to consider the information consistency and complementarity among cross-lingual texts.  ...  To address this issue, we introduce a multi-lingual neural relation extraction framework, which employs monolingual attention to utilize the information within mono-lingual texts and further proposes cross-lingual  ...  Both the works focus on multi-lingual transfer learning and learn a predictive model on a new language for existing KBs, by leveraging unified representation learning for cross-lingual entities.  ... 
doi:10.18653/v1/p17-1004 dblp:conf/acl/LinLS17 fatcat:qnpzfmi3g5hbvd7cn7p3veprny

Cross-lingual Structure Transfer for Relation and Event Extraction

Ananya Subburathinam, Di Lu, Heng Ji, Jonathan May, Shih-Fu Chang, Avirup Sil, Clare Voss
2019 Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)  
Extensive experiments on cross-lingual relation and event transfer among English, Chinese, and Arabic demonstrate that our approach achieves performance comparable to state-of-the-art supervised models  ...  We thus find that language-universal symbolic and distributional representations are complementary for cross-lingual structure transfer.  ...  We apply GCN to construct multi-lingual structural representations for cross-lingual transfer learning.  ... 
doi:10.18653/v1/d19-1030 dblp:conf/emnlp/SubburathinamLJ19 fatcat:3vwt72efyrfn7ojd6pkn64q7w4

Cross-lingual Joint Entity and Word Embedding to Improve Entity Linking and Parallel Sentence Mining

Xiaoman Pan, Thamme Gowda, Heng Ji, Jonathan May, Scott Miller
2019 Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019)  
A cross-lingual joint entity and word embedding learned from this kind of data not only can disambiguate linkable entities but can also effectively represent unlinkable entities.  ...  We propose a novel method, CLEW, to generate cross-lingual data that is a mix of entities and contextual words based on Wikipedia.  ...  DARPA LORELEI Program HR0011-15-C-0115, the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), via contract FA8650-17-C-9116, and ARL NS-CTA  ... 
doi:10.18653/v1/d19-6107 dblp:conf/acl-deeplo/PanGJMM19 fatcat:cxhj3j27jbatbbfsihrp3duafm

A Survey on Recent Approaches for Natural Language Processing in Low-Resource Scenarios [article]

Michael A. Hedderich, Lukas Lange, Heike Adel, Jannik Strötgen, Dietrich Klakow
2021 arXiv   pre-print
This includes mechanisms to create additional labeled data like data augmentation and distant supervision as well as transfer learning settings that reduce the need for target supervision.  ...  After a discussion about the different dimensions of data availability, we give a structured overview of methods that enable learning when training data is sparse.  ...  of distant supervision and learn how They are numerical representations of words or sen- to combine them.  ... 
arXiv:2010.12309v3 fatcat:26dwmlkmn5auha2ob2qdlrvla4

Knowledge Extraction in Low-Resource Scenarios: Survey and Perspective [article]

Shumin Deng, Ningyu Zhang, Feiyu Xiong, Jeff Z. Pan, Huajun Chen
2022 arXiv   pre-print
stronger models, and (3) exploiting data and models together.  ...  We hope that our survey can help both the academic and industrial community to better understand this field, inspire more ideas and boost broader applications.  ...  the information consistency and complementarity among cross-lingual texts. have studied the multi-lingual EE task and exploited the consistent information in multi-lingual data via cross-lingual attention  ... 
arXiv:2202.08063v2 fatcat:csqfjb3jozacjbutvrer63labu

Neural Entity Linking: A Survey of Models Based on Deep Learning [article]

Ozge Sevgili, Artem Shelmanov, Mikhail Arkhipov, Alexander Panchenko, Chris Biemann
2021 arXiv   pre-print
techniques including zero-shot and distant supervision methods, and cross-lingual approaches.  ...  The vast variety of modifications of this general neural entity linking architecture are grouped by several common themes: joint entity recognition and linking, models for global linking, domain-independent  ...  The work of Artem Shelmanov in the current study (preparation of sections related to application of entity linking to neural language models, entity ranking, contextmention encoding, and overall harmonization  ... 
arXiv:2006.00575v3 fatcat:ra3kwc4tmbfhlmgtlevkcshcqq

Neural entity linking: A survey of models based on deep learning

Özge Sevgili, Artem Shelmanov, Mikhail Arkhipov, Alexander Panchenko, Chris Biemann, Mehwish Alam, Davide Buscaldi, Michael Cochez, Francesco Osborne, Diego Reforgiato Recupero, Harald Sack
2022 Semantic Web Journal  
including zero-shot and distant supervision methods, and cross-lingual approaches.  ...  The vast variety of modifications of this general architecture are grouped by several common themes: joint entity mention detection and disambiguation, models for global linking, domain-independent techniques  ...  The work of Artem Shelmanov in the current study (preparation of sections related to application of entity linking to neural language models, entity ranking, context-mention encoding, and overall harmonization  ... 
doi:10.3233/sw-222986 fatcat:6gwmbtev7ngbliovf6cpf5hyde

Neural relation extraction: a review

2020 Turkish Journal of Electrical Engineering and Computer Sciences  
For creating distant supervision datasets such as NYT, entity 23 pairs in a triple are aligned with the sentences that contain head and tail entities in the natural text.  ...  Socher et al. [46] proposed a recurrent deep neural network model which admits a compositional 17 vector representation of words and phrases on a parse tree.  ...  The pattern-instance 38 pairs are subject to human annotation to be used in fusing the different labeling methods such as distant 39 supervision and relational patterns. mono-lingual and cross-lingual  ... 
doi:10.3906/elk-2005-119 fatcat:o36duadbunhmbesuyayc5jfmxe

Table of Contents [EDICS]

2020 IEEE/ACM Transactions on Audio Speech and Language Processing  
Zhang 2743 Language Acquisition and Learning Cross-Lingual Transfer Learning of Non-Native Acoustic Modeling for Pronunciation Error Detection and Diagnosis . . . . . . . . . . . . . . . . . . . . . .  ...  -goo Lee 105 Joint Learning of Token Context and Span Feature for Span-Based Nested NER . . . . . . L. Sun, Y. Sun, F. Ji, and C.  ... 
doi:10.1109/taslp.2020.3046150 fatcat:easrxuwl6zdppejsrf4bskxfw4

Neural relation extraction: a survey [article]

Mehmet Aydar and Ozge Bozal and Furkan Ozbay
2020 arXiv   pre-print
Neural relation extraction discovers semantic relations between entities from unstructured text using deep learning methods.  ...  We discuss advantageous and incompetent sides of existing studies and investigate additional research directions and improvement ideas in this field.  ...  [32] combined mono-lingual and cross-lingual attention to take advantage of both language-specific features and the patterns that bear resemblance across languages.  ... 
arXiv:2007.04247v1 fatcat:xxrcy2ef75dk5aeijqlf6tjgke

Learning Transferable Representation for Bilingual Relation Extraction via Convolutional Neural Networks

Bonan Min, Zhuolin Jiang, Marjorie Freedman, Ralph M. Weischedel
2017 International Joint Conference on Natural Language Processing  
We propose a deep neural network to learn transferable, discriminative bilingual representation.  ...  The learnt representation is discriminative and transferable between languages.  ...  The views, opinions, and/or findings contained in this article are those of the author and should not be interpreted as representing the official views or policies, either expressed or implied, of the  ... 
dblp:conf/ijcnlp/MinJFW17 fatcat:fhnfr7p5szbyrarqjlkpinnvoi

Extracting event and their relations from texts: A survey on recent research progress and challenges

Kang Liu, Yubo Chen, Jian Liu, Xinyu Zuo, Jun Zhao
2020 AI Open  
Specifically, in the event extraction task, we mainly focus on three recent important research problems: 1) how to learn the textual semantic representations for events in sentence-level event extraction  ...  , and challenges.  ...  Acknowledgement This work is supported by the Natural Key R&D Program of China (No.2018YFB1005100), the National Natural Science Foundation of China (No. 61922085, No.U1936207, No.61806201) and the Strategic  ... 
doi:10.1016/j.aiopen.2021.02.004 fatcat:qxbcmk55vzcb5nznhgfgwrbe4u

On Difficulties of Cross-Lingual Transfer with Order Differences: A Case Study on Dependency Parsing [article]

Wasi Uddin Ahmad, Zhisong Zhang, Xuezhe Ma, Eduard Hovy, Kai-Wei Chang, Nanyun Peng
2019 arXiv   pre-print
In this paper, we investigate cross-lingual transfer and posit that an order-agnostic model will perform better when transferring to distant foreign languages.  ...  Rigorous experiments and detailed analysis shows that RNN-based architectures transfer well to languages that are close to English, while self-attentive models have better overall cross-lingual transferability  ...  We thank RobertÖstling for reaching out when he saw the earlier arxiv version of the paper and providing insightful comments about word order and related citations.  ... 
arXiv:1811.00570v3 fatcat:n47ecgxpxbcitaply4ykgpns4m
« Previous Showing results 1 — 15 out of 724 results