Filters








14,175 Hits in 4.3 sec

Cross-Lingual Relation Extraction with Transformers [article]

Jian Ni and Taesun Moon and Parul Awasthy and Radu Florian
2020 arXiv   pre-print
Relation extraction (RE) is one of the most important tasks in information extraction, as it provides essential information for many NLP applications.  ...  Building upon unsupervised cross-lingual representation learning frameworks, we develop several deep Transformer based RE models with a novel encoding scheme that can effectively encode both entity location  ...  Figure 1 : 1 Deep Transformer based neural network architecture for relation extraction.  ... 
arXiv:2010.08652v1 fatcat:xvbomv66gjavhgs3shbwsgnxv4

GATE: Graph Attention Transformer Encoder for Cross-lingual Relation and Event Extraction [article]

Wasi Uddin Ahmad and Nanyun Peng and Kai-Wei Chang
2021 arXiv   pre-print
We introduce GATE, a Graph Attention Transformer Encoder, and test its cross-lingual transferability on relation and event extraction tasks.  ...  Recent progress in cross-lingual relation and event extraction use graph convolutional networks (GCNs) with universal dependency parses to learn language-agnostic sentence representations such that models  ...  Extensive experiments on three languages demonstrates the effectiveness of GATE in cross-lingual relation and event extraction.  ... 
arXiv:2010.03009v2 fatcat:h6xomdspjzh6xhfmvsmii7n2mu

Neural Relation Extraction with Multi-lingual Attention

Yankai Lin, Zhiyuan Liu, Maosong Sun
2017 Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)  
To address this issue, we introduce a multi-lingual neural relation extraction framework, which employs monolingual attention to utilize the information within mono-lingual texts and further proposes cross-lingual  ...  Experimental results on real-world datasets show that our model can take advantage of multi-lingual texts and consistently achieve significant improvements on relation extraction as compared with baselines  ...  . , m}} from the neural networks with multi-lingual attention. Those vectors with j = k are mono-lingual attention vectors, and those with j ̸ = k are cross-lingual attention vectors.  ... 
doi:10.18653/v1/p17-1004 dblp:conf/acl/LinLS17 fatcat:qnpzfmi3g5hbvd7cn7p3veprny

On the Relation between Syntactic Divergence and Zero-Shot Performance [article]

Ofir Arviv, Dmitry Nikolaev, Taelin Karidi, Omri Abend
2021 arXiv   pre-print
In order to compare parsing performance across different schemes, we perform extrinsic evaluation on the downstream task of cross-lingual relation extraction (RE) using a subset of a popular English RE  ...  In another, we apply three linguistically motivated transformations to UD, creating more cross-lingually stable versions of it, and assess their zero-shot parsability.  ...  This finding may suggest that cross-lingual stability is correlated with the ability of the parser to generalize within a language as well, a direction which we defer to future work.  ... 
arXiv:2110.04644v1 fatcat:fq6afkt5mzboxfvmgl6nuomubi

Bilingual Terminology Extraction from Comparable E-Commerce Corpora [article]

Hao Jia, Shuqin Gu, Yuqi Zhang, Xiangyu Duan
2022 arXiv   pre-print
Benefiting from the cross-lingual pre-training in e-commerce, our framework can make full use of the deep semantic relationship between source-side terminology and target-side sentence to extract corresponding  ...  In this paper, we propose a novel framework of extracting e-commercial bilingual terminologies from comparable data.  ...  of cross-lingual pretraining in e-commerce for the extraction models.  ... 
arXiv:2104.07398v2 fatcat:ieydoa72arfz7cazt53bzw5zoe

Cross-lingual Structure Transfer for Zero-resource Event Extraction

Di Lu, Ananya Subburathinam, Heng Ji, Jonathan May, Shih-Fu Chang, Avirup Sil, Clare R. Voss
2020 International Conference on Language Resources and Evaluation  
Most current cross-lingual transfer learning methods for Information Extraction (IE) have been applied to local sequence labeling tasks.  ...  based on universal dependency parses and fully-connected graphs, respectively. (2) Represent each node in these graph structures with a cross-lingual word embedding so that all sentences, regardless of  ...  Conclusions and Future Work In this paper, we propose a novel cross-lingual structure transfer framework for zero-resource event extraction.  ... 
dblp:conf/lrec/LuSJMCSV20 fatcat:nebmjihr2zhkxk7ullz4c33woq

Multilingual Knowledge Graph Embeddings for Cross-lingual Knowledge Alignment

Muhao Chen, Yingtao Tian, Mohan Yang, Carlo Zaniolo
2017 Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence  
Our models can be trained on partially aligned graphs, where just a small portion of triples are aligned with their cross-lingual counterparts.  ...  We deploy three different techniques to represent cross-lingual transitions, namely axis calibration, translation vectors, and linear transformations, and derive five variants for MTransE using different  ...  Meanwhile translation-based models cooperate well with other models. For example, variants of TransE are combined with word embeddings to help relation extraction from text Zhong et al., 2015] .  ... 
doi:10.24963/ijcai.2017/209 dblp:conf/ijcai/ChenTYZ17 fatcat:kyiu7lywwjcpxfoykxyj6n4mg4

Cross-Lingual Cross-Media Content Linking: Annotations and Joint Representations (Dagstuhl Seminar 15201)

Alexander G. Hauptmann, James Hodson, Juanzi Li, Nicu Sebe, Achim Rettinger, Marc Herbstritt
2015 Dagstuhl Reports  
Dagstuhl Seminar 15201 was conducted on "Cross-Lingual Cross-Media Content Linking: Annotations and Joint Representations".  ...  Except where otherwise noted, content of this report is licensed under a Creative Commons BY 3.0 Unported license Cross-Lingual Cross-Media  ...  lingual knowledge extraction.  ... 
doi:10.4230/dagrep.5.5.43 dblp:journals/dagstuhl-reports/HauptmannHLSR15 fatcat:sjqptft2m5cufcpzslggt7la5i

Multilingual Knowledge Graph Embeddings for Cross-lingual Knowledge Alignment [article]

Muhao Chen, Yingtao Tian, Mohan Yang, Carlo Zaniolo
2017 arXiv   pre-print
Our models can be trained on partially aligned graphs, where just a small portion of triples are aligned with their cross-lingual counterparts.  ...  We deploy three different techniques to represent cross-lingual transitions, namely axis calibration, translation vectors, and linear transformations, and derive five variants for MTransE using different  ...  Meanwhile translation-based models cooperate well with other models. For example, variants of TransE are combined with word embeddings to help relation extraction from text Zhong et al., 2015] .  ... 
arXiv:1611.03954v3 fatcat:yrnlhpfhofhrfm3tlbbyxokjg4

LLOD-Driven Bilingual Word Embeddings Rivaling Cross-Lingual Transformers in Quality of Life Concept Detection from French Online Health Communities [chapter]

Katharina Allgaier, Susana Veríssimo, Sherry Tan, Matthias Orlikowski, Matthias Hartung
2021 Applications and Practices in Ontology Design, Extraction, and Reasoning  
In a combined configuration, our models rival the performance of state-of-the-art cross-lingual transformers, despite being of considerably lower model complexity.  ...  extraction algorithms.  ...  We find that our models, when combined with a baseline approach that integrates machine translation and rule-based extraction algorithms, are strong contestants to cross-lingual transformers.  ... 
doi:10.3233/ssw210037 fatcat:nvmixudmg5gy5naedlaezqlkbi

Context-Aware Cross-Lingual Mapping

Hanan Aldarmaki, Mona Diab
2019 Proceedings of the 2019 Conference of the North  
We also implement cross-lingual mapping of deep contextualized word embeddings using parallel sentences with word alignments.  ...  In this paper, we propose an alternative to word-level mapping that better reflects sentence-level cross-lingual similarity.  ...  Related Work For cross-lingual alignment, we follow the popular approach of fitting a linear transformation matrix between word vector spaces that are independently trained for each language.  ... 
doi:10.18653/v1/n19-1391 dblp:conf/naacl/AldarmakiD19 fatcat:wzw7ssx3over7h3uk6heh6siwu

The RELX Dataset and Matching the Multilingual Blanks for Cross-Lingual Relation Classification [article]

Abdullatif Köksal, Arzucan Özgür
2020 arXiv   pre-print
For evaluation, we introduce a new public benchmark dataset for cross-lingual relation classification in English, French, German, Spanish, and Turkish, called RELX.  ...  To overcome this issue, we propose two cross-lingual relation classification models: a baseline model based on Multilingual BERT and a new multilingual pretraining setup, which significantly improves the  ...  To the best of our knowledge, we present the first transformer based approach for the task of cross-lingual relation classification.  ... 
arXiv:2010.09381v1 fatcat:3tbw5kmvhvbpbaaweq6sdcyjv4

Unsupervised cross-lingual model transfer for named entity recognition with contextualized word representations

Huijiong Yan, Tao Qian, Liang Xie, Shanguang Chen, Weinan Zhang
2021 PLoS ONE  
In this work, we investigate the unsupervised cross-lingual NER with model transfer based on contextualized word representations, which greatly advances the cross-lingual NER performance.  ...  We study several model transfer settings of the unsupervised cross-lingual NER, including (1) different types of the pretrained transformer-based language models as input, (2) the exploration strategies  ...  Related work Here we introduce the related work by six aspects, including NER, cross-lingual NER, model transfer, multi-source cross-lingual transfer, adapter and PGN.  ... 
doi:10.1371/journal.pone.0257230 pmid:34547014 pmcid:PMC8454935 fatcat:idspy336izgd3bhnv6xvqd7hfm

Zero-Shot Cross-Lingual Phonetic Recognition with External Language Embedding

Heting Gao, Junrui Ni, Yang Zhang, Kaizhi Qian, Shiyu Chang, Mark Hasegawa-Johnson
2021 Conference of the International Speech Communication Association  
Even with no transcribed speech, it is possible to train a language embedding using only data from language typologies (phylogenetic node and phoneme inventory) that reduces ASR error rates.  ...  Because of the high error rates of zero-shot cross-lingual ASR, most researchers studying cross-lingual ASR have chosen pragmatically to define that term to mean few-shot rather than zero-shot recognition  ...  In particular, we show both phylogenetic and phonetic knowledge are necessary for good cross-lingual accuracy and that a linear transformation network can flexibly leverage both types of information to  ... 
doi:10.21437/interspeech.2021-1843 dblp:conf/interspeech/GaoNZQCH21 fatcat:h36lrbx54bbjtkplt67cgrkyeq

TransWiC at SemEval-2021 Task 2: Transformer-based Multilingual and Cross-lingual Word-in-Context Disambiguation [article]

Hansi Hettiarachchi, Tharindu Ranasinghe
2021 arXiv   pre-print
Our approach also achieves satisfactory results in other monolingual and cross-lingual language pairs as well.  ...  Considering this limitation, our approach to SemEval-2021 Task 2 is based only on pretrained transformer models and does not use any language-specific processing and resources.  ...  Multilingual and cross-lingual transformer models like multilingual BERT and XLM-R show strong cross-lingual transfer learning performance.  ... 
arXiv:2104.04632v1 fatcat:tqqabmhhsfbvfoucrtmucqs6oi
« Previous Showing results 1 — 15 out of 14,175 results