Filters








23,259 Hits in 9.3 sec

A Triple Relation Network for Joint Entity and Relation Extraction

Zixiang Wang, Liqun Yang, Jian Yang, Tongliang Li, Longtao He, Zhoujun Li
2022 Electronics  
Specifically, we design an attention-based entity pair encoding module to identify all normal entity pairs directly.  ...  Since relevant triples give a clue for establishing implicit connections among entities, we propose a Triple Relation Network (Trn) to jointly extract triples, especially handling extracting implicit triples  ...  Our model will extract more triples that are overlapped with others from the sentence, and we utilized an attention-based entity pair encoding module to extract entity pairs directly.  ... 
doi:10.3390/electronics11101535 fatcat:tzsnq62k3ndphcrep63emd4npi

Joint Entity and Relation Extraction Network with Enhanced Explicit and Implicit Semantic Information

Huiyan Wu, Jun Huang
2022 Applied Sciences  
In this paper, we propose Joint Entity and Relation Extraction Network with Enhanced Explicit and Implicit Semantic Information (EINET).  ...  Compared with the baseline model on Conll04, EINET obtains improvements by 2.37% in F1 for named entity recognition and 3.43% in F1 for relation extraction.  ...  implicit semantics is effective for joint entity and relation extraction.  ... 
doi:10.3390/app12126231 fatcat:676fju57xvbj3oefmxlijrdbni

Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network

Yafei Xue, Jing Zhu, Jing Lyu, Gengxin Sun
2022 Computational Intelligence and Neuroscience  
Based on the BERT training model architecture, this paper extracts textual entities and relations tasks.  ...  After experimental analysis, comparing the traditional text entity relationship extraction model and the multi-head attention neural network joint extraction model, the model entity relationship extraction  ...  It is an entity-relationship joint extraction model based on feature enhancement, and a three-layer linear architecture model of text feature extraction layer, main entity recognition module, and object  ... 
doi:10.1155/2022/1530295 pmid:35655501 pmcid:PMC9155959 fatcat:ch24cpgqsvarnbyd24nqsvenjy

End-to-End Neural Relation Extraction Using Deep Biaffine Attention [chapter]

Dat Quoc Nguyen, Karin Verspoor
2019 Lecture Notes in Computer Science  
We propose a neural network model for joint extraction of named entities and relations between them, without any hand-crafted features.  ...  The key contribution of our model is to extend a BiLSTM-CRF-based entity recognition model with a deep biaffine attention layer to model second-order interactions between latent features for relation classification  ...  Acknowledgments: This work was supported by the ARC projects DP150101550 and LP160101469.  ... 
doi:10.1007/978-3-030-15712-8_47 fatcat:ksjkna7qpba3labw5sefqmgtbu

A Survey on Extraction of Causal Relations from Natural Language Text [article]

Jie Yang, Soyeon Caren Han, Josiah Poon
2021 arXiv   pre-print
Next, we list benchmark datasets and modeling assessment methods for causal relation extraction. Then, we present a structured overview of the three techniques with their representative systems.  ...  As an essential component of human cognition, cause-effect relations appear frequently in text, and curating cause-effect relations from text helps in building causal networks for predictive tasks.  ...  [100] propose joint models for entity and relation extraction from the ADE corpus. Li et al.  ... 
arXiv:2101.06426v2 fatcat:hd3ikb7mejcndlq6wsgojv4uoa

Modeling Task Interactions in Document-Level Joint Entity and Relation Extraction [article]

Liyan Xu, Jinho D. Choi
2022 arXiv   pre-print
We target on the document-level relation extraction in an end-to-end setting, where the model needs to jointly perform mention extraction, coreference resolution (COREF) and relation extraction (RE) at  ...  once, and gets evaluated in an entity-centric way.  ...  Conclusion We address the task interactions in the end-to-end document-level relation extraction, and compare five model settings featuring different interactions, including both implicit and our proposed  ... 
arXiv:2205.01909v1 fatcat:tpzob4ddxfgv5oklwhhohzam24

Neural Relation Extraction for Knowledge Base Enrichment

Bayu Distiawan Trisedya, Gerhard Weikum, Jianzhong Qi, Rui Zhang
2019 Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics  
To address this problem, we propose an end-to-end relation extraction model for KB enrichment based on a neural encoder-decoder model.  ...  We study relation extraction for knowledge base (KB) enrichment.  ...  The overall objective function of the joint learning of word and entity embeddings is: J = J E + J W (7) N-gram Based Attention Model Our proposed relation extraction model integrates the extraction  ... 
doi:10.18653/v1/p19-1023 dblp:conf/acl/TrisedyaWQZ19 fatcat:dkyftbaskjgmdb3yy7okjdijyu

Effective Deep Memory Networks for Distant Supervised Relation Extraction

Xiaocheng Feng, Jiang Guo, Bing Qin, Ting Liu, Yongjie Liu
2017 Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence  
the feature-based logistic regression model and compositional neural models such as CNN, our approach includes two major attention-based memory components, which is capable of explicitly capturing the  ...  with multiple computational layers, each of which is a neural attention model over an external memory.  ...  Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence  ... 
doi:10.24963/ijcai.2017/559 dblp:conf/ijcai/FengGQLL17 fatcat:d5ueqv6ktrbopo4jq4jamft5sq

On the Complementary Nature of Knowledge Graph Embedding, Fine Grain Entity Types, and Language Modeling [article]

Rajat Patel, Francis Ferraro
2020 arXiv   pre-print
We demonstrate the complementary natures of neural knowledge graph embedding, fine-grain entity type prediction, and neural language modeling.  ...  We show that a language model-inspired knowledge graph embedding approach yields both improved knowledge graph embeddings and fine-grain entity type representations.  ...  It also improves relation extraction , knowledge base completion and entity resolution (Nickel et al., 2011) .  ... 
arXiv:2010.05732v1 fatcat:at2mbe2pafdvdouco4ozyp5uv4

A Method about Building Deep Knowledge Graph for the Plant Insect Pest and Disease (DKG-PIPD)

Yingying Liu
2021 IEEE Access  
Furthermore, DKG-PIPD performed joint extraction about the entity and the relationship in unstructured knowledge in a corpus tagging method that is suitable for domain data.  ...  Moreover, the related work in this paper first introduced the general architecture required for the building of knowledge graph, and then summarized its key points, that is, named entity recognition, entity  ...  We implements a novel corpus tagging model based on the field ontology to achieve joint extraction of the entity and the relation, simultaneous tagging of the entity and the relation, direct modeling of  ... 
doi:10.1109/access.2021.3116467 fatcat:vyoluv7ln5ebnlrm3ogut32eoa

Collaborative Knowledge Base Embedding for Recommender Systems

Fuzheng Zhang, Nicholas Jing Yuan, Defu Lian, Xing Xie, Wei-Ying Ma
2016 Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining - KDD '16  
We apply stacked denoising auto-encoders and stacked convolutional auto-encoders, which are two types of deep learning based embedding techniques, to extract items' textual representations and visual representations  ...  Due to the rapid collection of information on the web, the knowledge base provides heterogeneous information including both structured and unstructured data with different semantics, which can be consumed  ...  Our model mainly consists of two steps: 1) knowledge base embedding and 2) collaborative joint learning.  ... 
doi:10.1145/2939672.2939673 dblp:conf/kdd/ZhangYLXM16 fatcat:tnf43r4ohrhzhfrz2qbg7vd4ae

Learning to Update Knowledge Graphs by Reading News

Jizhi Tang, Yansong Feng, Dongyan Zhao
2019 Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)  
GUpdater is built upon graph neural networks (GNNs) with a text-based attention mechanism to guide the updating message passing through the KG structures.  ...  Most current text-based KG updating methods rely on elaborately designed information extraction systems and carefully crafted rules, which are often domain-specific and hard to maintain or generalize.  ...  Acknowledgements This work is supported in part by the National Hi-Tech R&D Program of China (No. 2018YFC0831900) and the NSFC Grants (No. 61672057, 61672058).  ... 
doi:10.18653/v1/d19-1265 dblp:conf/emnlp/TangFZ19 fatcat:6pkvp5sa7jbq5bgh6wi4zmlpri

Hierarchical Attention Networks for Knowledge Base Completion via Joint Adversarial Training [article]

Chen Li, Xutan Peng, Shanghang Zhang, Jianxin Li, Lihong Wang
2018 arXiv   pre-print
The AT mechanism encourages our model to extract features that are both discriminative for missing relation prediction and shareable between single relation and multi-hop paths.  ...  By joint Adversarial Training (AT) the entire model, our method minimizes the classification error of missing relations, and ensures the source of shared features are difficult to discriminate in the meantime  ...  SFE-PR is a PR-based method that uses breadth first search with Subgraph Feature Extraction (SFE) for extracting path features (Gardner and Mitchell 2015) .  ... 
arXiv:1810.06033v1 fatcat:2wkdz624lvg3dowrssyc4bz2ku

Utilizing Textual Information in Knowledge Graph Embedding: A Survey of Methods and Applications

Fengyuan Lu, Peijin Cong, Xinli Huang
2020 IEEE Access  
Firstly, we introduce the techniques for encoding the textual information to represent the entities and relations from perspectives of encoding models and scoring functions, respectively.  ...  Finally, applications of KG embedding with textual information in the specific tasks such as KG completion in zero-shot scenario, multilingual entity alignment, relation extraction and recommender system  ...  The reason is that the number of relations is too small for Joint(A-LSTM) to take relation as attention.  ... 
doi:10.1109/access.2020.2995074 fatcat:tlayxrhr5fbjbfqvnuczq34bae

Attention Weight is Indispensable in Joint Entity and Relation Extraction

Jianquan Ouyang, Jing Zhang, Tianming Liu
2022 Intelligent Automation and Soft Computing  
Joint entity and relation extraction (JERE) is an important foundation for unstructured knowledge extraction in natural language processing (NLP).  ...  In this paper, we propose a novel model called Attention and Spanbased Entity and Relation Transformer (ASpERT) for JERE.  ...  Acknowledgement: We thank the open-source authors of the dataset. We also thank all members from Xiangtan University 504 Lab for their strong support for my research.  ... 
doi:10.32604/iasc.2022.028352 fatcat:kwpla24d2jb2plqhzqinpotwta
« Previous Showing results 1 — 15 out of 23,259 results