A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Combining Long Short Term Memory and Convolutional Neural Network for Cross-Sentence n-ary Relation Extraction
[article]
2018
arXiv
pre-print
We propose in this paper a combined model of Long Short Term Memory and Convolutional Neural Networks (LSTM-CNN) that exploits word embeddings and positional embeddings for cross-sentence n-ary relation ...
n-ary relation extraction. ...
To address these issues, we propose a combined model consisting a Long Short-Term Memory unit and a Convolutional Neural Network (LSTM CNN) that exploits both word embedding and positional embedding features ...
arXiv:1811.00845v1
fatcat:nw43kqahsnbabehvt7habqqp7y
Incorporating representation learning and multihead attention to improve biomedical cross-sentence n-ary relation extraction
2020
BMC Bioinformatics
We explored a novel method for cross-sentence n-ary relation extraction. ...
Experiments on n-ary relation extraction show that combining context and knowledge representations can significantly improve the n-ary relation extraction performance. ...
Acknowledgments The authors would like to thank the editor and all anonymous reviewers for valuable suggestions and constructive comments. ...
doi:10.1186/s12859-020-03629-9
pmid:32677883
fatcat:embkd22td5a6lnarajesvyj4wy
BERT-GT: Cross-sentence n-ary relation extraction with BERT and graph transformer
2021
Bioinformatics
relations among n entities across multiple sentences, and use either a graph neural network (GNN) with long short-term memory (LSTM) or an attention mechanism. ...
To automatically extract information from biomedical literature, existing biomedical text-mining approaches typically formulate the problem as a cross-sentence n-ary relation-extraction task that detects ...
Zhijiang Guo for helping us to reproduce the results of AGGCN on the n-ary dataset. We thank Dr. Chih-Husan Wei for his assistance on revising the manuscript. ...
doi:10.1093/bioinformatics/btaa1087
pmid:33416851
pmcid:PMC8023679
fatcat:c3yptvgvi5ghhiatyoa6rjdlku
Significance of Learning and Memories in Computational Methods
2020
Helix
In this paper a study is presented on the influences of convolutional neural networks and long short-term memory networks that impose new ideas to develop intensive learning frameworks. ...
Neural Networks plays catalytic role for the artificial intelligence and learning processes. ...
obtaining pin-point comprehensions in various high dimensions of vectored and non-vectored data, orchestrations of DNN, RNN, CNN and LSTM has to take place no matter they are specific in their applications ...
doi:10.29042/2020-10-2-219-225
fatcat:pey5lljb35fxxlfyvv6mouwxiu
BERT-GT: Cross-sentence n-ary relation extraction with BERT and Graph Transformer
[article]
2021
arXiv
pre-print
relations among n entities across multiple sentences, and use either a graph neural network (GNN) with long short-term memory (LSTM) or an attention mechanism. ...
To automatically extract information from biomedical literature, existing biomedical text-mining approaches typically formulate the problem as a cross-sentence n-ary relation-extraction task that detects ...
Zhijiang Guo for helping us to reproduce the results of AGGCN on the n-ary dataset. We thank Dr. Chih-Hsuan Wei for his assistance on revising the manuscript. ...
arXiv:2101.04158v1
fatcat:atyb4ymse5fxdlrszglb6dsogm
Cross-Sentence N-ary Relation Extraction with Graph LSTMs
2017
Transactions of the Association for Computational Linguistics
In this paper, we explore a general relation extraction framework based on graph long short-term memory networks (graph LSTMs) that can be easily extended to cross-sentence n-ary relation extraction. ...
Recent NLP inroads in high-value domains have sparked interest in the more general setting of extracting n-ary relations that span multiple sentences. ...
Acknowledgements We thank Daniel Fried and Ming-Wei Chang for useful discussions, as well as the anonymous reviewers and editor-in-chief Mark Johnson for their helpful comments. ...
doi:10.1162/tacl_a_00049
fatcat:ey7qk2gwqrebjkzpifjmqjvfq4
Cross-Sentence N-ary Relation Extraction with Graph LSTMs
[article]
2017
arXiv
pre-print
In this paper, we explore a general relation extraction framework based on graph long short-term memory networks (graph LSTMs) that can be easily extended to cross-sentence n-ary relation extraction. ...
Recent NLP inroads in high-value domains have sparked interest in the more general setting of extracting n-ary relations that span multiple sentences. ...
Acknowledgements We thank Daniel Fried and Ming-Wei Chang for useful discussions, as well as the anonymous reviewers and editor-in-chief Mark Johnson for their helpful comments. ...
arXiv:1708.03743v1
fatcat:2cmd6mtg5neq3gexu5fet5pkya
Multi-Scale Feature and Metric Learning for Relation Extraction
[article]
2021
arXiv
pre-print
To address the above limitations, we propose a multi-scale feature and metric learning framework for relation extraction. ...
Specifically, we first develop a multi-scale convolutional neural network to aggregate the non-successive mainstays in the lexical sequence. ...
Sequence-based models [15] , [16] , [17] , [18] , [19] , [20] extract local and global lexical features from the word sequence using convolutional neural network (CNN) [21] and long short-term ...
arXiv:2107.13425v1
fatcat:fav425fzzrdy3dujlntcznh6he
Protein-Protein Interaction Extraction using Attention-based Tree-Structured Neural Network Models
2022
Proceedings of the ... International Florida Artificial Intelligence Research Society Conference
This paper investigates tree-structured neural network models for the PPI task and the results show their supremacy over sequential models and their effectiveness for this task. ...
Unlike sequential models, tree-structured neural network models have the ability to consider syntactic and semantic dependencies between different portions of the text and can provide structural information ...
Acknowledgments We thank the reviewers for their constructive comments. ...
doi:10.32473/flairs.v35i.130660
fatcat:3jc22eraxzbaxb5qofaqus5w7m
Multi-Stream Semantics-Guided Dynamic Aggregation Graph Convolution Networks to Extract Overlapping Relations
2021
IEEE Access
graph convolution network (SG-DAGCN) to realize the extraction of overlapping relations. ...
INDEX TERMS Overlapping relation extraction, multiscale structural information, dynamic aggregation, long distance dependencies, refined graph, relevant substructure. ...
ACKNOWLEDGMENT The authors would like to thank the anonymous reviewers for their comments and helpful suggestions. ...
doi:10.1109/access.2021.3062231
fatcat:zu4rrzkalfcsxdol7x63lzhadu
Inter-sentence Relation Extraction with Document-level Graph Convolutional Neural Network
2019
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
We present a novel inter-sentence relation extraction model that builds a labelled edge graph convolutional neural network model on a document-level graph. ...
Our analysis shows that all the types in the graph are effective for inter-sentence relation extraction. ...
Results were obtained from a project commissioned by the New Energy and Industrial Technology Development Organization (NEDO). ...
doi:10.18653/v1/p19-1423
dblp:conf/acl/SahuCMA19
fatcat:354lhl2zprflbdqvr2ctiy4w5e
Unity in Diversity: Learning Distributed Heterogeneous Sentence Representation for Extractive Summarization
[article]
2019
arXiv
pre-print
This semantic and compositional feature vector is then concatenated with the document-dependent features for sentence ranking. ...
The network learns sentence representation in a way that, salient sentences are closer in the vector space than non-salient sentences. ...
Singh, Gupta, and Varma (2017) proposed a combination of memory network and convolutional BLSTM (Bidirectional Long Short Term Memory) network to learn better unified document representation which jointly ...
arXiv:1912.11688v1
fatcat:2337brxyovaqxcge26ry5fsxui
Part of speech tagging: a systematic review of deep learning and machine learning approaches
2022
Journal of Big Data
Furthermore, the presence of ambiguity when tagging terms with different contextual meanings inside a sentence cannot be overlooked. ...
Then, recent trends and advancements of DL and ML-based part-of-speech-taggers are presented in terms of the proposed approaches deployed and their performance evaluation metrics. ...
The experiments are conducted using Gated Recur- rent Unit (GRU), Long Short-Term Memory (LSTM), Recurrent Neural Networks (RNN), and Bi-directional Long Short-Term Memory (BLSTM) for implementing POS ...
doi:10.1186/s40537-022-00561-y
fatcat:fpmxdnm76benxms6wckwiqykpy
Neural Metric Learning for Fast End-to-End Relation Extraction
[article]
2019
arXiv
pre-print
Relation extraction (RE) is an indispensable information extraction task in several disciplines. ...
We introduce a novel neural architecture utilizing the table structure, based on repeated applications of 2D convolutions for pooling local dependency and metric-based features, that improves on the state-of-the-art ...
Currently, the architecture is designed for extracting relations involving two entities and occur within sentence bounds; handling n-ary relations and exploring document-level extraction involving cross-sentence ...
arXiv:1905.07458v4
fatcat:ozzii4u4lrasxfojqeesasxktu
Deep Neural Networks for Relation Extraction
[article]
2021
arXiv
pre-print
Finally, we propose a hierarchical entity graph convolutional network for relation extraction across documents. ...
Next, we propose two joint entity and relation extraction frameworks based on encoder-decoder architecture. ...
We use a bi-directional long short-term memory (Bi-LSTM) (Hochreiter and Schmidhuber, 1997) layer to capture the interaction among words in a sentence S = {w 1 , w 2 , ....., w n }, where n is the sentence ...
arXiv:2104.01799v1
fatcat:vmatz7gxazd4xnm2oprncd5mm4
« Previous
Showing results 1 — 15 out of 294 results