Filters








2,444 Hits in 5.7 sec

Tag-Enhanced Tree-Structured Neural Networks for Implicit Discourse Relation Classification [article]

Yizhong Wang, Sujian Li, Jingfeng Yang, Xu Sun, Houfeng Wang
2018 arXiv   pre-print
Specifically, we employ the Tree-LSTM model and Tree-GRU model, which are based on the tree structure, to encode the arguments in a relation.  ...  Identifying implicit discourse relations between text spans is a challenging task because it requires understanding the meaning of the text.  ...  Acknowledgments We thank all the anonymous reviewers for their insightful comments on this paper.  ... 
arXiv:1803.01165v1 fatcat:ei5qa4dxu5exvcqtwkjcuuf7zu

Tree framework with BERT word embedding for the recognition of Chinese implicit discourse relations

Dan Jiang, Jin He
2020 IEEE Access  
MODEL For Chinese implicit DRR, in this study the relation labels are first organized into relation trees according to the level of the relations, and a model based on BERT, BiLSTM, and a tree structure  ...  Rutherford et al. proposed the use of neural network models based on a feedforward and LSTM architecture and systematically studied the effects of varying structures [12] .  ... 
doi:10.1109/access.2020.3019500 fatcat:kbknwhehgrhkrh6jf543wcqvr4

Entity-Augmented Distributional Semantics for Discourse Relations [article]

Yangfeng Ji, Jacob Eisenstein
2015 arXiv   pre-print
The resulting system obtains substantial improvements over the previous state-of-the-art in predicting implicit discourse relations in the Penn Discourse Treebank.  ...  A more subtle challenge is that it is not enough to represent the meaning of each sentence of a discourse relation, because the relation may depend on links between lower-level elements, such as entity  ...  EXPERIMENTS We evaluate our approach on the implicit discourse relation identification in the Penn Discourse Treebank (PDTB).  ... 
arXiv:1412.5673v3 fatcat:5cddmgeqcjhitjrepod6tmgg5a

Do We Really Need All Those Rich Linguistic Features? A Neural Network-Based Approach to Implicit Sense Labeling

Niko Schenk, Christian Chiarcos, Kathrin Donandt, Samuel Rönnqvist, Evgeny Stepanov, Giuseppe Riccardi
2016 Proceedings of the CoNLL-16 shared task  
We describe our contribution to the CoNLL 2016 Shared Task on shallow discourse parsing. 1 Our system extends the two best parsers from previous year's competition by integration of a novel implicit sense  ...  It is grounded on a highly generic, language-independent feedforward neural network architecture incorporating weighted word embeddings for argument spans which obviates the need for (traditional) hand-crafted  ...  A Neural Sense Labeler for Implicit and Entity Relations We construct a neural network-based module for the classification of senses for both implicit and entity (EntRel) relations. 3 As a very general  ... 
doi:10.18653/v1/k16-2005 dblp:conf/conll/SchenkCDRSR16 fatcat:hzcf2f2rjzgvnkcmjidulq6mhe

Event-Related Features in Feedforward Neural Networks Contribute to Identifying Causal Relations in Discourse

Edoardo Maria Ponti, Anna Korhonen
2017 Proceedings of the 2nd Workshop on Linking Models of Lexical, Sentential and Discourse-level Semantics  
This makes their identification challenging. We aim to improve their identification by implementing a Feedforward Neural Network with a novel set of features for this task.  ...  In particular, these are based on the position of event mentions and the semantics of events and participants.  ...  Few approaches based on deep learning have been proposed for discourse relation classification so far. focused on implicit relations.  ... 
doi:10.18653/v1/w17-0903 dblp:conf/eacl/PontiK17 fatcat:ipomckbwbvcs3jwr2emqe6shgi

Adapting Event Embedding for Implicit Discourse Relation Recognition

Maria Leonor Pacheco, I-Ta Lee, Xiao Zhang, Abdullah Khan Zehady, Pranjal Daga, Di Jin, Ayush Parolia, Dan Goldwasser
2016 Proceedings of the CoNLL-16 shared task  
We model discourse arguments as a combination of word and event vectors. Event information is aggregated with word vectors and a Multi-Layer Neural Network is used to classify discourse senses.  ...  Predicting the sense of a discourse relation is particularly challenging when connective markers are missing.  ...  In all these experiments, we used a neural network architecture, and used as a baseline a simple lexical classifier based on word pairs.  ... 
doi:10.18653/v1/k16-2019 dblp:conf/conll/PachecoLZZDJPG16 fatcat:6hopiufz3zfnxoyqpptt6rjmvy

The DCU Discourse Parser for Connective, Argument Identification and Explicit Sense Classification

Longyue Wang, Chris Hokamp, Tsuyoshi Okita, Xiaojun Zhang, Qun Liu
2015 Proceedings of the Nineteenth Conference on Computational Natural Language Learning - Shared Task  
Focusing on achieving good performance when inferring explicit discourse relations, we apply maximum entropy and recurrent neural networks to different sub-tasks such as connective identification, argument  ...  This paper describes our submission to the CoNLL-2015 shared task on discourse parsing. We factor the pipeline into subcomponents which are then used to form the final sequential architecture.  ...  We build a pipeline system which focuses on achieving good performance when inferring explicit discourse relations. We apply maximum entropy and recurrent neural networks to different sub-tasks.  ... 
doi:10.18653/v1/k15-2014 dblp:conf/conll/WangH0ZL15 fatcat:qcr3udwkwveodpsbfakcu2e7hi

Exploring Joint Neural Model for Sentence Level Discourse Parsing and Sentiment Analysis

Bita Nejat, Giuseppe Carenini, Raymond Ng
2017 Proceedings of the 18th Annual SIGdial Meeting on Discourse and Dialogue  
Next, we apply three different Recursive Neural Net models: one for discourse structure prediction, one for discourse relation prediction and one for sentiment analysis.  ...  Specifically for Discourse Parsing, we see improvements in the prediction on the set of contrastive relations.  ...  Next, we designed three independent Recursive Neural Nets classifiers; one for Discourse Structure prediction, one for Discourse Relation prediction and one for Sentiment Analysis.  ... 
doi:10.18653/v1/w17-5535 dblp:conf/sigdial/NejatCN17 fatcat:khab45b5mzfu7grcxej2lpoq54

Learning Representations for Text-level Discourse Parsing

Gregor Weiss
2015 Proceedings of the ACL-IJCNLP 2015 Student Research Workshop  
Instead of depending on mostly hand-engineered sparse features and independent components for each subtask, we propose a unified approach completely based on deep learning architectures.  ...  To train more expressive representations that capture communicative functions and semantic roles of discourse units and relations between them, we will jointly learn all discourse parsing subtasks at different  ...  Adjacent nodes are joined depending on their discourse relations to form a tree.  ... 
doi:10.3115/v1/p15-3003 dblp:conf/acl/Weiss15 fatcat:lgoso2wbpvaujopgrguwzrsedi

CoNLL 2016 Shared Task on Multilingual Shallow Discourse Parsing

Nianwen Xue, Hwee Tou Ng, Sameer Pradhan, Attapol Rutherford, Bonnie Webber, Chuan Wang, Hongmin Wang
2016 Proceedings of the CoNLL-16 shared task  
Progression Relation in which one argument represents a progression from the other, in extent, intensity, scale, etc.  ...  Expansion Relation in which one argument is an elaboration or restatement of another Purpose Relation between an action and the intention behind it Temporal Relation that is temporal in nature, expressing  ...  Acknowledgments We would like to thank the Penn Discourse TreeBank team and the Chinese Discourse TreeBank Team, for allowing us to use the PDTB corpus and the CDTB corpus for the shared task.  ... 
doi:10.18653/v1/k16-2001 dblp:conf/conll/XueNPRWWW16 fatcat:fqyb7bkixngzrnir76osuwc4mu

Recognizing Implicit Discourse Relations via Repeated Reading: Neural Networks with Multi-Level Attention

Yang Liu, Sujian Li
2016 Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing  
Recognizing implicit discourse relations is a challenging but important task in the field of Natural Language Processing.  ...  To mimic the repeated reading strategy, we propose the neural networks with multi-level attention (NNMA), combining the attention mechanism and external memories to gradually fix the attention on some  ...  Acknowledgments We thank all the anonymous reviewers for their insightful comments on this paper.  ... 
doi:10.18653/v1/d16-1130 dblp:conf/emnlp/LiuL16 fatcat:mpm75yerendp5lqtj7zddhjbza

Recognizing Implicit Discourse Relations via Repeated Reading: Neural Networks with Multi-Level Attention [article]

Yang Liu, Sujian Li
2016 arXiv   pre-print
Recognizing implicit discourse relations is a challenging but important task in the field of Natural Language Processing.  ...  To mimic the repeated reading strategy, we propose the neural networks with multi-level attention (NNMA), combining the attention mechanism and external memories to gradually fix the attention on some  ...  Acknowledgments We thank all the anonymous reviewers for their insightful comments on this paper.  ... 
arXiv:1609.06380v1 fatcat:b6d432fhbjbyvjvf5vbinimc5m

Improve Discourse Dependency Parsing with Contextualized Representations [article]

Yifei Zhou, Yansong Feng
2022 arXiv   pre-print
of structural information from the context of extracted discourse trees, and substantially outperforms traditional direct-classification methods.  ...  Motivated by the observation of writing patterns commonly shared across articles, we propose a novel method that treats discourse relation identification as a sequence labelling task, which takes advantage  ...  To test the effectiveness of our model for implicit discourse relation identification, We delete some freely omissible connectives identified by Ma et al. (2019) to automatically generate implicit discourse  ... 
arXiv:2205.02090v1 fatcat:2rncuc3w3raa7fplqftgwho674

Improved Document Modelling with a Neural Discourse Parser [article]

Fajri Koto, Jey Han Lau, Timothy Baldwin
2019 arXiv   pre-print
Despite the success of attention-based neural models for natural language generation and classification tasks, they are unable to capture the discourse structure of larger documents.  ...  For abstractive summarization, for instance, conventional neural models simply match source documents and the summary in a latent space without explicit representation of text structure or relations.  ...  The formulation of these shallow features (nuclearity and relation scores) are inspired by Ono et al. (1994) , who propose a number of ways to score an EDU based on the RST tree structure.  ... 
arXiv:1911.06919v1 fatcat:zhd6e2rk4bf7tialzow5kcu754

Recursive Deep Models for Discourse Parsing

Jiwei Li, Rumeng Li, Eduard Hovy
2014 Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP)  
In this paper, we propose a recursive model for discourse parsing that jointly models distributed representations for clauses, sentences, and entire discourses.  ...  Text-level discourse parsing remains a challenge: most approaches employ features that fail to capture the intentional, semantic, and syntactic aspects that govern discourse coherence.  ...  Conclusion In this paper, we describe an RST-style text-level discourse parser based on a neural network model.  ... 
doi:10.3115/v1/d14-1220 dblp:conf/emnlp/LiLH14 fatcat:fghdznpb6zay5b3ex7eeivbyqq
« Previous Showing results 1 — 15 out of 2,444 results