A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Improving Implicit Discourse Relation Classification by Modeling Inter-dependencies of Discourse Units in a Paragraph
2018
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
With the goal of improving implicit discourse relation classification, we introduce a paragraph-level neural networks that model inter-dependencies between discourse units as well as discourse relation ...
continuity and patterns, and predict a sequence of discourse relations in a paragraph. ...
Acknowledgments We acknowledge the support of NVIDIA Corporation for their donation of one GeForce GTX TI-TAN X GPU used for this research. ...
doi:10.18653/v1/n18-1013
dblp:conf/naacl/DaiH18
fatcat:bgltndzrfzdjjcz2glxepp47pi
Improving Implicit Discourse Relation Classification by Modeling Inter-dependencies of Discourse Units in a Paragraph
[article]
2018
arXiv
pre-print
With the goal of improving implicit discourse relation classification, we introduce a paragraph-level neural networks that model inter-dependencies between discourse units as well as discourse relation ...
continuity and patterns, and predict a sequence of discourse relations in a paragraph. ...
Acknowledgments We acknowledge the support of NVIDIA Corporation for their donation of one GeForce GTX TI-TAN X GPU used for this research. ...
arXiv:1804.05918v1
fatcat:dssi7ffg5re5nktcoay3fbq3wa
A Survey of Implicit Discourse Relation Recognition
[article]
2022
arXiv
pre-print
The task of implicit discourse relation recognition (IDRR) is to detect implicit relation and classify its sense between two text segments without a connective. ...
Finally, we discuss future research directions for discourse relation analysis. ...
Dai and Huang [18] proposed a paragraph-level RNN model that takes a sequence of discourse units as input and predicts a sequence of implicit discourse relations in a paragraph. ...
arXiv:2203.02982v1
fatcat:ubublxw2fnfdpexgw4jslj76tm
A Regularization Approach for Incorporating Event Knowledge and Coreference Relations into Neural Discourse Parsing
2019
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
We argue that external commonsense knowledge and linguistic constraints need to be incorporated into neural network models for mitigating data sparsity issues and further improving the performance of discourse ...
Experiments show that our knowledge regularization approach outperforms all previous systems on the benchmark dataset PDTB for discourse parsing. ...
Base Model The base model processes a paragraph containing a sequence of discourse units each time, and predicts a sequence of discourse relations (both implicit and explicit relations) with one relation ...
doi:10.18653/v1/d19-1295
dblp:conf/emnlp/DaiH19
fatcat:t5sla34azbh7rhxsv6mu7avhre
DRTS Parsing with Structure-Aware Encoding and Decoding
[article]
2020
arXiv
pre-print
State-of-the-art performance can be achieved by a neural sequence-to-sequence model, treating the tree construction as an incremental sequence generation problem. ...
In this work, we propose a structural-aware model at both the encoder and decoder phase to integrate the structural information, where graph attention network (GAT) is exploited for effectively modeling ...
Acknowledgments We thank all reviewers for the valuable comments, which greatly help to improve the paper. ...
arXiv:2005.06901v1
fatcat:a2pdcab4mff5ngwjw56g44p7oa
Implicit Discourse Relation Classification via Multi-Task Neural Networks
[article]
2016
arXiv
pre-print
Without discourse connectives, classifying implicit discourse relations is a challenging task and a bottleneck for building a practical discourse parser. ...
The experimental results on the PDTB implicit discourse relation classification task demonstrate that our model achieves significant gains over baseline systems. ...
Acknowledgments We thank all the anonymous reviewers for their insightful comments on this paper. ...
arXiv:1603.02776v1
fatcat:pa3zrmscm5atvahspahatalpkq
On the Importance of Word and Sentence Representation Learning in Implicit Discourse Relation Classification
[article]
2020
arXiv
pre-print
We argue that a powerful contextualized representation module, a bilateral multi-perspective matching module, and a global information fusion module are all important to implicit discourse analysis. ...
Implicit discourse relation classification is one of the most difficult parts in shallow discourse parsing as the relation prediction without explicit connectives requires the language understanding at ...
Lan, 2015], im-
plicit discourse relation recognition still remains a challenge. ...
arXiv:2004.12617v2
fatcat:msw4i3es5rcjfo5m5jqbtj36me
Task-Level Curriculum Learning for Non-Autoregressive Neural Machine Translation
2020
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
We called our method as task-level curriculum learning for NAT (TCL-NAT). ...
Since AT and NAT can share model structure and AT is an easier task than NAT due to the explicit dependency on previous target-side tokens, a natural idea is to gradually shift the model training from ...
Lan, 2015], im-
plicit discourse relation recognition still remains a challenge. ...
doi:10.24963/ijcai.2020/530
dblp:conf/ijcai/LiuOSJ20
fatcat:srxploensnb5jkftojqjpgdfjq
Table of Contents
2021
IEEE/ACM Transactions on Audio Speech and Language Processing
Zhu Learning Context-Aware Convolutional Filters for Implicit Discourse Relation Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ...
Zhang Corpus-Aware Graph Aggregation Network for Sequence Labeling . . . . . . . . ...H. Chen, Q. Ma, L. Yu, Z. Lin, and J. ...
doi:10.1109/taslp.2021.3137066
fatcat:ocit27xwlbagtjdyc652yws4xa
Dialogue Act Recognition via CRF-Attentive Structured Network
[article]
2017
arXiv
pre-print
Dialogue Act Recognition (DAR) is a challenging problem in dialogue interpretation, which aims to attach semantic labels to utterances and characterize the speaker's intention. ...
dependencies. ...
It is a natural choice to assign a label to each element in the sequence via linear chain CRF, which enable us to model dependencies among labels. ...
arXiv:1711.05568v1
fatcat:kifitwaz5bbetgdue2cm33u2bq
Implicit Discourse Relation Identification for Open-domain Dialogues
2019
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
We firstly propose a method to automatically extract the implicit discourse relation argument pairs and labels from a dataset of dialogic turns, resulting in a novel corpus of discourse relation pairs; ...
Discourse relation identification has been an active area of research for many years, and the challenge of identifying implicit relations remains largely an unsolved task, especially in the context of ...
To our knowledge there is no English dialogue-based corpus with implicit discourse relation labels, as such research specifically targeting a discourse relation identification model for social open-domain ...
doi:10.18653/v1/p19-1065
dblp:conf/acl/MaBWCW19
fatcat:exvas72csvgpvkroek2kmy53l4
Working Memory-Driven Neural Networks with a Novel Knowledge Enhancement Paradigm for Implicit Discourse Relation Recognition
2020
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
Recognizing implicit discourse relation is a challenging task in discourse analysis, which aims to understand and infer the latent relations between two discourse arguments, such as temporal, comparison ...
Most of the present models largely focus on learning-based methods that utilize only intra-sentence textual information to identify discourse relations, ignoring the wider contexts beyond the discourse ...
Acknowledgments We thank the anonymous reviewers for their valuable feedback. Our work is supported by the National Natural Science ...
doi:10.1609/aaai.v34i05.6287
fatcat:smjklclp2ncibbmqashawugnfe
DCR-Net: A Deep Co-Interactive Relation Network for Joint Dialog Act Recognition and Sentiment Classification
2020
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
To address this problem, we propose a Deep Co-Interactive Relation Network (DCR-Net) to explicitly consider the cross-impact and model the interaction between the two tasks by introducing a co-interactive ...
Comprehensive analysis empirically verifies the effectiveness of explicitly modeling the relation between the two tasks and the multi-steps interaction mechanism. ...
Acknowledgments We thank the anonymous reviewers for their helpful comments and suggestions. ...
doi:10.1609/aaai.v34i05.6391
fatcat:q5gsvl4tyzaoze45xqkcthf5am
DCR-Net: A Deep Co-Interactive Relation Network for Joint Dialog Act Recognition and Sentiment Classification
[article]
2020
arXiv
pre-print
To address this problem, we propose a Deep Co-Interactive Relation Network (DCR-Net) to explicitly consider the cross-impact and model the interaction between the two tasks by introducing a co-interactive ...
Comprehensive analysis empirically verifies the effectiveness of explicitly modeling the relation between the two tasks and the multi-steps interaction mechanism. ...
Acknowledgments We thank the anonymous reviewers for their helpful comments and suggestions. ...
arXiv:2008.06914v1
fatcat:wyhdzljs3zd3lj7dv4karuwjxu
Multi-Task Learning in Natural Language Processing: An Overview
[article]
2021
arXiv
pre-print
Then we present optimization techniques on loss construction, data sampling, and task scheduling to properly train a multi-task model. ...
In recent years, Multi-Task Learning (MTL), which can leverage useful information of related tasks to achieve simultaneous performance improvement on multiple related tasks, has been used to handle these ...
and such connectives are used as implicit relation labels ...
arXiv:2109.09138v1
fatcat:hlgzjykuvzczzmsgnl32w5qo5q
« Previous
Showing results 1 — 15 out of 11,060 results