Filters








2,086 Hits in 3.5 sec

Neural Models for Reasoning over Multiple Mentions using Coreference [article]

Bhuwan Dhingra, Qiao Jin, Zhilin Yang, William W. Cohen, Ruslan Salakhutdinov
2018 arXiv   pre-print
The layer uses coreference annotations extracted from an external system to connect entity mentions belonging to the same cluster.  ...  Many problems in NLP require aggregating information from multiple mentions of the same entity which may be far apart in the text.  ...  One important form of reasoning for Question Answering (QA) models is the ability to aggregate information from multiple mentions of entities.  ... 
arXiv:1804.05922v1 fatcat:qftnsqqopjemxmveu4j5tbkxvy

Neural Models for Reasoning over Multiple Mentions Using Coreference

Bhuwan Dhingra, Qiao Jin, Zhilin Yang, William Cohen, Ruslan Salakhutdinov
2018 Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)  
The layer uses coreference annotations extracted from an external system to connect entity mentions belonging to the same cluster.  ...  Many problems in NLP require aggregating information from multiple mentions of the same entity which may be far apart in the text.  ...  One important form of reasoning for Question Answering (QA) models is the ability to aggregate information from multiple mentions of entities.  ... 
doi:10.18653/v1/n18-2007 dblp:conf/naacl/DhingraJYCS18 fatcat:jvgd6dnsvjdu3n5fbmuvai7gie

Coreference-Aware Dialogue Summarization [article]

Zhengyuan Liu, Ke Shi, Nancy F. Chen
2021 arXiv   pre-print
Therefore, in this work, we investigate different approaches to explicitly incorporate coreference information in neural abstractive dialogue summarization models to tackle the aforementioned challenges  ...  interlocutors and person mentions.  ...  Acknowledgments This research was supported by funding from the Institute for Infocomm Research (I2R) under A*STAR ARES, Singapore. We thank Ai Ti Aw and Minh Nguyen for insightful discussions.  ... 
arXiv:2106.08556v2 fatcat:mnzeos4bdjgareihflrcl76vce

End-to-end Neural Coreference Resolution [article]

Kenton Lee, Luheng He, Mike Lewis, Luke Zettlemoyer
2017 arXiv   pre-print
The key idea is to directly consider all spans in a document as potential mentions and learn distributions over possible antecedents for each.  ...  We introduce the first end-to-end coreference resolution model and show that it significantly outperforms all previous work without using a syntactic parser or hand-engineered mention detector.  ...  We also thank the UW NLP group for helpful conversations and comments on the work.  ... 
arXiv:1707.07045v2 fatcat:oddmnxlhubby5pgjbbem4qtmvy

Question Answering by Reasoning Across Documents with Graph Convolutional Networks [article]

Nicola De Cao, Wilker Aziz, Ivan Titov
2019 arXiv   pre-print
We introduce a neural model which integrates and reasons relying on information spread within documents and across multiple documents. We frame it as an inference problem on a graph.  ...  Mentions of entities are nodes of this graph while edges encode relations between different mentions (e.g., within- and cross-document co-reference).  ...  Acknowledgments We would like to thank Johannes Welbl for helping to test our system on WIKIHOP.  ... 
arXiv:1808.09920v3 fatcat:pdd4mc5tkjgezeduqouky2yrba

Question Answering by Reasoning Across Documents with Graph Convolutional Networks

Nicola De Cao, Wilker Aziz, Ivan Titov
2019 Proceedings of the 2019 Conference of the North  
We introduce a neural model which integrates and reasons relying on information spread within documents and across multiple documents. We frame it as an inference problem on a graph.  ...  Mentions of entities are nodes of this graph while edges encode relations between different mentions (e.g., within-and crossdocument coreference).  ...  We also report baselines using GloVe instead of ELMo with and without R-GCN . For the full model we report mean ±1 std over 5 runs. dicted by the coreference system (COREF).  ... 
doi:10.18653/v1/n19-1240 dblp:conf/naacl/CaoAT19 fatcat:7jdqxkfhgnetdcc5mjtlmji4xy

Tracing Origins: Coreference-aware Machine Reading Comprehension [article]

Baorong Huang, Zhuosheng Zhang, Hai Zhao
2022 arXiv   pre-print
We use two strategies to fine-tune a pre-trained language model, namely, placing an additional encoder layer after a pre-trained language model to focus on the coreference mentions or constructing a relational  ...  language model, in order to highlight the coreference mentions of the entities that must be identified for coreference-intensive question answering in QUOREF, a relatively new dataset that is specifically  ...  Following the practice of in handling multiple answers for the same question, we use the cross entropy to calculate the losses for each answer if the question has multiple answers: L n = F C(E prLM ,  ... 
arXiv:2110.07961v2 fatcat:chszl5plc5cufivo4vojdbp74i

Segmentation Approach for Coreference Resolution Task [article]

Aref Jafari, Ali Ghodsi
2020 arXiv   pre-print
In the proposed method, the BERT model has been used for encoding the documents and a head network designed to capture the relations between the embedded tokens.  ...  The presented paper is a report of an ongoing study on an idea which proposes a new approach for coreference resolution which can resolve all coreference mentions to a given mention in the document in  ...  Introduction Over the past decades several models have been proposed for the coreference resolution task [1] .  ... 
arXiv:2007.04301v1 fatcat:fejthwgs25gktaatihftzebnoy

Improving Coreference Resolution by Leveraging Entity-Centric Features with Graph Neural Networks and Second-order Inference [article]

Lu Liu, Zhenqiao Song, Xiaoqing Zheng
2020 arXiv   pre-print
One of the major challenges in coreference resolution is how to make use of entity-level features defined over clusters of mentions rather than mention pairs.  ...  We propose a graph neural network-based coreference resolution method that can capture the entity-centric information by encouraging the sharing of features across all mentions that probably refer to the  ...  Three most popular metrics for coreference resolution were used to evaluate our model: MUC, B 3 and CEAF φ4 .  ... 
arXiv:2009.04639v1 fatcat:vg2qnjyhafgrlm3or4eh6spv4y

Exploring Graph-structured Passage Representation for Multi-hop Reading Comprehension with Graph Neural Networks [article]

Linfeng Song, Zhiguo Wang, Mo Yu, Yue Zhang, Radu Florian, Daniel Gildea
2018 arXiv   pre-print
However, coreference is limited in providing information for rich inference. We introduce a new method for better connecting global evidence, which forms more complex graphs compared to DAGs.  ...  Previous work approximates global evidence with local coreference information, encoding coreference chains with DAG-styled GRU layers within a gated-attention reader.  ...  W 3 and b 3 are model parameters. A gated recurrent neural network is used to model the state transition process.  ... 
arXiv:1809.02040v1 fatcat:d2izcf5dvbgxjafoaxn6yjzzgu

End-to-end Neural Coreference Resolution

Kenton Lee, Luheng He, Mike Lewis, Luke Zettlemoyer
2017 Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing  
The key idea is to directly consider all spans in a document as potential mentions and learn distributions over possible antecedents for each.  ...  We introduce the first end-to-end coreference resolution model and show that it significantly outperforms all previous work without using a syntactic parser or handengineered mention detector.  ...  We also thank the UW NLP group for helpful conversations and comments on the work.  ... 
doi:10.18653/v1/d17-1018 dblp:conf/emnlp/LeeHLZ17 fatcat:6sovxp7qg5fkzf56ejpxd6oomy

Coreference Resolution System Not Only for Czech

Michal Novák
2017 Conference on Theory and Practice of Information Technologies  
The system implements a sequence of mention ranking models specialized at particular types of coreferential expressions (relative, reflexive, personal pronouns etc.).  ...  The paper introduces Treex CR, a coreference resolution (CR) system not only for Czech. As its name suggests, it has been implemented as an integral part of the Treex NLP framework.  ...  This work has been also supported and has been using language resources developed and/or stored and/or distributed by the LINDAT/CLARIN project No.  ... 
dblp:conf/itat/Novak17 fatcat:gje3kuoyozeszclzga5mrpmlve

Reasoning Over History: Context Aware Visual Dialog [article]

Muhammad A. Shah, Shikib Mehri, Tejas Srinivasan
2020 arXiv   pre-print
More specifically, it struggles with tasks that require reasoning over the dialog history, particularly coreference resolution.  ...  We extend the MAC network architecture with Context-aware Attention and Memory (CAM), which attends over control states in past dialog turns to determine the necessary reasoning operations for the current  ...  Visual coreference resolution requires both an ability to reason over coreferences in the dialog, as well as ground the entities from the language modality in the visual one.  ... 
arXiv:2011.00669v1 fatcat:ixchak7hafbofi4xvdo3g672pm

Coreference Aware Representation Learning for Neural Named Entity Recognition

Zeyu Dai, Hongliang Fei, Ping Li
2019 Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence  
Recent neural network models have achieved state-of-the-art performance on the task of named entity recognition (NER).  ...  In this paper, we propose a novel approach to learn coreference-aware word representations for the NER task at the document level.  ...  the averaged F1-score and standard deviation over multiple trials.  ... 
doi:10.24963/ijcai.2019/687 dblp:conf/ijcai/DaiF019 fatcat:hy5vmi2uhbd6nnmkltwhjx5aeq

Triad-based Neural Network for Coreference Resolution [article]

Yuanliang Meng, Anna Rumshisky
2018 arXiv   pre-print
We propose a triad-based neural network system that generates affinity scores between entity mentions for coreference resolution.  ...  To our knowledge, this is the first neural network system to model mutual dependency of more than two members at mention level.  ...  Generally speaking, three types of models have been used for coreference resolution: pairwise models, mention ranking models, and entity-mention models.  ... 
arXiv:1809.06491v1 fatcat:soegd27n6rc2fge5kz4gvgwzri
« Previous Showing results 1 — 15 out of 2,086 results