Filters








18 Hits in 1.9 sec

Making Neural QA as Simple as Possible but not Simpler

Dirk Weissenborn, Georg Wiese, Laura Seiffe
2017 Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017)  
Recent development of large-scale question answering (QA) datasets triggered a substantial amount of research into end-toend neural architectures for QA.  ...  We find that there are two ingredients necessary for building a high-performing neural QA system: first, the awareness of question words while processing the context and second, a composition function  ...  Acknowledgments We thank Sebastian Riedel, Philippe Thomas, Leonhard Hennig and Omer Levy for comments on an early draft of this work as well as the anonymous reviewers for their insightful comments.  ... 
doi:10.18653/v1/k17-1028 dblp:conf/conll/WeissenbornWS17 fatcat:wg6y55lkfvcm7j5rxthcwss24u

Making Neural QA as Simple as Possible but not Simpler [article]

Dirk Weissenborn and Georg Wiese and Laura Seiffe
2017 arXiv   pre-print
Recent development of large-scale question answering (QA) datasets triggered a substantial amount of research into end-to-end neural architectures for QA.  ...  We find that there are two ingredients necessary for building a high-performing neural QA system: first, the awareness of question words while processing the context and second, a composition function  ...  Acknowledgments We thank Sebastian Riedel, Philippe Thomas, Leonhard Hennig and Omer Levy for comments on an early draft of this work as well as the anonymous reviewers for their insightful comments.  ... 
arXiv:1703.04816v3 fatcat:d46qtwxovjf2znm7kri553m63e

Question Answering by Reasoning Across Documents with Graph Convolutional Networks [article]

Nicola De Cao, Wilker Aziz, Ivan Titov
2019 arXiv   pre-print
Our Entity-GCN method is scalable and compact, and it achieves state-of-the-art results on a multi-document question answering dataset, WikiHop (Welbl et al., 2018).  ...  We introduce a neural model which integrates and reasons relying on information spread within documents and across multiple documents. We frame it as an inference problem on a graph.  ...  This project is supported by SAP Innovation Center Network, ERC Starting Grant BroadSem (678254) and the Dutch Organization for Scientific Research (NWO) VIDI 639.022.518.  ... 
arXiv:1808.09920v3 fatcat:pdd4mc5tkjgezeduqouky2yrba

Question Answering by Reasoning Across Documents with Graph Convolutional Networks

Nicola De Cao, Wilker Aziz, Ivan Titov
2019 Proceedings of the 2019 Conference of the North  
Our Entity-GCN method is scalable and compact, and it achieves state-of-the-art results on a multi-document question answering dataset, WIKIHOP (Welbl et al., 2018).  ...  We introduce a neural model which integrates and reasons relying on information spread within documents and across multiple documents. We frame it as an inference problem on a graph.  ...  These systems, given a text and a question, need to answer the query relying on the given document.  ... 
doi:10.18653/v1/n19-1240 dblp:conf/naacl/CaoAT19 fatcat:7jdqxkfhgnetdcc5mjtlmji4xy

Jack the Reader – A Machine Reading Framework

Dirk Weissenborn, Pasquale Minervini, Isabelle Augenstein, Johannes Welbl, Tim Rocktäschel, Matko Bošnjak, Jeff Mitchell, Thomas Demeester, Tim Dettmers, Pontus Stenetorp, Sebastian Riedel
2018 Proceedings of ACL 2018, System Demonstrations  
For example, in Question Answering, the supporting text can be newswire or Wikipedia articles; in Natural Language Inference, premises can be seen as the supporting text and hypotheses as questions.  ...  Providing a set of useful primitives operating in a single framework of related tasks would allow for expressive modelling, and easier model comparison and replication.  ...  The architecture is built by a sequence of modular neural building blocks, in short modules.  ... 
doi:10.18653/v1/p18-4005 dblp:conf/acl/WeissenbornMAWR18 fatcat:or5rhfs3grfsvkulm3kkatddq4

Jack the Reader - A Machine Reading Framework [article]

Dirk Weissenborn, Pasquale Minervini, Tim Dettmers, Isabelle Augenstein, Johannes Welbl, Tim Rocktäschel, Matko Bošnjak, Jeff Mitchell, Thomas Demeester, Pontus Stenetorp, Sebastian Riedel
2018 arXiv   pre-print
For example, in Question Answering, the supporting text can be newswire or Wikipedia articles; in Natural Language Inference, premises can be seen as the supporting text and hypotheses as questions.  ...  Providing a set of useful primitives operating in a single framework of related tasks would allow for expressive modelling, and easier model comparison and replication.  ...  The architecture is built by a sequence of modular neural building blocks, in short modules.  ... 
arXiv:1806.08727v1 fatcat:p4r52au3zzewvaop66sy4lbyxu

End-To-End Neural Network for Paraphrased Question Answering Architecture with Single Supporting Line in Bangla Language

Md. Mohsin Uddin, East West University, Dhaka Bangladesh, Nazmus Sakib Patwary, Md. Mohaiminul Hasan, Tanvir Rahman, Mir Tanveer Islam
2020 International Journal of Future Computer and Communication  
To predict appropriate answer, model is trained with question-answer pair and a supporting line.  ...  miniature sub tasks to accomplish a whole AI-system having capability of answering and reasoning complicated and long questions through understating paragraph.  ...  Another paper [4] where researchers proposed an efficient neural model denoted as FastQA for question answering which outperforms existing model over very popular recent datasets named SQuAD, [18]  ... 
doi:10.18178/ijfcc.2020.9.3.565 fatcat:gp726p3afnczboj6ihgmpzhone

Densely Connected Attention Propagation for Reading Comprehension [article]

Yi Tay, Luu Anh Tuan, Siu Cheung Hui, Jian Su
2019 arXiv   pre-print
We propose DecaProp (Densely Connected Attention Propagation), a new densely connected neural architecture for reading comprehension (RC). There are two distinct characteristics of our model.  ...  To this end, we propose novel Bidirectional Attention Connectors (BAC) for efficiently forging connections throughout the network. We conduct extensive experiments on four challenging RC benchmarks.  ...  The authors would like to thank the anonymous reviewers of NeuRIPS 2018 for their valuable time and feedback!  ... 
arXiv:1811.04210v2 fatcat:mysf6uphfrc67fftflcpp2v62u

A3Net:Adversarial-and-Attention Network for Machine Reading Comprehension [chapter]

Jiuniu Wang, Xingyu Fu, Guangluan Xu, Yirong Wu, Ziyan Chen, Yang Wei, Li Jin
2018 Lecture Notes in Computer Science  
Second, we propose a multi-layer attention network utilizing three kinds of high-efficiency attention mechanisms.  ...  Multi-layer attention conducts interaction between question and passage within each layer, which contributes to reasonable representation and understanding of the model.  ...  For example, by randomly dropping units, dropout is widely used as a simple way to prevent neural networks from overfitting.  ... 
doi:10.1007/978-3-319-99495-6_6 fatcat:gw6btg53e5dchijyjtoww6gwfe

A Study of the Tasks and Models in Machine Reading Comprehension [article]

Chao Wang
2020 arXiv   pre-print
and complex-reasoning MRC tasks; 2) the architecture designs, attention mechanisms, and performance-boosting approaches for developing neural-network-based MRC models; 3) some recently proposed transfer  ...  To provide a survey on the existing tasks and models in Machine Reading Comprehension (MRC), this report reviews: 1) the dataset collection and performance evaluation of some representative simple-reasoning  ...  Architecture Designs for MRC Models This section introduces some representative architecture designs for MRC models, which cover both simple-reasoning and complex-reasoning MRC models.  ... 
arXiv:2001.08635v1 fatcat:yuc3fx4jjvhkxnbv4o6kerunve

Efficient and Robust Question Answering from Minimal Context over Documents

Sewon Min, Victor Zhong, Richard Socher, Caiming Xiong
2018 Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)  
Neural models for question answering (QA) over documents have achieved significant performance improvements.  ...  In this paper, we study the minimal context required to answer the question, and find that most questions in existing datasets can be answered with a small set of sentences.  ...  First, we studied the minimal context required to answer the question in existing datasets and found that most questions can be answered using a small set of sentences.  ... 
doi:10.18653/v1/p18-1160 dblp:conf/acl/SocherZXM18 fatcat:7fme4rqlmjgpngy5pdhxoxpg6u

New Vietnamese Corpus for Machine Reading Comprehension of Health News Articles

Kiet Van Nguyen, Tin Van Huynh, Duc-Vu Nguyen, Anh Gia-Tuan Nguyen, Ngan Luu-Thuy Nguyen
2022 ACM Transactions on Asian and Low-Resource Language Information Processing  
We introduce a process for creating a high-quality corpus for the Vietnamese machine reading comprehension task. Linguistically, our corpus accommodates diversity in question and answer types.  ...  Machine reading comprehension is a natural language understanding task where the computing system is required to read a text and then find the answer to a specific question posed by a human.  ...  Also, we thank to Sang Thanh Tran and the annotator team for creating and revising the question-answer data.  ... 
doi:10.1145/3527631 fatcat:mgnypmkaavce7k6hyp4j7c5ce4

SubjQA: A Dataset for Subjectivity and Review Comprehension [article]

Johannes Bjerva, Nikita Bhutani, Behzad Golshan, Wang-Chiew Tan, Isabelle Augenstein
2020 arXiv   pre-print
For instance, a subjective question may or may not be associated with a subjective answer.  ...  We release an English QA dataset (SubjQA) based on customer reviews, containing subjectivity annotations for questions and answer spans across 6 distinct domains.  ...  Acknowledgements We are grateful to the Nordic Language Processing Laboratory (NLPL) for providing access to its supercluster infrastructure, and the anonymous reviewers for their helpful feedback.  ... 
arXiv:2004.14283v3 fatcat:szkah5oscbcd7mrlmk3pougadm

Efficient and Robust Question Answering from Minimal Context over Documents [article]

Sewon Min, Victor Zhong, Richard Socher, Caiming Xiong
2018 arXiv   pre-print
Neural models for question answering (QA) over documents have achieved significant performance improvements.  ...  In this paper, we study the minimal context required to answer the question, and find that most questions in existing datasets can be answered with a small set of sentences.  ...  Acknowledgments We thank the anonymous reviewers and the Salesforce Research team members for their thoughtful comments and discussions.  ... 
arXiv:1805.08092v1 fatcat:cgswcf2slzfifgkrtazhxegw3y

SRQA: Synthetic Reader for Factoid Question Answering

Jiuniu Wang, Wenjia Xu, Xingyu Fu, Yang Wei, Li Jin, Ziyan Chen, Guangluan Xu, Yirong Wu
2019 Knowledge-Based Systems  
We introduce a new model called SRQA, which means Synthetic Reader for Factoid Question Answering.  ...  The question answering system can answer questions from various fields and forms with deep neural networks, but it still lacks effective ways when facing multiple evidences.  ...  For example, by randomly dropping units, dropout [37] is widely used as a simple way to prevent neural networks from overfitting.  ... 
doi:10.1016/j.knosys.2019.105415 fatcat:agstiyiqgbeobdocsclhdtcqtu
« Previous Showing results 1 — 15 out of 18 results