Filters








1,844 Hits in 6.8 sec

Context Transformer with Stacked Pointer Networks for Conversational Question Answering over Knowledge Graphs [article]

Joan Plepi, Endri Kacupaj, Kuldeep Singh, Harsh Thakkar, Jens Lehmann
2021 arXiv   pre-print
Our framework consists of a stack of pointer networks as an extension of a context transformer model for parsing the input question and the dialog history.  ...  Neural semantic parsing approaches have been widely used for Question Answering (QA) systems over knowledge graphs.  ...  Moreover, given the knowledge graph embeddings, our stacked pointer networks select the relevant items from the knowledge graph depending on the conversational context.  ... 
arXiv:2103.07766v2 fatcat:m4dobg5ygfet3bqnaymezypaxm

ConfNet2Seq: Full Length Answer Generation from Spoken Questions [article]

Vaishali Pal, Manish Shrivastava, Laurent Besacier
2020 arXiv   pre-print
This is the first attempt towards generating full-length natural answers from a graph input(confusion network) to the best of our knowledge.  ...  These desired responses are in the form of full-length natural answers generated over facts retrieved from a knowledge source.  ...  We have used the pointer-network over ASR graphs(confusion network) and show that it gives comparable results to the model trained on the best hypothesis.  ... 
arXiv:2006.05163v2 fatcat:jtrs337uv5dk7pucsarueupzla

Conversational Question Answering over Knowledge Graphs with Transformer and Graph Attention Networks [article]

Endri Kacupaj, Joan Plepi, Kuldeep Singh, Harsh Thakkar, Jens Lehmann, Maria Maleshkova
2021 arXiv   pre-print
This paper addresses the task of (complex) conversational question answering over a knowledge graph.  ...  For this task, we propose LASAGNE (muLti-task semAntic parSing with trAnsformer and Graph atteNtion nEtworks).  ...  Conclusions In this article, we focus on complex question answering over a large-scale knowledge graph containing conversational context.  ... 
arXiv:2104.01569v2 fatcat:nksqgjhp45cpbck56i6ld6bgrm

Empathetic response generation through Graph-based Multi-hop Reasoning on Emotional Causality

Jiashuo Wang, Wenjie Li, Peiqin Lin, Feiteng Mu
2021 Knowledge-Based Systems  
Then, we propose a novel graph-based model with multi-hop reasoning to model the emotional causality of the empathetic conversation.  ...  Finally, we demonstrate the effectiveness of our model on EMPATHETICDIALOGUES in comparison with several competitive models.  ...  Specifically, KagNet [31] , a model based on graph convolutional networks and LSTMs with a hierarchical path-based attention mechanism, is proposed for the task of commonsense question answering.  ... 
doi:10.1016/j.knosys.2021.107547 fatcat:lrwfsmfajjdydgdz4kbqfjyrsy

Recent Advances in Deep Learning Based Dialogue Systems: A Systematic Survey [article]

Jinjie Ni, Tom Young, Vlad Pandelea, Fuzhao Xue, Vinay Adiga, Erik Cambria
2021 arXiv   pre-print
, Neural Networks, CNN, RNN, Hierarchical Recurrent Encoder-Decoder, Memory Networks, Attention, Transformer, Pointer Net, CopyNet, Reinforcement Learning, GANs, Knowledge Graph, Survey, Review  ...  Keywords: Dialogue Systems, Chatbots, Conversational AI, Task-oriented, Open Domain, Chit-chat, Question Answering, Artificial Intelligence, Natural Language Processing, Information Retrieval, Deep Learning  ...  Attention Networks, Transformer, Pointer Net and CopyNet, Deep Reinforcement Learning models, Generative Adversarial Networks (GANs), Knowledge Graph Augmented Neural Networks.  ... 
arXiv:2105.04387v4 fatcat:stperoq73rgyja5b7zcfysjh5q

A Survey of Knowledge-Enhanced Text Generation [article]

Wenhao Yu, Chenguang Zhu, Zaitang Li, Zhiting Hu, Qingyun Wang, Heng Ji, Meng Jiang
2022 arXiv   pre-print
In this survey, we present a comprehensive review of the research on knowledge enhanced text generation over the past five years.  ...  The main content includes two parts: (i) general methods and architectures for integrating knowledge into text generation; (ii) specific techniques and applications according to different forms of knowledge  ...  reasoning over knowledge graph via path finding strategies; and (M4) improve the graph embeddings with graph neural networks.  ... 
arXiv:2010.04389v3 fatcat:vzdtlz4j65g2va7gwkbmzyxkhq

Graph Neural Networks Meet Neural-Symbolic Computing: A Survey and Perspective [article]

Luis C. Lamb, Artur Garcez, Marco Gori, Marcelo Prates, Pedro Avelar, Moshe Vardi
2021 arXiv   pre-print
Graph Neural Networks (GNN) have been widely used in relational and symbolic domains, with widespread application of GNNs in combinatorial optimization, constraint satisfaction, relational reasoning and  ...  The need for improved explainability, interpretability and trust of AI systems in general demands principled methodologies, as suggested by neural-symbolic computing.  ...  In fact, attention mechanisms can be used to solve graph problems, for example with pointer networks [Vinyals et al., 2015] .  ... 
arXiv:2003.00330v7 fatcat:jiacrkwiuvbofnp5lmiemhn4ua

Pure-Past Linear Temporal and Dynamic Logic on Finite Traces

Giuseppe De Giacomo, Antonio Di Stasio, Francesco Fuggitti, Sasha Rubin
2020 Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence  
Because of this, we can exploit a foundational result on reverse languages to get an exponential improvement, over LTLf /LDLf , for computing the corresponding DFA.  ...  Interestingly, PLTLf (resp., PLDLf ) has the same expressive power as LTLf (resp., LDLf ), but transforming a PLTLf (resp., PLDLf ) formula into its equivalent LTLf (resp.,LDLf) is quite expensive.  ...  In fact, attention mechanisms can be used to solve graph problems, for example with pointer networks [Vinyals et al., 2015] .  ... 
doi:10.24963/ijcai.2020/679 dblp:conf/ijcai/LambGGPAV20 fatcat:25epl6f75zg6jhuzrw3naoksju

Task-Oriented Conversation Generation Using Heterogeneous Memory Networks

Zehao Lin, Xinjing Huang, Feng Ji, Haiqing Chen, Yin Zhang
2019 Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)  
How to incorporate external knowledge into a neural dialogue model is critically important for dialogue systems to behave like real humans.  ...  In our method, historical sequential dialogues are encoded and stored into the context-aware memory enhanced by gating mechanism while grounding knowledge tuples are encoded and stored into the context-free  ...  This work was supported by the NSFC (No.61402403), DAMO Academy (Alibaba Group), Alibaba-Zhejiang University Joint Institute of Frontier Technologies, Chinese Knowledge Center for Engineering Sciences  ... 
doi:10.18653/v1/d19-1463 dblp:conf/emnlp/LinHJCZ19 fatcat:djelamkro5cpvi3qo5zpuftfne

Task-Oriented Conversation Generation Using Heterogeneous Memory Networks [article]

Zehao Lin, Xinjing Huang, Feng Ji, Haiqing Chen, Ying Zhang
2019 arXiv   pre-print
How to incorporate external knowledge into a neural dialogue model is critically important for dialogue systems to behave like real humans.  ...  In our method, historical sequential dialogues are encoded and stored into the context-aware memory enhanced by gating mechanism while grounding knowledge tuples are encoded and stored into the context-free  ...  This work was supported by the NSFC (No.61402403), DAMO Academy (Alibaba Group), Alibaba-Zhejiang University Joint Institute of Frontier Technologies, Chinese Knowledge Center for Engineering Sciences  ... 
arXiv:1909.11287v1 fatcat:lilx5hyyljej7janwebybltzau

The Lost Combinator

Mark Steedman
2018 Computational Linguistics  
The first question concerns the way that Combinatory Categorial Grammar (CCG) was developed with a number of colleagues, over a number of stages and in slightly different forms.  ...  with all the wonderful students and colleagues that have made many essential contributions to this work over many years.  ...  To answer the question, we use the knowledge graph and the entailment graph, and the following rule: (18) a. if Q or anything that entails Q is in the knowledge graph then answer in the positive. b.  ... 
doi:10.1162/coli_a_00328 fatcat:moiccaptzbfdnotycnf24yinae

Relation/Entity-Centric Reading Comprehension [article]

Takeshi Onishi
2020 arXiv   pre-print
More specifically, we focus on question answering tasks designed to measure reading comprehension.  ...  This thesis addresses this challenge through studies of reading comprehension with a focus on understanding entities and their relationships.  ...  In the first stage, a pointer network [56] selects a part of the passage that is likely essential for solving the question.  ... 
arXiv:2008.11940v1 fatcat:ka44mwkforb4lfenw3odkosmem

An Attentive Survey of Attention Models [article]

Sneha Chaudhari, Varun Mithal, Gungor Polatkan, Rohan Ramanath
2021 arXiv   pre-print
We hope this survey will provide a succinct introduction to attention models and guide practitioners while developing approaches for their applications.  ...  Attention Model has now become an important concept in neural networks that has been researched within diverse application domains.  ...  In the second part, final answer for the query is calculated using the context vector over relevant facts with the help of attention.  ... 
arXiv:1904.02874v3 fatcat:fyqgqn7sxzdy3efib3rrqexs74

The Natural Language Decathlon: Multitask Learning as Question Answering [article]

Bryan McCann and Nitish Shirish Keskar and Caiming Xiong and Richard Socher
2018 arXiv   pre-print
We cast all tasks as question answering over a context.  ...  Furthermore, we present a new Multitask Question Answering Network (MQAN) jointly learns all tasks in decaNLP without any task-specific modules or parameters in the multitask setting.  ...  metrics for decaNLP baselines: sequence-to-sequence (S2S) with self-attentive transformer layers (w/SAtt), the addition of coattention (+CAtt) over a split context and question, and a question pointer  ... 
arXiv:1806.08730v1 fatcat:pdvwr3fqfrdnjdzwotzahsjf3e

Complex Knowledge Base Question Answering: A Survey [article]

Yunshi Lan, Gaole He, Jinhao Jiang, Jing Jiang, Wayne Xin Zhao, Ji-Rong Wen
2021 arXiv   pre-print
Knowledge base question answering (KBQA) aims to answer a question over a knowledge base (KB). Early studies mainly focused on answering simple questions over KBs and achieved great success.  ...  Therefore, in recent years, researchers propose a large number of novel methods, which looked into the challenges of answering complex questions.  ...  [131] managed to model the flow of the focus in a conversation via an entity transition graph. For comprehensive understanding of conversation context, Plepi et al.  ... 
arXiv:2108.06688v1 fatcat:qbnneptnrbguxcygni7jsn6q24
« Previous Showing results 1 — 15 out of 1,844 results