Filters








3,113 Hits in 5.0 sec

Deep learning-based question answering system for intelligent humanoid robot

Widodo Budiharto, Vincent Andreas, Alexander Agung Santoso Gunawan
2020 Journal of Big Data  
Findings The question from the user will be processed using deep learning, and the result will be compared to the knowledge base on the system.  ...  The Humanoid Robot should consider the style of questions and conclude the answer through conversation between robot and user.  ...  Experimental results and discussion For the experiments, we test the model using RNN (GRU/LSTM) based Encoder and CNN based Encoder.  ... 
doi:10.1186/s40537-020-00341-6 fatcat:y43vepzy5fewjbtjworapxbqfy

Comparative Study of CNN and RNN for Natural Language Processing [article]

Wenpeng Yin, Katharina Kann, Mo Yu, Hinrich Schütze
2017 arXiv   pre-print
The state of the art on many NLP tasks often switches due to the battle between CNNs and RNNs.  ...  CNN is supposed to be good at extracting position-invariant features and RNN at modeling units in sequence.  ...  Answer Selection (AS) on WikiQA (Yang et al., 2015) , an open domain question-answer dataset. We use the subtask that assumes that there is at least one correct answer for a question.  ... 
arXiv:1702.01923v1 fatcat:2vnyqcszqjgmxmk3a7an6fqeme

Inner Attention based Recurrent Neural Networks for Answer Selection

Bingning Wang, Kang Liu, Jun Zhao
2016 Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)  
Despite the improvement over nonattentive models, the attention mechanism under RNN is not well studied.  ...  Then we present three new RNN models that add attention information before RNN hidden representation, which shows advantage in representing sentence and achieves new stateof-art results in answer selection  ...  There are two common baseline systems for above three datasets: • GRU: A non-attentive GRU-RNN that models the question and answer separately. • OARNN: Outer attention-based RNN models (OARNN) with GRU  ... 
doi:10.18653/v1/p16-1122 dblp:conf/acl/WangL016 fatcat:rxqr4lwkzbbzjcbphlyu377rxq

Dynamic Graph Generation Network: Generating Relational Knowledge from Diagrams [article]

Daesik Kim, Youngjoon Yoo, Jeesoo Kim, Sangkuk Lee, Nojun Kwak
2017 arXiv   pre-print
Moreover, further experiments on question answering shows potentials of the proposed method for various applications.  ...  To tackle this problem, we propose a unified diagram-parsing network for generating knowledge from diagrams based on an object detector and a recurrent neural network designed for a graphical structure  ...  [14] proposed gated graph neural networks (GG-NNs), which apply GRU as an RNN model for the task. In contrast to the RNN-based models, Marino et al.  ... 
arXiv:1711.09528v1 fatcat:odt4vse6ozfbjkaq2362hf74e4

End-To-End Neural Network for Paraphrased Question Answering Architecture with Single Supporting Line in Bangla Language

Md. Mohsin Uddin, East West University, Dhaka Bangladesh, Nazmus Sakib Patwary, Md. Mohaiminul Hasan, Tanvir Rahman, Mir Tanveer Islam
2020 International Journal of Future Computer and Communication  
To predict appropriate answer, model is trained with question-answer pair and a supporting line.  ...  For comparing our task applying variation of basic Recurrent Neural Network (RNN): Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) different accuracy has been found.  ...  A question answering system must interpret between those question and should give the appropriate answer from the knowledge base.  ... 
doi:10.18178/ijfcc.2020.9.3.565 fatcat:gp726p3afnczboj6ihgmpzhone

Deep Knowledge Tracing with Side Information [article]

Zhiwei Wang, Xiaoqin Feng, Jiliang Tang, Gale Yan Huang, Zitao Liu
2019 arXiv   pre-print
Despite its inherent challenges, recent deep neural networks based knowledge tracing models have achieved great success, which is largely from models' ability to learn sequential dependencies of questions  ...  However, in addition to sequential information, questions inherently exhibit side relations, which can enrich our understandings about student knowledge states and has great potentials to advance knowledge  ...  Recently, one framework named Deep Knowledge Tracing (DKT) that is based on deep neural networks has shown superior performance over previously proposed knowledge tracing models [12] .  ... 
arXiv:1909.00372v1 fatcat:tv6qncznmjhu5fog5ikww3kngu

Linguistic Knowledge as Memory for Recurrent Neural Networks [article]

Bhuwan Dhingra, Zhilin Yang, William W. Cohen, Ruslan Salakhutdinov
2017 arXiv   pre-print
Hence, we propose to use external linguistic knowledge as an explicit signal to inform the model which memories it should utilize.  ...  Specifically, external knowledge is used to augment a sequence with typed edges between arbitrarily distant elements, and the resulting graph is decomposed into directed acyclic subgraphs.  ...  Recently, there has been interest in incorporating symbolic knowledge, such as that from a Knowledge Base or coreference information, within RNN-based language models (Yang et al., 2016; Ahn et al., 2016  ... 
arXiv:1703.02620v1 fatcat:3xg3s5hugzaqzfizhq2x7pv3am

Student Performance Prediction Using Dynamic Neural Models [article]

Marina Delianidi, Konstantinos Diamantaras, George Chrysogonidis, Vasileios Nikiforidis
2021 arXiv   pre-print
The first part employs a dynamic neural network (either TDNN or RNN) to trace the student knowledge state.  ...  We address the problem of predicting the correctness of the student's response on the next exam question based on their previous interactions in the course of their learning and evaluation process.  ...  RNN Approach: Bi-GRU Model The model architecture based on the RNN method for the knowledge tracing task is shown in Figure 3 .  ... 
arXiv:2106.00524v1 fatcat:7zqonylycnf2tiqno2nqn4lyiu

Dynamic Graph Generation Network: Generating Relational Knowledge from Diagrams

Daesik Kim, YoungJoon Yoo, Jeesoo Kim, Sangkuk Lee, Nojun Kwak
2018 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition  
Moreover, further experiments on question answering shows potentials of the proposed method for various applications.  ...  To tackle this problem, we propose a unified diagram-parsing network for generating knowledge from diagrams based on an object detector and a recurrent neural network designed for a graphical structure  ...  [15] proposed gated graph neural networks (GG-NNs), which apply GRU as an RNN model for the task. In contrast to the RNN-based models, Marino et al.  ... 
doi:10.1109/cvpr.2018.00438 dblp:conf/cvpr/KimYKLK18 fatcat:j5gyp63swzf35aclv2dtxzv4i4

Recurrent One-Hop Predictions for Reasoning over Knowledge Graphs [article]

Wenpeng Yin, Yadollah Yaghoobzadeh, Hinrich Schütze
2018 arXiv   pre-print
Reasoning over multi-hop (mh) KG paths is thus an important capability that is needed for question answering or other NLP tasks that require knowledge about the world. mh-KG reasoning includes diverse  ...  Our models show state-of-the-art for two important multi-hop KG reasoning tasks: Knowledge Base Completion and Path Query Answering.  ...  This task corresponds to answering compositional natural questions. For example, the question "Where do Brandon Lee's parents live?"  ... 
arXiv:1806.04523v1 fatcat:5nprpxdqprh6pp44s6lv2n5q7m

Towards Open Domain Chatbots—A GRU Architecture for Data Driven Conversations [chapter]

Åsmund Kamphaug, Ole-Christoffer Granmo, Morten Goodwin, Vladimir I. Zadorozhny
2018 Lecture Notes in Computer Science  
To evaluate our architecture, we have compiled 10 years of questions and answers from a youth information service, 200083 questions spanning a wide range of content, altogether 289 topics, involving law  ...  In this paper, we propose a novel deep learning architecture for content recognition that consists of multiple levels of gated recurrent units (GRUs).  ...  This research uses a knowledge base that already contains the topics of all the available questions, which makes it different from our approach.  ... 
doi:10.1007/978-3-319-77547-0_16 fatcat:ttv6udjf4jfnpjpf222cshdaoq

Neural Network-based Question Answering over Knowledge Graphs on Word and Character Level

Denis Lukovnikov, Asja Fischer, Jens Lehmann, Sören Auer
2017 Proceedings of the 26th International Conference on World Wide Web - WWW '17  
Question Answering (QA) systems over Knowledge Graphs (KG) automatically answer natural language questions using facts contained in a knowledge graph.  ...  a large knowledge resource.  ...  billion facts), Microsoft Bing's Satori Knowledge Base 3 or Yandex' Object Answer 4 .  ... 
doi:10.1145/3038912.3052675 dblp:conf/www/LukovnikovFLA17 fatcat:clazggqz6bdkplcsnuw64yipvi

A Review on Methods and Applications in Multimodal Deep Learning [article]

Jabeen Summaira, Xi Li, Amin Muhammad Shoib, Jabbar Abdul
2022 arXiv   pre-print
Schwing [75] proposed a framework for the VQA task using external knowledge resources that contain a set of facts. This framework can answer both fact-based and visual-based questions. K.  ...  Table 4 . 4 Comparative analysis of Visual Question Answering models. where MMJEM=Multimodal Joint-Embedding Models, MMAM=Multimodal Attention-based Models, MMEKM=Multimodal External Knowledge bases Models  ... 
arXiv:2202.09195v1 fatcat:wwxrmrwmerfabbenleylwmmj7y

On the Significance of Question Encoder Sequence Model in the Out-of-Distribution Performance in Visual Question Answering [article]

Gouthaman KV, Anurag Mittal
2021 arXiv   pre-print
It has been shown that current Visual Question Answering (VQA) models are over-dependent on the language-priors (spurious correlations between question-types and their most frequent answers) from the train  ...  To demonstrate this, we performed a detailed analysis of various existing RNN-based and Transformer-based question-encoders, and along, we proposed a novel Graph attention network (GAT)-based question-encoder  ...  Sum-pooling aggregation: In this experiment, as in the self-attention-based question-encoders, we apply the sumpooling operation over the hidden states of the GRU cells in GRU-QE to get the question context  ... 
arXiv:2108.12585v2 fatcat:e5kak3jup5drbj3r2u2o6hew6q

Neural Models for Reasoning over Multiple Mentions using Coreference [article]

Bhuwan Dhingra, Qiao Jin, Zhilin Yang, William W. Cohen, Ruslan Salakhutdinov
2018 arXiv   pre-print
Existing Recurrent Neural Network (RNN) layers are biased towards short-term dependencies and hence not suited to such tasks.  ...  QA models which directly read text to answer questions (commonly known as Reading Comprehension systems) (Hermann et al., 2015; Seo et al., 2017a) , typically consist of RNN layers.  ...  In this manner only 80% of the questions are answerable, but the performance increases substantially compared to pure language modeling based approaches.  ... 
arXiv:1804.05922v1 fatcat:qftnsqqopjemxmveu4j5tbkxvy
« Previous Showing results 1 — 15 out of 3,113 results