Filters








186,729 Hits in 4.2 sec

Word concept model: a knowledge representation for dialogue agents

Yang Li, Tong Zhang, Stephen E. Levinson
2000 6th International Conference on Spoken Language Processing (ICSLP 2000)   unpublished
An intelligent agent is necessary for a dialogue system when meanings are strictly defined by using a world state model.  ...  Information extraction is a key component in dialogue systems. Knowledge about the world as well as knowledge specific to each word should be used for robust semantic processing.  ...  However, no current knowledge representation method is good for storing information associated with each word.  ... 
doi:10.21437/icslp.2000-445 fatcat:iqownxpsuzdzlkivcyczqqjr2a

Can Visual Dialogue Models Do Scorekeeping? Exploring How Dialogue Representations Incrementally Encode Shared Knowledge

Brielen Madureira, David Schlangen
2022 Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)   unpublished
We propose a theory-based evaluation method for investigating to what degree models pretrained on the VisDial dataset incrementally build representations that appropriately do scorekeeping.  ...  Cognitively plausible visual dialogue models should keep a mental scoreboard of shared established facts in the dialogue context.  ...  Acknowledgements We are thankful to the anonymous reviewers for their feedback and suggestions, to Wencke Liermann for implementing the interface for the human evaluation and to the student assistants  ... 
doi:10.18653/v1/2022.acl-short.73 fatcat:r7tr3fbrlbaxlmbweusdaactra

Incorporating Commonsense Knowledge into Abstractive Dialogue Summarization via Heterogeneous Graph Networks [article]

Xiachong Feng, Xiaocheng Feng, Bing Qin, Ting Liu
2020 arXiv   pre-print
In detail, we consider utterance and commonsense knowledge as two different types of data and design a Dialogue Heterogeneous Graph Network (D-HGN) for modeling both information.  ...  In this paper, we present a novel multi-speaker dialogue summarizer to demonstrate how large-scale commonsense knowledge can facilitate dialogue understanding and summary generation.  ...  Recent works that incorporate additional commonsense knowledge in the dialogue generation (Zhou et al., 2018) and dialogue context representation learning show that even though neural models have strong  ... 
arXiv:2010.10044v1 fatcat:dpxkgfhyzjehnd4rrhoa4ukt4e

Hierarchical Knowledge Distillation for Dialogue Sequence Labeling [article]

Shota Orihashi, Yoshihiro Yamazaki, Naoki Makishima, Mana Ihori, Akihiko Takashima, Tomohiro Tanaka, Ryo Masumura
2021 arXiv   pre-print
This paper presents a novel knowledge distillation method for dialogue sequence labeling.  ...  Dialogue sequence labeling is a supervised learning task that estimates labels for each utterance in the target dialogue document, and is useful for many applications such as dialogue act estimation.  ...  Knowledge distillation is seen as potentially able to overcome the difficulty of using a large model for dialogue sequence labeling, but no truly effective knowledge distillation technique for dialogue  ... 
arXiv:2111.10957v1 fatcat:ohanql6srfd3bc3wkmga72fcka

History-Adaption Knowledge Incorporation Mechanism for Multi-Turn Dialogue System

Yajing Sun, Yue Hu, Luxi Xing, Jing Yu, Yuqiang Xie
2020 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
So we design a history-adaption knowledge incorporation mechanism to build an effective multi-turn dialogue model.  ...  And the knowledge-grounded history representation also enhances the conversation consistency.  ...  On the one hand, the module integrates the knowledge information into the current dialogue and get knowledge-aware representation, which helps keep dialogue consistency.  ... 
doi:10.1609/aaai.v34i05.6425 fatcat:vfhq4lvm65gfhdsjlvkaq4mmuq

Improving Knowledge-aware Dialogue Generation via Knowledge Base Question Answering [article]

Jian Wang, Junhao Liu, Wei Bi, Xiaojiang Liu, Kejing He, Ruifeng Xu, Min Yang
2019 arXiv   pre-print
In this paper, we propose a novel knowledge-aware dialogue generation model (called TransDG), which transfers question representation and knowledge matching abilities from knowledge base question answering  ...  (KBQA) task to facilitate the utterance understanding and factual knowledge selection for dialogue generation.  ...  Knowledge Base Question Answering As shown in Figure 2 , our model contains two parts: a KBQA model and a dialogue generation model, where knowledge learned from the KBQA task is transferred to dialogue  ... 
arXiv:1912.07491v1 fatcat:p2cxea2ay5alpfd6kvccgamjsy

RT-KGD: Relation Transition Aware Knowledge-Grounded Dialogue Generation [article]

Kexin Wang, Zhixu Li, Jiaan Wang, Jianfeng Qu, Ying He, An Liu, Lei Zhao
2022 arXiv   pre-print
To this end, we propose a Relation Transition aware Knowledge-Grounded Dialogue Generation model (RT-KGD).  ...  Most existing works adopt knowledge graphs (KGs) as the external resources, paying attention to the contribution of entities in the last utterance of the dialogue for context understanding and response  ...  As illustrated in Fig. 2 , our model first constructs the multi-turn heterogeneous knowledge transition path (MHKT-Path) for the given dialogue context (Sec. 3.2) and then encodes the MHKT-Path by a knowledge  ... 
arXiv:2207.08212v1 fatcat:z7s4egdztjahvnijwdbpmlxcoq

Dynamically Retrieving Knowledge via Query Generation for informative dialogue response [article]

Zhongtian Hu, Yangqi Chen, Yushuang Liu, Lifang Wang
2022 arXiv   pre-print
In order to solve the problem, we design a knowledge-driven dialogue system named DRKQG (Dynamically Retrieving Knowledge via Query Generation for informative dialogue response).  ...  Compared with general dialogue systems, superior knowledge-driven dialogue systems can generate more informative and knowledgeable responses with pre-provided knowledge.  ...  For a dialogue session, our model needs to dynamically generate a query and retrieve knowledge, and then produce a response based on the retrieved knowledge in each round of the dialogue session.  ... 
arXiv:2208.00128v1 fatcat:2qf635nkhzh4jh5zw7hhwaq46y

Knowledgeable Dialogue Reading Comprehension on Key Turns [article]

Junlong Li, Zhuosheng Zhang, Hai Zhao
2020 arXiv   pre-print
The original context, question and external knowledge are encoded with the pre-trained language model, then the language representation and key turns are combined together with a will-designed mechanism  ...  Our research focuses dialogue-based MRC, where the passages are multi-turn dialogues.  ...  Conclusion In this paper, we propose modeling multi-turn dialogue with knowledge and key turns for dialogue-based multi-choice MRC.  ... 
arXiv:2004.13988v2 fatcat:gfd2mp7iavdnxjupm5xxe7ko3u

Improving Knowledge-Aware Dialogue Generation via Knowledge Base Question Answering

Jian Wang, Junhao Liu, Wei Bi, Xiaojiang Liu, Kejing He, Ruifeng Xu, Min Yang
2020 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
In this paper, we propose a novel knowledge-aware dialogue generation model (called TransDG), which transfers question representation and knowledge matching abilities from knowledge base question answering  ...  (KBQA) task to facilitate the utterance understanding and factual knowledge selection for dialogue generation.  ...  Knowledge Base Question Answering As shown in Figure 2 , our model contains two parts: a KBQA model and a dialogue generation model, where knowledge learned from the KBQA task is transferred to dialogue  ... 
doi:10.1609/aaai.v34i05.6453 fatcat:rqq2zwgywvdmrn5joec6anww7i

Sequence-to-Sequence Learning for Task-oriented Dialogue with Dialogue State Representation [article]

Haoyang Wen, Yijia Liu, Wanxiang Che, Libo Qin, Ting Liu
2018 arXiv   pre-print
Classic pipeline models for task-oriented dialogue system require explicit modeling the dialogue states and hand-crafted action spaces to query a domain-specific knowledge base.  ...  Our framework models a dialogue state as a fixed-size distributed representation and use this representation to query a knowledge base via an attention mechanism.  ...  Acknowledgments We thank the anonymous reviewers for their helpful comments and suggestions.  ... 
arXiv:1806.04441v1 fatcat:w7qvnqk6uzamviexm33fbb3djm

MTSS: Learn from Multiple Domain Teachers and Become a Multi-Domain Dialogue Expert

Shuke Peng, Feng Ji, Zehao Lin, Shaobo Cui, Haiqing Chen, Yin Zhang
2020 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
Then, these domain-specific teachers impart their domain knowledge and policies to a universal student model and collectively make this student model a multi-domain dialogue expert.  ...  Each individual teacher only focuses on one specific domain and learns its corresponding domain knowledge and dialogue policy based on a precisely extracted single domain dialogue state representation.  ...  Our model subtly circumvents the knotty multi-domain dialogue state representation problem by using multiple teacher models to learn domain-specific dialogue knowledge.  ... 
doi:10.1609/aaai.v34i05.6384 fatcat:szi56ouwvradlasfdamx6wyp4m

DialogZoo: Large-Scale Dialog-Oriented Task Learning [article]

Zhi Chen, Jijia Bao, Lu Chen, Yuncong Liu, Da Ma, Bei Chen, Mengyue Wu, Su Zhu, Jian-Guang Lou, Kai Yu
2022 arXiv   pre-print
The experimental results show that our method not only improves the ability of dialogue generation and knowledge distillation, but also the representation ability of models.  ...  We evaluate our model on various downstream dialogue tasks.  ...  Pre-trained Dialogue Models Pre-trained dialogue models are pretrained language models (PLMs) designed for dialogue tasks.  ... 
arXiv:2205.12662v1 fatcat:bjjvinmorzbljk7274juubbevm

Knowledge-Grounded Dialogue Generation with a Unified Knowledge Representation [article]

Yu Li, Baolin Peng, Yelong Shen, Yi Mao, Lars Liden, Zhou Yu, Jianfeng Gao
2022 arXiv   pre-print
To address these challenges, we present PLUG, a language model that homogenizes different knowledge sources to a unified knowledge representation for knowledge-grounded dialogue generation tasks.  ...  PLUG is pre-trained on a dialogue generation task conditioned on a unified essential knowledge representation.  ...  a unified knowledge representation in a large-scale language model.  ... 
arXiv:2112.07924v2 fatcat:f22kzln675d5vh4b255b6zm5pi

MTSS: Learn from Multiple Domain Teachers and Become a Multi-domain Dialogue Expert [article]

Shuke Peng, Feng Ji, Zehao Lin, Shaobo Cui, Haiqing Chen, Yin Zhang
2020 arXiv   pre-print
Then, these domain-specific teachers impart their domain knowledge and policies to a universal student model and collectively make this student model a multi-domain dialogue expert.  ...  Each individual teacher only focuses on one specific domain and learns its corresponding domain knowledge and dialogue policy based on a precisely extracted single domain dialogue state representation.  ...  Our model subtly circumvents the knotty multi-domain dialogue state representation problem by using multiple teacher models to learn domain-specific dialogue knowledge.  ... 
arXiv:2005.10450v1 fatcat:s3f6eorecbeafnpwdtfstgd3sa
« Previous Showing results 1 — 15 out of 186,729 results