Filters








85,364 Hits in 3.9 sec

Specialized Language Models using Dialogue Predictions [article]

Cosmin Popovici, Paolo Baggia (CSELT - Turin, Italy)
1996 arXiv   pre-print
The use of several language models obtained by exploiting dialogue predictions gives better results than the use of a single model for the whole dialogue interaction.  ...  This paper analyses language modeling in spoken dialogue systems for accessing a database.  ...  Table 1 : Results of single models and models with DP. SINGLE CONTEXT-INDEPENDENT MODELS LANGUAGE MODELS WITH DIALOGUE PREDICTIONS A set of two models with DP were tested.  ... 
arXiv:cmp-lg/9612002v2 fatcat:zndyktpa2jgzhnn5k5dp3vketu

Specialized language models using dialogue predictions

C. Popovici, P. Baggia
1997 IEEE International Conference on Acoustics, Speech, and Signal Processing  
The use of several language models obtained by exploiting dialogue predictions gives better results than the use of a single model for the whole dialogue interaction.  ...  This paper analyses language modeling in spoken dialogue systems for accessing a database.  ...  Single context-independent models SA Language models with dialogue predictions A set of two models with DP were tested. The first one, ALL_PRED, was created as described in Section 3.2.  ... 
doi:10.1109/icassp.1997.596055 dblp:conf/icassp/PopoviciB97 fatcat:zofjt6o6vjgvng3fbce4iwozee

DSBERT:Unsupervised Dialogue Structure learning with BERT [article]

Bingkun Chen, Shaobing Dai, Shenghua Zheng, Lei Liao, Yang Li
2021 arXiv   pre-print
can be used for dialogue structure learning.  ...  Unsupervised dialogue structure learning is an important and meaningful task in natural language processing.  ...  The encoder of DSBERT only uses the features of special token to predict the latent states. In the decode stage, we also only use the features of special token to restore the dialogue.  ... 
arXiv:2111.04933v1 fatcat:7fcj7mfkrrbhpl6os73ppoxjee

CAiRE: An Empathetic Neural Chatbot [article]

Zhaojiang Lin, Peng Xu, Genta Indra Winata, Farhad Bin Siddique, Zihan Liu, Jamin Shin, Pascale Fung
2020 arXiv   pre-print
., 2019) learning approach that fine-tunes a large-scale pre-trained language model with multi-task objectives: response language modeling, response prediction and dialogue emotion detection.  ...  We evaluate our model on the recently proposed empathetic-dialogues dataset (Rashkin et al., 2019), the experiment results show that CAiRE achieves state-of-the-art performance on dialogue emotion detection  ...  objectives: response language modeling, response prediction, and dialogue emotion detection.  ... 
arXiv:1907.12108v4 fatcat:4qh7cs5un5chjp3wrxy3uqlthe

TOD-BERT: Pre-trained Natural Language Understanding for Task-Oriented Dialogue [article]

Chien-Sheng Wu, Steven Hoi, Richard Socher, Caiming Xiong
2020 arXiv   pre-print
The underlying difference of linguistic patterns between general text and task-oriented dialogue makes existing pre-trained language models less useful in practice.  ...  , dialogue act prediction, and response selection.  ...  Figure 1 : 1 Dialogue pre-training based on Transformer encoder with user and system special tokens. Two objective functions are used: masked language modeling and response contrastive learning.  ... 
arXiv:2004.06871v3 fatcat:plfdyo7j3nfdhiqdrztyzbou3m

Goal-Oriented Multi-Task BERT-Based Dialogue State Tracker [article]

Pavel Gulyaev, Eugenia Elistratova, Vasily Konovalov, Yuri Kuratov, Leonid Pugachev, Mikhail Burtsev
2020 arXiv   pre-print
The organizers introduced the Schema-Guided Dialogue (SGD) dataset with multi-domain conversations and released a zero-shot dialogue state tracking model.  ...  The model "queries" dialogue history with descriptions of slots and services as well as possible values of slots.  ...  To predict a dialogue state update, our model solves several classification tasks and a span-prediction task.  ... 
arXiv:2002.02450v1 fatcat:2mwuotfnsrhdvhk7cfjmlzla44

Different Strokes for Different Folks: Investigating Appropriate Further Pre-training Approaches for Diverse Dialogue Tasks [article]

Yao Qiu, Jinchao Zhang, Jie Zhou
2021 arXiv   pre-print
Loading models pre-trained on the large-scale corpus in the general domain and fine-tuning them on specific downstream tasks is gradually becoming a paradigm in Natural Language Processing.  ...  However, most of these further pre-training works just keep running the conventional pre-training task, e.g., masked language model, which can be regarded as the domain adaptation to bridge the data distribution  ...  prediction (DSP), context response matching (CRM), and dialogue coherence verification (DCV), each of which has special characteristics.  ... 
arXiv:2109.06524v1 fatcat:bjxrje5ryfhrde3ehcnwsq3o2i

The SPPD System for Schema Guided Dialogue State Tracking Challenge [article]

Miao Li, Haoqi Xiong, Yunbo Cao
2020 arXiv   pre-print
The key components of the system is a number of BERT based zero-shot NLU models that can effectively capture semantic relations between natural language descriptions of services' schemas and utterances  ...  from dialogue turns.  ...  For each user turn of a dialogue, the proposed dialogue state tracking model must make predictions for all three fields, and metrics mentioned above are used to evaluate the model predictions.  ... 
arXiv:2006.09035v1 fatcat:khqmburmrbbtldl45cltlpovju

STN4DST: A Scalable Dialogue State Tracking based on Slot Tagging Navigation [article]

Puhai Yang, Heyan Huang, Xianling Mao
2021 arXiv   pre-print
extract slot value quickly and accurately by the joint learning of slot tagging and slot value position prediction in the dialogue context, especially for unknown slot values.  ...  Extensive experiments over several benchmark datasets show that the proposed model performs better than state-of-the-art baselines greatly.  ...  When dealing with more complex unknown slot values, STN4DST presents better generalization and scalability than the widely used span extraction, showing greater research potential and application prospect  ... 
arXiv:2010.10811v2 fatcat:rgdb64ncbndnnbijhntf3h4tyi

DIA-MOLE: An Unsupervised Learning Approach to Adaptive Dialogue Models for Spoken Dialogue Systems

Jens-Uwe Moeller (Natural Language Systems Division, Dept. of Computer Science, Univ. of Hamburg)
1997 arXiv   pre-print
The DIAlogue MOdel Learning Environment supports an engineering-oriented approach towards dialogue modelling for a spoken-language interface.  ...  Major steps towards dialogue models is to know about the basic units that are used to construct a dialogue model and possible sequences.  ...  E.g. the word recognizer may use dialogue act predictions to choose a specific language model trained on these DDA classes to improve word recognition.  ... 
arXiv:cmp-lg/9708009v1 fatcat:zog4bfysyreqnpximemhicboq4

The Use of Psycholinguistic Patterns in Interactive Systems of Active Information Retrieval

Valery Evgenevich Sachkov, Dmitry Aleksandrovich Akimov, Sergey Aleksandrovich Pavelyev
2018 International Journal of Engineering & Technology  
The article explores the possibility of using psycholinguistic patterns in a dialogue with the Internet visitors.  ...  The model of building psycholinguistic patterns for revealing the semantic information in dialogues is given. Patterns are based on associative links of words and word combinations.  ...  Conclusion As research shows the possibility of using specialized psycholinguistic patterns increases the level of interaction between the user and the machine, since they allow dialogue in the natural  ... 
doi:10.14419/ijet.v7i4.38.24595 fatcat:iojc3pq3jfg6jfczjswhlkgvyq

Affect Recognition for Multimodal Natural Language Processing

Soujanya Poria, Ong Yew Soon, Bing Liu, Lidong Bing
2020 Cognitive Computation  
Acknowledgments The guest editors are grateful to the Editor-in-Chief, Amir Hussain, and to the many reviewers who kindly agreed to serve for this special issue and submitted their insightful reviews in  ...  It expands the horizons of NLP to study language used in face to face communication and in online multimedia.  ...  The challenges of modeling human multimodal language can be split into two major categories: (1) studying each modality individually and modeling each in a manner that can be linked to other modalities  ... 
doi:10.1007/s12559-020-09738-0 fatcat:lmospfzn3barvk6fwnnk2rvw3i

Amendable Generation for Dialogue State Tracking [article]

Xin Tian, Liankai Huang, Yingzhan Lin, Siqi Bao, Huang He, Yunyi Yang, Hua Wu, Fan Wang, Shuqi Sun
2021 arXiv   pre-print
With the additional amending generation pass, our model is tasked to learn more robust dialogue state tracking by amending the errors that still exist in the primitive dialogue state, which plays the role  ...  The mistakes of these models made at the current turn are prone to be carried over to the next turn, causing error propagation.  ...  Besides, some methods treat dialogue state tracking as a causal language model by using the dialogue of the current turn and previous dialogue state as input sequence (Lin et al., 2020; .  ... 
arXiv:2110.15659v1 fatcat:g6dxqmm4dzd7fibipomrdc4w2y

Deal or No Deal? End-to-End Learning of Negotiation Dialogues

Mike Lewis, Denis Yarats, Yann Dauphin, Devi Parikh, Dhruv Batra
2017 Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing  
For the first time, we show it is possible to train end-to-end models for negotiation, which must learn both linguistic and reasoning skills with no annotated dialogue states.  ...  dialogue.  ...  Acknowledgments We would like to thank Luke Zettlemoyer and the anonymous EMNLP reviewers for their insightful comments, and the Mechanical Turk workers who helped us collect data. References  ... 
doi:10.18653/v1/d17-1259 dblp:conf/emnlp/LewisYDPB17 fatcat:t32yw2avuzed5mms7a45ag2tyq

Dialogue State Tracking with a Language Model using Schema-Driven Prompting [article]

Chia-Hsuan Lee, Hao Cheng, Mari Ostendorf
2021 arXiv   pre-print
Recently, good results have been obtained using more general architectures based on pretrained language models.  ...  Here, we introduce a new variation of the language modeling approach that uses schema-driven prompting to provide task-aware history encoding that is used for both categorical and non-categorical slots  ...  Moreover, using descriptions consistently improves the performance of both models. All our models outperform baselines that do not use extra dialogue data.  ... 
arXiv:2109.07506v1 fatcat:vda2nn7f55etneaxe5ugv47zvi
« Previous Showing results 1 — 15 out of 85,364 results