A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Diversifying Reply Suggestions using a Matching-Conditional Variational Autoencoder
[article]
2019
arXiv
pre-print
We consider the problem of diversifying automated reply suggestions for a commercial instant-messaging (IM) system (Skype). ...
To diversify responses, we formulate the model as a generative latent variable model with Conditional Variational Auto-Encoder (M-CVAE). ...
To this end, we propose the Matching-CVAE (M-CVAE) architecture, which introduces a generative LVM on the Matching-IR model using the neural variational autoencoder (VAE) framework (Kingma and Welling ...
arXiv:1903.10630v1
fatcat:lgoqayqpobajjevekrmjxbvuse
Diversifying Reply Suggestions Using a Matching-Conditional Variational Autoencoder
2019
Proceedings of the 2019 Conference of the North
We consider the problem of diversifying automated reply suggestions for a commercial instant-messaging (IM) system (Skype). ...
To diversify responses, we formulate the model as a generative latent variable model with Conditional Variational Auto-Encoder (M-CVAE). ...
To this end, we propose the Matching-CVAE (M-CVAE) architecture, which introduces a generative LVM on the Matching-IR model using the neural variational autoencoder (VAE) framework (Kingma and Welling ...
doi:10.18653/v1/n19-2006
dblp:conf/naacl/DebBS19
fatcat:wfrcwx7j7zbtpohkvi7dynucxi
A Conditional Generative Matching Model for Multi-lingual Reply Suggestion
[article]
2021
arXiv
pre-print
While prior works largely focus on monolingual models, we propose Conditional Generative Matching models (CGM), optimized within a Variational Autoencoder framework to address challenges arising from multi-lingual ...
We study the problem of multilingual automated reply suggestions (RS) model serving many languages simultaneously. ...
Diversifying reply suggestions using a matching-
conditional variational autoencoder. In NAACL-
HLT.
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and
Kristina Toutanova. 2019. ...
arXiv:2109.07046v1
fatcat:36qxed7xlnbvrfiyz2wz5whiji
Guiding Variational Response Generator to Exploit Persona
[article]
2020
arXiv
pre-print
This paper proposes to adopt the personality-related characteristics of human conversations into variational response generators, by designing a specific conditional variational autoencoder based deep ...
We sincerely thank the anonymous reviewers for their thorough reviewing and valuable suggestions. ...
CVAE Conditional Variational AutoEncoder with user information as prior knowledge for modeling persona . Similar to VAE, bagof-words loss is applied in CVAE. ...
arXiv:1911.02390v2
fatcat:nklapps6mvd3zfv4jfcqkbsjsq
MojiTalk: Generating Emotional Responses at Scale
2018
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
We investigate several conditional variational autoencoders training on these conversations, which allow us to use emojis to control the emotion of the generated text. ...
Generating emotional language is a key step towards building empathetic natural language processing agents. ...
Conditional Variational Autoencoder (CVAE) Having similar encoder-decoder structures, SEQ2SEQ can be easily extended to a Conditional Variational Autoencoder (CVAE) (Sohn et al., 2015) . ...
doi:10.18653/v1/p18-1104
dblp:conf/acl/WangZ18
fatcat:u6cjgc42vrd5ldz23waw6lov5e
Diversifying Dialogue Generation with Non-Conversational Text
[article]
2020
arXiv
pre-print
In this paper, we propose a new perspective to diversify dialogue generation by leveraging non-conversational text. ...
We further present a training paradigm to effectively incorporate these text via iterative back translation. ...
CVAE The conditional variational autoencoder (Serban et al., 2017b; Zhao et al., 2017) which injects diversity by imposing stochastical latent variables. ...
arXiv:2005.04346v2
fatcat:iduxvzafffa7vizrd6u3g3ud3a
Jointly Optimizing Diversity and Relevance in Neural Response Generation
2019
Proceedings of the 2019 Conference of the North
As a result, our approach induces a latent space in which the distance and direction from the predicted response vector roughly match the relevance and diversity, respectively. ...
In this paper, we propose a SPACEFUSION model to jointly optimize diversity and relevance that essentially fuses the latent space of a sequenceto-sequence model and that of an autoencoder model by leveraging ...
For instance, Zhao et al. (2017) present an approach to enhancing diversity by mapping diverse responses to a probability distribution using a conditional variational autoencoder (CVAE). ...
doi:10.18653/v1/n19-1125
dblp:conf/naacl/GaoLZBGGD19
fatcat:mrrunqfozje3ngxt6bmlcqhndi
Jointly Optimizing Diversity and Relevance in Neural Response Generation
[article]
2019
arXiv
pre-print
As a result, our approach induces a latent space in which the distance and direction from the predicted response vector roughly match the relevance and diversity, respectively. ...
In this paper, we propose a SpaceFusion model to jointly optimize diversity and relevance that essentially fuses the latent space of a sequence-to-sequence model and that of an autoencoder model by leveraging ...
For instance, Zhao et al. (2017) present an approach to enhancing diversity by mapping diverse responses to a probability distribution using a conditional variational autoencoder (CVAE). ...
arXiv:1902.11205v3
fatcat:g6p5rvlq7fdsrlcuuxwu743oyu
Conditional Text Generation for Harmonious Human-Machine Interaction
[article]
2020
arXiv
pre-print
Conditional Text Generation (CTG) has thus become a research hotspot. As a promising research field, we find that many efforts have been paid to exploring it. ...
Therefore, we aim to give a comprehensive review of the new research trends of CTG. ...
CONCLUSION We have made a systematic review of the research trends of conditional text generation (c-TextGen). ...
arXiv:1909.03409v2
fatcat:s2zfmwxtubgwjks4luoby6vdoq
Natural Language Generation with Neural Variational Models
[article]
2018
arXiv
pre-print
Specifically, we implement two sequence-to-sequence neural variational models - variational autoencoders (VAE) and variational encoder-decoders (VED). ...
In order to circumvent this issue, we propose the variational attention mechanism where the attention context vector is modeled as a random variable that can be sampled from a distribution. ...
Acknowledgements This thesis would not have been possible without the constant support that I received from a number of people. ...
arXiv:1808.09012v1
fatcat:2s5l5k5cr5bg3oqzvpbb5yt3zi
Deep Latent-Variable Models for Text Generation
[article]
2022
arXiv
pre-print
The end-to-end approach conflates all sub-modules, which used to be designed by complex handcrafted rules, into a holistic encode-decode architecture. ...
As a result, it is difficult to trust the output from them in real-life applications. ...
Conditional Variational Autoencoder. ...
arXiv:2203.02055v1
fatcat:sq3upxl7xvfnhigoc7apszomwu
Recent Advances in Neural Text Generation: A Task-Agnostic Survey
[article]
2022
arXiv
pre-print
This paper presents a task-agnostic survey of recent advances in neural text generation. ...
., 2014) can search best-match text as the reply to a userissued utterance. ...
The aforementioned Variational Autoencoder (VAE), for instance, is modified to a Variational Recurrent Autoencoder (VRAE) for text generation (Fabius et al., 2015; Chien and Wang, 2019) . ...
arXiv:2203.03047v1
fatcat:iupgvcw2hbge5ioy6quiotnra4
Challenges in Building Intelligent Open-domain Dialog Systems
[article]
2020
arXiv
pre-print
Consistency requires the system to demonstrate a consistent personality to win users trust and gain their long-term confidence. ...
Semantics requires a dialog system to not only understand the content of the dialog but also identify user's social needs during the conversation. ...
In [195] , a conditional variational autoencoder is proposed to generate more emotional responses conditioned on an input post and some pre-specified emojis. Huber et al. ...
arXiv:1905.05709v3
fatcat:vdibhr4sobgufgcab2cyfyg7wy
Data Manipulation: Towards Effective Instance Learning for Neural Dialogue Generation via Learning to Augment and Reweight
[article]
2020
arXiv
pre-print
As such, a reliable training corpus is the crux of building a robust and well-behaved dialogue model. ...
In this paper, we propose a data manipulation framework to proactively reshape the data distribution towards reliable samples by augmenting and highlighting effective learning samples as well as reducing ...
Acknowledgments We would like to thank all the reviewers for their insightful and valuable comments and suggestions. ...
arXiv:2004.02594v5
fatcat:yxg4fizbgndubh6y62q4sigagy
Cryptocurrency trading: a comprehensive survey
2022
Financial Innovation
and extreme condition, prediction of volatility and return, crypto-assets portfolio construction and crypto-assets, technical trading and others). ...
This paper provides a comprehensive survey of cryptocurrency trading research, by covering 146 research papers on various aspects of cryptocurrency trading (e.g., cryptocurrency trading systems, bubble ...
Cryptocurrency exchanges can be market makers (usually using the bid-ask spread as a commission for services) or a matching platform (simply charging fees). ...
doi:10.1186/s40854-021-00321-6
fatcat:d3d2pkxy5fgcfa4s6gi4h2snua
« Previous
Showing results 1 — 15 out of 39 results