Filters








743 Hits in 9.7 sec

Controllable Dialogue Generation with Disentangled Multi-grained Style Specification and Attribute Consistency Reward [article]

Zhe Hu, Zhiwei Cao, Hou Pong Chan, Jiachen Liu, Xinyan Xiao, Jinsong Su, Hua Wu
2021 arXiv   pre-print
Furthermore, we train our model with an attribute consistency reward to promote response control with explicit supervision signals.  ...  In this paper, we propose a controllable dialogue generation model to steer response generation under multi-attribute constraints.  ...  In this paper, we propose CRAYON, a framework to generate Controllable Response with multi-grAined stYle specification and attribute cONsistency reward.  ... 
arXiv:2109.06717v1 fatcat:mt3cill7hvg5xb6x3uplcmw3qu

Text Style Transfer: A Review and Experimental Evaluation [article]

Zhiqiang Hu, Roy Ka-Wei Lee, Charu C. Aggarwal, Aston Zhang
2021 arXiv   pre-print
Specifically, researchers have investigated the Text Style Transfer (TST) task, which aims to change the stylistic properties of the text while retaining its style independent content.  ...  Finally, we expand on current trends and provide new perspectives on the new and exciting developments in the TST field.  ...  For example, pre-trained style attribute code can be leverage in machine translation methods to translate language and generate text in specific styles.5.6.1 Attribute Control Generation with Content-Style  ... 
arXiv:2010.12742v2 fatcat:gmkjxf7f7jhivbo6mayaxjsk7q

Neural Language Generation: Formulation, Methods, and Evaluation [article]

Cristina Garbacea, Qiaozhu Mei
2020 arXiv   pre-print
Recent advances in neural network-based generative modeling have reignited the hopes in having computer systems capable of seamlessly conversing with humans and able to understand natural language.  ...  In this survey we formally define and categorize the problem of natural language generation.  ...  Style-independent content representations are learnt via disentangled latent representations for generating sentences with controllable style attributes (Shen et al., 2017) , .  ... 
arXiv:2007.15780v1 fatcat:oixtreazxvbgvclicpxiqzbxrm

Deep Learning for Text Style Transfer: A Survey [article]

Di Jin, Zhijing Jin, Zhiting Hu, Olga Vechtomova, Rada Mihalcea
2021 arXiv   pre-print
Text style transfer is an important task in natural language generation, which aims to control certain attributes in the generated text, such as politeness, emotion, humor, and many others.  ...  In this paper, we present a systematic survey of the research on neural text style transfer, spanning over 100 representative articles since the first neural text style transfer work in 2017.  ...  Hooks in dialogue generation with expressed the headline: Learning to generate emotions. In Proceedings of the 2018 headlines with controlled styles.  ... 
arXiv:2011.00416v5 fatcat:wfw3jfh2mjfupbzrmnztsqy4ny

Emotion Intensity and its Control for Emotional Voice Conversion [article]

Kun Zhou, Berrak Sisman, Rajib Rana, Björn W. Schuller, Haizhou Li
2022 arXiv   pre-print
We propose to disentangle the speaker style from linguistic content and encode the speaker style into a style embedding in a continuous space that forms the prototype of emotion embedding.  ...  We further learn the actual emotion encoder from an emotion-labelled database and study the use of relative attributes to represent fine-grained emotion intensity.  ...  The authors would like to thank the anonymous reviewers for their insightful comments, Dr Bin Wang for valuable discussions and Dr Rui Liu for sharing part of the codes.  ... 
arXiv:2201.03967v2 fatcat:22h7iuofrnd33cf23xzrjun37m

From Theories on Styles to their Transfer in Text: Bridging the Gap with a Hierarchical Survey [article]

Enrica Troiano and Aswathy Velutharambath and Roman Klinger
2021 arXiv   pre-print
As a natural language generation task, style transfer aims at re-writing existing texts, and specifically, it creates paraphrases that exhibit some desired stylistic attributes.  ...  Hence, our review shows how the groups relate to one another, and where specific styles, including some that have never been explored, belong in the hierarchy.  ...  Acknowledgements This work was supported by Deutsche Forschungsgemeinschaft (project CEAT, KL 2869/1-2) and the Leibniz WissenschaftsCampus Tübingen "Cognitive Interfaces".  ... 
arXiv:2110.15871v2 fatcat:ddpowdm6pbazzd5mwl65nrge5q

Deep Learning for Text Style Transfer: A Survey

Di Jin, Zhijing Jin, Zhiting Hu, Olga Vechtomova, Rada Mihalcea
2021 Computational Linguistics  
Text style transfer is an important task in natural language generation, which aims to control certain attributes in the generated text, such as politeness, emotion, humor, and many others.  ...  In this paper, we present a systematic survey of the research on neural text style transfer, spanning over 100 representative articles since the first neural text style transfer work in 2017.  ...  Hooks in dialogue generation with expressed the headline: Learning to generate emotions. In Proceedings of the 2018 headlines with controlled styles.  ... 
doi:10.1162/coli_a_00426 fatcat:v7vmb62ckfcu5k5mpu2pydnrxy

"Transforming" Delete, Retrieve, Generate Approach for Controlled Text Style Transfer

Akhilesh Sudhakar, Bhargav Upadhyay, Arjun Maheswaran
2019 Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)  
GST is a part of a larger 'Delete Retrieve Generate' framework, in which we also propose a novel method of deleting style attributes from the source sentence by exploiting the inner workings of the Transformer  ...  In this work we introduce the Generative Style Transformer (GST) -a new approach to rewriting sentences to a target style in the absence of parallel style corpora.  ...  Acknowledgments The authors of this paper would like to thank Swati Tiwari, Nishant Thakur and Akshit Mittal for their help with evaluation of results, and the anonymous reviewers for their suggestions  ... 
doi:10.18653/v1/d19-1322 dblp:conf/emnlp/SudhakarUM19 fatcat:k4qsnrsu4bdihaixlyn5h2snpa

Transforming Delete, Retrieve, Generate Approach for Controlled Text Style Transfer [article]

Akhilesh Sudhakar, Bhargav Upadhyay, Arjun Maheswaran
2019 arXiv   pre-print
GST is a part of a larger 'Delete Retrieve Generate' framework, in which we also propose a novel method of deleting style attributes from the source sentence by exploiting the inner workings of the Transformer  ...  In this work we introduce the Generative Style Transformer (GST) - a new approach to rewriting sentences to a target style in the absence of parallel style corpora.  ...  Acknowledgments The authors of this paper would like to thank Swati Tiwari, Nishant Thakur and Akshit Mittal for their help with evaluation of results, and the anonymous reviewers for their suggestions  ... 
arXiv:1908.09368v1 fatcat:5myr2ihujndhbb47yrqf7kxe3i

Challenges in Building Intelligent Open-domain Dialog Systems [article]

Minlie Huang, Xiaoyan Zhu, Jianfeng Gao
2020 arXiv   pre-print
Consistency requires the system to demonstrate a consistent personality to win users trust and gain their long-term confidence.  ...  Interactiveness refers to the system's ability to generate interpersonal responses to achieve particular social goals such as entertainment, conforming, and task completion.  ...  [116] discussed four attributes that are associated with the control of opendomain dialog generation: repetition, specificity, response-relatedness, and question-asking.  ... 
arXiv:1905.05709v3 fatcat:vdibhr4sobgufgcab2cyfyg7wy

Multimodal Research in Vision and Language: A Review of Current and Emerging Trends [article]

Shagun Uppal, Sarthak Bhagat, Devamanyu Hazarika, Navonil Majumdar, Soujanya Poria, Roger Zimmermann, Amir Zadeh
2020 arXiv   pre-print
We also address task-specific trends, along with their evaluation strategies and upcoming challenges.  ...  We look at its applications in their task formulations and how to solve various problems related to semantic perception and content generation.  ...  Also, a graph attention framework with multi-view memory was used for the task of top n-recommendations as per user-specific attributes [297] .  ... 
arXiv:2010.09522v2 fatcat:l4npstkoqndhzn6hznr7eeys4u

Spoken Language Interaction with Robots: Research Issues and Recommendations, Report from the NSF Future Directions Workshop [article]

Matthew Marge, Carol Espy-Wilson, Nigel Ward
2020 arXiv   pre-print
The result is this report, in which we identify key scientific and engineering advances needed. Our recommendations broadly relate to eight general themes.  ...  Third, for robustness, robots need higher-bandwidth communication with users and better handling of uncertainty, including simultaneous consideration of multiple hypotheses and goals.  ...  Hill, Nia Peters, Erion Plaku, Christopher Reardon, and Clare Voss. A hearty thanks also to Erin Zaroukian for her careful reading and thoughtful comments on the writing, Chad M.  ... 
arXiv:2011.05533v1 fatcat:gm4e3fj6sjhstm62naqzetblgi

Vision-Language Navigation: A Survey and Taxonomy [article]

Wansen Wu, Tao Chang, Xinmeng Li
2022 arXiv   pre-print
., single-turn and multi-turn tasks.  ...  Finally, we discuss several open issues of VLN and point out some opportunities in the future, i.e., incorporating knowledge with VLN models and implementing them in the real physical world.  ...  ACKNOWLEDGMENT The work described in this paper was sponsored in part by the National Natural Science Foundation of China under Grant No. 62103420 and 62103428 , the Natural Science Fund of Hunan Province  ... 
arXiv:2108.11544v3 fatcat:qo5g237si5cwtewxiaeqtjwqpy

Deep Latent-Variable Models for Text Generation [article]

Xiaoyu Shen
2022 arXiv   pre-print
It covers a wide range of applications like machine translation, document summarization, dialogue generation and so on.  ...  Nonetheless, deep learning models are known to be extremely data-hungry, and text generated from them usually suffer from low diversity, interpretability and controllability.  ...  How latent variables can improve the controllability and interpretability of text generation by adding finer-grained, latent specifications on the intermediate generation process.  ... 
arXiv:2203.02055v1 fatcat:sq3upxl7xvfnhigoc7apszomwu

Recent Trends in Deep Learning Based Natural Language Processing [article]

Tom Young, Devamanyu Hazarika, Soujanya Poria, Erik Cambria
2018 arXiv   pre-print
We also summarize, compare and contrast the various models and put forward a detailed understanding of the past, present and future of deep learning in NLP.  ...  In this paper, we review significant deep learning related models and methods that have been employed for numerous NLP tasks and provide a walk-through of their evolution.  ...  [135] proposed generating sentences whose attributes are controlled by learning disentangled latent representations with designated semantics.  ... 
arXiv:1708.02709v8 fatcat:guliplxoqfb43pw7jtlrhebcui
« Previous Showing results 1 — 15 out of 743 results