Filters








258 Hits in 4.1 sec

Reformulating Unsupervised Style Transfer as Paraphrase Generation [article]

Kalpesh Krishna, John Wieting, Mohit Iyyer
2020 arXiv   pre-print
In this paper, we reformulate unsupervised style transfer as a paraphrase generation problem, and present a simple methodology based on fine-tuning pretrained language models on automatically generated  ...  Modern NLP defines the task of style transfer as modifying the style of a given sentence without appreciably changing its semantics, which implies that the outputs of style transfer systems should be paraphrases  ...  Motivated by this result, we reformulate style transfer as a controlled paraphrase generation task. We call our method STRAP, or Style Transfer via Paraphrasing.  ... 
arXiv:2010.05700v1 fatcat:6oe76tfx7zc3flkr5vt32baecq

Reformulating Unsupervised Style Transfer as Paraphrase Generation

Kalpesh Krishna, John Wieting, Mohit Iyyer
2020 Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)   unpublished
In this paper, we reformulate unsupervised style transfer as a paraphrase generation problem, and present a simple methodology based on fine-tuning pretrained language models on automatically generated  ...  Modern NLP defines the task of style transfer as modifying the style of a given sentence without appreciably changing its semantics, which implies that the outputs of style transfer systems should be paraphrases  ...  Motivated by this result, we reformulate style transfer as a controlled paraphrase generation task. We call our method STRAP, or Style Transfer via Paraphrasing.  ... 
doi:10.18653/v1/2020.emnlp-main.55 fatcat:xag5vgbg4jatznd7bydkozakzu

From Paraphrasing to Semantic Parsing: Unsupervised Semantic Parsing via Synchronous Semantic Decoding [article]

Shan Wu, Bo Chen, Chunlei Xin, Xianpei Han, Le Sun, Weipeng Zhang, Jiansong Chen, Fan Yang, Xunliang Cai
2021 arXiv   pre-print
Specifically, we reformulate semantic parsing as a constrained paraphrasing problem: given an utterance, our model synchronously generates its canonical utterance and meaning representation.  ...  In this paper, we propose an unsupervised semantic parsing method - Synchronous Semantic Decoding (SSD), which can simultaneously resolve the semantic gap and the structure gap by jointly leveraging paraphrasing  ...  Given an utterance, SSD reformulates semantic parsing as a constrained paraphrasing problem, and synchronously generates its canonical utterance and logical form.  ... 
arXiv:2106.06228v1 fatcat:qjz7uhi4ivctpnzcdazge5jsm4

Data-driven Paraphrasing and Stylistic Harmonization

Gerold Hintz
2016 Proceedings of the NAACL Student Research Workshop  
This thesis proposal outlines the use of unsupervised data-driven methods for paraphrasing tasks.  ...  We describe a method for unsupervised relation extraction, which we aim to leverage in lexical substitution as a replacement for knowledge-based resources.  ...  Acknowledgments This work has been supported by the German Research Foundation as part of the Research Training Group "Adaptive Preparation of Information from Heterogeneous Sources" (AIPHES) under grant  ... 
doi:10.18653/v1/n16-2006 dblp:conf/naacl/Hintz16 fatcat:csble2feovdangevaskolklqsi

Exploring Cross-lingual Textual Style Transfer with Large Multilingual Language Models [article]

Daniil Moskovskiy, Daryna Dementieva, Alexander Panchenko
2022 arXiv   pre-print
Experiments show that multilingual models are capable of performing multilingual style transfer.  ...  Detoxification is a task of generating text in polite style while preserving meaning and fluency of the original toxic text.  ...  In the multilingual setup we experimentally show that reformulating detoxification (Textual Style Transfer) as a NMT task boosts performance of the models given enough parallel data for training.  ... 
arXiv:2206.02252v1 fatcat:qw4scxif3bgjxdyv3fhzqbocmu

Learning semantic similarity in a continuous space

Michel Deudon
2018 Neural Information Processing Systems  
We first learn to repeat, reformulate questions to infer intents as normal distributions with a deep generative model [2] (variational auto encoder).  ...  Our work sheds light on how deep generative models can approximate distributions (semantic representations) to effectively measure semantic similarity with meaningful distance metrics from Information  ...  We also thank Professor Francis Bach and Professor Guillaume Obozinski for their insightful course on probabilistic graphical models at ENS Cachan, as well as Professor Michalis Vazirgiannis for his course  ... 
dblp:conf/nips/Deudon18 fatcat:4wl4phdspffinomr5i5of76i7e

Conditional Text Paraphrasing: A Survey and Taxonomy

Ahmed H. Al-Ghidani, Aly A.
2018 International Journal of Advanced Computer Science and Applications  
The target of this taxonomy is to expand the definition of the text paraphrasing by adding some conditional constraints as features that either control the paraphrase generation or discrimination.  ...  This work introduces a survey for the Text Paraphrasing task.  ...  Recently, the great successes of the deep generative models [17] such as Variational Autoencoders (VAEs) [18] and Generative Adversarial Networks (GANs) [19] had a great impact in unsupervised learning  ... 
doi:10.14569/ijacsa.2018.091182 fatcat:iox6cqoab5hjlnczf5znpcg2xe

Deep Learning for Text Style Transfer: A Survey [article]

Di Jin, Zhijing Jin, Zhiting Hu, Olga Vechtomova, Rada Mihalcea
2021 arXiv   pre-print
Text style transfer is an important task in natural language generation, which aims to control certain attributes in the generated text, such as politeness, emotion, humor, and many others.  ...  In this paper, we present a systematic survey of the research on neural text style transfer, spanning over 100 representative articles since the first neural text style transfer work in 2017.  ...  Reformulating unsupervised obfuscation - (best of the labs track at style transfer as paraphrase generation. CLEF-2017).  ... 
arXiv:2011.00416v5 fatcat:wfw3jfh2mjfupbzrmnztsqy4ny

Training Naturalized Semantic Parsers with Very Little Data [article]

Subendhu Rongali, Konstantine Arkoudas, Melanie Rubino, Wael Hamza
2022 arXiv   pre-print
Our method is based on a novel synthesis of four techniques: joint training with auxiliary unsupervised tasks; constrained decoding; self-training; and paraphrasing.  ...  Semantic parsing is an important NLP problem, particularly for voice assistants such as Alexa and Google Assistant.  ...  We also introduce self-training and paraphrasing steps to augment the initial data and further improve model performance.  ... 
arXiv:2204.14243v2 fatcat:fl4xc7l7ybfr5ltsd7yi5dgime

Deep Learning for Text Style Transfer: A Survey

Di Jin, Zhijing Jin, Zhiting Hu, Olga Vechtomova, Rada Mihalcea
2021 Computational Linguistics  
Text style transfer is an important task in natural language generation, which aims to control certain attributes in the generated text, such as politeness, emotion, humor, and many others.  ...  In this paper, we present a systematic survey of the research on neural text style transfer, spanning over 100 representative articles since the first neural text style transfer work in 2017.  ...  Reformulating unsupervised obfuscation - (best of the labs track at style transfer as paraphrase generation. CLEF-2017).  ... 
doi:10.1162/coli_a_00426 fatcat:v7vmb62ckfcu5k5mpu2pydnrxy

From Theories on Styles to their Transfer in Text: Bridging the Gap with a Hierarchical Survey [article]

Enrica Troiano and Aswathy Velutharambath and Roman Klinger
2022 arXiv   pre-print
As a natural language generation task, style transfer aims at rewriting existing texts, and specifically, it creates paraphrases that exhibit some desired stylistic attributes.  ...  Several style-aware paraphrasing methods have attempted to tackle style transfer.  ...  Chakrabarty et al. (2020b) were the first to frame simile generation as a style transfer task.  ... 
arXiv:2110.15871v4 fatcat:zqzpmd6ennhqzhwedwdlqizf7y

Preventing Author Profiling through Zero-Shot Multilingual Back-Translation [article]

David Ifeoluwa Adelani, Miaoran Zhang, Xiaoyu Shen, Ali Davody, Thomas Kleinbauer, Dietrich Klakow
2021 arXiv   pre-print
Style transfer is an effective way of transforming texts in order to remove any information that enables author profiling.  ...  We compare our models with five representative text style transfer models on three datasets across different domains.  ...  Reformulating unsupervised style transfer as para- phrase generation. In Proceedings of the 2020 Con- ference on Empirical Methods in Natural Language Processing (EMNLP), pages 737-762, Online.  ... 
arXiv:2109.09133v1 fatcat:7ut56tbw4zbyhnbtzqhtgrjds4

On The Ingredients of an Effective Zero-shot Semantic Parser [article]

Pengcheng Yin, John Wieting, Avirup Sil, Graham Neubig
2021 arXiv   pre-print
We propose bridging these gaps using improved grammars, stronger paraphrasers, and efficient learning methods using canonical examples that most likely reflect real user intents.  ...  Recent studies have performed zero-shot learning by synthesizing training examples of canonical utterances and programs from a grammar, and further paraphrasing these utterances to improve linguistic diversity  ...  Reformulating unsupervised style transfer as para- phrase generation. In Proceedings of EMNLP. Catherine Finegan-Dollak, Jonathan K.  ... 
arXiv:2110.08381v1 fatcat:xw4l4jg6fbhxxbh5f2xdrtzuvq

Diversifying Task-oriented Dialogue Response Generation with Prototype Guided Paraphrasing [article]

Phillip Lippe, Pengjie Ren, Hinda Haned, Bart Voorn, Maarten de Rijke
2020 arXiv   pre-print
To introduce diversity, P2-Net randomly samples previous conversational utterances as prototypes, from which the model can then extract speaking style information.  ...  Instead of generating a response from scratch, P2-Net generates system responses by paraphrasing template-based responses.  ...  Paraphrasing Corpus-based DRG task differs from previous work on diversifying text generation through style transfer [11, 12] , which aims to rewrite a sentence with a target style, while keeping the  ... 
arXiv:2008.03391v1 fatcat:tzsqscbvorbodglacz7dyw5jci

Entailment as Few-Shot Learner [article]

Sinong Wang, Han Fang, Madian Khabsa, Hanzi Mao, Hao Ma
2021 arXiv   pre-print
The key idea of this approach is to reformulate potential NLP task into an entailment one, and then fine-tune the model with as little as 8 examples.  ...  We further demonstrate our proposed method can be: (i) naturally combined with an unsupervised contrastive learning-based data augmentation method; (ii) easily extended to multilingual few-shot learning  ...  Unsupervised Contrastive Learning In our proposed framework, we reformulate various NLP tasks as a textual entailment task, which takes sentence pair as input.  ... 
arXiv:2104.14690v1 fatcat:ya4uq45hcrgsrmgypjwucjlx4y
« Previous Showing results 1 — 15 out of 258 results