Filters








4,894 Hits in 3.0 sec

Unsupervised Text Style Transfer with Content Embeddings

Keith Carlson, Department of Computer Science Dartmouth College Hanover, NH 03755, USA, Allen Riddell, Daniel Rockmore, Department of Information and Library Science Indiana University Bloomington Bloomington, IN 47405, USA, Departments of Computer Science and Mathematics Dartmouth College Hanover, NH 03755, USA, The Santa Fe Institute Santa Fe, NM 87501, USA
2021 Proceedings of the Conference Recent Advances in Natural Language Processing - Deep Learning for Natural Language Processing Methods and Applications   unpublished
Unsupervised Text Style Transfer with Content Embeddings Keith Carlson Allen Riddell Department of Computer  ...  Style transfer for texts: Retrain, report er- rors, compare with rewrites.  ... 
doi:10.26615/978-954-452-072-4_027 fatcat:fs6lx3k7cvbwldsflt4emqt7tq

Transductive Learning for Unsupervised Text Style Transfer [article]

Fei Xiao, Liang Pang, Yanyan Lan, Yan Wang, Huawei Shen, Xueqi Cheng
2021 arXiv   pre-print
Unsupervised style transfer models are mainly based on an inductive learning approach, which represents the style as embeddings, decoder parameters, or discriminator parameters and directly applies these  ...  Specifically, an attentional encoder-decoder with a retriever framework is utilized. It involves top-K relevant sentences in the target style in the transfer process.  ...  Then, decode or rewrite the style-independent content with the target style embedding.  ... 
arXiv:2109.07812v1 fatcat:vgjhlt2eafh3lcacjzxrqukpwe

Gradient-guided Unsupervised Text Style Transfer via Contrastive Learning [article]

Chenghao Fan, Ziao Li, Wei wei
2022 arXiv   pre-print
Text style transfer is a challenging text generation problem, which aims at altering the style of a given sentence to a target one while keeping its content unchanged.  ...  Previous approaches lack explicit modeling of content invariance and are thus susceptible to content shift between the original sentence and the transferred one. (2) Style misclassification.  ...  Background Style Transfer Style transfer is a task targeting at changing the stylistic attribute while retaining the content of the input text.  ... 
arXiv:2202.00469v1 fatcat:cg2dcekiizah3exbc52mx3sglu

On the Importance of Word and Sentence Representation Learning in Implicit Discourse Relation Classification

Xin Liu, Jiefu Ou, Yangqiu Song, Xin Jiang
2020 Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence  
discourse relation classification is one of the most difficult parts in shallow discourse parsing as the relation prediction without explicit connectives requires the language understanding at both the text  ...  Introduction Text style transfer aims to endow a sentence with a different style and meanwhile keep its main semantic content unaltered, which could benefit various downstream applications such as text  ...  The first paradigm explicitly disentangles text as separated content and style representations, respectively, then combines the content with a target style to achieve transfer.  ... 
doi:10.24963/ijcai.2020/526 dblp:conf/ijcai/YiLLS20 fatcat:3cg6bqvmojfw3m6gl4p2svnsvm

Unet-TTS: Improving Unseen Speaker and Style Transfer in One-shot Voice Cloning [article]

Rui Li, Dong Pu, Minnie Huang, Bill Huang
2022 arXiv   pre-print
According to both subjective and objective evaluations of similarity, the new model outperforms both speaker embedding and unsupervised style modeling (GST) approaches on an unseen emotional corpus.  ...  One-shot voice cloning aims to transform speaker voice and speaking style in speech synthesized from a text-to-speech (TTS) system, where only a shot recording from the target reference speech can be used  ...  They use the speaker embedding in SV and the style embedding from an unsupervised style modeling named GST [10] to provide speaker and style information, respectively.  ... 
arXiv:2109.11115v3 fatcat:t4pb4gdhajc3xecq6rdqnzyqky

SentiInc: Incorporating Sentiment Information into Sentiment Transfer Without Parallel Data [chapter]

Kartikey Pant, Yash Verma, Radhika Mamidi
2020 Lecture Notes in Computer Science  
This is done by incorporating sentiment based loss in the back-translation based style transfer.  ...  Sentiment-to-sentiment transfer involves changing the sentiment of the given text while preserving the underlying information.  ...  Style Embedding [3] : In this method, the model learns a representation for the input sentence that only contains the content information after which it learns style embeddings in addition to the content  ... 
doi:10.1007/978-3-030-45442-5_39 fatcat:dwaexzdi2bh6jmswj6w7kpqgue

Style Transfer as Unsupervised Machine Translation [article]

Zhirui Zhang, Shuo Ren, Shujie Liu, Jianyong Wang, Peng Chen, Mu Li, Ming Zhou, Enhong Chen
2018 arXiv   pre-print
Language style transferring rephrases text with specific stylistic attributes while preserving the original attribute-independent content.  ...  With this constraint, in this paper, we adapt unsupervised machine translation methods for the task of automatic style transfer.  ...  To learn style transfer using non-parallel text, we design an unsupervised sequence-to-sequence training method as illustrated in Figure 2 .  ... 
arXiv:1808.07894v1 fatcat:mefiv5n6bra4fcbcblxq5dx7s4

Plug and Play Autoencoders for Conditional Text Generation [article]

Florian Mai
2020 arXiv   pre-print
Text autoencoders are commonly used for conditional generation tasks such as style transfer.  ...  Evaluations on style transfer tasks both with and without sequence-to-sequence supervision show that our method performs better than or comparable to strong baselines while being up to four times faster  ...  In this study, we apply it to supervised and unsupervised text style transfer.  ... 
arXiv:2010.02983v2 fatcat:t27lepghkbb6ddycs5ozshqtf4

Bag-of-Vectors Autoencoders for Unsupervised Conditional Text Generation [article]

Florian Mai, James Henderson
2021 arXiv   pre-print
However, their method is restricted to autoencoders with a single-vector embedding, which limits how much information can be retained.  ...  Our experimental evaluations on unsupervised sentiment transfer and sentence summarization show that our method performs substantially better than a standard autoencoder.  ...  As both content retention and style transfer are important for style transfer, the further a graph is to the top right, the better the model.fashion as for Yelp-Reviews.  ... 
arXiv:2110.07002v1 fatcat:b5bfzyxzjve4nhwnzegbew72bq

Tweet to News Conversion: An Investigation into Unsupervised Controllable Text Generation [article]

Zishan Ahmad, Mukuntha N S, Asif Ekbal, Pushpak Bhattacharyya
2020 arXiv   pre-print
The first system focuses on unsupervised style transfer and converts the individual tweets into news sentences.  ...  Text generator systems have become extremely popular with the advent of recent deep learning models such as encoder-decoder.  ...  We intend this to be an initial step towards solving the problem of unsupervised guided text generation, where we intend to control the style and the content of generated text.  ... 
arXiv:2008.09333v1 fatcat:zxjwac5kc5at7hyubn3xng6e2y

Text Style Transfer: A Review and Experimental Evaluation [article]

Zhiqiang Hu, Roy Ka-Wei Lee, Charu C. Aggarwal, Aston Zhang
2021 arXiv   pre-print
Specifically, researchers have investigated the Text Style Transfer (TST) task, which aims to change the stylistic properties of the text while retaining its style independent content.  ...  This article aims to provide a comprehensive review of recent research efforts on text style transfer.  ...  The style embedding (Style-Emb) model concatenates the style embedding vector with content representation to generate different style text with one decoder. This method is discussed in Section 5.4.  ... 
arXiv:2010.12742v2 fatcat:gmkjxf7f7jhivbo6mayaxjsk7q

Controllable Unsupervised Text Attribute Transfer via Editing Entangled Latent Representation [article]

Ke Wang, Hang Hua, Xiaojun Wan
2019 arXiv   pre-print
To address the above problems, we propose a more flexible unsupervised text attribute transfer framework which replaces the process of modeling attribute with minimal editing of latent representations  ...  Unsupervised text attribute transfer automatically transforms a text to alter a specific attribute (e.g. sentiment) without using any parallel data, while simultaneously preserving its attribute-independent  ...  The dominant methods of unsupervised text attribute transfer are to separately model attribute and content representations, such as using multiple attribute-specific decoders [5] or combining the content  ... 
arXiv:1905.12926v2 fatcat:thtto4pgerfgto6l6yv5xmj27u

Formality Style Transfer with Hybrid Textual Annotations [article]

Ruochen Xu, Tao Ge, Furu Wei
2019 arXiv   pre-print
Furthermore, our model can be readily adapted to other unsupervised text style transfer tasks like unsupervised sentiment transfer and achieve competitive results on three widely recognized benchmarks.  ...  Formality style transformation is the task of modifying the formality of a given sentence without changing its content. Its challenge is the lack of large-scale sentence-aligned parallel data.  ...  to other text style transfer tasks with competitive performance.  ... 
arXiv:1903.06353v1 fatcat:4q3x555lszemlakd3cv4pc4pri

Structured Content Preservation for Unsupervised Text Style Transfer [article]

Youzhi Tian, Zhiting Hu, Zhou Yu
2018 arXiv   pre-print
Text style transfer aims to modify the style of a sentence while keeping its content unchanged. Recent style transfer systems often fail to faithfully preserve the content after changing the style.  ...  Our model achieves significant improvement in terms of both content preservation and style transfer in automatic and human evaluation.  ...  When applying adversarial training to unsupervised text style transfer, the content encoder aims to fool the style discriminator by removing style information from the content embedding.  ... 
arXiv:1810.06526v2 fatcat:oz7h6qlnzbcltl3pe6pyb26cje

Civil Rephrases Of Toxic Texts With Self-Supervised Transformers [article]

Leo Laugier, John Pavlopoulos, Jeffrey Sorensen, Lucas Dixon
2021 arXiv   pre-print
style transfer systems which we compare with using several scoring systems and human evaluation.  ...  Experimenting with the largest toxicity detection dataset to date (Civil Comments) our model generates sentences that are more fluent and better at preserving the initial content compared to earlier text  ...  Fighting offensive language on social media with unsupervised text style transfer.  ... 
arXiv:2102.05456v2 fatcat:6ky4ma4d6zcqrc6ryueqbkesfq
« Previous Showing results 1 — 15 out of 4,894 results