A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Style Transfer for Texts: Retrain, Report Errors, Compare with Rewrites
[article]
2019
arXiv
pre-print
First, the standard metrics for style accuracy and semantics preservation vary significantly on different re-runs. Therefore one has to report error margins for the obtained results. ...
This paper shows that standard assessment methodology for style transfer has several significant problems. ...
Second, it shows that reporting error margins of several consecutive retrains for the same model is crucial for the comparison of different architectures, since error margins for some of the models overlap ...
arXiv:1908.06809v2
fatcat:hzwywebq35a4jatdn23fn63r5u
Style Transfer for Texts: Retrain, Report Errors, Compare with Rewrites
2019
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
First, the standard metrics for style accuracy and semantics preservation vary significantly on different re-runs. Therefore one has to report error margins for the obtained results. ...
This paper shows that standard assessment methodology for style transfer has several significant problems. ...
Second, it shows that reporting error margins of several consecutive retrains for the same model is crucial for the comparison of different architectures, since error margins for some of the models overlap ...
doi:10.18653/v1/d19-1406
dblp:conf/emnlp/TikhonovSNNY19
fatcat:upx5o46jrzh4ldej33dmjrd444
Deep Learning for Text Style Transfer: A Survey
[article]
2021
arXiv
pre-print
In this paper, we present a systematic survey of the research on neural text style transfer, spanning over 100 representative articles since the first neural text style transfer work in 2017. ...
Text style transfer is an important task in natural language generation, which aims to control certain attributes in the generated text, such as politeness, emotion, humor, and many others. ...
Style 7-12, 2018, European Language Resources
transfer for texts: Retrain, report errors, Association (ELRA).
compare with rewrites. ...
arXiv:2011.00416v5
fatcat:wfw3jfh2mjfupbzrmnztsqy4ny
Deep Learning for Text Style Transfer: A Survey
2021
Computational Linguistics
In this paper, we present a systematic survey of the research on neural text style transfer, spanning over 100 representative articles since the first neural text style transfer work in 2017. ...
Text style transfer is an important task in natural language generation, which aims to control certain attributes in the generated text, such as politeness, emotion, humor, and many others. ...
Style 7-12, 2018, European Language Resources
transfer for texts: Retrain, report errors, Association (ELRA).
compare with rewrites. ...
doi:10.1162/coli_a_00426
fatcat:v7vmb62ckfcu5k5mpu2pydnrxy
Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation
2019
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Disentangling the content and style in the latent space is prevalent in unpaired text style transfer. ...
style transfer and better content preservation. ...
Acknowledgment We would like to thank the anonymous reviewers for their valuable comments. ...
doi:10.18653/v1/p19-1601
dblp:conf/acl/DaiLQH19
fatcat:5azwlvklzfb6ldnrbvigb3po3i
Dear Sir or Madam, May I introduce the GYAFC Dataset: Corpus, Benchmarks and Metrics for Formality Style Transfer
[article]
2018
arXiv
pre-print
Style transfer is the task of automatically transforming a piece of text in one particular style into another. ...
In this work, we create the largest corpus for a particular stylistic transfer (formality) and show that techniques from the machine translation community can serve as strong baselines for future work. ...
, Ellie Pavlick, Maksym Bezva, Dimitrios Alikaniotis and Kyunghyun Cho for helpful discussion and the three anonymous reviewers for their useful comments and suggestions. ...
arXiv:1803.06535v2
fatcat:qvtd526hyfd2bh25fc7umrgcje
Few-shot Controllable Style Transfer for Low-Resource Multilingual Settings
[article]
2022
arXiv
pre-print
We report promising qualitative results for several attribute transfer tasks (sentiment transfer, simplification, gender neutralization, text anonymization) all without retraining the model. ...
We push the state-of-the-art for few-shot style transfer with a new method modeling the stylistic difference between paraphrases. ...
Acknowledgements We are very grateful to the Task Mate team (especially Auric Bonifacio Quintana) for their support and helping us crowdsource data and evaluate models on their platform. ...
arXiv:2110.07385v2
fatcat:vil7dulr3jdubgplda5k3ft764
Dear Sir or Madam, May I Introduce the GYAFC Dataset: Corpus, Benchmarks and Metrics for Formality Style Transfer
2018
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
Style transfer is the task of automatically transforming a piece of text in one particular style into another. ...
In this work, we create the largest corpus for a particular stylistic transfer (formality) and show that techniques from the machine translation community can serve as strong baselines for future work. ...
, Ellie Pavlick, Maksym Bezva, Dimitrios Alikaniotis and Kyunghyun Cho for helpful discussion and the three anonymous reviewers for their useful comments and suggestions. ...
doi:10.18653/v1/n18-1012
dblp:conf/naacl/RaoT18
fatcat:dwhvecacpzccfikx5af7t7kl7a
Text Rewriting Improves Semantic Role Labeling
2014
The Journal of Artificial Intelligence Research
In this paper we use text rewriting as a means of increasing the amount of labeled data available for model training. ...
Our method uses automatically extracted rewrite rules from comparable corpora and bitexts to generate multiple versions of sentences annotated with gold standard labels. ...
They pilot their idea in semantic role labeling using hand-written rewrite rules and show that it compares favorably with approaches that retrain their model on the target domain. ...
doi:10.1613/jair.4431
fatcat:eqassdmyvndzzbivitswcij7s4
Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation
[article]
2019
arXiv
pre-print
Disentangling the content and style in the latent space is prevalent in unpaired text style transfer. ...
style transfer and better content preservation. ...
Acknowledgment We would like to thank the anonymous reviewers for their valuable comments. ...
arXiv:1905.05621v3
fatcat:7aakcfgb2rdzzjamyt33twic5y
Controlled Text Generation as Continuous Optimization with Multiple Constraints
[article]
2021
arXiv
pre-print
We evaluate our approach on controllable machine translation and style transfer with multiple sentence-level attributes and observe significant improvements over baselines. ...
As large-scale language model pretraining pushes the state-of-the-art in text generation, recent work has turned to controlling attributes of the text such models generate. ...
Style Transfer We begin with a style-transfer task, a task aiming to faithfully and fluently rewrite a given sentence such that a desired writing style is reflected in the generation. ...
arXiv:2108.01850v1
fatcat:eb2ahhvuwfh6hchmbendf37rim
Mix and Match: Learning-free Controllable Text Generation using Energy Language Models
[article]
2022
arXiv
pre-print
with the base autoregressive LM. ...
In this work, we propose Mix and Match LM, a global score-based alternative for controllable text generation that combines arbitrary pre-trained black-box models for achieving the desired attributes in ...
We also thank our colleagues at the UCSD/CMU Berg Lab for their helpful comments and feedback. ...
arXiv:2203.13299v2
fatcat:ggsbncwf3jajnexgqjg24zyf2y
Text Counterfactuals via Latent Optimization and Shapley-Guided Search
[article]
2021
arXiv
pre-print
We study the problem of generating counterfactual text for a classifier as a means for understanding and debugging classification. ...
We then use these estimates to guide a beam search for the final counterfactual text. ...
Another line of related work is style transfer (Sudhakar et al., 2019; Wang et al., 2019; Hu et al., 2017) , which aim to modify a given text according to a target style. ...
arXiv:2110.11589v1
fatcat:qutmqhxb5va7tnpmftkwdnhb2e
Improving Data-to-Text Generation via Preserving High-Frequency Phrases and Fact-Checking
2021
Italian Journal of Computational Linguistics
We use transfer learning with an auxiliary task of keeping high-frequency word sequences from the training data for text generation. ...
In this work, we propose a generate-extract-correct pipeline for the task. ...
Knight went 6-for-14 from the field and 1-for-3 from the three-point line to score a team-high of 17 points, while also adding five assists and two steals. ...
doi:10.4000/ijcol.909
fatcat:cdtgxqgp3nc27js5yyvbll3f2q
A Survey of Controllable Text Generation using Transformer-based Pre-trained Language Models
[article]
2022
arXiv
pre-print
We hope it can help researchers in related fields to quickly track the academic frontier, providing them with a landscape of the area and a roadmap for future research. ...
It is regarded as crucial for the development of advanced text generation technologies that are more natural and better meet the specific constraints in practical applications. ...
The second way is to retrain or refactor the PLMs for controlled text generation. ...
arXiv:2201.05337v1
fatcat:lqr6ulndhrcjbiy7etejwtdghy
« Previous
Showing results 1 — 15 out of 512 results