ParaGuide: Guided Diffusion Paraphrasers for Plug-and-Play Textual Style
Transfer
- URL: http://arxiv.org/abs/2308.15459v3
- Date: Thu, 22 Feb 2024 23:38:26 GMT
- Title: ParaGuide: Guided Diffusion Paraphrasers for Plug-and-Play Textual Style
Transfer
- Authors: Zachary Horvitz, Ajay Patel, Chris Callison-Burch, Zhou Yu, Kathleen
McKeown
- Abstract summary: Textual style transfer is the task of transforming stylistic properties of text while preserving meaning.
We introduce a novel diffusion-based framework for general-purpose style transfer that can be flexibly adapted to arbitrary target styles.
We validate the method on the Enron Email Corpus, with both human and automatic evaluations, and find that it outperforms strong baselines on formality, sentiment, and even authorship style transfer.
- Score: 57.6482608202409
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Textual style transfer is the task of transforming stylistic properties of
text while preserving meaning. Target "styles" can be defined in numerous ways,
ranging from single attributes (e.g, formality) to authorship (e.g,
Shakespeare). Previous unsupervised style-transfer approaches generally rely on
significant amounts of labeled data for only a fixed set of styles or require
large language models. In contrast, we introduce a novel diffusion-based
framework for general-purpose style transfer that can be flexibly adapted to
arbitrary target styles at inference time. Our parameter-efficient approach,
ParaGuide, leverages paraphrase-conditioned diffusion models alongside
gradient-based guidance from both off-the-shelf classifiers and strong existing
style embedders to transform the style of text while preserving semantic
information. We validate the method on the Enron Email Corpus, with both human
and automatic evaluations, and find that it outperforms strong baselines on
formality, sentiment, and even authorship style transfer.
Related papers
- TinyStyler: Efficient Few-Shot Text Style Transfer with Authorship Embeddings [51.30454130214374]
We introduce TinyStyler, a lightweight but effective approach to perform efficient, few-shot text style transfer.
We evaluate TinyStyler's ability to perform text attribute style transfer with automatic and human evaluations.
Our model has been made publicly available at https://huggingface.co/tinystyler/tinystyler.
arXiv Detail & Related papers (2024-06-21T18:41:22Z) - Prefix-Tuning Based Unsupervised Text Style Transfer [29.86587278794342]
Unsupervised text style transfer aims at training a generative model that can alter the style of the input sentence while preserving its content.
In this paper, we employ powerful pre-trained large language models and present a new prefix-tuning-based method for unsupervised text style transfer.
arXiv Detail & Related papers (2023-10-23T06:13:08Z) - Don't lose the message while paraphrasing: A study on content preserving
style transfer [61.38460184163704]
Content preservation is critical for real-world applications of style transfer studies.
We compare various style transfer models on the example of the formality transfer domain.
We conduct a precise comparative study of several state-of-the-art techniques for style transfer.
arXiv Detail & Related papers (2023-08-17T15:41:08Z) - StoryTrans: Non-Parallel Story Author-Style Transfer with Discourse
Representations and Content Enhancing [73.81778485157234]
Long texts usually involve more complicated author linguistic preferences such as discourse structures than sentences.
We formulate the task of non-parallel story author-style transfer, which requires transferring an input story into a specified author style.
We use an additional training objective to disentangle stylistic features from the learned discourse representation to prevent the model from degenerating to an auto-encoder.
arXiv Detail & Related papers (2022-08-29T08:47:49Z) - Reformulating Unsupervised Style Transfer as Paraphrase Generation [48.83148014000888]
We reformulate unsupervised style transfer as a paraphrase generation problem.
We present a simple methodology based on fine-tuning pretrained language models on automatically generated paraphrase data.
We also pivot to a more real-world style transfer setting by collecting a large dataset of 15M sentences in 11 diverse styles.
arXiv Detail & Related papers (2020-10-12T13:31:01Z) - TextSETTR: Few-Shot Text Style Extraction and Tunable Targeted Restyling [23.60231661500702]
We present a novel approach to the problem of text style transfer.
Our method makes use of readily-available unlabeled text by relying on the implicit connection in style between adjacent sentences.
We demonstrate that training on unlabeled Amazon reviews data results in a model that is competitive on sentiment transfer.
arXiv Detail & Related papers (2020-10-08T07:06:38Z) - Exploring Contextual Word-level Style Relevance for Unsupervised Style
Transfer [60.07283363509065]
Unsupervised style transfer aims to change the style of an input sentence while preserving its original content.
We propose a novel attentional sequence-to-sequence model that exploits the relevance of each output word to the target style.
Experimental results show that our proposed model achieves state-of-the-art performance in terms of both transfer accuracy and content preservation.
arXiv Detail & Related papers (2020-05-05T10:24:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.