Reformulating Unsupervised Style Transfer as Paraphrase Generation
- URL: http://arxiv.org/abs/2010.05700v1
- Date: Mon, 12 Oct 2020 13:31:01 GMT
- Title: Reformulating Unsupervised Style Transfer as Paraphrase Generation
- Authors: Kalpesh Krishna, John Wieting, Mohit Iyyer
- Abstract summary: We reformulate unsupervised style transfer as a paraphrase generation problem.
We present a simple methodology based on fine-tuning pretrained language models on automatically generated paraphrase data.
We also pivot to a more real-world style transfer setting by collecting a large dataset of 15M sentences in 11 diverse styles.
- Score: 48.83148014000888
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Modern NLP defines the task of style transfer as modifying the style of a
given sentence without appreciably changing its semantics, which implies that
the outputs of style transfer systems should be paraphrases of their inputs.
However, many existing systems purportedly designed for style transfer
inherently warp the input's meaning through attribute transfer, which changes
semantic properties such as sentiment. In this paper, we reformulate
unsupervised style transfer as a paraphrase generation problem, and present a
simple methodology based on fine-tuning pretrained language models on
automatically generated paraphrase data. Despite its simplicity, our method
significantly outperforms state-of-the-art style transfer systems on both human
and automatic evaluations. We also survey 23 style transfer papers and discover
that existing automatic metrics can be easily gamed and propose fixed variants.
Finally, we pivot to a more real-world style transfer setting by collecting a
large dataset of 15M sentences in 11 diverse styles, which we use for an
in-depth analysis of our system.
Related papers
- ParaGuide: Guided Diffusion Paraphrasers for Plug-and-Play Textual Style
Transfer [57.6482608202409]
Textual style transfer is the task of transforming stylistic properties of text while preserving meaning.
We introduce a novel diffusion-based framework for general-purpose style transfer that can be flexibly adapted to arbitrary target styles.
We validate the method on the Enron Email Corpus, with both human and automatic evaluations, and find that it outperforms strong baselines on formality, sentiment, and even authorship style transfer.
arXiv Detail & Related papers (2023-08-29T17:36:02Z) - Conversation Style Transfer using Few-Shot Learning [56.43383396058639]
In this paper, we introduce conversation style transfer as a few-shot learning problem.
We propose a novel in-context learning approach to solve the task with style-free dialogues as a pivot.
We show that conversation style transfer can also benefit downstream tasks.
arXiv Detail & Related papers (2023-02-16T15:27:00Z) - Prompt-Based Editing for Text Style Transfer [25.863546922455498]
We present a prompt-based editing approach for text style transfer.
We transform a prompt-based generation problem into a classification one, which is a training-free process.
Our approach largely outperforms the state-of-the-art systems that have 20 times more parameters.
arXiv Detail & Related papers (2023-01-27T21:31:14Z) - Few-shot Controllable Style Transfer for Low-Resource Settings: A Study
in Indian Languages [13.980482277351523]
Style transfer is the task of rewriting an input sentence into a target style while preserving its content.
We push the state-of-the-art for few-shot style transfer with a new method modeling the stylistic difference between paraphrases.
Our model achieves 2-3x better performance and output diversity in formality transfer and code-mixing addition across five Indian languages.
arXiv Detail & Related papers (2021-10-14T14:16:39Z) - StylePTB: A Compositional Benchmark for Fine-grained Controllable Text
Style Transfer [90.6768813620898]
Style transfer aims to controllably generate text with targeted stylistic changes while maintaining core meaning from the source sentence constant.
We introduce a large-scale benchmark, StylePTB, with paired sentences undergoing 21 fine-grained stylistic changes spanning atomic lexical, syntactic, semantic, and thematic transfers of text.
We find that existing methods on StylePTB struggle to model fine-grained changes and have an even more difficult time composing multiple styles.
arXiv Detail & Related papers (2021-04-12T04:25:09Z) - Exploring Contextual Word-level Style Relevance for Unsupervised Style
Transfer [60.07283363509065]
Unsupervised style transfer aims to change the style of an input sentence while preserving its original content.
We propose a novel attentional sequence-to-sequence model that exploits the relevance of each output word to the target style.
Experimental results show that our proposed model achieves state-of-the-art performance in terms of both transfer accuracy and content preservation.
arXiv Detail & Related papers (2020-05-05T10:24:28Z) - Politeness Transfer: A Tag and Generate Approach [167.9924201435888]
This paper introduces a new task of politeness transfer.
It involves converting non-polite sentences to polite sentences while preserving the meaning.
We design a tag and generate pipeline that identifies stylistic attributes and subsequently generates a sentence in the target style.
arXiv Detail & Related papers (2020-04-29T15:08:53Z) - Fair Transfer of Multiple Style Attributes in Text [26.964711594103566]
We show that the transfer of multiple styles cannot be achieved by sequentially performing multiple single-style transfers.
We propose a neural network architecture for fairly transferring multiple style attributes in a given text.
arXiv Detail & Related papers (2020-01-18T15:38:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.