Prompt-Based Editing for Text Style Transfer
- URL: http://arxiv.org/abs/2301.11997v2
- Date: Fri, 22 Dec 2023 05:49:33 GMT
- Title: Prompt-Based Editing for Text Style Transfer
- Authors: Guoqing Luo, Yu Tong Han, Lili Mou, Mauajama Firdaus
- Abstract summary: We present a prompt-based editing approach for text style transfer.
We transform a prompt-based generation problem into a classification one, which is a training-free process.
Our approach largely outperforms the state-of-the-art systems that have 20 times more parameters.
- Score: 25.863546922455498
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Prompting approaches have been recently explored in text style transfer,
where a textual prompt is used to query a pretrained language model to generate
style-transferred texts word by word in an autoregressive manner. However, such
a generation process is less controllable and early prediction errors may
affect future word predictions. In this paper, we present a prompt-based
editing approach for text style transfer. Specifically, we prompt a pretrained
language model for style classification and use the classification probability
to compute a style score. Then, we perform discrete search with word-level
editing to maximize a comprehensive scoring function for the style-transfer
task. In this way, we transform a prompt-based generation problem into a
classification one, which is a training-free process and more controllable than
the autoregressive generation of sentences. In our experiments, we performed
both automatic and human evaluation on three style-transfer benchmark datasets,
and show that our approach largely outperforms the state-of-the-art systems
that have 20 times more parameters. Additional empirical analyses further
demonstrate the effectiveness of our approach.
Related papers
- Prefix-Tuning Based Unsupervised Text Style Transfer [29.86587278794342]
Unsupervised text style transfer aims at training a generative model that can alter the style of the input sentence while preserving its content.
In this paper, we employ powerful pre-trained large language models and present a new prefix-tuning-based method for unsupervised text style transfer.
arXiv Detail & Related papers (2023-10-23T06:13:08Z) - ParaGuide: Guided Diffusion Paraphrasers for Plug-and-Play Textual Style
Transfer [57.6482608202409]
Textual style transfer is the task of transforming stylistic properties of text while preserving meaning.
We introduce a novel diffusion-based framework for general-purpose style transfer that can be flexibly adapted to arbitrary target styles.
We validate the method on the Enron Email Corpus, with both human and automatic evaluations, and find that it outperforms strong baselines on formality, sentiment, and even authorship style transfer.
arXiv Detail & Related papers (2023-08-29T17:36:02Z) - Evolutionary Verbalizer Search for Prompt-based Few Shot Text
Classification [5.583948835737293]
We propose a novel evolutionary verbalizer search (EVS) algorithm to improve prompt-based tuning with the high-performance verbalizer.
In this paper, we focus on automatically constructing the optimal verbalizer and propose a novelEVS algorithm to improve prompt-based tuning with the high-performance verbalizer.
arXiv Detail & Related papers (2023-06-18T10:03:11Z) - Text Revision by On-the-Fly Representation Optimization [76.11035270753757]
Current state-of-the-art methods formulate these tasks as sequence-to-sequence learning problems.
We present an iterative in-place editing approach for text revision, which requires no parallel data.
It achieves competitive and even better performance than state-of-the-art supervised methods on text simplification.
arXiv Detail & Related papers (2022-04-15T07:38:08Z) - GTAE: Graph-Transformer based Auto-Encoders for Linguistic-Constrained
Text Style Transfer [119.70961704127157]
Non-parallel text style transfer has attracted increasing research interests in recent years.
Current approaches still lack the ability to preserve the content and even logic of original sentences.
We propose a method called Graph Transformer based Auto-GTAE, which models a sentence as a linguistic graph and performs feature extraction and style transfer at the graph level.
arXiv Detail & Related papers (2021-02-01T11:08:45Z) - On Learning Text Style Transfer with Direct Rewards [101.97136885111037]
Lack of parallel corpora makes it impossible to directly train supervised models for the text style transfer task.
We leverage semantic similarity metrics originally used for fine-tuning neural machine translation models.
Our model provides significant gains in both automatic and human evaluation over strong baselines.
arXiv Detail & Related papers (2020-10-24T04:30:02Z) - Reformulating Unsupervised Style Transfer as Paraphrase Generation [48.83148014000888]
We reformulate unsupervised style transfer as a paraphrase generation problem.
We present a simple methodology based on fine-tuning pretrained language models on automatically generated paraphrase data.
We also pivot to a more real-world style transfer setting by collecting a large dataset of 15M sentences in 11 diverse styles.
arXiv Detail & Related papers (2020-10-12T13:31:01Z) - Exploring Contextual Word-level Style Relevance for Unsupervised Style
Transfer [60.07283363509065]
Unsupervised style transfer aims to change the style of an input sentence while preserving its original content.
We propose a novel attentional sequence-to-sequence model that exploits the relevance of each output word to the target style.
Experimental results show that our proposed model achieves state-of-the-art performance in terms of both transfer accuracy and content preservation.
arXiv Detail & Related papers (2020-05-05T10:24:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.