TinyStyler: Efficient Few-Shot Text Style Transfer with Authorship Embeddings
- URL: http://arxiv.org/abs/2406.15586v2
- Date: Thu, 07 Nov 2024 17:56:19 GMT
- Title: TinyStyler: Efficient Few-Shot Text Style Transfer with Authorship Embeddings
- Authors: Zachary Horvitz, Ajay Patel, Kanishk Singh, Chris Callison-Burch, Kathleen McKeown, Zhou Yu,
- Abstract summary: We introduce TinyStyler, a lightweight but effective approach to perform efficient, few-shot text style transfer.
We evaluate TinyStyler's ability to perform text attribute style transfer with automatic and human evaluations.
Our model has been made publicly available at https://huggingface.co/tinystyler/tinystyler.
- Score: 51.30454130214374
- License:
- Abstract: The goal of text style transfer is to transform the style of texts while preserving their original meaning, often with only a few examples of the target style. Existing style transfer methods generally rely on the few-shot capabilities of large language models or on complex controllable text generation approaches that are inefficient and underperform on fluency metrics. We introduce TinyStyler, a lightweight but effective approach, which leverages a small language model (800M params) and pre-trained authorship embeddings to perform efficient, few-shot text style transfer. We evaluate on the challenging task of authorship style transfer and find TinyStyler outperforms strong approaches such as GPT-4. We also evaluate TinyStyler's ability to perform text attribute style transfer (formal $\leftrightarrow$ informal) with automatic and human evaluations and find that the approach outperforms recent controllable text generation methods. Our model has been made publicly available at https://huggingface.co/tinystyler/tinystyler .
Related papers
- Authorship Style Transfer with Policy Optimization [26.34892894935038]
Authorship style transfer aims to rewrite a given text into a specified target while preserving the original meaning in the source.
Existing approaches rely on the availability of a large number of target style exemplars for model training.
arXiv Detail & Related papers (2024-03-12T19:34:54Z) - STEER: Unified Style Transfer with Expert Reinforcement [71.3995732115262]
STEER: Unified Style Transfer with Expert Reinforcement, is a unified frame-work developed to overcome the challenge of limited parallel data for style transfer.
We show STEER is robust, maintaining its style transfer capabilities on out-of-domain data, and surpassing nearly all baselines across various styles.
arXiv Detail & Related papers (2023-11-13T09:02:30Z) - ParaGuide: Guided Diffusion Paraphrasers for Plug-and-Play Textual Style
Transfer [57.6482608202409]
Textual style transfer is the task of transforming stylistic properties of text while preserving meaning.
We introduce a novel diffusion-based framework for general-purpose style transfer that can be flexibly adapted to arbitrary target styles.
We validate the method on the Enron Email Corpus, with both human and automatic evaluations, and find that it outperforms strong baselines on formality, sentiment, and even authorship style transfer.
arXiv Detail & Related papers (2023-08-29T17:36:02Z) - MSSRNet: Manipulating Sequential Style Representation for Unsupervised
Text Style Transfer [82.37710853235535]
Unsupervised text style transfer task aims to rewrite a text into target style while preserving its main content.
Traditional methods rely on the use of a fixed-sized vector to regulate text style, which is difficult to accurately convey the style strength for each individual token.
Our proposed method addresses this issue by assigning individual style vector to each token in a text, allowing for fine-grained control and manipulation of the style strength.
arXiv Detail & Related papers (2023-06-12T13:12:29Z) - StylerDALLE: Language-Guided Style Transfer Using a Vector-Quantized
Tokenizer of a Large-Scale Generative Model [64.26721402514957]
We propose StylerDALLE, a style transfer method that uses natural language to describe abstract art styles.
Specifically, we formulate the language-guided style transfer task as a non-autoregressive token sequence translation.
To incorporate style information, we propose a Reinforcement Learning strategy with CLIP-based language supervision.
arXiv Detail & Related papers (2023-03-16T12:44:44Z) - Conversation Style Transfer using Few-Shot Learning [56.43383396058639]
In this paper, we introduce conversation style transfer as a few-shot learning problem.
We propose a novel in-context learning approach to solve the task with style-free dialogues as a pivot.
We show that conversation style transfer can also benefit downstream tasks.
arXiv Detail & Related papers (2023-02-16T15:27:00Z) - SimpleStyle: An Adaptable Style Transfer Approach [6.993665837027786]
We present SimpleStyle, a minimalist yet effective approach for style-transfer composed of two simple ingredients: controlled denoising and output filtering.
We apply SimpleStyle to transfer a wide range of text attributes appearing in real-world textual data from social networks.
We show that teaching a student model to generate the output of SimpleStyle can result in a system that performs style transfer of equivalent quality with only a single greedy-decoded sample.
arXiv Detail & Related papers (2022-12-20T18:12:49Z) - ST$^2$: Small-data Text Style Transfer via Multi-task Meta-Learning [14.271083093944753]
Text style transfer aims to paraphrase a sentence in one style into another while preserving content.
Due to lack of parallel training data, state-of-art methods are unsupervised and rely on large datasets that share content.
In this work, we develop a meta-learning framework to transfer between any kind of text styles.
arXiv Detail & Related papers (2020-04-24T13:36:38Z) - Fair Transfer of Multiple Style Attributes in Text [26.964711594103566]
We show that the transfer of multiple styles cannot be achieved by sequentially performing multiple single-style transfers.
We propose a neural network architecture for fairly transferring multiple style attributes in a given text.
arXiv Detail & Related papers (2020-01-18T15:38:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.