StylePTB: A Compositional Benchmark for Fine-grained Controllable Text
Style Transfer
- URL: http://arxiv.org/abs/2104.05196v1
- Date: Mon, 12 Apr 2021 04:25:09 GMT
- Title: StylePTB: A Compositional Benchmark for Fine-grained Controllable Text
Style Transfer
- Authors: Yiwei Lyu, Paul Pu Liang, Hai Pham, Eduard Hovy, Barnab\'as P\'oczos,
Ruslan Salakhutdinov, Louis-Philippe Morency
- Abstract summary: Style transfer aims to controllably generate text with targeted stylistic changes while maintaining core meaning from the source sentence constant.
We introduce a large-scale benchmark, StylePTB, with paired sentences undergoing 21 fine-grained stylistic changes spanning atomic lexical, syntactic, semantic, and thematic transfers of text.
We find that existing methods on StylePTB struggle to model fine-grained changes and have an even more difficult time composing multiple styles.
- Score: 90.6768813620898
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Text style transfer aims to controllably generate text with targeted
stylistic changes while maintaining core meaning from the source sentence
constant. Many of the existing style transfer benchmarks primarily focus on
individual high-level semantic changes (e.g. positive to negative), which
enable controllability at a high level but do not offer fine-grained control
involving sentence structure, emphasis, and content of the sentence. In this
paper, we introduce a large-scale benchmark, StylePTB, with (1) paired
sentences undergoing 21 fine-grained stylistic changes spanning atomic lexical,
syntactic, semantic, and thematic transfers of text, as well as (2)
compositions of multiple transfers which allow modeling of fine-grained
stylistic changes as building blocks for more complex, high-level transfers. By
benchmarking existing methods on StylePTB, we find that they struggle to model
fine-grained changes and have an even more difficult time composing multiple
styles. As a result, StylePTB brings novel challenges that we hope will
encourage future research in controllable text style transfer, compositional
models, and learning disentangled representations. Solving these challenges
would present important steps towards controllable text generation.
Related papers
- Towards Visual Text Design Transfer Across Languages [49.78504488452978]
We introduce a novel task of Multimodal Style Translation (MuST-Bench)
MuST-Bench is a benchmark designed to evaluate the ability of visual text generation models to perform translation across different writing systems.
In response, we introduce SIGIL, a framework for multimodal style translation that eliminates the need for style descriptions.
arXiv Detail & Related papers (2024-10-24T15:15:01Z) - ParaGuide: Guided Diffusion Paraphrasers for Plug-and-Play Textual Style
Transfer [57.6482608202409]
Textual style transfer is the task of transforming stylistic properties of text while preserving meaning.
We introduce a novel diffusion-based framework for general-purpose style transfer that can be flexibly adapted to arbitrary target styles.
We validate the method on the Enron Email Corpus, with both human and automatic evaluations, and find that it outperforms strong baselines on formality, sentiment, and even authorship style transfer.
arXiv Detail & Related papers (2023-08-29T17:36:02Z) - MSSRNet: Manipulating Sequential Style Representation for Unsupervised
Text Style Transfer [82.37710853235535]
Unsupervised text style transfer task aims to rewrite a text into target style while preserving its main content.
Traditional methods rely on the use of a fixed-sized vector to regulate text style, which is difficult to accurately convey the style strength for each individual token.
Our proposed method addresses this issue by assigning individual style vector to each token in a text, allowing for fine-grained control and manipulation of the style strength.
arXiv Detail & Related papers (2023-06-12T13:12:29Z) - StoryTrans: Non-Parallel Story Author-Style Transfer with Discourse
Representations and Content Enhancing [73.81778485157234]
Long texts usually involve more complicated author linguistic preferences such as discourse structures than sentences.
We formulate the task of non-parallel story author-style transfer, which requires transferring an input story into a specified author style.
We use an additional training objective to disentangle stylistic features from the learned discourse representation to prevent the model from degenerating to an auto-encoder.
arXiv Detail & Related papers (2022-08-29T08:47:49Z) - VAE based Text Style Transfer with Pivot Words Enhancement Learning [5.717913255287939]
We propose a VAE based Text Style Transfer with pivOt Words Enhancement leaRning (VT-STOWER) method.
We introduce pivot words learning, which is applied to learn decisive words for a specific style.
The proposed VT-STOWER can be scaled to different TST scenarios with a novel and flexible style strength control mechanism.
arXiv Detail & Related papers (2021-12-06T16:41:26Z) - Contextual Text Style Transfer [73.66285813595616]
Contextual Text Style Transfer aims to translate a sentence into a desired style with its surrounding context taken into account.
We propose a Context-Aware Style Transfer (CAST) model, which uses two separate encoders for each input sentence and its surrounding context.
Two new benchmarks, Enron-Context and Reddit-Context, are introduced for formality and offensiveness style transfer.
arXiv Detail & Related papers (2020-04-30T23:01:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.