Text Style Transfer: A Review and Experimental Evaluation
- URL: http://arxiv.org/abs/2010.12742v3
- Date: Tue, 21 Jun 2022 10:57:48 GMT
- Title: Text Style Transfer: A Review and Experimental Evaluation
- Authors: Zhiqiang Hu, Roy Ka-Wei Lee, Charu C. Aggarwal, Aston Zhang
- Abstract summary: The Text Style Transfer (TST) task aims to change the stylistic properties of the text while retaining its style independent content.
Many novel TST algorithms have been developed, while the industry has leveraged these algorithms to enable exciting TST applications.
This article aims to provide a comprehensive review of recent research efforts on text style transfer.
- Score: 26.946157705298685
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The stylistic properties of text have intrigued computational linguistics
researchers in recent years. Specifically, researchers have investigated the
Text Style Transfer (TST) task, which aims to change the stylistic properties
of the text while retaining its style independent content. Over the last few
years, many novel TST algorithms have been developed, while the industry has
leveraged these algorithms to enable exciting TST applications. The field of
TST research has burgeoned because of this symbiosis. This article aims to
provide a comprehensive review of recent research efforts on text style
transfer. More concretely, we create a taxonomy to organize the TST models and
provide a comprehensive summary of the state of the art. We review the existing
evaluation methodologies for TST tasks and conduct a large-scale
reproducibility study where we experimentally benchmark 19 state-of-the-art TST
algorithms on two publicly available datasets. Finally, we expand on current
trends and provide new perspectives on the new and exciting developments in the
TST field.
Related papers
- A Survey of Text Style Transfer: Applications and Ethical Implications [4.749824105387292]
Text style transfer (TST) aims to control selected attributes of language use, such as politeness, formality, or sentiment, without altering the style-independent content of the text.
This paper presents a comprehensive review of TST applications that have been researched over the years, using both traditional linguistic approaches and more recent deep learning methods.
arXiv Detail & Related papers (2024-07-23T17:15:23Z) - Text Style Transfer: An Introductory Overview [0.1534667887016089]
Text Style Transfer (TST) is a pivotal task in natural language generation to manipulate text style attributes while preserving style-independent content.
The attributes targeted in TST can vary widely, including politeness, authorship, mitigation of offensive language, modification of feelings, and adjustment of text formality.
This paper provides an introductory overview of TST, addressing its challenges, existing approaches, datasets, evaluation measures, subtasks, and applications.
arXiv Detail & Related papers (2024-07-20T09:54:55Z) - Distilling Text Style Transfer With Self-Explanation From LLMs [28.595450029172124]
Text Style Transfer (TST) seeks to alter the style of text while retaining its core content.
We propose a framework that leverages large language models (LLMs) alongside chain-of-thought (CoT)
Co is shown to surpass traditional supervised fine-tuning and knowledge distillation methods.
arXiv Detail & Related papers (2024-03-02T06:38:15Z) - VAE based Text Style Transfer with Pivot Words Enhancement Learning [5.717913255287939]
We propose a VAE based Text Style Transfer with pivOt Words Enhancement leaRning (VT-STOWER) method.
We introduce pivot words learning, which is applied to learn decisive words for a specific style.
The proposed VT-STOWER can be scaled to different TST scenarios with a novel and flexible style strength control mechanism.
arXiv Detail & Related papers (2021-12-06T16:41:26Z) - Artificial Text Detection via Examining the Topology of Attention Maps [58.46367297712477]
We propose three novel types of interpretable topological features for this task based on Topological Data Analysis (TDA)
We empirically show that the features derived from the BERT model outperform count- and neural-based baselines up to 10% on three common datasets.
The probing analysis of the features reveals their sensitivity to the surface and syntactic properties.
arXiv Detail & Related papers (2021-09-10T12:13:45Z) - Syntax Matters! Syntax-Controlled in Text Style Transfer [24.379552683296392]
Existing text style transfer (TST) methods rely on style classifiers to disentangle the text's content and style attributes.
We propose a novel Syntax-Aware Controllable Generation (SACG) model, which includes a syntax-aware style classifier.
We show that our proposed method significantly outperforms the state-of-the-art methods.
arXiv Detail & Related papers (2021-08-12T17:35:23Z) - A Survey on Neural Speech Synthesis [110.39292386792555]
Text to speech (TTS) is a hot research topic in speech, language, and machine learning communities.
We conduct a comprehensive survey on neural TTS, aiming to provide a good understanding of current research and future trends.
We focus on the key components in neural TTS, including text analysis, acoustic models and vocoders, and several advanced topics, including fast TTS, low-resource TTS, robust TTS, expressive TTS, and adaptive TTS, etc.
arXiv Detail & Related papers (2021-06-29T16:50:51Z) - StylePTB: A Compositional Benchmark for Fine-grained Controllable Text
Style Transfer [90.6768813620898]
Style transfer aims to controllably generate text with targeted stylistic changes while maintaining core meaning from the source sentence constant.
We introduce a large-scale benchmark, StylePTB, with paired sentences undergoing 21 fine-grained stylistic changes spanning atomic lexical, syntactic, semantic, and thematic transfers of text.
We find that existing methods on StylePTB struggle to model fine-grained changes and have an even more difficult time composing multiple styles.
arXiv Detail & Related papers (2021-04-12T04:25:09Z) - Deep Learning for Text Style Transfer: A Survey [71.8870854396927]
Text style transfer is an important task in natural language generation, which aims to control certain attributes in the generated text.
We present a systematic survey of the research on neural text style transfer, spanning over 100 representative articles since the first neural text style transfer work in 2017.
We discuss the task formulation, existing datasets and subtasks, evaluation, as well as the rich methodologies in the presence of parallel and non-parallel data.
arXiv Detail & Related papers (2020-11-01T04:04:43Z) - A Survey on Text Classification: From Shallow to Deep Learning [83.47804123133719]
The last decade has seen a surge of research in this area due to the unprecedented success of deep learning.
This paper fills the gap by reviewing the state-of-the-art approaches from 1961 to 2021.
We create a taxonomy for text classification according to the text involved and the models used for feature extraction and classification.
arXiv Detail & Related papers (2020-08-02T00:09:03Z) - Exploring the Limits of Transfer Learning with a Unified Text-to-Text
Transformer [64.22926988297685]
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP)
In this paper, we explore the landscape of introducing transfer learning techniques for NLP by a unified framework that converts all text-based language problems into a text-to-text format.
arXiv Detail & Related papers (2019-10-23T17:37:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.