Don't lose the message while paraphrasing: A study on content preserving
style transfer
- URL: http://arxiv.org/abs/2308.09055v1
- Date: Thu, 17 Aug 2023 15:41:08 GMT
- Title: Don't lose the message while paraphrasing: A study on content preserving
style transfer
- Authors: Nikolay Babakov, David Dale, Ilya Gusev, Irina Krotova, Alexander
Panchenko
- Abstract summary: Content preservation is critical for real-world applications of style transfer studies.
We compare various style transfer models on the example of the formality transfer domain.
We conduct a precise comparative study of several state-of-the-art techniques for style transfer.
- Score: 61.38460184163704
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Text style transfer techniques are gaining popularity in natural language
processing allowing paraphrasing text in the required form: from toxic to
neural, from formal to informal, from old to the modern English language, etc.
Solving the task is not sufficient to generate some neural/informal/modern
text, but it is important to preserve the original content unchanged. This
requirement becomes even more critical in some applications such as style
transfer of goal-oriented dialogues where the factual information shall be kept
to preserve the original message, e.g. ordering a certain type of pizza to a
certain address at a certain time. The aspect of content preservation is
critical for real-world applications of style transfer studies, but it has
received little attention. To bridge this gap we perform a comparison of
various style transfer models on the example of the formality transfer domain.
To perform a study of the content preservation abilities of various style
transfer methods we create a parallel dataset of formal vs. informal
task-oriented dialogues. The key difference between our dataset and the
existing ones like GYAFC [17] is the presence of goal-oriented dialogues with
predefined semantic slots essential to be kept during paraphrasing, e.g. named
entities. This additional annotation allowed us to conduct a precise
comparative study of several state-of-the-art techniques for style transfer.
Another result of our study is a modification of the unsupervised method LEWIS
[19] which yields a substantial improvement over the original method and all
evaluated baselines on the proposed task.
Related papers
- ParaGuide: Guided Diffusion Paraphrasers for Plug-and-Play Textual Style
Transfer [57.6482608202409]
Textual style transfer is the task of transforming stylistic properties of text while preserving meaning.
We introduce a novel diffusion-based framework for general-purpose style transfer that can be flexibly adapted to arbitrary target styles.
We validate the method on the Enron Email Corpus, with both human and automatic evaluations, and find that it outperforms strong baselines on formality, sentiment, and even authorship style transfer.
arXiv Detail & Related papers (2023-08-29T17:36:02Z) - Conversation Style Transfer using Few-Shot Learning [56.43383396058639]
In this paper, we introduce conversation style transfer as a few-shot learning problem.
We propose a novel in-context learning approach to solve the task with style-free dialogues as a pivot.
We show that conversation style transfer can also benefit downstream tasks.
arXiv Detail & Related papers (2023-02-16T15:27:00Z) - StoryTrans: Non-Parallel Story Author-Style Transfer with Discourse
Representations and Content Enhancing [73.81778485157234]
Long texts usually involve more complicated author linguistic preferences such as discourse structures than sentences.
We formulate the task of non-parallel story author-style transfer, which requires transferring an input story into a specified author style.
We use an additional training objective to disentangle stylistic features from the learned discourse representation to prevent the model from degenerating to an auto-encoder.
arXiv Detail & Related papers (2022-08-29T08:47:49Z) - Studying the role of named entities for content preservation in text
style transfer [65.40394342240558]
We focus on the role of named entities in content preservation for formality text style transfer.
We collect a new dataset for the evaluation of content similarity measures in text style transfer.
We perform an error analysis of a pre-trained formality transfer model and introduce a simple technique to use information about named entities to enhance the performance of baseline content similarity measures used in text style transfer.
arXiv Detail & Related papers (2022-06-20T09:31:47Z) - A Review of Text Style Transfer using Deep Learning [0.0]
Text style transfer is a task of adapting and/or changing the stylistic manner in which a sentence is written.
We point out the technological advances in deep neural networks that have been the driving force behind current successes in the fields of natural language understanding and generation.
The review is structured around two key stages in the text style transfer process, namely, representation learning and sentence generation in a new style.
arXiv Detail & Related papers (2021-09-30T14:06:36Z) - Improving Disentangled Text Representation Learning with
Information-Theoretic Guidance [99.68851329919858]
discrete nature of natural language makes disentangling of textual representations more challenging.
Inspired by information theory, we propose a novel method that effectively manifests disentangled representations of text.
Experiments on both conditional text generation and text-style transfer demonstrate the high quality of our disentangled representation.
arXiv Detail & Related papers (2020-06-01T03:36:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.