Exploring Contextual Word-level Style Relevance for Unsupervised Style
Transfer
- URL: http://arxiv.org/abs/2005.02049v2
- Date: Fri, 11 Mar 2022 07:47:13 GMT
- Title: Exploring Contextual Word-level Style Relevance for Unsupervised Style
Transfer
- Authors: Chulun Zhou, Liangyu Chen, Jiachen Liu, Xinyan Xiao, Jinsong Su, Sheng
Guo, Hua Wu
- Abstract summary: Unsupervised style transfer aims to change the style of an input sentence while preserving its original content.
We propose a novel attentional sequence-to-sequence model that exploits the relevance of each output word to the target style.
Experimental results show that our proposed model achieves state-of-the-art performance in terms of both transfer accuracy and content preservation.
- Score: 60.07283363509065
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Unsupervised style transfer aims to change the style of an input sentence
while preserving its original content without using parallel training data. In
current dominant approaches, owing to the lack of fine-grained control on the
influence from the target style,they are unable to yield desirable output
sentences. In this paper, we propose a novel attentional sequence-to-sequence
(Seq2seq) model that dynamically exploits the relevance of each output word to
the target style for unsupervised style transfer. Specifically, we first
pretrain a style classifier, where the relevance of each input word to the
original style can be quantified via layer-wise relevance propagation. In a
denoising auto-encoding manner, we train an attentional Seq2seq model to
reconstruct input sentences and repredict word-level previously-quantified
style relevance simultaneously. In this way, this model is endowed with the
ability to automatically predict the style relevance of each output word. Then,
we equip the decoder of this model with a neural style component to exploit the
predicted wordlevel style relevance for better style transfer. Particularly, we
fine-tune this model using a carefully-designed objective function involving
style transfer, style relevance consistency, content preservation and fluency
modeling loss terms. Experimental results show that our proposed model achieves
state-of-the-art performance in terms of both transfer accuracy and content
preservation.
Related papers
- Prefix-Tuning Based Unsupervised Text Style Transfer [29.86587278794342]
Unsupervised text style transfer aims at training a generative model that can alter the style of the input sentence while preserving its content.
In this paper, we employ powerful pre-trained large language models and present a new prefix-tuning-based method for unsupervised text style transfer.
arXiv Detail & Related papers (2023-10-23T06:13:08Z) - ParaGuide: Guided Diffusion Paraphrasers for Plug-and-Play Textual Style
Transfer [57.6482608202409]
Textual style transfer is the task of transforming stylistic properties of text while preserving meaning.
We introduce a novel diffusion-based framework for general-purpose style transfer that can be flexibly adapted to arbitrary target styles.
We validate the method on the Enron Email Corpus, with both human and automatic evaluations, and find that it outperforms strong baselines on formality, sentiment, and even authorship style transfer.
arXiv Detail & Related papers (2023-08-29T17:36:02Z) - Conversation Style Transfer using Few-Shot Learning [56.43383396058639]
In this paper, we introduce conversation style transfer as a few-shot learning problem.
We propose a novel in-context learning approach to solve the task with style-free dialogues as a pivot.
We show that conversation style transfer can also benefit downstream tasks.
arXiv Detail & Related papers (2023-02-16T15:27:00Z) - StyleFlow: Disentangle Latent Representations via Normalizing Flow for
Unsupervised Text Style Transfer [5.439842512864442]
Style transfer aims to alter the style of a sentence while preserving its content.
In this paper, we propose a novel disentanglement-based style transfer model StyleFlow to enhance content preservation.
arXiv Detail & Related papers (2022-12-19T17:59:18Z) - StoryTrans: Non-Parallel Story Author-Style Transfer with Discourse
Representations and Content Enhancing [73.81778485157234]
Long texts usually involve more complicated author linguistic preferences such as discourse structures than sentences.
We formulate the task of non-parallel story author-style transfer, which requires transferring an input story into a specified author style.
We use an additional training objective to disentangle stylistic features from the learned discourse representation to prevent the model from degenerating to an auto-encoder.
arXiv Detail & Related papers (2022-08-29T08:47:49Z) - GTAE: Graph-Transformer based Auto-Encoders for Linguistic-Constrained
Text Style Transfer [119.70961704127157]
Non-parallel text style transfer has attracted increasing research interests in recent years.
Current approaches still lack the ability to preserve the content and even logic of original sentences.
We propose a method called Graph Transformer based Auto-GTAE, which models a sentence as a linguistic graph and performs feature extraction and style transfer at the graph level.
arXiv Detail & Related papers (2021-02-01T11:08:45Z) - TextSETTR: Few-Shot Text Style Extraction and Tunable Targeted Restyling [23.60231661500702]
We present a novel approach to the problem of text style transfer.
Our method makes use of readily-available unlabeled text by relying on the implicit connection in style between adjacent sentences.
We demonstrate that training on unlabeled Amazon reviews data results in a model that is competitive on sentiment transfer.
arXiv Detail & Related papers (2020-10-08T07:06:38Z) - Contextual Text Style Transfer [73.66285813595616]
Contextual Text Style Transfer aims to translate a sentence into a desired style with its surrounding context taken into account.
We propose a Context-Aware Style Transfer (CAST) model, which uses two separate encoders for each input sentence and its surrounding context.
Two new benchmarks, Enron-Context and Reddit-Context, are introduced for formality and offensiveness style transfer.
arXiv Detail & Related papers (2020-04-30T23:01:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.