Politeness Transfer: A Tag and Generate Approach
- URL: http://arxiv.org/abs/2004.14257v2
- Date: Fri, 1 May 2020 22:33:41 GMT
- Title: Politeness Transfer: A Tag and Generate Approach
- Authors: Aman Madaan, Amrith Setlur, Tanmay Parekh, Barnabas Poczos, Graham
Neubig, Yiming Yang, Ruslan Salakhutdinov, Alan W Black, Shrimai Prabhumoye
- Abstract summary: This paper introduces a new task of politeness transfer.
It involves converting non-polite sentences to polite sentences while preserving the meaning.
We design a tag and generate pipeline that identifies stylistic attributes and subsequently generates a sentence in the target style.
- Score: 167.9924201435888
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: This paper introduces a new task of politeness transfer which involves
converting non-polite sentences to polite sentences while preserving the
meaning. We also provide a dataset of more than 1.39 instances automatically
labeled for politeness to encourage benchmark evaluations on this new task. We
design a tag and generate pipeline that identifies stylistic attributes and
subsequently generates a sentence in the target style while preserving most of
the source content. For politeness as well as five other transfer tasks, our
model outperforms the state-of-the-art methods on automatic metrics for content
preservation, with a comparable or better performance on style transfer
accuracy. Additionally, our model surpasses existing methods on human
evaluations for grammaticality, meaning preservation and transfer accuracy
across all the six style transfer tasks. The data and code is located at
https://github.com/tag-and-generate.
Related papers
- Don't lose the message while paraphrasing: A study on content preserving
style transfer [61.38460184163704]
Content preservation is critical for real-world applications of style transfer studies.
We compare various style transfer models on the example of the formality transfer domain.
We conduct a precise comparative study of several state-of-the-art techniques for style transfer.
arXiv Detail & Related papers (2023-08-17T15:41:08Z) - StoryTrans: Non-Parallel Story Author-Style Transfer with Discourse
Representations and Content Enhancing [73.81778485157234]
Long texts usually involve more complicated author linguistic preferences such as discourse structures than sentences.
We formulate the task of non-parallel story author-style transfer, which requires transferring an input story into a specified author style.
We use an additional training objective to disentangle stylistic features from the learned discourse representation to prevent the model from degenerating to an auto-encoder.
arXiv Detail & Related papers (2022-08-29T08:47:49Z) - Few-shot Controllable Style Transfer for Low-Resource Settings: A Study
in Indian Languages [13.980482277351523]
Style transfer is the task of rewriting an input sentence into a target style while preserving its content.
We push the state-of-the-art for few-shot style transfer with a new method modeling the stylistic difference between paraphrases.
Our model achieves 2-3x better performance and output diversity in formality transfer and code-mixing addition across five Indian languages.
arXiv Detail & Related papers (2021-10-14T14:16:39Z) - Teach me how to Label: Labeling Functions from Natural Language with
Text-to-text Transformers [0.5330240017302619]
This paper focuses on the task of turning natural language descriptions into Python labeling functions.
We follow a novel approach to semantic parsing with pre-trained text-to-text Transformers.
Our approach can be regarded as a stepping stone towards models that are taught how to label in natural language.
arXiv Detail & Related papers (2021-01-18T16:04:15Z) - Conditioned Text Generation with Transfer for Closed-Domain Dialogue
Systems [65.48663492703557]
We show how to optimally train and control the generation of intent-specific sentences using a conditional variational autoencoder.
We introduce a new protocol called query transfer that allows to leverage a large unlabelled dataset.
arXiv Detail & Related papers (2020-11-03T14:06:10Z) - Reformulating Unsupervised Style Transfer as Paraphrase Generation [48.83148014000888]
We reformulate unsupervised style transfer as a paraphrase generation problem.
We present a simple methodology based on fine-tuning pretrained language models on automatically generated paraphrase data.
We also pivot to a more real-world style transfer setting by collecting a large dataset of 15M sentences in 11 diverse styles.
arXiv Detail & Related papers (2020-10-12T13:31:01Z) - Contextual Text Style Transfer [73.66285813595616]
Contextual Text Style Transfer aims to translate a sentence into a desired style with its surrounding context taken into account.
We propose a Context-Aware Style Transfer (CAST) model, which uses two separate encoders for each input sentence and its surrounding context.
Two new benchmarks, Enron-Context and Reddit-Context, are introduced for formality and offensiveness style transfer.
arXiv Detail & Related papers (2020-04-30T23:01:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.