Towards Attribute-Entangled Controllable Text Generation: A Pilot Study
of Blessing Generation
- URL: http://arxiv.org/abs/2210.16557v1
- Date: Sat, 29 Oct 2022 10:19:48 GMT
- Title: Towards Attribute-Entangled Controllable Text Generation: A Pilot Study
of Blessing Generation
- Authors: Shulin Huang, Shirong Ma, Yinghui Li, Yangning Li, Shiyang Lin,
Hai-Tao Zheng and Ying Shen
- Abstract summary: Controllable Text Generation (CTG) has obtained great success due to its fine-grained generation ability obtained by focusing on multiple attributes.
We focus on a novel CTG scenario, i.e., blessing generation which is challenging because high-quality blessing texts require CTG models to comprehensively consider the entanglement between multiple attributes.
We present EBleT, a large-scale Entangled Blessing Text dataset containing 293K English sentences annotated with multiple attributes.
- Score: 24.921292269906175
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Controllable Text Generation (CTG) has obtained great success due to its
fine-grained generation ability obtained by focusing on multiple attributes.
However, most existing CTG researches overlook how to utilize the attribute
entanglement to enhance the diversity of the controlled generated texts. Facing
this dilemma, we focus on a novel CTG scenario, i.e., blessing generation which
is challenging because high-quality blessing texts require CTG models to
comprehensively consider the entanglement between multiple attributes (e.g.,
objects and occasions). To promote the research on blessing generation, we
present EBleT, a large-scale Entangled Blessing Text dataset containing 293K
English sentences annotated with multiple attributes. Furthermore, we propose
novel evaluation metrics to measure the quality of the blessing texts generated
by the baseline models we designed. Our study opens a new research direction
for controllable text generation and enables the development of
attribute-entangled CTG models. Our dataset and source codes are available at
\url{https://github.com/huangshulin123/Blessing-Generation}.
Related papers
- Retrieval is Accurate Generation [99.24267226311157]
We introduce a novel method that selects context-aware phrases from a collection of supporting documents.
Our model achieves the best performance and the lowest latency among several retrieval-augmented baselines.
arXiv Detail & Related papers (2024-02-27T14:16:19Z) - Personalized Text Generation with Fine-Grained Linguistic Control [9.668216418094316]
We focus on controlling fine-grained attributes spanning multiple linguistic dimensions.
We introduce a novel benchmark to train generative models and evaluate their ability to generate personalized text.
arXiv Detail & Related papers (2024-02-07T14:41:08Z) - Successor Features for Efficient Multisubject Controlled Text Generation [48.37713738712319]
We introduce SF-GEN, which is grounded in two primary concepts: successor features (SFs) and language model rectification.
SF-GEN seamlessly integrates the two to enable dynamic steering of text generation with no need to alter the LLM's parameters.
To the best of our knowledge, our research represents the first application of successor features in text generation.
arXiv Detail & Related papers (2023-11-03T00:17:08Z) - Deliberate then Generate: Enhanced Prompting Framework for Text
Generation [70.10319005141888]
Deliberate then Generate (DTG) prompting framework consists of error detection instructions and candidates that may contain errors.
We conduct extensive experiments on 20+ datasets across 7 text generation tasks, including summarization, translation, dialogue, and more.
We show that DTG consistently outperforms existing prompting methods and achieves state-of-the-art performance on multiple text generation tasks.
arXiv Detail & Related papers (2023-05-31T13:23:04Z) - Learning to Transfer Prompts for Text Generation [97.64625999380425]
We propose a novel prompt-based method (PTG) for text generation in a transferable setting.
First, PTG learns a set of source prompts for various source generation tasks and then transfers these prompts as target prompts to perform target generation tasks.
In extensive experiments, PTG yields competitive or better results than fine-tuning methods.
arXiv Detail & Related papers (2022-05-03T14:53:48Z) - A Survey on Retrieval-Augmented Text Generation [53.04991859796971]
Retrieval-augmented text generation has remarkable advantages and has achieved state-of-the-art performance in many NLP tasks.
It firstly highlights the generic paradigm of retrieval-augmented generation, and then it reviews notable approaches according to different tasks.
arXiv Detail & Related papers (2022-02-02T16:18:41Z) - Outline to Story: Fine-grained Controllable Story Generation from
Cascaded Events [39.577220559911055]
We propose a new task named "Outline to Story" (O2S) as a test bed for fine-grained controllable generation of long text.
We then create datasets for future benchmarks, built by state-of-the-art keyword extraction techniques.
arXiv Detail & Related papers (2021-01-04T08:16:21Z) - GenAug: Data Augmentation for Finetuning Text Generators [21.96895115572357]
We propose and evaluate various augmentation methods, including some that incorporate external knowledge, for finetuning GPT-2 on a subset of Yelp Reviews.
Our experiments demonstrate that insertion of character-level synthetic noise and keyword replacement with hypernyms are effective augmentation methods.
arXiv Detail & Related papers (2020-10-05T05:46:39Z) - Controllable Text Generation with Focused Variation [71.07811310799664]
Focused-Variation Network (FVN) is a novel model to control language generation.
FVN learns disjoint discrete latent spaces for each attribute inside codebooks, which allows for both controllability and diversity.
We evaluate FVN on two text generation datasets with annotated content and style, and show state-of-the-art performance as assessed by automatic and human evaluations.
arXiv Detail & Related papers (2020-09-25T06:31:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.