Outline to Story: Fine-grained Controllable Story Generation from
Cascaded Events
- URL: http://arxiv.org/abs/2101.00822v1
- Date: Mon, 4 Jan 2021 08:16:21 GMT
- Title: Outline to Story: Fine-grained Controllable Story Generation from
Cascaded Events
- Authors: Le Fang, Tao Zeng, Chaochun Liu, Liefeng Bo, Wen Dong, Changyou Chen
- Abstract summary: We propose a new task named "Outline to Story" (O2S) as a test bed for fine-grained controllable generation of long text.
We then create datasets for future benchmarks, built by state-of-the-art keyword extraction techniques.
- Score: 39.577220559911055
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Large-scale pretrained language models have shown thrilling generation
capabilities, especially when they generate consistent long text in thousands
of words with ease. However, users of these models can only control the prefix
of sentences or certain global aspects of generated text. It is challenging to
simultaneously achieve fine-grained controllability and preserve the
state-of-the-art unconditional text generation capability. In this paper, we
first propose a new task named "Outline to Story" (O2S) as a test bed for
fine-grained controllable generation of long text, which generates a
multi-paragraph story from cascaded events, i.e. a sequence of outline events
that guide subsequent paragraph generation. We then create dedicate datasets
for future benchmarks, built by state-of-the-art keyword extraction techniques.
Finally, we propose an extremely simple yet strong baseline method for the O2S
task, which fine tunes pre-trained language models on augmented sequences of
outline-story pairs with simple language modeling objective. Our method does
not introduce any new parameters or perform any architecture modification,
except several special tokens as delimiters to build augmented sequences.
Extensive experiments on various datasets demonstrate state-of-the-art
conditional story generation performance with our model, achieving better
fine-grained controllability and user flexibility. Our paper is among the first
ones by our knowledge to propose a model and to create datasets for the task of
"outline to story". Our work also instantiates research interest of
fine-grained controllable generation of open-domain long text, where
controlling inputs are represented by short text.
Related papers
- Retrieval is Accurate Generation [99.24267226311157]
We introduce a novel method that selects context-aware phrases from a collection of supporting documents.
Our model achieves the best performance and the lowest latency among several retrieval-augmented baselines.
arXiv Detail & Related papers (2024-02-27T14:16:19Z) - RSTGen: Imbuing Fine-Grained Interpretable Control into Long-FormText
Generators [26.27412809287025]
RSTGen is a framework that controls the discourse structure, semantics and topics of generated text.
We demonstrate our model's ability to control structural discourse and semantic features of generated text in open generation evaluation.
arXiv Detail & Related papers (2022-05-25T09:06:04Z) - Event Transition Planning for Open-ended Text Generation [55.729259805477376]
Open-ended text generation tasks require models to generate a coherent continuation given limited preceding context.
We propose a novel two-stage method which explicitly arranges the ensuing events in open-ended text generation.
Our approach can be understood as a specially-trained coarse-to-fine algorithm.
arXiv Detail & Related papers (2022-04-20T13:37:51Z) - Data-to-text Generation with Variational Sequential Planning [74.3955521225497]
We consider the task of data-to-text generation, which aims to create textual output from non-linguistic input.
We propose a neural model enhanced with a planning component responsible for organizing high-level information in a coherent and meaningful way.
We infer latent plans sequentially with a structured variational model, while interleaving the steps of planning and generation.
arXiv Detail & Related papers (2022-02-28T13:17:59Z) - Facts2Story: Controlling Text Generation by Key Facts [0.0]
We propose a controlled generation task based on expanding a sequence of facts, expressed in natural language, into a longer narrative.
We show that while auto-regressive, unidirectional Language Models such as GPT2 produce better fluency, they struggle to adhere to the requested facts.
We propose a plan-and-cloze model (using fine-tuned XLNet) which produces competitive fluency while adhering to the requested content.
arXiv Detail & Related papers (2020-12-08T10:14:29Z) - KGPT: Knowledge-Grounded Pre-Training for Data-to-Text Generation [100.79870384880333]
We propose a knowledge-grounded pre-training (KGPT) to generate knowledge-enriched text.
We adopt three settings, namely fully-supervised, zero-shot, few-shot to evaluate its effectiveness.
Under zero-shot setting, our model achieves over 30 ROUGE-L on WebNLG while all other baselines fail.
arXiv Detail & Related papers (2020-10-05T19:59:05Z) - POINTER: Constrained Progressive Text Generation via Insertion-based
Generative Pre-training [93.79766670391618]
We present POINTER, a novel insertion-based approach for hard-constrained text generation.
The proposed method operates by progressively inserting new tokens between existing tokens in a parallel manner.
The resulting coarse-to-fine hierarchy makes the generation process intuitive and interpretable.
arXiv Detail & Related papers (2020-05-01T18:11:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.