Text-Blueprint: An Interactive Platform for Plan-based Conditional
Generation
- URL: http://arxiv.org/abs/2305.00034v1
- Date: Fri, 28 Apr 2023 18:14:48 GMT
- Title: Text-Blueprint: An Interactive Platform for Plan-based Conditional
Generation
- Authors: Fantine Huot, Joshua Maynez, Shashi Narayan, Reinald Kim Amplayo,
Kuzman Ganchev, Annie Louis, Anders Sandholm, Dipanjan Das, Mirella Lapata
- Abstract summary: Planning can be a useful intermediate step to render conditional generation less opaque and more grounded.
We present a web browser-based demonstration for query-focused summarization that uses a sequence of question-answer pairs.
- Score: 84.95981645040281
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: While conditional generation models can now generate natural language well
enough to create fluent text, it is still difficult to control the generation
process, leading to irrelevant, repetitive, and hallucinated content. Recent
work shows that planning can be a useful intermediate step to render
conditional generation less opaque and more grounded. We present a web
browser-based demonstration for query-focused summarization that uses a
sequence of question-answer pairs, as a blueprint plan for guiding text
generation (i.e., what to say and in what order). We illustrate how users may
interact with the generated text and associated plan visualizations, e.g., by
editing and modifying the blueprint in order to improve or control the
generated output.
A short video demonstrating our system is available at
https://goo.gle/text-blueprint-demo.
Related papers
- Plug-and-Play Recipe Generation with Content Planning [28.65323853250831]
We propose a framework which explicitly models the global content plan of the generated text.
It optimize the joint distribution of the natural language sequence and the global content plan in a plug-and-play manner.
Our model achieves the state-of-the-art performance on the task of recipe generation.
arXiv Detail & Related papers (2022-12-09T19:39:10Z) - Conditional Generation with a Question-Answering Blueprint [84.95981645040281]
We advocate planning as a useful intermediate representation for rendering conditional generation less opaque and more grounded.
We obtain blueprints automatically by exploiting state-of-the-art question generation technology.
We develop Transformer-based models, each varying in how they incorporate the blueprint in the generated output.
arXiv Detail & Related papers (2022-07-01T13:10:19Z) - Event Transition Planning for Open-ended Text Generation [55.729259805477376]
Open-ended text generation tasks require models to generate a coherent continuation given limited preceding context.
We propose a novel two-stage method which explicitly arranges the ensuing events in open-ended text generation.
Our approach can be understood as a specially-trained coarse-to-fine algorithm.
arXiv Detail & Related papers (2022-04-20T13:37:51Z) - PLANET: Dynamic Content Planning in Autoregressive Transformers for
Long-form Text Generation [47.97523895218194]
We propose a novel generation framework leveraging autoregressive self-attention mechanism to conduct content planning and surface realization dynamically.
Our framework enriches the Transformer decoder with latent representations to maintain sentence-level semantic plans grounded by bag-of-words.
arXiv Detail & Related papers (2022-03-17T05:52:35Z) - Outline to Story: Fine-grained Controllable Story Generation from
Cascaded Events [39.577220559911055]
We propose a new task named "Outline to Story" (O2S) as a test bed for fine-grained controllable generation of long text.
We then create datasets for future benchmarks, built by state-of-the-art keyword extraction techniques.
arXiv Detail & Related papers (2021-01-04T08:16:21Z) - POINTER: Constrained Progressive Text Generation via Insertion-based
Generative Pre-training [93.79766670391618]
We present POINTER, a novel insertion-based approach for hard-constrained text generation.
The proposed method operates by progressively inserting new tokens between existing tokens in a parallel manner.
The resulting coarse-to-fine hierarchy makes the generation process intuitive and interpretable.
arXiv Detail & Related papers (2020-05-01T18:11:54Z) - PALM: Pre-training an Autoencoding&Autoregressive Language Model for
Context-conditioned Generation [92.7366819044397]
Self-supervised pre-training has emerged as a powerful technique for natural language understanding and generation.
This work presents PALM with a novel scheme that jointly pre-trains an autoencoding and autoregressive language model on a large unlabeled corpus.
An extensive set of experiments show that PALM achieves new state-of-the-art results on a variety of language generation benchmarks.
arXiv Detail & Related papers (2020-04-14T06:25:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.