Sequentially Controlled Text Generation
- URL: http://arxiv.org/abs/2301.02299v1
- Date: Thu, 5 Jan 2023 21:23:51 GMT
- Title: Sequentially Controlled Text Generation
- Authors: Alexander Spangher, Xinyu Hua, Yao Ming, Nanyun Peng
- Abstract summary: GPT-2 generates sentences that are remarkably human-like, longer documents can ramble and do not follow human-like writing structure.
We study the problem of imposing structure on long-range text.
We develop a sequential controlled text generation pipeline with generation and editing.
- Score: 97.22539956688443
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: While GPT-2 generates sentences that are remarkably human-like, longer
documents can ramble and do not follow human-like writing structure. We study
the problem of imposing structure on long-range text. We propose a novel
controlled text generation task, sequentially controlled text generation, and
identify a dataset, NewsDiscourse as a starting point for this task. We develop
a sequential controlled text generation pipeline with generation and editing.
We test different degrees of structural awareness and show that, in general,
more structural awareness results in higher control-accuracy, grammaticality,
coherency and topicality, approaching human-level writing performance.
Related papers
- Instruct-SCTG: Guiding Sequential Controlled Text Generation through
Instructions [42.67608830386934]
Instruct-SCTG is a sequential framework that harnesses instruction-tuned language models to generate structurally coherent text.
Our framework generates articles in a section-by-section manner, aligned with the desired human structure using natural language instructions.
arXiv Detail & Related papers (2023-12-19T16:20:49Z) - Automatic and Human-AI Interactive Text Generation [27.05024520190722]
This tutorial aims to provide an overview of the state-of-the-art natural language generation research.
Text-to-text generation tasks are more constrained in terms of semantic consistency and targeted language styles.
arXiv Detail & Related papers (2023-10-05T20:26:15Z) - RSTGen: Imbuing Fine-Grained Interpretable Control into Long-FormText
Generators [26.27412809287025]
RSTGen is a framework that controls the discourse structure, semantics and topics of generated text.
We demonstrate our model's ability to control structural discourse and semantic features of generated text in open generation evaluation.
arXiv Detail & Related papers (2022-05-25T09:06:04Z) - Event Transition Planning for Open-ended Text Generation [55.729259805477376]
Open-ended text generation tasks require models to generate a coherent continuation given limited preceding context.
We propose a novel two-stage method which explicitly arranges the ensuing events in open-ended text generation.
Our approach can be understood as a specially-trained coarse-to-fine algorithm.
arXiv Detail & Related papers (2022-04-20T13:37:51Z) - PLANET: Dynamic Content Planning in Autoregressive Transformers for
Long-form Text Generation [47.97523895218194]
We propose a novel generation framework leveraging autoregressive self-attention mechanism to conduct content planning and surface realization dynamically.
Our framework enriches the Transformer decoder with latent representations to maintain sentence-level semantic plans grounded by bag-of-words.
arXiv Detail & Related papers (2022-03-17T05:52:35Z) - Data-to-text Generation with Variational Sequential Planning [74.3955521225497]
We consider the task of data-to-text generation, which aims to create textual output from non-linguistic input.
We propose a neural model enhanced with a planning component responsible for organizing high-level information in a coherent and meaningful way.
We infer latent plans sequentially with a structured variational model, while interleaving the steps of planning and generation.
arXiv Detail & Related papers (2022-02-28T13:17:59Z) - SCROLLS: Standardized CompaRison Over Long Language Sequences [62.574959194373264]
We introduce SCROLLS, a suite of tasks that require reasoning over long texts.
SCROLLS contains summarization, question answering, and natural language inference tasks.
We make all datasets available in a unified text-to-text format and host a live leaderboard to facilitate research on model architecture and pretraining methods.
arXiv Detail & Related papers (2022-01-10T18:47:15Z) - Long Text Generation by Modeling Sentence-Level and Discourse-Level
Coherence [59.51720326054546]
We propose a long text generation model, which can represent the prefix sentences at sentence level and discourse level in the decoding process.
Our model can generate more coherent texts than state-of-the-art baselines.
arXiv Detail & Related papers (2021-05-19T07:29:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.