Automated Story Generation as Question-Answering
- URL: http://arxiv.org/abs/2112.03808v1
- Date: Tue, 7 Dec 2021 16:32:30 GMT
- Title: Automated Story Generation as Question-Answering
- Authors: Louis Castricato, Spencer Frazier, Jonathan Balloch, Nitya Tarakad,
Mark Riedl
- Abstract summary: We propose a novel approach to automated story generation that treats the problem as one of generative question-answering.
Our proposed story generation system starts with sentences encapsulating the final event of the story.
- Score: 5.669790037378093
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural language model-based approaches to automated story generation suffer
from two important limitations. First, language model-based story generators
generally do not work toward a given goal or ending. Second, they often lose
coherence as the story gets longer. We propose a novel approach to automated
story generation that treats the problem as one of generative
question-answering. Our proposed story generation system starts with sentences
encapsulating the final event of the story. The system then iteratively (1)
analyzes the text describing the most recent event, (2) generates a question
about "why" a character is doing the thing they are doing in the event, and
then (3) attempts to generate another, preceding event that answers this
question.
Related papers
- Generating Visual Stories with Grounded and Coreferent Characters [63.07511918366848]
We present the first model capable of predicting visual stories with consistently grounded and coreferent character mentions.
Our model is finetuned on a new dataset which we build on top of the widely used VIST benchmark.
We also propose new evaluation metrics to measure the richness of characters and coreference in stories.
arXiv Detail & Related papers (2024-09-20T14:56:33Z) - MoPS: Modular Story Premise Synthesis for Open-Ended Automatic Story Generation [50.01780173691132]
We introduce Modular Story Premise Synthesis (MoPS)
MoPS breaks down story premises into modules like background and persona for automated design and generation.
Thorough evaluations demonstrate that our synthesized premises excel in diversity, fascination, completeness, and originality.
arXiv Detail & Related papers (2024-06-09T08:31:14Z) - Creating Suspenseful Stories: Iterative Planning with Large Language
Models [2.6923151107804055]
We propose a novel iterative-prompting-based planning method that is grounded in two theoretical foundations of story suspense.
To the best of our knowledge, this paper is the first attempt at suspenseful story generation with large language models.
arXiv Detail & Related papers (2024-02-27T01:25:52Z) - Neural Story Planning [8.600049807193413]
We present an approach to story plot generation that unifies causal planning with neural language models.
Our system infers the preconditions for events in the story and then events that will cause those conditions to become true.
Results indicate that our proposed method produces more coherent plotlines than several strong baselines.
arXiv Detail & Related papers (2022-12-16T21:29:41Z) - Event Transition Planning for Open-ended Text Generation [55.729259805477376]
Open-ended text generation tasks require models to generate a coherent continuation given limited preceding context.
We propose a novel two-stage method which explicitly arranges the ensuing events in open-ended text generation.
Our approach can be understood as a specially-trained coarse-to-fine algorithm.
arXiv Detail & Related papers (2022-04-20T13:37:51Z) - Goal-Directed Story Generation: Augmenting Generative Language Models
with Reinforcement Learning [7.514717103747824]
We present two automated techniques grounded in deep reinforcement learning and reward shaping to control the plot of computer-generated stories.
The first utilizes proximal policy optimization to fine-tune an existing transformer-based language model to generate text continuations but also be goal-seeking.
The second extracts a knowledge graph from the unfolding story, which is used by a policy network with graph attention to select a candidate continuation generated by a language model.
arXiv Detail & Related papers (2021-12-16T03:34:14Z) - Inferring the Reader: Guiding Automated Story Generation with
Commonsense Reasoning [12.264880519328353]
We introduce Commonsense-inference Augmented neural StoryTelling (CAST), a framework for introducing commonsense reasoning into the generation process.
We find that our CAST method produces significantly more coherent, on-topic, enjoyable and fluent stories than existing models in both the single-character and two-character settings.
arXiv Detail & Related papers (2021-05-04T06:40:33Z) - Sketch and Customize: A Counterfactual Story Generator [71.34131541754674]
We propose a sketch-and-customize generation model guided by the causality implicated in the conditions and endings.
Experimental results show that the proposed model generates much better endings, as compared with the traditional sequence-to-sequence model.
arXiv Detail & Related papers (2021-04-02T08:14:22Z) - Cue Me In: Content-Inducing Approaches to Interactive Story Generation [74.09575609958743]
We focus on the task of interactive story generation, where the user provides the model mid-level sentence abstractions.
We present two content-inducing approaches to effectively incorporate this additional information.
Experimental results from both automatic and human evaluations show that these methods produce more topically coherent and personalized stories.
arXiv Detail & Related papers (2020-10-20T00:36:15Z) - A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation [98.25464306634758]
We propose to utilize commonsense knowledge from external knowledge bases to generate reasonable stories.
We employ multi-task learning which combines a discriminative objective to distinguish true and fake stories.
Our model can generate more reasonable stories than state-of-the-art baselines, particularly in terms of logic and global coherence.
arXiv Detail & Related papers (2020-01-15T05:42:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.