Neural Story Planning
- URL: http://arxiv.org/abs/2212.08718v1
- Date: Fri, 16 Dec 2022 21:29:41 GMT
- Title: Neural Story Planning
- Authors: Anbang Ye, Christopher Cui, Taiwei Shi, Mark O. Riedl
- Abstract summary: We present an approach to story plot generation that unifies causal planning with neural language models.
Our system infers the preconditions for events in the story and then events that will cause those conditions to become true.
Results indicate that our proposed method produces more coherent plotlines than several strong baselines.
- Score: 8.600049807193413
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Automated plot generation is the challenge of generating a sequence of events
that will be perceived by readers as the plot of a coherent story. Traditional
symbolic planners plan a story from a goal state and guarantee logical causal
plot coherence but rely on a library of hand-crafted actions with their
preconditions and effects. This closed world setting limits the length and
diversity of what symbolic planners can generate. On the other hand,
pre-trained neural language models can generate stories with great diversity,
while being generally incapable of ending a story in a specified manner and can
have trouble maintaining coherence. In this paper, we present an approach to
story plot generation that unifies causal planning with neural language models.
We propose to use commonsense knowledge extracted from large language models to
recursively expand a story plot in a backward chaining fashion. Specifically,
our system infers the preconditions for events in the story and then events
that will cause those conditions to become true. We performed automatic
evaluation to measure narrative coherence as indicated by the ability to answer
questions about whether different events in the story are causally related to
other events. Results indicate that our proposed method produces more coherent
plotlines than several strong baselines.
Related papers
- Generating Visual Stories with Grounded and Coreferent Characters [63.07511918366848]
We present the first model capable of predicting visual stories with consistently grounded and coreferent character mentions.
Our model is finetuned on a new dataset which we build on top of the widely used VIST benchmark.
We also propose new evaluation metrics to measure the richness of characters and coreference in stories.
arXiv Detail & Related papers (2024-09-20T14:56:33Z) - StoryVerse: Towards Co-authoring Dynamic Plot with LLM-based Character Simulation via Narrative Planning [8.851718319632973]
Large Language Models (LLMs) drive the behavior of virtual characters, allowing plots to emerge from interactions between characters and their environments.
We propose a novel plot creation workflow that mediates between a writer's authorial intent and the emergent behaviors from LLM-driven character simulation.
The process creates "living stories" that dynamically adapt to various game world states, resulting in narratives co-created by the author, character simulation, and player.
arXiv Detail & Related papers (2024-05-17T23:04:51Z) - Visual Storytelling with Question-Answer Plans [70.89011289754863]
We present a novel framework which integrates visual representations with pretrained language models and planning.
Our model translates the image sequence into a visual prefix, a sequence of continuous embeddings which language models can interpret.
It also leverages a sequence of question-answer pairs as a blueprint plan for selecting salient visual concepts and determining how they should be assembled into a narrative.
arXiv Detail & Related papers (2023-10-08T21:45:34Z) - Persona-Guided Planning for Controlling the Protagonist's Persona in
Story Generation [71.24817035071176]
We propose a planning-based generation model named CONPER to explicitly model the relationship between personas and events.
Both automatic and manual evaluation results demonstrate that CONPER outperforms state-of-the-art baselines for generating more coherent and persona-controllable stories.
arXiv Detail & Related papers (2022-04-22T13:45:02Z) - Computational Lens on Cognition: Study Of Autobiographical Versus
Imagined Stories With Large-Scale Language Models [95.88620740809004]
We study differences in the narrative flow of events in autobiographical versus imagined stories using GPT-3.
We found that imagined stories have higher sequentiality than autobiographical stories.
In comparison to imagined stories, autobiographical stories contain more concrete words and words related to the first person.
arXiv Detail & Related papers (2022-01-07T20:10:47Z) - Automated Story Generation as Question-Answering [5.669790037378093]
We propose a novel approach to automated story generation that treats the problem as one of generative question-answering.
Our proposed story generation system starts with sentences encapsulating the final event of the story.
arXiv Detail & Related papers (2021-12-07T16:32:30Z) - GraphPlan: Story Generation by Planning with Event Graph [31.29515089313627]
We focus on planning a sequence of events assisted by event graphs, and use the events to guide the generator.
Instead of using a sequence-to-sequence model to output a storyline, we propose to generate an event sequence by walking on an event graph.
arXiv Detail & Related papers (2021-02-05T03:18:55Z) - Content Planning for Neural Story Generation with Aristotelian Rescoring [39.07607377794395]
Long-form narrative text manages a fluent impersonation of human writing, but only at the local sentence level, and lacks structure or global cohesion.
We posit that many of the problems of story generation can be addressed via high-quality content planning, and present a system that focuses on how to learn good plot structures to guide story generation.
arXiv Detail & Related papers (2020-09-21T13:41:32Z) - Automated Storytelling via Causal, Commonsense Plot Ordering [20.032706455801353]
Causal relations between plot events are believed to increase the perception of story and plot coherence.
We introduce the concept of soft causal relations as causal relations inferred from commonsense reasoning.
arXiv Detail & Related papers (2020-09-02T05:37:03Z) - PlotMachines: Outline-Conditioned Generation with Dynamic Plot State
Tracking [128.76063992147016]
We present PlotMachines, a neural narrative model that learns to transform an outline into a coherent story by tracking the dynamic plot states.
In addition, we enrich PlotMachines with high-level discourse structure so that the model can learn different writing styles corresponding to different parts of the narrative.
arXiv Detail & Related papers (2020-04-30T17:16:31Z) - A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation [98.25464306634758]
We propose to utilize commonsense knowledge from external knowledge bases to generate reasonable stories.
We employ multi-task learning which combines a discriminative objective to distinguish true and fake stories.
Our model can generate more reasonable stories than state-of-the-art baselines, particularly in terms of logic and global coherence.
arXiv Detail & Related papers (2020-01-15T05:42:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.