Generating Long-form Story Using Dynamic Hierarchical Outlining with Memory-Enhancement
- URL: http://arxiv.org/abs/2412.13575v1
- Date: Wed, 18 Dec 2024 07:50:54 GMT
- Title: Generating Long-form Story Using Dynamic Hierarchical Outlining with Memory-Enhancement
- Authors: Qianyue Wang, Jinwu Hu, Zhengping Li, Yufeng Wang, daiyuan li, Yu Hu, Mingkui Tan,
- Abstract summary: We propose Dynamic Hierarchical Outlining with Memory-Enhancement long-form story generation method, named DOME, to generate the long-form story with coherent content and plot.
A Memory-Enhancement Module (MEM) based on temporal knowledge graphs is introduced to store and access the generated content.
Experiments demonstrate that DOME significantly improves the fluency, coherence, and overall quality of generated long stories compared to state-of-the-art methods.
- Score: 29.435378306293583
- License:
- Abstract: Long-form story generation task aims to produce coherent and sufficiently lengthy text, essential for applications such as novel writingand interactive storytelling. However, existing methods, including LLMs, rely on rigid outlines or lack macro-level planning, making it difficult to achieve both contextual consistency and coherent plot development in long-form story generation. To address this issues, we propose Dynamic Hierarchical Outlining with Memory-Enhancement long-form story generation method, named DOME, to generate the long-form story with coherent content and plot. Specifically, the Dynamic Hierarchical Outline(DHO) mechanism incorporates the novel writing theory into outline planning and fuses the plan and writing stages together, improving the coherence of the plot by ensuring the plot completeness and adapting to the uncertainty during story generation. A Memory-Enhancement Module (MEM) based on temporal knowledge graphs is introduced to store and access the generated content, reducing contextual conflicts and improving story coherence. Finally, we propose a Temporal Conflict Analyzer leveraging temporal knowledge graphs to automatically evaluate the contextual consistency of long-form story. Experiments demonstrate that DOME significantly improves the fluency, coherence, and overall quality of generated long stories compared to state-of-the-art methods.
Related papers
- ContextualStory: Consistent Visual Storytelling with Spatially-Enhanced and Storyline Context [50.572907418430155]
Existing autoregressive methods struggle with high memory usage, slow generation speeds, and limited context integration.
We propose ContextualStory, a novel framework designed to generate coherent story frames and extend frames for story continuation.
In experiments on PororoSV and FlintstonesSV benchmarks, ContextualStory significantly outperforms existing methods in both story visualization and story continuation.
arXiv Detail & Related papers (2024-07-13T05:02:42Z) - SEED-Story: Multimodal Long Story Generation with Large Language Model [66.37077224696242]
SEED-Story is a novel method that leverages a Multimodal Large Language Model (MLLM) to generate extended multimodal stories.
We propose multimodal attention sink mechanism to enable the generation of stories with up to 25 sequences (only 10 for training) in a highly efficient autoregressive manner.
We present a large-scale and high-resolution dataset named StoryStream for training our model and quantitatively evaluating the task of multimodal story generation in various aspects.
arXiv Detail & Related papers (2024-07-11T17:21:03Z) - Guiding and Diversifying LLM-Based Story Generation via Answer Set Programming [1.7889842797216124]
Large language models (LLMs) are capable of generating stories in response to open-ended user requests.
We propose using a higher-level and more abstract symbolic specification of high-level story structure to guide and diversify story generation.
arXiv Detail & Related papers (2024-06-01T21:14:25Z) - Improving Pacing in Long-Form Story Planning [55.39443681232538]
We propose a CONCrete Outline ConTrol system to improve pacing when automatically generating story outlines.
We first train a concreteness evaluator to judge which of two events is more concrete.
In this work, we explore a vaguest-first expansion procedure that aims for uniform pacing.
arXiv Detail & Related papers (2023-11-08T04:58:29Z) - GROVE: A Retrieval-augmented Complex Story Generation Framework with A
Forest of Evidence [26.90143556633735]
We propose a retrieval-autextbfGmented stotextbfRy generation framework with a ftextbfOrest of etextbfVidtextbfEnce (GROVE) to enhance stories' complexity.
We design an asking-why'' prompting scheme that extracts a forest of evidence, providing compensation for the ambiguities that may occur in the generated story.
arXiv Detail & Related papers (2023-10-09T03:55:55Z) - Re3: Generating Longer Stories With Recursive Reprompting and Revision [83.99558005056817]
We consider the problem of automatically generating longer stories of over two thousand words.
Compared to prior work on shorter stories, long-range plot coherence and relevance are more central challenges here.
We propose the Recursive Reprompting and Revision framework (Re3) to address these challenges.
arXiv Detail & Related papers (2022-10-13T06:29:57Z) - Consistency and Coherency Enhanced Story Generation [35.08911595854691]
We propose a two-stage generation framework to enhance consistency and coherency of generated stories.
The first stage is to organize the story outline which depicts the story plots and events, and the second stage is to expand the outline into a complete story.
In addition, coreference supervision signals are incorporated to reduce coreference errors and improve the coreference consistency.
arXiv Detail & Related papers (2020-10-17T16:40:37Z) - Content Planning for Neural Story Generation with Aristotelian Rescoring [39.07607377794395]
Long-form narrative text manages a fluent impersonation of human writing, but only at the local sentence level, and lacks structure or global cohesion.
We posit that many of the problems of story generation can be addressed via high-quality content planning, and present a system that focuses on how to learn good plot structures to guide story generation.
arXiv Detail & Related papers (2020-09-21T13:41:32Z) - PlotMachines: Outline-Conditioned Generation with Dynamic Plot State
Tracking [128.76063992147016]
We present PlotMachines, a neural narrative model that learns to transform an outline into a coherent story by tracking the dynamic plot states.
In addition, we enrich PlotMachines with high-level discourse structure so that the model can learn different writing styles corresponding to different parts of the narrative.
arXiv Detail & Related papers (2020-04-30T17:16:31Z) - A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation [98.25464306634758]
We propose to utilize commonsense knowledge from external knowledge bases to generate reasonable stories.
We employ multi-task learning which combines a discriminative objective to distinguish true and fake stories.
Our model can generate more reasonable stories than state-of-the-art baselines, particularly in terms of logic and global coherence.
arXiv Detail & Related papers (2020-01-15T05:42:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.