Narrative Interpolation for Generating and Understanding Stories
- URL: http://arxiv.org/abs/2008.07466v1
- Date: Mon, 17 Aug 2020 16:45:50 GMT
- Title: Narrative Interpolation for Generating and Understanding Stories
- Authors: Su Wang, Greg Durrett, Katrin Erk
- Abstract summary: We propose a method for controlled narrative/story generation where we are able to guide the model to produce coherent narratives with user-specified target endings.
The core of our method is an incrementally model based on GPT-2 which conditions on a previous sentence and a next sentence in a narrative and fills in the gap.
We show that ending-guided generation results in narratives which are coherent, faithful to the given ending guide, and require less manual effort on the part of the human guide writer than past approaches.
- Score: 52.463747140762145
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a method for controlled narrative/story generation where we are
able to guide the model to produce coherent narratives with user-specified
target endings by interpolation: for example, we are told that Jim went hiking
and at the end Jim needed to be rescued, and we want the model to incrementally
generate steps along the way. The core of our method is an interpolation model
based on GPT-2 which conditions on a previous sentence and a next sentence in a
narrative and fills in the gap. Additionally, a reranker helps control for
coherence of the generated text. With human evaluation, we show that
ending-guided generation results in narratives which are coherent, faithful to
the given ending guide, and require less manual effort on the part of the human
guide writer than past approaches.
Related papers
- Returning to the Start: Generating Narratives with Related Endpoints [27.61802620856587]
We propose RENarGen, a story-generation paradigm that generates narratives by ensuring the first and last sentences are related and then infilling the middle sentences.
Our contributions include an initial exploration of how various methods of bookending from Narratology affect language modeling for stories.
arXiv Detail & Related papers (2024-03-31T23:48:50Z) - Robust Preference Learning for Storytelling via Contrastive
Reinforcement Learning [53.92465205531759]
Controlled automated story generation seeks to generate natural language stories satisfying constraints from natural language critiques or preferences.
We train a contrastive bi-encoder model to align stories with human critiques, building a general purpose preference model.
We further fine-tune the contrastive reward model using a prompt-learning technique to increase story generation robustness.
arXiv Detail & Related papers (2022-10-14T13:21:33Z) - SNaC: Coherence Error Detection for Narrative Summarization [73.48220043216087]
We introduce SNaC, a narrative coherence evaluation framework rooted in fine-grained annotations for long summaries.
We develop a taxonomy of coherence errors in generated narrative summaries and collect span-level annotations for 6.6k sentences across 150 book and movie screenplay summaries.
Our work provides the first characterization of coherence errors generated by state-of-the-art summarization models and a protocol for eliciting coherence judgments from crowd annotators.
arXiv Detail & Related papers (2022-05-19T16:01:47Z) - Persona-Guided Planning for Controlling the Protagonist's Persona in
Story Generation [71.24817035071176]
We propose a planning-based generation model named CONPER to explicitly model the relationship between personas and events.
Both automatic and manual evaluation results demonstrate that CONPER outperforms state-of-the-art baselines for generating more coherent and persona-controllable stories.
arXiv Detail & Related papers (2022-04-22T13:45:02Z) - Inferring the Reader: Guiding Automated Story Generation with
Commonsense Reasoning [12.264880519328353]
We introduce Commonsense-inference Augmented neural StoryTelling (CAST), a framework for introducing commonsense reasoning into the generation process.
We find that our CAST method produces significantly more coherent, on-topic, enjoyable and fluent stories than existing models in both the single-character and two-character settings.
arXiv Detail & Related papers (2021-05-04T06:40:33Z) - Cue Me In: Content-Inducing Approaches to Interactive Story Generation [74.09575609958743]
We focus on the task of interactive story generation, where the user provides the model mid-level sentence abstractions.
We present two content-inducing approaches to effectively incorporate this additional information.
Experimental results from both automatic and human evaluations show that these methods produce more topically coherent and personalized stories.
arXiv Detail & Related papers (2020-10-20T00:36:15Z) - Consistency and Coherency Enhanced Story Generation [35.08911595854691]
We propose a two-stage generation framework to enhance consistency and coherency of generated stories.
The first stage is to organize the story outline which depicts the story plots and events, and the second stage is to expand the outline into a complete story.
In addition, coreference supervision signals are incorporated to reduce coreference errors and improve the coreference consistency.
arXiv Detail & Related papers (2020-10-17T16:40:37Z) - Narrative Text Generation with a Latent Discrete Plan [39.71663365273463]
We propose a deep latent variable model that first samples a sequence of anchor words, one per sentence in the story, as part of its generative process.
During training, our model treats the sequence of anchor words as a latent variable and attempts to induce anchoring sequences that help guide generation in an unsupervised fashion.
We conduct human evaluations which demonstrate that the stories produced by our model are rated better in comparison with baselines which do not consider story plans.
arXiv Detail & Related papers (2020-10-07T08:45:37Z) - PlotMachines: Outline-Conditioned Generation with Dynamic Plot State
Tracking [128.76063992147016]
We present PlotMachines, a neural narrative model that learns to transform an outline into a coherent story by tracking the dynamic plot states.
In addition, we enrich PlotMachines with high-level discourse structure so that the model can learn different writing styles corresponding to different parts of the narrative.
arXiv Detail & Related papers (2020-04-30T17:16:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.