On Narrative Information and the Distillation of Stories
- URL: http://arxiv.org/abs/2211.12423v1
- Date: Tue, 22 Nov 2022 17:30:36 GMT
- Title: On Narrative Information and the Distillation of Stories
- Authors: Dylan R. Ashley, Vincent Herrmann, Zachary Friggstad, J\"urgen
Schmidhuber
- Abstract summary: We show how modern artificial neural networks can be leveraged to distill stories.
We then demonstrate how evolutionary algorithms can leverage this to extract a set of narrative templates.
In the process of doing so, we give strong statistical evidence that these narrative information templates are present in existing albums.
- Score: 4.224809458327516
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The act of telling stories is a fundamental part of what it means to be
human. This work introduces the concept of narrative information, which we
define to be the overlap in information space between a story and the items
that compose the story. Using contrastive learning methods, we show how modern
artificial neural networks can be leveraged to distill stories and extract a
representation of the narrative information. We then demonstrate how
evolutionary algorithms can leverage this to extract a set of narrative
templates and how these templates -- in tandem with a novel curve-fitting
algorithm we introduce -- can reorder music albums to automatically induce
stories in them. In the process of doing so, we give strong statistical
evidence that these narrative information templates are present in existing
albums. While we experiment only with music albums here, the premises of our
work extend to any form of (largely) independent media.
Related papers
- Are Large Language Models Capable of Generating Human-Level Narratives? [114.34140090869175]
This paper investigates the capability of LLMs in storytelling, focusing on narrative development and plot progression.
We introduce a novel computational framework to analyze narratives through three discourse-level aspects.
We show that explicit integration of discourse features can enhance storytelling, as is demonstrated by over 40% improvement in neural storytelling.
arXiv Detail & Related papers (2024-07-18T08:02:49Z) - SCStory: Self-supervised and Continual Online Story Discovery [53.72745249384159]
SCStory helps people digest rapidly published news article streams in real-time without human annotations.
SCStory employs self-supervised and continual learning with a novel idea of story-indicative adaptive modeling of news article streams.
arXiv Detail & Related papers (2023-11-27T04:50:01Z) - Album Storytelling with Iterative Story-aware Captioning and Large
Language Models [86.6548090965982]
We study how to transform an album to vivid and coherent stories, a task we refer to as "album storytelling"
With recent advances in Large Language Models (LLMs), it is now possible to generate lengthy, coherent text.
Our method effectively generates more accurate and engaging stories for albums, with enhanced coherence and vividness.
arXiv Detail & Related papers (2023-05-22T11:45:10Z) - NarraSum: A Large-Scale Dataset for Abstractive Narrative Summarization [26.80378373420446]
NarraSum is a large-scale narrative summarization dataset.
It contains 122K narrative documents, which are collected from plot descriptions of movies and TV episodes with diverse genres, and their corresponding abstractive summaries.
Experiments show that there is a large performance gap between humans and the state-of-the-art summarization models on NarraSum.
arXiv Detail & Related papers (2022-12-02T22:51:51Z) - Computational Lens on Cognition: Study Of Autobiographical Versus
Imagined Stories With Large-Scale Language Models [95.88620740809004]
We study differences in the narrative flow of events in autobiographical versus imagined stories using GPT-3.
We found that imagined stories have higher sequentiality than autobiographical stories.
In comparison to imagined stories, autobiographical stories contain more concrete words and words related to the first person.
arXiv Detail & Related papers (2022-01-07T20:10:47Z) - Guiding Neural Story Generation with Reader Models [5.935317028008691]
We introduce Story generation with Reader Models (StoRM), a framework in which a reader model is used to reason about the story should progress.
Experiments show that our model produces significantly more coherent and on-topic stories, outperforming baselines in dimensions including plot plausibility and staying on topic.
arXiv Detail & Related papers (2021-12-16T03:44:01Z) - CompRes: A Dataset for Narrative Structure in News [2.4578723416255754]
We introduce CompRes -- the first dataset for narrative structure in news media.
We use the annotated dataset to train several supervised models to identify the different narrative elements.
arXiv Detail & Related papers (2020-07-09T15:21:59Z) - PlotMachines: Outline-Conditioned Generation with Dynamic Plot State
Tracking [128.76063992147016]
We present PlotMachines, a neural narrative model that learns to transform an outline into a coherent story by tracking the dynamic plot states.
In addition, we enrich PlotMachines with high-level discourse structure so that the model can learn different writing styles corresponding to different parts of the narrative.
arXiv Detail & Related papers (2020-04-30T17:16:31Z) - Hide-and-Tell: Learning to Bridge Photo Streams for Visual Storytelling [86.42719129731907]
We propose to explicitly learn to imagine a storyline that bridges the visual gap.
We train the network to produce a full plausible story even with missing photo(s)
In experiments, we show that our scheme of hide-and-tell, and the network design are indeed effective at storytelling.
arXiv Detail & Related papers (2020-02-03T14:22:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.