Modeling Event Salience in Narratives via Barthes' Cardinal Functions
- URL: http://arxiv.org/abs/2011.01785v1
- Date: Tue, 3 Nov 2020 15:28:07 GMT
- Title: Modeling Event Salience in Narratives via Barthes' Cardinal Functions
- Authors: Takaki Otake, Sho Yokoi, Naoya Inoue, Ryo Takahashi, Tatsuki
Kuribayashi, Kentaro Inui
- Abstract summary: Estimating event salience is useful for tasks such as story generation and text analysis in narratology and folkloristics.
To compute event salience without any annotations, we propose several unsupervised methods that require only a pre-trained language model.
We show that the proposed methods outperform baseline methods and find fine-tuning a language model on narrative texts is a key factor in improving the proposed methods.
- Score: 38.44885682996472
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Events in a narrative differ in salience: some are more important to the
story than others. Estimating event salience is useful for tasks such as story
generation, and as a tool for text analysis in narratology and folkloristics.
To compute event salience without any annotations, we adopt Barthes' definition
of event salience and propose several unsupervised methods that require only a
pre-trained language model. Evaluating the proposed methods on folktales with
event salience annotation, we show that the proposed methods outperform
baseline methods and find fine-tuning a language model on narrative texts is a
key factor in improving the proposed methods.
Related papers
- Pixel Sentence Representation Learning [67.4775296225521]
In this work, we conceptualize the learning of sentence-level textual semantics as a visual representation learning process.
We employ visually-grounded text perturbation methods like typos and word order shuffling, resonating with human cognitive patterns, and enabling perturbation to be perceived as continuous.
Our approach is further bolstered by large-scale unsupervised topical alignment training and natural language inference supervision.
arXiv Detail & Related papers (2024-02-13T02:46:45Z) - Semi-supervised News Discourse Profiling with Contrastive Learning [27.28989421841165]
News discourse profiling seeks to scrutinize the event-related role of each sentence in a news article.
We present a novel approach, denoted as Intra-document Contrastive Learning with Distillation (ICLD), for addressing the news discourse profiling task.
arXiv Detail & Related papers (2023-09-20T23:51:34Z) - Disco-Bench: A Discourse-Aware Evaluation Benchmark for Language
Modelling [70.23876429382969]
We propose a benchmark that can evaluate intra-sentence discourse properties across a diverse set of NLP tasks.
Disco-Bench consists of 9 document-level testsets in the literature domain, which contain rich discourse phenomena.
For linguistic analysis, we also design a diagnostic test suite that can examine whether the target models learn discourse knowledge.
arXiv Detail & Related papers (2023-07-16T15:18:25Z) - Global Constraints with Prompting for Zero-Shot Event Argument
Classification [49.84347224233628]
We propose to use global constraints with prompting to tackle event argument classification without any annotation and task-specific training.
A pre-trained language model scores the new passages, making the initial prediction.
Our novel prompt templates can easily adapt to all events and argument types without manual effort.
arXiv Detail & Related papers (2023-02-09T06:39:29Z) - A Generative Approach for Script Event Prediction via Contrastive
Fine-tuning [35.87615178251874]
Script event prediction aims to predict the subsequent event given the context.
Recent works have attempted to improve event correlation reasoning by using pretrained language models and incorporating external knowledge.
We propose a novel generative approach for this task, in which a pretrained language model is fine-tuned with an event-centric pretraining objective.
arXiv Detail & Related papers (2022-12-07T07:32:47Z) - Semantic Pivoting Model for Effective Event Detection [19.205550116466604]
Event Detection aims to identify and classify mentions of event instances from unstructured articles.
Existing techniques for event detection only use homogeneous one-hot vectors to represent the event type classes, ignoring the fact that the semantic meaning of the types is important to the task.
We propose a Semantic Pivoting Model for Effective Event Detection (SPEED), which explicitly incorporates prior information during training and captures semantically meaningful correlations between input and events.
arXiv Detail & Related papers (2022-11-01T19:20:34Z) - Probing via Prompting [71.7904179689271]
This paper introduces a novel model-free approach to probing, by formulating probing as a prompting task.
We conduct experiments on five probing tasks and show that our approach is comparable or better at extracting information than diagnostic probes.
We then examine the usefulness of a specific linguistic property for pre-training by removing the heads that are essential to that property and evaluating the resulting model's performance on language modeling.
arXiv Detail & Related papers (2022-07-04T22:14:40Z) - Salience-Aware Event Chain Modeling for Narrative Understanding [22.27295378297949]
We introduce methods for extracting the principal chain from natural language text, by filtering away non-salient events and supportive sentences.
We show that by pre-training large language models on our extracted chains, we obtain improvements in two tasks that benefit from a clear understanding of event chains.
arXiv Detail & Related papers (2021-09-22T01:34:03Z) - Memory and Knowledge Augmented Language Models for Inferring Salience in
Long-Form Stories [21.99104738567138]
This paper takes a recent unsupervised method for salience detection derived from Barthes Cardinal Functions and theories of surprise.
We improve the standard transformer language model by incorporating an external knowledgebase and adding a memory mechanism.
Our evaluation against this data demonstrates that our salience detection model improves performance over and above a non-knowledgebase and memory augmented language model.
arXiv Detail & Related papers (2021-09-08T16:15:50Z) - Toward Better Storylines with Sentence-Level Language Models [54.91921545103256]
We propose a sentence-level language model which selects the next sentence in a story from a finite set of fluent alternatives.
We demonstrate the effectiveness of our approach with state-of-the-art accuracy on the unsupervised Story Cloze task.
arXiv Detail & Related papers (2020-05-11T16:54:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.