TIMELINE: Exhaustive Annotation of Temporal Relations Supporting the
Automatic Ordering of Events in News Articles
- URL: http://arxiv.org/abs/2310.17802v1
- Date: Thu, 26 Oct 2023 22:23:38 GMT
- Title: TIMELINE: Exhaustive Annotation of Temporal Relations Supporting the
Automatic Ordering of Events in News Articles
- Authors: Sarah Alsayyahi and Riza Batista-Navarro
- Abstract summary: This paper presents a new annotation scheme that clearly defines the criteria based on which temporal relations should be annotated.
We also propose a method for annotating all temporal relations -- including long-distance ones -- which automates the process.
The result is a new dataset, the TIMELINE corpus, in which improved inter-annotator agreement was obtained.
- Score: 4.314956204483074
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal relation extraction models have thus far been hindered by a number
of issues in existing temporal relation-annotated news datasets, including: (1)
low inter-annotator agreement due to the lack of specificity of their
annotation guidelines in terms of what counts as a temporal relation; (2) the
exclusion of long-distance relations within a given document (those spanning
across different paragraphs); and (3) the exclusion of events that are not
centred on verbs. This paper aims to alleviate these issues by presenting a new
annotation scheme that clearly defines the criteria based on which temporal
relations should be annotated. Additionally, the scheme includes events even if
they are not expressed as verbs (e.g., nominalised events). Furthermore, we
propose a method for annotating all temporal relations -- including
long-distance ones -- which automates the process, hence reducing time and
manual effort on the part of annotators. The result is a new dataset, the
TIMELINE corpus, in which improved inter-annotator agreement was obtained, in
comparison with previously reported temporal relation datasets. We report the
results of training and evaluating baseline temporal relation extraction models
on the new corpus, and compare them with results obtained on the widely used
MATRES corpus.
Related papers
- More than Classification: A Unified Framework for Event Temporal
Relation Extraction [61.44799147458621]
Event temporal relation extraction(ETRE) is usually formulated as a multi-label classification task.
We observe that all relations can be interpreted using the start and end time points of events.
We propose a unified event temporal relation extraction framework, which transforms temporal relations into logical expressions of time points.
arXiv Detail & Related papers (2023-05-28T02:09:08Z) - Relational Sentence Embedding for Flexible Semantic Matching [86.21393054423355]
We present Sentence Embedding (RSE), a new paradigm to discover further the potential of sentence embeddings.
RSE is effective and flexible in modeling sentence relations and outperforms a series of state-of-the-art embedding methods.
arXiv Detail & Related papers (2022-12-17T05:25:17Z) - MAVEN-ERE: A Unified Large-scale Dataset for Event Coreference,
Temporal, Causal, and Subevent Relation Extraction [78.61546292830081]
We construct a large-scale human-annotated ERE dataset MAVEN-ERE with improved annotation schemes.
It contains 103,193 event coreference chains, 1,216,217 temporal relations, 57,992 causal relations, and 15,841 subevent relations.
Experiments show that ERE on MAVEN-ERE is quite challenging, and considering relation interactions with joint learning can improve performances.
arXiv Detail & Related papers (2022-11-14T13:34:49Z) - Extracting Temporal Event Relation with Syntactic-Guided Temporal Graph
Transformer [17.850316385809617]
We propose a new Temporal Graph Transformer network to explicitly find the connection between two events from a syntactic graph constructed from one or two continuous sentences.
Experiments on MATRES and TB-Dense datasets show that our approach significantly outperforms previous state-of-the-art methods on both end-to-end temporal relation extraction and temporal relation classification.
arXiv Detail & Related papers (2021-04-19T19:00:45Z) - One-shot Learning for Temporal Knowledge Graphs [49.41854171118697]
We propose a one-shot learning framework for link prediction in temporal knowledge graphs.
Our proposed method employs a self-attention mechanism to effectively encode temporal interactions between entities.
Our experiments show that the proposed algorithm outperforms the state of the art baselines for two well-studied benchmarks.
arXiv Detail & Related papers (2020-10-23T03:24:44Z) - Learning to Decouple Relations: Few-Shot Relation Classification with
Entity-Guided Attention and Confusion-Aware Training [49.9995628166064]
We propose CTEG, a model equipped with two mechanisms to learn to decouple easily-confused relations.
On the one hand, an EGA mechanism is introduced to guide the attention to filter out information causing confusion.
On the other hand, a Confusion-Aware Training (CAT) method is proposed to explicitly learn to distinguish relations.
arXiv Detail & Related papers (2020-10-21T11:07:53Z) - Joint Constrained Learning for Event-Event Relation Extraction [94.3499255880101]
We propose a joint constrained learning framework for modeling event-event relations.
Specifically, the framework enforces logical constraints within and across multiple temporal and subevent relations.
We show that our joint constrained learning approach effectively compensates for the lack of jointly labeled data.
arXiv Detail & Related papers (2020-10-13T22:45:28Z) - Predicting Event Time by Classifying Sub-Level Temporal Relations
Induced from a Unified Representation of Time Anchors [10.67457147373144]
We propose an effective method to decompose complex temporal relations into sub-level relations.
Our approach outperforms the state-of-the-art decision tree model.
arXiv Detail & Related papers (2020-08-14T16:30:07Z) - Temporal Information Extraction by Predicting Relative Time-lines [30.060559390097314]
We propose a new method to construct a linear time-line from a set of (extracted) temporal relations.
Within this paradigm, we propose two models that predict in linear complexity, and a new training loss using TimeML-style annotations.
arXiv Detail & Related papers (2018-08-28T16:46:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.