Temporal Information Extraction by Predicting Relative Time-lines
- URL: http://arxiv.org/abs/1808.09401v2
- Date: Thu, 30 Nov 2023 09:58:01 GMT
- Title: Temporal Information Extraction by Predicting Relative Time-lines
- Authors: Artuur Leeuwenberg, Marie-Francine Moens
- Abstract summary: We propose a new method to construct a linear time-line from a set of (extracted) temporal relations.
Within this paradigm, we propose two models that predict in linear complexity, and a new training loss using TimeML-style annotations.
- Score: 30.060559390097314
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The current leading paradigm for temporal information extraction from text
consists of three phases: (1) recognition of events and temporal expressions,
(2) recognition of temporal relations among them, and (3) time-line
construction from the temporal relations. In contrast to the first two phases,
the last phase, time-line construction, received little attention and is the
focus of this work. In this paper, we propose a new method to construct a
linear time-line from a set of (extracted) temporal relations. But more
importantly, we propose a novel paradigm in which we directly predict start and
end-points for events from the text, constituting a time-line without going
through the intermediate step of prediction of temporal relations as in earlier
work. Within this paradigm, we propose two models that predict in linear
complexity, and a new training loss using TimeML-style annotations, yielding
promising results.
Related papers
- Learning Temporal Distances: Contrastive Successor Features Can Provide a Metric Structure for Decision-Making [66.27188304203217]
Temporal distances lie at the heart of many algorithms for planning, control, and reinforcement learning.
Prior attempts to define such temporal distances in settings have been stymied by an important limitation.
We show how successor features learned by contrastive learning form a temporal distance that does satisfy the triangle inequality.
arXiv Detail & Related papers (2024-06-24T19:36:45Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - TIMELINE: Exhaustive Annotation of Temporal Relations Supporting the
Automatic Ordering of Events in News Articles [4.314956204483074]
This paper presents a new annotation scheme that clearly defines the criteria based on which temporal relations should be annotated.
We also propose a method for annotating all temporal relations -- including long-distance ones -- which automates the process.
The result is a new dataset, the TIMELINE corpus, in which improved inter-annotator agreement was obtained.
arXiv Detail & Related papers (2023-10-26T22:23:38Z) - Revisiting the Temporal Modeling in Spatio-Temporal Predictive Learning
under A Unified View [73.73667848619343]
We introduce USTEP (Unified S-TEmporal Predictive learning), an innovative framework that reconciles the recurrent-based and recurrent-free methods by integrating both micro-temporal and macro-temporal scales.
arXiv Detail & Related papers (2023-10-09T16:17:42Z) - Temporal Smoothness Regularisers for Neural Link Predictors [8.975480841443272]
We show that a simple method like TNTComplEx can produce significantly more accurate results than state-of-the-art methods.
We also evaluate the impact of a wide range of temporal smoothing regularisers on two state-of-the-art temporal link prediction models.
arXiv Detail & Related papers (2023-09-16T16:52:49Z) - Direct Embedding of Temporal Network Edges via Time-Decayed Line Graphs [51.51417735550026]
Methods for machine learning on temporal networks generally exhibit at least one of two limitations.
We present a simple method that avoids both shortcomings: construct the line graph of the network, which includes a node for each interaction, and weigh the edges of this graph based on the difference in time between interactions.
Empirical results on real-world networks demonstrate our method's efficacy and efficiency on both edge classification and temporal link prediction.
arXiv Detail & Related papers (2022-09-30T18:24:13Z) - Temporal Relation Extraction with a Graph-Based Deep Biaffine Attention
Model [0.0]
We propose a novel temporal information extraction model based on deep biaffine attention.
We experimentally demonstrate that our model achieves state-of-the-art performance in temporal relation extraction.
arXiv Detail & Related papers (2022-01-16T19:40:08Z) - Extracting Temporal Event Relation with Syntactic-Guided Temporal Graph
Transformer [17.850316385809617]
We propose a new Temporal Graph Transformer network to explicitly find the connection between two events from a syntactic graph constructed from one or two continuous sentences.
Experiments on MATRES and TB-Dense datasets show that our approach significantly outperforms previous state-of-the-art methods on both end-to-end temporal relation extraction and temporal relation classification.
arXiv Detail & Related papers (2021-04-19T19:00:45Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.