Temporal Information Extraction by Predicting Relative Time-lines
- URL: http://arxiv.org/abs/1808.09401v2
- Date: Thu, 30 Nov 2023 09:58:01 GMT
- Title: Temporal Information Extraction by Predicting Relative Time-lines
- Authors: Artuur Leeuwenberg, Marie-Francine Moens
- Abstract summary: We propose a new method to construct a linear time-line from a set of (extracted) temporal relations.
Within this paradigm, we propose two models that predict in linear complexity, and a new training loss using TimeML-style annotations.
- Score: 30.060559390097314
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The current leading paradigm for temporal information extraction from text
consists of three phases: (1) recognition of events and temporal expressions,
(2) recognition of temporal relations among them, and (3) time-line
construction from the temporal relations. In contrast to the first two phases,
the last phase, time-line construction, received little attention and is the
focus of this work. In this paper, we propose a new method to construct a
linear time-line from a set of (extracted) temporal relations. But more
importantly, we propose a novel paradigm in which we directly predict start and
end-points for events from the text, constituting a time-line without going
through the intermediate step of prediction of temporal relations as in earlier
work. Within this paradigm, we propose two models that predict in linear
complexity, and a new training loss using TimeML-style annotations, yielding
promising results.
Related papers
- Relational Conformal Prediction for Correlated Time Series [56.59852921638328]
We propose a novel distribution-free approach based on conformal prediction framework and quantile regression.
We fill this void by introducing a novel conformal prediction method based on graph deep learning operators.
Our approach provides accurate coverage and archives state-of-the-art uncertainty quantification in relevant benchmarks.
arXiv Detail & Related papers (2025-02-13T16:12:17Z) - Temporal reasoning for timeline summarisation in social media [9.475065787773017]
We introduce NarrativeReason, a novel dataset focused on temporal relationships among sequential events within narratives.
We then combine temporal reasoning with timeline summarisation through a knowledge distillation framework.
Experimental results demonstrate that our model achieves superior performance on out-of-domain mental health-related timeline summarisation tasks.
arXiv Detail & Related papers (2024-12-30T21:54:33Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - TIMELINE: Exhaustive Annotation of Temporal Relations Supporting the
Automatic Ordering of Events in News Articles [4.314956204483074]
This paper presents a new annotation scheme that clearly defines the criteria based on which temporal relations should be annotated.
We also propose a method for annotating all temporal relations -- including long-distance ones -- which automates the process.
The result is a new dataset, the TIMELINE corpus, in which improved inter-annotator agreement was obtained.
arXiv Detail & Related papers (2023-10-26T22:23:38Z) - Temporal Smoothness Regularisers for Neural Link Predictors [8.975480841443272]
We show that a simple method like TNTComplEx can produce significantly more accurate results than state-of-the-art methods.
We also evaluate the impact of a wide range of temporal smoothing regularisers on two state-of-the-art temporal link prediction models.
arXiv Detail & Related papers (2023-09-16T16:52:49Z) - Direct Embedding of Temporal Network Edges via Time-Decayed Line Graphs [51.51417735550026]
Methods for machine learning on temporal networks generally exhibit at least one of two limitations.
We present a simple method that avoids both shortcomings: construct the line graph of the network, which includes a node for each interaction, and weigh the edges of this graph based on the difference in time between interactions.
Empirical results on real-world networks demonstrate our method's efficacy and efficiency on both edge classification and temporal link prediction.
arXiv Detail & Related papers (2022-09-30T18:24:13Z) - Temporal Relation Extraction with a Graph-Based Deep Biaffine Attention
Model [0.0]
We propose a novel temporal information extraction model based on deep biaffine attention.
We experimentally demonstrate that our model achieves state-of-the-art performance in temporal relation extraction.
arXiv Detail & Related papers (2022-01-16T19:40:08Z) - An Empirical Study: Extensive Deep Temporal Point Process [61.14164208094238]
We first review recent research emphasis and difficulties in modeling asynchronous event sequences with deep temporal point process.
We propose a Granger causality discovery framework for exploiting the relations among multi-types of events.
arXiv Detail & Related papers (2021-10-19T10:15:00Z) - Extracting Temporal Event Relation with Syntactic-Guided Temporal Graph
Transformer [17.850316385809617]
We propose a new Temporal Graph Transformer network to explicitly find the connection between two events from a syntactic graph constructed from one or two continuous sentences.
Experiments on MATRES and TB-Dense datasets show that our approach significantly outperforms previous state-of-the-art methods on both end-to-end temporal relation extraction and temporal relation classification.
arXiv Detail & Related papers (2021-04-19T19:00:45Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.