Extracting Temporal Event Relation with Syntactic-Guided Temporal Graph
Transformer
- URL: http://arxiv.org/abs/2104.09570v1
- Date: Mon, 19 Apr 2021 19:00:45 GMT
- Title: Extracting Temporal Event Relation with Syntactic-Guided Temporal Graph
Transformer
- Authors: Shuaicheng Zhang, Lifu Huang, Qiang Ning
- Abstract summary: We propose a new Temporal Graph Transformer network to explicitly find the connection between two events from a syntactic graph constructed from one or two continuous sentences.
Experiments on MATRES and TB-Dense datasets show that our approach significantly outperforms previous state-of-the-art methods on both end-to-end temporal relation extraction and temporal relation classification.
- Score: 17.850316385809617
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Extracting temporal relations (e.g., before, after, concurrent) among events
is crucial to natural language understanding. Previous studies mainly rely on
neural networks to learn effective features or manual-crafted linguistic
features for temporal relation extraction, which usually fail when the context
between two events is complex or wide. Inspired by the examination of available
temporal relation annotations and human-like cognitive procedures, we propose a
new Temporal Graph Transformer network to (1) explicitly find the connection
between two events from a syntactic graph constructed from one or two
continuous sentences, and (2) automatically locate the most indicative temporal
cues from the path of the two event mentions as well as their surrounding
concepts in the syntactic graph with a new temporal-oriented attention
mechanism. Experiments on MATRES and TB-Dense datasets show that our approach
significantly outperforms previous state-of-the-art methods on both end-to-end
temporal relation extraction and temporal relation classification.
Related papers
- TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - TIMELINE: Exhaustive Annotation of Temporal Relations Supporting the
Automatic Ordering of Events in News Articles [4.314956204483074]
This paper presents a new annotation scheme that clearly defines the criteria based on which temporal relations should be annotated.
We also propose a method for annotating all temporal relations -- including long-distance ones -- which automates the process.
The result is a new dataset, the TIMELINE corpus, in which improved inter-annotator agreement was obtained.
arXiv Detail & Related papers (2023-10-26T22:23:38Z) - Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - Exploring the Limits of Historical Information for Temporal Knowledge
Graph Extrapolation [59.417443739208146]
We propose a new event forecasting model based on a novel training framework of historical contrastive learning.
CENET learns both the historical and non-historical dependency to distinguish the most potential entities.
We evaluate our proposed model on five benchmark graphs.
arXiv Detail & Related papers (2023-08-29T03:26:38Z) - Relational Temporal Graph Reasoning for Dual-task Dialogue Language
Understanding [39.76268402567324]
Dual-task dialog understanding language aims to tackle two correlative dialog language understanding tasks simultaneously via their inherent correlations.
We put forward a new framework, whose core is relational temporal graph reasoning.
Our models outperform state-of-the-art models by a large margin.
arXiv Detail & Related papers (2023-06-15T13:19:08Z) - Temporal Relation Extraction with a Graph-Based Deep Biaffine Attention
Model [0.0]
We propose a novel temporal information extraction model based on deep biaffine attention.
We experimentally demonstrate that our model achieves state-of-the-art performance in temporal relation extraction.
arXiv Detail & Related papers (2022-01-16T19:40:08Z) - An Empirical Study: Extensive Deep Temporal Point Process [23.9359814366167]
We first review recent research emphasis and difficulties in modeling asynchronous event sequences with deep temporal pointprocess.
We propose a Granger causality discovery framework for exploiting the relations among multi-types of events.
arXiv Detail & Related papers (2021-10-19T10:15:00Z) - Extracting Event Temporal Relations via Hyperbolic Geometry [18.068466562913923]
We introduce two approaches to encode events and their temporal relations in hyperbolic spaces.
One approach leverages hyperbolic embeddings to directly infer event relations through simple geometrical operations.
In the second one, we devise an end-to-end architecture composed of hyperbolic neural units tailored for the temporal relation extraction task.
arXiv Detail & Related papers (2021-09-12T14:40:13Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - One-shot Learning for Temporal Knowledge Graphs [49.41854171118697]
We propose a one-shot learning framework for link prediction in temporal knowledge graphs.
Our proposed method employs a self-attention mechanism to effectively encode temporal interactions between entities.
Our experiments show that the proposed algorithm outperforms the state of the art baselines for two well-studied benchmarks.
arXiv Detail & Related papers (2020-10-23T03:24:44Z) - Temporal Embeddings and Transformer Models for Narrative Text
Understanding [72.88083067388155]
We present two approaches to narrative text understanding for character relationship modelling.
The temporal evolution of these relations is described by dynamic word embeddings, that are designed to learn semantic changes over time.
A supervised learning approach based on the state-of-the-art transformer model BERT is used instead to detect static relations between characters.
arXiv Detail & Related papers (2020-03-19T14:23:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.