Temporal Relation Extraction with a Graph-Based Deep Biaffine Attention
Model
- URL: http://arxiv.org/abs/2201.06125v1
- Date: Sun, 16 Jan 2022 19:40:08 GMT
- Title: Temporal Relation Extraction with a Graph-Based Deep Biaffine Attention
Model
- Authors: Bo-Ying Su, Shang-Ling Hsu, Kuan-Yin Lai, Amarnath Gupta
- Abstract summary: We propose a novel temporal information extraction model based on deep biaffine attention.
We experimentally demonstrate that our model achieves state-of-the-art performance in temporal relation extraction.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Temporal information extraction plays a critical role in natural language
understanding. Previous systems have incorporated advanced neural language
models and have successfully enhanced the accuracy of temporal information
extraction tasks. However, these systems have two major shortcomings. First,
they fail to make use of the two-sided nature of temporal relations in
prediction. Second, they involve non-parallelizable pipelines in inference
process that bring little performance gain. To this end, we propose a novel
temporal information extraction model based on deep biaffine attention to
extract temporal relationships between events in unstructured text efficiently
and accurately. Our model is performant because we perform relation extraction
tasks directly instead of considering event annotation as a prerequisite of
relation extraction. Moreover, our architecture uses Multilayer Perceptrons
(MLP) with biaffine attention to predict arcs and relation labels separately,
improving relation detecting accuracy by exploiting the two-sided nature of
temporal relationships. We experimentally demonstrate that our model achieves
state-of-the-art performance in temporal relation extraction.
Related papers
- TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Re-Temp: Relation-Aware Temporal Representation Learning for Temporal
Knowledge Graph Completion [11.699431017532367]
Temporal Knowledge Graph Completion (TKGC) under the extrapolation setting aims to predict the missing entity from a fact in the future.
We propose our model, Re-Temp, which leverages explicit temporal embedding as input and incorporates skip information flow after each timestamp to skip unnecessary information for prediction.
We demonstrate that our model outperforms all eight recent state-of-the-art models by a significant margin.
arXiv Detail & Related papers (2023-10-24T10:58:33Z) - Temporal Smoothness Regularisers for Neural Link Predictors [8.975480841443272]
We show that a simple method like TNTComplEx can produce significantly more accurate results than state-of-the-art methods.
We also evaluate the impact of a wide range of temporal smoothing regularisers on two state-of-the-art temporal link prediction models.
arXiv Detail & Related papers (2023-09-16T16:52:49Z) - TC-GAT: Graph Attention Network for Temporal Causality Discovery [6.974417592057705]
Causality is frequently intertwined with temporal elements, as the progression from cause to effect is not instantaneous but rather ensconced in a temporal dimension.
We propose a method for extracting causality from the text that integrates both temporal and causal relations.
We present a novel model, TC-GAT, which employs a graph attention mechanism to assign weights to the temporal relationships and leverages a causal knowledge graph to determine the adjacency matrix.
arXiv Detail & Related papers (2023-04-21T02:26:42Z) - Extracting or Guessing? Improving Faithfulness of Event Temporal
Relation Extraction [87.04153383938969]
We improve the faithfulness of TempRel extraction models from two perspectives.
The first perspective is to extract genuinely based on contextual description.
The second perspective is to provide proper uncertainty estimation.
arXiv Detail & Related papers (2022-10-10T19:53:13Z) - Temporal Relevance Analysis for Video Action Models [70.39411261685963]
We first propose a new approach to quantify the temporal relationships between frames captured by CNN-based action models.
We then conduct comprehensive experiments and in-depth analysis to provide a better understanding of how temporal modeling is affected.
arXiv Detail & Related papers (2022-04-25T19:06:48Z) - SAIS: Supervising and Augmenting Intermediate Steps for Document-Level
Relation Extraction [51.27558374091491]
We propose to explicitly teach the model to capture relevant contexts and entity types by supervising and augmenting intermediate steps (SAIS) for relation extraction.
Based on a broad spectrum of carefully designed tasks, our proposed SAIS method not only extracts relations of better quality due to more effective supervision, but also retrieves the corresponding supporting evidence more accurately.
arXiv Detail & Related papers (2021-09-24T17:37:35Z) - Extracting Temporal Event Relation with Syntactic-Guided Temporal Graph
Transformer [17.850316385809617]
We propose a new Temporal Graph Transformer network to explicitly find the connection between two events from a syntactic graph constructed from one or two continuous sentences.
Experiments on MATRES and TB-Dense datasets show that our approach significantly outperforms previous state-of-the-art methods on both end-to-end temporal relation extraction and temporal relation classification.
arXiv Detail & Related papers (2021-04-19T19:00:45Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - One-shot Learning for Temporal Knowledge Graphs [49.41854171118697]
We propose a one-shot learning framework for link prediction in temporal knowledge graphs.
Our proposed method employs a self-attention mechanism to effectively encode temporal interactions between entities.
Our experiments show that the proposed algorithm outperforms the state of the art baselines for two well-studied benchmarks.
arXiv Detail & Related papers (2020-10-23T03:24:44Z) - Domain Knowledge Empowered Structured Neural Net for End-to-End Event
Temporal Relation Extraction [44.95973272921582]
We propose a framework that enhances deep neural network with distributional constraints constructed by probabilistic domain knowledge.
We solve the constrained inference problem via Lagrangian Relaxation and apply it on end-to-end event temporal relation extraction tasks.
arXiv Detail & Related papers (2020-09-15T22:20:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.