Re-Temp: Relation-Aware Temporal Representation Learning for Temporal
Knowledge Graph Completion
- URL: http://arxiv.org/abs/2310.15722v1
- Date: Tue, 24 Oct 2023 10:58:33 GMT
- Title: Re-Temp: Relation-Aware Temporal Representation Learning for Temporal
Knowledge Graph Completion
- Authors: Kunze Wang, Soyeon Caren Han, Josiah Poon
- Abstract summary: Temporal Knowledge Graph Completion (TKGC) under the extrapolation setting aims to predict the missing entity from a fact in the future.
We propose our model, Re-Temp, which leverages explicit temporal embedding as input and incorporates skip information flow after each timestamp to skip unnecessary information for prediction.
We demonstrate that our model outperforms all eight recent state-of-the-art models by a significant margin.
- Score: 11.699431017532367
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Temporal Knowledge Graph Completion (TKGC) under the extrapolation setting
aims to predict the missing entity from a fact in the future, posing a
challenge that aligns more closely with real-world prediction problems.
Existing research mostly encodes entities and relations using sequential graph
neural networks applied to recent snapshots. However, these approaches tend to
overlook the ability to skip irrelevant snapshots according to entity-related
relations in the query and disregard the importance of explicit temporal
information. To address this, we propose our model, Re-Temp (Relation-Aware
Temporal Representation Learning), which leverages explicit temporal embedding
as input and incorporates skip information flow after each timestamp to skip
unnecessary information for prediction. Additionally, we introduce a two-phase
forward propagation method to prevent information leakage. Through the
evaluation on six TKGC (extrapolation) datasets, we demonstrate that our model
outperforms all eight recent state-of-the-art models by a significant margin.
Related papers
- TempME: Towards the Explainability of Temporal Graph Neural Networks via
Motif Discovery [15.573944320072284]
We propose TempME, which uncovers the most pivotal temporal motifs guiding the prediction of temporal graph neural networks (TGNNs)
TempME extracts the most interaction-related motifs while minimizing the amount of contained information to preserve the sparsity and succinctness of the explanation.
Experiments validate the superiority of TempME, with up to 8.21% increase in terms of explanation accuracy across six real-world datasets and up to 22.96% increase in boosting the prediction Average Precision of current TGNNs.
arXiv Detail & Related papers (2023-10-30T07:51:41Z) - Temporal Smoothness Regularisers for Neural Link Predictors [8.975480841443272]
We show that a simple method like TNTComplEx can produce significantly more accurate results than state-of-the-art methods.
We also evaluate the impact of a wide range of temporal smoothing regularisers on two state-of-the-art temporal link prediction models.
arXiv Detail & Related papers (2023-09-16T16:52:49Z) - Exploring the Limits of Historical Information for Temporal Knowledge
Graph Extrapolation [59.417443739208146]
We propose a new event forecasting model based on a novel training framework of historical contrastive learning.
CENET learns both the historical and non-historical dependency to distinguish the most potential entities.
We evaluate our proposed model on five benchmark graphs.
arXiv Detail & Related papers (2023-08-29T03:26:38Z) - TempSAL -- Uncovering Temporal Information for Deep Saliency Prediction [64.63645677568384]
We introduce a novel saliency prediction model that learns to output saliency maps in sequential time intervals.
Our approach locally modulates the saliency predictions by combining the learned temporal maps.
Our code will be publicly available on GitHub.
arXiv Detail & Related papers (2023-01-05T22:10:16Z) - Temporal Relevance Analysis for Video Action Models [70.39411261685963]
We first propose a new approach to quantify the temporal relationships between frames captured by CNN-based action models.
We then conduct comprehensive experiments and in-depth analysis to provide a better understanding of how temporal modeling is affected.
arXiv Detail & Related papers (2022-04-25T19:06:48Z) - Temporal Knowledge Graph Reasoning with Low-rank and Model-agnostic
Representations [1.8262547855491458]
We introduce Time-LowFER, a family of parameter-efficient and time-aware extensions of the low-rank tensor factorization model LowFER.
Noting several limitations in current approaches to represent time, we propose a cycle-aware time-encoding scheme for time features.
We implement our methods in a unified temporal knowledge graph embedding framework, focusing on time-sensitive data processing.
arXiv Detail & Related papers (2022-04-10T22:24:11Z) - Temporal Relation Extraction with a Graph-Based Deep Biaffine Attention
Model [0.0]
We propose a novel temporal information extraction model based on deep biaffine attention.
We experimentally demonstrate that our model achieves state-of-the-art performance in temporal relation extraction.
arXiv Detail & Related papers (2022-01-16T19:40:08Z) - Exploring the Limits of Few-Shot Link Prediction in Knowledge Graphs [49.6661602019124]
We study a spectrum of models derived by generalizing the current state of the art for few-shot link prediction.
We find that a simple zero-shot baseline - which ignores any relation-specific information - achieves surprisingly strong performance.
Experiments on carefully crafted synthetic datasets show that having only a few examples of a relation fundamentally limits models from using fine-grained structural information.
arXiv Detail & Related papers (2021-02-05T21:04:31Z) - One-shot Learning for Temporal Knowledge Graphs [49.41854171118697]
We propose a one-shot learning framework for link prediction in temporal knowledge graphs.
Our proposed method employs a self-attention mechanism to effectively encode temporal interactions between entities.
Our experiments show that the proposed algorithm outperforms the state of the art baselines for two well-studied benchmarks.
arXiv Detail & Related papers (2020-10-23T03:24:44Z) - Software Engineering Event Modeling using Relative Time in Temporal
Knowledge Graphs [15.22542676866305]
We present a multi-relational temporal knowledge graph based on the daily interactions between artifacts in GitHub.
We introduce two new datasets for i) interpolated time-conditioned link prediction and ii) extrapolated time-conditioned link/time prediction queries.
Our experiments on these datasets highlight the potential of adapting knowledge graphs to answer broad software engineering questions.
arXiv Detail & Related papers (2020-07-02T16:28:43Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.