T-GAP: Learning to Walk across Time for Temporal Knowledge Graph
Completion
- URL: http://arxiv.org/abs/2012.10595v1
- Date: Sat, 19 Dec 2020 04:45:32 GMT
- Title: T-GAP: Learning to Walk across Time for Temporal Knowledge Graph
Completion
- Authors: Jaehun Jung, Jinhong Jung, U Kang
- Abstract summary: Temporal knowledge graphs (TKGs) inherently reflect the transient nature of real-world knowledge, as opposed to static knowledge graphs.
We propose T-GAP, a novel model for TKG completion that maximally utilizes both temporal information and graph structure in its encoder and decoder.
Our experiments demonstrate that T-GAP achieves superior performance against state-of-the-art baselines, and competently generalizes to queries with unseen timestamps.
- Score: 13.209193437124881
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Temporal knowledge graphs (TKGs) inherently reflect the transient nature of
real-world knowledge, as opposed to static knowledge graphs. Naturally,
automatic TKG completion has drawn much research interests for a more realistic
modeling of relational reasoning. However, most of the existing mod-els for TKG
completion extend static KG embeddings that donot fully exploit TKG structure,
thus lacking in 1) account-ing for temporally relevant events already residing
in the lo-cal neighborhood of a query, and 2) path-based inference that
facilitates multi-hop reasoning and better interpretability. In this paper, we
propose T-GAP, a novel model for TKG completion that maximally utilizes both
temporal information and graph structure in its encoder and decoder. T-GAP
encodes query-specific substructure of TKG by focusing on the temporal
displacement between each event and the query times-tamp, and performs
path-based inference by propagating attention through the graph. Our empirical
experiments demonstrate that T-GAP not only achieves superior performance
against state-of-the-art baselines, but also competently generalizes to queries
with unseen timestamps. Through extensive qualitative analyses, we also show
that T-GAP enjoys from transparent interpretability, and follows human
intuition in its reasoning process.
Related papers
- Learning Granularity Representation for Temporal Knowledge Graph Completion [2.689675451882683]
Temporal Knowledge Graphs (TKGs) incorporate temporal information to reflect the dynamic structural knowledge and evolutionary patterns of real-world facts.
This paper proposes textbfLearning textbfGranularity textbfRepresentation (termed $mathsfLGRe$) for TKG completion.
It comprises two main components: Granularity Learning (GRL) and Adaptive Granularity Balancing (AGB)
arXiv Detail & Related papers (2024-08-27T08:19:34Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - Meta-Learning Based Knowledge Extrapolation for Temporal Knowledge Graph [4.103806361930888]
Temporal KGs (TKGs) extend traditional Knowledge Graphs by associating static triples with timestamps forming quadruples.
We propose a Meta-Learning based Temporal Knowledge Graph Extrapolation (MTKGE) model, which is trained on link prediction tasks sampled from the existing TKGs.
We show that MTKGE consistently outperforms both the existing state-of-the-art models for knowledge graph extrapolation.
arXiv Detail & Related papers (2023-02-11T09:52:26Z) - Search to Pass Messages for Temporal Knowledge Graph Completion [97.40256786473516]
We propose to use neural architecture search (NAS) to design data-specific message passing architecture for temporal knowledge graphs (TKGs) completion.
In particular, we develop a generalized framework to explore topological and temporal information in TKGs.
We adopt a search algorithm, which trains a supernet structure by sampling single path for efficient search with less cost.
arXiv Detail & Related papers (2022-10-30T04:05:06Z) - DyTed: Disentangled Representation Learning for Discrete-time Dynamic
Graph [59.583555454424]
We propose a novel disenTangled representation learning framework for discrete-time Dynamic graphs, namely DyTed.
We specially design a temporal-clips contrastive learning task together with a structure contrastive learning to effectively identify the time-invariant and time-varying representations respectively.
arXiv Detail & Related papers (2022-10-19T14:34:12Z) - ExpressivE: A Spatio-Functional Embedding For Knowledge Graph Completion [78.8942067357231]
ExpressivE embeds pairs of entities as points and relations as hyper-parallelograms in the virtual triple space.
We show that ExpressivE is competitive with state-of-the-art KGEs and even significantly outperforms them on W18RR.
arXiv Detail & Related papers (2022-06-08T23:34:39Z) - Learning Meta Representations of One-shot Relations for Temporal
Knowledge Graph Link Prediction [33.36701435886095]
Few-shot relational learning for static knowledge graphs (KGs) has drawn greater interest in recent years.
TKGs contain rich temporal information, thus requiring temporal reasoning techniques for modeling.
This poses a greater challenge in learning few-shot relations in the temporal context.
arXiv Detail & Related papers (2022-05-21T15:17:52Z) - Temporal Knowledge Graph Reasoning Based on Evolutional Representation
Learning [59.004025528223025]
Key to predict future facts is to thoroughly understand the historical facts.
A TKG is actually a sequence of KGs corresponding to different timestamps.
We propose a novel Recurrent Evolution network based on Graph Convolution Network (GCN)
arXiv Detail & Related papers (2021-04-21T05:12:21Z) - TeMP: Temporal Message Passing for Temporal Knowledge Graph Completion [45.588053447288566]
Inferring missing facts in temporal knowledge graphs (TKGs) is a fundamental and challenging task.
We propose the Temporal Message Passing (TeMP) framework to address these challenges by combining graph neural networks, temporal dynamics models, data imputation and frequency-based gating techniques.
arXiv Detail & Related papers (2020-10-07T17:11:53Z) - TeRo: A Time-aware Knowledge Graph Embedding via Temporal Rotation [12.138550487430807]
We present a new approach of TKG embedding, TeRo, which defines the temporal evolution of entity embedding.
We show our proposed model overcomes the limitations of the existing KG embedding models and TKG embedding models.
Experimental results on four different TKGs show that TeRo significantly outperforms existing state-of-the-art models for link prediction.
arXiv Detail & Related papers (2020-10-02T14:35:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.