TeRo: A Time-aware Knowledge Graph Embedding via Temporal Rotation
- URL: http://arxiv.org/abs/2010.01029v2
- Date: Sat, 24 Oct 2020 22:42:26 GMT
- Title: TeRo: A Time-aware Knowledge Graph Embedding via Temporal Rotation
- Authors: Chengjin Xu, Mojtaba Nayyeri, Fouad Alkhoury, Hamed Shariat Yazdi,
Jens Lehmann
- Abstract summary: We present a new approach of TKG embedding, TeRo, which defines the temporal evolution of entity embedding.
We show our proposed model overcomes the limitations of the existing KG embedding models and TKG embedding models.
Experimental results on four different TKGs show that TeRo significantly outperforms existing state-of-the-art models for link prediction.
- Score: 12.138550487430807
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the last few years, there has been a surge of interest in learning
representations of entitiesand relations in knowledge graph (KG). However, the
recent availability of temporal knowledgegraphs (TKGs) that contain time
information for each fact created the need for reasoning overtime in such TKGs.
In this regard, we present a new approach of TKG embedding, TeRo, which defines
the temporal evolution of entity embedding as a rotation from the initial time
to the currenttime in the complex vector space. Specially, for facts involving
time intervals, each relation isrepresented as a pair of dual complex
embeddings to handle the beginning and the end of therelation, respectively. We
show our proposed model overcomes the limitations of the existing KG embedding
models and TKG embedding models and has the ability of learning and
inferringvarious relation patterns over time. Experimental results on four
different TKGs show that TeRo significantly outperforms existing
state-of-the-art models for link prediction. In addition, we analyze the effect
of time granularity on link prediction over TKGs, which as far as we know
hasnot been investigated in previous literature.
Related papers
- Learning Granularity Representation for Temporal Knowledge Graph Completion [2.689675451882683]
Temporal Knowledge Graphs (TKGs) incorporate temporal information to reflect the dynamic structural knowledge and evolutionary patterns of real-world facts.
This paper proposes textbfLearning textbfGranularity textbfRepresentation (termed $mathsfLGRe$) for TKG completion.
It comprises two main components: Granularity Learning (GRL) and Adaptive Granularity Balancing (AGB)
arXiv Detail & Related papers (2024-08-27T08:19:34Z) - Selective Temporal Knowledge Graph Reasoning [70.11788354442218]
Temporal Knowledge Graph (TKG) aims to predict future facts based on given historical ones.
Existing TKG reasoning models are unable to abstain from predictions they are uncertain.
We propose an abstention mechanism for TKG reasoning, which helps the existing models make selective, instead of indiscriminate, predictions.
arXiv Detail & Related papers (2024-04-02T06:56:21Z) - Learning Multi-graph Structure for Temporal Knowledge Graph Reasoning [3.3571415078869955]
This paper proposes an innovative reasoning approach that focuses on Learning Multi-graph Structure (LMS)
LMS incorporates an adaptive gate for merging entity representations both along and across timestamps effectively.
It also integrates timestamp semantics into graph attention calculations and time-aware decoders.
arXiv Detail & Related papers (2023-12-04T08:23:09Z) - A Survey on Temporal Knowledge Graph Completion: Taxonomy, Progress, and
Prospects [73.44022660932087]
temporal characteristics are prominently evident in a substantial volume of knowledge.
The continuous emergence of new knowledge, the weakness of the algorithm for extracting structured information from unstructured data, and the lack of information in the source dataset are cited.
The task of Temporal Knowledge Graph Completion (TKGC) has attracted increasing attention, aiming to predict missing items based on the available information.
arXiv Detail & Related papers (2023-08-04T16:49:54Z) - Meta-Learning Based Knowledge Extrapolation for Temporal Knowledge Graph [4.103806361930888]
Temporal KGs (TKGs) extend traditional Knowledge Graphs by associating static triples with timestamps forming quadruples.
We propose a Meta-Learning based Temporal Knowledge Graph Extrapolation (MTKGE) model, which is trained on link prediction tasks sampled from the existing TKGs.
We show that MTKGE consistently outperforms both the existing state-of-the-art models for knowledge graph extrapolation.
arXiv Detail & Related papers (2023-02-11T09:52:26Z) - A Survey of Knowledge Graph Reasoning on Graph Types: Static, Dynamic,
and Multimodal [57.8455911689554]
Knowledge graph reasoning (KGR) aims to deduce new facts from existing facts based on mined logic rules underlying knowledge graphs (KGs)
It has been proven to significantly benefit the usage of KGs in many AI applications, such as question answering, recommendation systems, and etc.
arXiv Detail & Related papers (2022-12-12T08:40:04Z) - Learning Meta Representations of One-shot Relations for Temporal
Knowledge Graph Link Prediction [33.36701435886095]
Few-shot relational learning for static knowledge graphs (KGs) has drawn greater interest in recent years.
TKGs contain rich temporal information, thus requiring temporal reasoning techniques for modeling.
This poses a greater challenge in learning few-shot relations in the temporal context.
arXiv Detail & Related papers (2022-05-21T15:17:52Z) - Temporal Knowledge Graph Reasoning Based on Evolutional Representation
Learning [59.004025528223025]
Key to predict future facts is to thoroughly understand the historical facts.
A TKG is actually a sequence of KGs corresponding to different timestamps.
We propose a novel Recurrent Evolution network based on Graph Convolution Network (GCN)
arXiv Detail & Related papers (2021-04-21T05:12:21Z) - Exploring the Limits of Few-Shot Link Prediction in Knowledge Graphs [49.6661602019124]
We study a spectrum of models derived by generalizing the current state of the art for few-shot link prediction.
We find that a simple zero-shot baseline - which ignores any relation-specific information - achieves surprisingly strong performance.
Experiments on carefully crafted synthetic datasets show that having only a few examples of a relation fundamentally limits models from using fine-grained structural information.
arXiv Detail & Related papers (2021-02-05T21:04:31Z) - T-GAP: Learning to Walk across Time for Temporal Knowledge Graph
Completion [13.209193437124881]
Temporal knowledge graphs (TKGs) inherently reflect the transient nature of real-world knowledge, as opposed to static knowledge graphs.
We propose T-GAP, a novel model for TKG completion that maximally utilizes both temporal information and graph structure in its encoder and decoder.
Our experiments demonstrate that T-GAP achieves superior performance against state-of-the-art baselines, and competently generalizes to queries with unseen timestamps.
arXiv Detail & Related papers (2020-12-19T04:45:32Z) - TeMP: Temporal Message Passing for Temporal Knowledge Graph Completion [45.588053447288566]
Inferring missing facts in temporal knowledge graphs (TKGs) is a fundamental and challenging task.
We propose the Temporal Message Passing (TeMP) framework to address these challenges by combining graph neural networks, temporal dynamics models, data imputation and frequency-based gating techniques.
arXiv Detail & Related papers (2020-10-07T17:11:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.