Learning Meta Representations of One-shot Relations for Temporal
Knowledge Graph Link Prediction
- URL: http://arxiv.org/abs/2205.10621v2
- Date: Wed, 24 May 2023 17:01:36 GMT
- Title: Learning Meta Representations of One-shot Relations for Temporal
Knowledge Graph Link Prediction
- Authors: Zifeng Ding, Bailan He, Yunpu Ma, Zhen Han, Volker Tresp
- Abstract summary: Few-shot relational learning for static knowledge graphs (KGs) has drawn greater interest in recent years.
TKGs contain rich temporal information, thus requiring temporal reasoning techniques for modeling.
This poses a greater challenge in learning few-shot relations in the temporal context.
- Score: 33.36701435886095
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Few-shot relational learning for static knowledge graphs (KGs) has drawn
greater interest in recent years, while few-shot learning for temporal
knowledge graphs (TKGs) has hardly been studied. Compared to KGs, TKGs contain
rich temporal information, thus requiring temporal reasoning techniques for
modeling. This poses a greater challenge in learning few-shot relations in the
temporal context. In this paper, we follow the previous work that focuses on
few-shot relational learning on static KGs and extend two fundamental TKG
reasoning tasks, i.e., interpolated and extrapolated link prediction, to the
one-shot setting. We propose four new large-scale benchmark datasets and
develop a TKG reasoning model for learning one-shot relations in TKGs.
Experimental results show that our model can achieve superior performance on
all datasets in both TKG link prediction tasks.
Related papers
- Graph Stochastic Neural Process for Inductive Few-shot Knowledge Graph Completion [63.68647582680998]
We focus on a task called inductive few-shot knowledge graph completion (I-FKGC)
Inspired by the idea of inductive reasoning, we cast I-FKGC as an inductive reasoning problem.
We present a neural process-based hypothesis extractor that models the joint distribution of hypothesis, from which we can sample a hypothesis for predictions.
In the second module, based on the hypothesis, we propose a graph attention-based predictor to test if the triple in the query set aligns with the extracted hypothesis.
arXiv Detail & Related papers (2024-08-03T13:37:40Z) - zrLLM: Zero-Shot Relational Learning on Temporal Knowledge Graphs with Large Language Models [33.10218179341504]
We use large language models to generate relation representations for embedding-based TKGF methods.
We show that our approach helps TKGF models to achieve much better performance in forecasting the facts with previously unseen relations.
arXiv Detail & Related papers (2023-11-15T21:25:15Z) - Meta-Learning Based Knowledge Extrapolation for Temporal Knowledge Graph [4.103806361930888]
Temporal KGs (TKGs) extend traditional Knowledge Graphs by associating static triples with timestamps forming quadruples.
We propose a Meta-Learning based Temporal Knowledge Graph Extrapolation (MTKGE) model, which is trained on link prediction tasks sampled from the existing TKGs.
We show that MTKGE consistently outperforms both the existing state-of-the-art models for knowledge graph extrapolation.
arXiv Detail & Related papers (2023-02-11T09:52:26Z) - A Survey of Knowledge Graph Reasoning on Graph Types: Static, Dynamic,
and Multimodal [57.8455911689554]
Knowledge graph reasoning (KGR) aims to deduce new facts from existing facts based on mined logic rules underlying knowledge graphs (KGs)
It has been proven to significantly benefit the usage of KGs in many AI applications, such as question answering, recommendation systems, and etc.
arXiv Detail & Related papers (2022-12-12T08:40:04Z) - Few-Shot Inductive Learning on Temporal Knowledge Graphs using
Concept-Aware Information [31.10140298420744]
We propose a few-shot out-of-graph (OOG) link prediction task for temporal knowledge graphs (TKGs)
We predict the missing entities from the links concerning unseen entities by employing a meta-learning framework.
Our model achieves superior performance on all three datasets.
arXiv Detail & Related papers (2022-11-15T14:23:07Z) - Temporal Knowledge Graph Reasoning Based on Evolutional Representation
Learning [59.004025528223025]
Key to predict future facts is to thoroughly understand the historical facts.
A TKG is actually a sequence of KGs corresponding to different timestamps.
We propose a novel Recurrent Evolution network based on Graph Convolution Network (GCN)
arXiv Detail & Related papers (2021-04-21T05:12:21Z) - Learning Intents behind Interactions with Knowledge Graph for
Recommendation [93.08709357435991]
Knowledge graph (KG) plays an increasingly important role in recommender systems.
Existing GNN-based models fail to identify user-item relation at a fine-grained level of intents.
We propose a new model, Knowledge Graph-based Intent Network (KGIN)
arXiv Detail & Related papers (2021-02-14T03:21:36Z) - Exploring the Limits of Few-Shot Link Prediction in Knowledge Graphs [49.6661602019124]
We study a spectrum of models derived by generalizing the current state of the art for few-shot link prediction.
We find that a simple zero-shot baseline - which ignores any relation-specific information - achieves surprisingly strong performance.
Experiments on carefully crafted synthetic datasets show that having only a few examples of a relation fundamentally limits models from using fine-grained structural information.
arXiv Detail & Related papers (2021-02-05T21:04:31Z) - T-GAP: Learning to Walk across Time for Temporal Knowledge Graph
Completion [13.209193437124881]
Temporal knowledge graphs (TKGs) inherently reflect the transient nature of real-world knowledge, as opposed to static knowledge graphs.
We propose T-GAP, a novel model for TKG completion that maximally utilizes both temporal information and graph structure in its encoder and decoder.
Our experiments demonstrate that T-GAP achieves superior performance against state-of-the-art baselines, and competently generalizes to queries with unseen timestamps.
arXiv Detail & Related papers (2020-12-19T04:45:32Z) - One-shot Learning for Temporal Knowledge Graphs [49.41854171118697]
We propose a one-shot learning framework for link prediction in temporal knowledge graphs.
Our proposed method employs a self-attention mechanism to effectively encode temporal interactions between entities.
Our experiments show that the proposed algorithm outperforms the state of the art baselines for two well-studied benchmarks.
arXiv Detail & Related papers (2020-10-23T03:24:44Z) - TeRo: A Time-aware Knowledge Graph Embedding via Temporal Rotation [12.138550487430807]
We present a new approach of TKG embedding, TeRo, which defines the temporal evolution of entity embedding.
We show our proposed model overcomes the limitations of the existing KG embedding models and TKG embedding models.
Experimental results on four different TKGs show that TeRo significantly outperforms existing state-of-the-art models for link prediction.
arXiv Detail & Related papers (2020-10-02T14:35:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.