Adaptive Path-Memory Network for Temporal Knowledge Graph Reasoning
- URL: http://arxiv.org/abs/2304.12604v1
- Date: Tue, 25 Apr 2023 06:33:08 GMT
- Title: Adaptive Path-Memory Network for Temporal Knowledge Graph Reasoning
- Authors: Hao Dong, Zhiyuan Ning, Pengyang Wang, Ziyue Qiao, Pengfei Wang,
Yuanchun Zhou, Yanjie Fu
- Abstract summary: Temporal knowledge graph (TKG) reasoning aims to predict the future missing facts based on historical information.
We propose a novel architecture modeling with relation feature of TKG, namely aDAptivE path-MemOry Network (DaeMon)
DaeMon adaptively models the temporal path information between query subject and each object candidate across history time.
- Score: 25.84105067110878
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal knowledge graph (TKG) reasoning aims to predict the future missing
facts based on historical information and has gained increasing research
interest recently. Lots of works have been made to model the historical
structural and temporal characteristics for the reasoning task. Most existing
works model the graph structure mainly depending on entity representation.
However, the magnitude of TKG entities in real-world scenarios is considerable,
and an increasing number of new entities will arise as time goes on. Therefore,
we propose a novel architecture modeling with relation feature of TKG, namely
aDAptivE path-MemOry Network (DaeMon), which adaptively models the temporal
path information between query subject and each object candidate across history
time. It models the historical information without depending on entity
representation. Specifically, DaeMon uses path memory to record the temporal
path information derived from path aggregation unit across timeline considering
the memory passing strategy between adjacent timestamps. Extensive experiments
conducted on four real-world TKG datasets demonstrate that our proposed model
obtains substantial performance improvement and outperforms the
state-of-the-art up to 4.8% absolute in MRR.
Related papers
- Learning Granularity Representation for Temporal Knowledge Graph Completion [2.689675451882683]
Temporal Knowledge Graphs (TKGs) incorporate temporal information to reflect the dynamic structural knowledge and evolutionary patterns of real-world facts.
This paper proposes textbfLearning textbfGranularity textbfRepresentation (termed $mathsfLGRe$) for TKG completion.
It comprises two main components: Granularity Learning (GRL) and Adaptive Granularity Balancing (AGB)
arXiv Detail & Related papers (2024-08-27T08:19:34Z) - Temporal Inductive Path Neural Network for Temporal Knowledge Graph
Reasoning [16.984588879938947]
Reasoning on Temporal Knowledge Graph (TKG) aims to predict future facts based on historical occurrences.
Most existing approaches model TKGs relying on entity modeling, as nodes in the graph play a crucial role in knowledge representation.
We propose Temporal Inductive Path Neural Network (TiPNN), which models historical information in an entity-independent perspective.
arXiv Detail & Related papers (2023-09-06T17:37:40Z) - Exploring the Limits of Historical Information for Temporal Knowledge
Graph Extrapolation [59.417443739208146]
We propose a new event forecasting model based on a novel training framework of historical contrastive learning.
CENET learns both the historical and non-historical dependency to distinguish the most potential entities.
We evaluate our proposed model on five benchmark graphs.
arXiv Detail & Related papers (2023-08-29T03:26:38Z) - Temporal Graph Benchmark for Machine Learning on Temporal Graphs [54.52243310226456]
Temporal Graph Benchmark (TGB) is a collection of challenging and diverse benchmark datasets.
We benchmark each dataset and find that the performance of common models can vary drastically across datasets.
TGB provides an automated machine learning pipeline for reproducible and accessible temporal graph research.
arXiv Detail & Related papers (2023-07-03T13:58:20Z) - Search to Pass Messages for Temporal Knowledge Graph Completion [97.40256786473516]
We propose to use neural architecture search (NAS) to design data-specific message passing architecture for temporal knowledge graphs (TKGs) completion.
In particular, we develop a generalized framework to explore topological and temporal information in TKGs.
We adopt a search algorithm, which trains a supernet structure by sampling single path for efficient search with less cost.
arXiv Detail & Related papers (2022-10-30T04:05:06Z) - HiSMatch: Historical Structure Matching based Temporal Knowledge Graph
Reasoning [59.38797474903334]
This paper proposes the textbfHistorical textbfStructure textbfMatching (textbfHiSMatch) model.
It applies two structure encoders to capture the semantic information contained in the historical structures of the query and candidate entities.
Experiments on six benchmark datasets demonstrate the significant improvement of the proposed HiSMatch model, with up to 5.6% performance improvement in MRR, compared to the state-of-the-art baselines.
arXiv Detail & Related papers (2022-10-18T09:39:26Z) - Temporal Relevance Analysis for Video Action Models [70.39411261685963]
We first propose a new approach to quantify the temporal relationships between frames captured by CNN-based action models.
We then conduct comprehensive experiments and in-depth analysis to provide a better understanding of how temporal modeling is affected.
arXiv Detail & Related papers (2022-04-25T19:06:48Z) - Temporal Knowledge Graph Reasoning Based on Evolutional Representation
Learning [59.004025528223025]
Key to predict future facts is to thoroughly understand the historical facts.
A TKG is actually a sequence of KGs corresponding to different timestamps.
We propose a novel Recurrent Evolution network based on Graph Convolution Network (GCN)
arXiv Detail & Related papers (2021-04-21T05:12:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.