EvoKG: Jointly Modeling Event Time and Network Structure for Reasoning
over Temporal Knowledge Graphs
- URL: http://arxiv.org/abs/2202.07648v2
- Date: Wed, 16 Feb 2022 06:24:34 GMT
- Title: EvoKG: Jointly Modeling Event Time and Network Structure for Reasoning
over Temporal Knowledge Graphs
- Authors: Namyong Park, Fuchen Liu, Purvanshi Mehta, Dana Cristofor, Christos
Faloutsos, Yuxiao Dong
- Abstract summary: Reasoning over temporal knowledge graphs (TKGs) is crucial for many applications to provide intelligent services.
We present a problem formulation that unifies the two major problems that need to be addressed for an effective reasoning over TKGs, namely, modeling the event time and the evolving network structure.
Our proposed method EvoKG jointly models both tasks in an effective framework, which captures the ever-changing structural and temporal dynamics in TKGs.
- Score: 25.408246523764085
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: How can we perform knowledge reasoning over temporal knowledge graphs (TKGs)?
TKGs represent facts about entities and their relations, where each fact is
associated with a timestamp. Reasoning over TKGs, i.e., inferring new facts
from time-evolving KGs, is crucial for many applications to provide intelligent
services. However, despite the prevalence of real-world data that can be
represented as TKGs, most methods focus on reasoning over static knowledge
graphs, or cannot predict future events. In this paper, we present a problem
formulation that unifies the two major problems that need to be addressed for
an effective reasoning over TKGs, namely, modeling the event time and the
evolving network structure. Our proposed method EvoKG jointly models both tasks
in an effective framework, which captures the ever-changing structural and
temporal dynamics in TKGs via recurrent event modeling, and models the
interactions between entities based on the temporal neighborhood aggregation
framework. Further, EvoKG achieves an accurate modeling of event time, using
flexible and efficient mechanisms based on neural density estimation.
Experiments show that EvoKG outperforms existing methods in terms of
effectiveness (up to 77% and 116% more accurate time and link prediction) and
efficiency.
Related papers
- Large Language Models-guided Dynamic Adaptation for Temporal Knowledge Graph Reasoning [87.10396098919013]
Large Language Models (LLMs) have demonstrated extensive knowledge and remarkable proficiency in temporal reasoning.
We propose a Large Language Models-guided Dynamic Adaptation (LLM-DA) method for reasoning on Temporal Knowledge Graphs.
LLM-DA harnesses the capabilities of LLMs to analyze historical data and extract temporal logical rules.
arXiv Detail & Related papers (2024-05-23T04:54:37Z) - Selective Temporal Knowledge Graph Reasoning [70.11788354442218]
Temporal Knowledge Graph (TKG) aims to predict future facts based on given historical ones.
Existing TKG reasoning models are unable to abstain from predictions they are uncertain.
We propose an abstention mechanism for TKG reasoning, which helps the existing models make selective, instead of indiscriminate, predictions.
arXiv Detail & Related papers (2024-04-02T06:56:21Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Learning Multi-graph Structure for Temporal Knowledge Graph Reasoning [3.3571415078869955]
This paper proposes an innovative reasoning approach that focuses on Learning Multi-graph Structure (LMS)
LMS incorporates an adaptive gate for merging entity representations both along and across timestamps effectively.
It also integrates timestamp semantics into graph attention calculations and time-aware decoders.
arXiv Detail & Related papers (2023-12-04T08:23:09Z) - Learning Meta Representations of One-shot Relations for Temporal
Knowledge Graph Link Prediction [33.36701435886095]
Few-shot relational learning for static knowledge graphs (KGs) has drawn greater interest in recent years.
TKGs contain rich temporal information, thus requiring temporal reasoning techniques for modeling.
This poses a greater challenge in learning few-shot relations in the temporal context.
arXiv Detail & Related papers (2022-05-21T15:17:52Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Temporal Knowledge Graph Reasoning Based on Evolutional Representation
Learning [59.004025528223025]
Key to predict future facts is to thoroughly understand the historical facts.
A TKG is actually a sequence of KGs corresponding to different timestamps.
We propose a novel Recurrent Evolution network based on Graph Convolution Network (GCN)
arXiv Detail & Related papers (2021-04-21T05:12:21Z) - T-GAP: Learning to Walk across Time for Temporal Knowledge Graph
Completion [13.209193437124881]
Temporal knowledge graphs (TKGs) inherently reflect the transient nature of real-world knowledge, as opposed to static knowledge graphs.
We propose T-GAP, a novel model for TKG completion that maximally utilizes both temporal information and graph structure in its encoder and decoder.
Our experiments demonstrate that T-GAP achieves superior performance against state-of-the-art baselines, and competently generalizes to queries with unseen timestamps.
arXiv Detail & Related papers (2020-12-19T04:45:32Z) - TeMP: Temporal Message Passing for Temporal Knowledge Graph Completion [45.588053447288566]
Inferring missing facts in temporal knowledge graphs (TKGs) is a fundamental and challenging task.
We propose the Temporal Message Passing (TeMP) framework to address these challenges by combining graph neural networks, temporal dynamics models, data imputation and frequency-based gating techniques.
arXiv Detail & Related papers (2020-10-07T17:11:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.