Local-Global History-aware Contrastive Learning for Temporal Knowledge
Graph Reasoning
- URL: http://arxiv.org/abs/2312.01601v1
- Date: Mon, 4 Dec 2023 03:27:01 GMT
- Title: Local-Global History-aware Contrastive Learning for Temporal Knowledge
Graph Reasoning
- Authors: Wei Chen, Huaiyu Wan, Yuting Wu, Shuyuan Zhao, Jiayaqi Cheng, Yuxin Li
and Youfang Lin
- Abstract summary: We propose a novel blueLocal-blueglobal history-aware blueContrastive blueL model (blueLogCL) for temporal knowledge graphs.
For the first challenge, LogCL proposes an entity-aware attention mechanism applied to the local and global historical facts encoder.
For the latter issue, LogCL designs four historical query contrast patterns, effectively improving the robustness of the model.
- Score: 25.497749629866757
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Temporal knowledge graphs (TKGs) have been identified as a promising approach
to represent the dynamics of facts along the timeline. The extrapolation of TKG
is to predict unknowable facts happening in the future, holding significant
practical value across diverse fields. Most extrapolation studies in TKGs focus
on modeling global historical fact repeating and cyclic patterns, as well as
local historical adjacent fact evolution patterns, showing promising
performance in predicting future unknown facts. Yet, existing methods still
face two major challenges: (1) They usually neglect the importance of
historical information in KG snapshots related to the queries when encoding the
local and global historical information; (2) They exhibit weak anti-noise
capabilities, which hinders their performance when the inputs are contaminated
with noise.To this end, we propose a novel \blue{Lo}cal-\blue{g}lobal
history-aware \blue{C}ontrastive \blue{L}earning model (\blue{LogCL}) for TKG
reasoning, which adopts contrastive learning to better guide the fusion of
local and global historical information and enhance the ability to resist
interference. Specifically, for the first challenge, LogCL proposes an
entity-aware attention mechanism applied to the local and global historical
facts encoder, which captures the key historical information related to
queries. For the latter issue, LogCL designs four historical query contrast
patterns, effectively improving the robustness of the model. The experimental
results on four benchmark datasets demonstrate that LogCL delivers better and
more robust performance than the state-of-the-art baselines.
Related papers
- DPCL-Diff: The Temporal Knowledge Graph Reasoning based on Graph Node Diffusion Model with Dual-Domain Periodic Contrastive Learning [3.645855411897217]
We propose a graph node diffusion model with dual-domain periodic contrastive learning (DPCL-Diff)
GNDiff introduces noise into sparsely related events to simulate new events, generating high-quality data that better conforms to the actual distribution.
DPCL-Diff maps periodic and non-periodic event entities to Poincar'e and Euclidean spaces, leveraging their characteristics to distinguish similar periodic events effectively.
arXiv Detail & Related papers (2024-11-03T08:30:29Z) - Historically Relevant Event Structuring for Temporal Knowledge Graph Reasoning [5.510391547468202]
Temporal Knowledge Graph (TKG) reasoning focuses on predicting events through historical information within snapshots distributed on a timeline.
We propose an innovative TKG reasoning approach towards textbfHistorically textbfRelevant textbfEvents textbfStructuring ($mathsfHisRES$)
arXiv Detail & Related papers (2024-05-17T08:33:43Z) - AMCEN: An Attention Masking-based Contrastive Event Network for Two-stage Temporal Knowledge Graph Reasoning [29.68279984719722]
Temporal knowledge graphs (TKGs) can effectively model the ever-evolving nature of real-world knowledge, and their completeness and enhancement can be achieved by reasoning new events from existing ones.
However, reasoning accuracy is adversely impacted due to an imbalance between new and recurring events in the datasets.
We propose an attention masking-based contrastive event network (AMCEN) with local-global temporal patterns for the two-stage prediction of future events.
arXiv Detail & Related papers (2024-05-16T01:39:50Z) - Selective Temporal Knowledge Graph Reasoning [70.11788354442218]
Temporal Knowledge Graph (TKG) aims to predict future facts based on given historical ones.
Existing TKG reasoning models are unable to abstain from predictions they are uncertain.
We propose an abstention mechanism for TKG reasoning, which helps the existing models make selective, instead of indiscriminate, predictions.
arXiv Detail & Related papers (2024-04-02T06:56:21Z) - Continual Learning with Pre-Trained Models: A Survey [61.97613090666247]
Continual Learning aims to overcome the catastrophic forgetting of former knowledge when learning new ones.
This paper presents a comprehensive survey of the latest advancements in PTM-based CL.
arXiv Detail & Related papers (2024-01-29T18:27:52Z) - Memory Consistency Guided Divide-and-Conquer Learning for Generalized
Category Discovery [56.172872410834664]
Generalized category discovery (GCD) aims at addressing a more realistic and challenging setting of semi-supervised learning.
We propose a Memory Consistency guided Divide-and-conquer Learning framework (MCDL)
Our method outperforms state-of-the-art models by a large margin on both seen and unseen classes of the generic image recognition.
arXiv Detail & Related papers (2024-01-24T09:39:45Z) - Exploring the Limits of Historical Information for Temporal Knowledge
Graph Extrapolation [59.417443739208146]
We propose a new event forecasting model based on a novel training framework of historical contrastive learning.
CENET learns both the historical and non-historical dependency to distinguish the most potential entities.
We evaluate our proposed model on five benchmark graphs.
arXiv Detail & Related papers (2023-08-29T03:26:38Z) - Temporal Knowledge Graph Reasoning with Historical Contrastive Learning [24.492458924487863]
We propose a new event forecasting model called Contrastive Event Network (CENET)
CENET learns both the historical and non-historical dependency to distinguish the most potential entities that can best match the given query.
During the inference process, CENET employs a mask-based strategy to generate the final results.
arXiv Detail & Related papers (2022-11-20T08:32:59Z) - Large Language Models with Controllable Working Memory [64.71038763708161]
Large language models (LLMs) have led to a series of breakthroughs in natural language processing (NLP)
What further sets these models apart is the massive amounts of world knowledge they internalize during pretraining.
How the model's world knowledge interacts with the factual information presented in the context remains under explored.
arXiv Detail & Related papers (2022-11-09T18:58:29Z) - Temporal Knowledge Graph Reasoning Based on Evolutional Representation
Learning [59.004025528223025]
Key to predict future facts is to thoroughly understand the historical facts.
A TKG is actually a sequence of KGs corresponding to different timestamps.
We propose a novel Recurrent Evolution network based on Graph Convolution Network (GCN)
arXiv Detail & Related papers (2021-04-21T05:12:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.