A Simple But Powerful Graph Encoder for Temporal Knowledge Graph
Completion
- URL: http://arxiv.org/abs/2112.07791v1
- Date: Tue, 14 Dec 2021 23:30:42 GMT
- Title: A Simple But Powerful Graph Encoder for Temporal Knowledge Graph
Completion
- Authors: Zifeng Ding, Yunpu Ma, Bailan He, Volker Tresp
- Abstract summary: We propose a simple but powerful graph encoder TARGCN for temporal knowledge graphs (TKGs)
Our model can achieve a more than 42% relative improvement on GDELT dataset compared with the state-of-the-art model.
It outperforms the strongest baseline on ICEWS05-15 dataset with around 18.5% fewer parameters.
- Score: 13.047205680129094
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While knowledge graphs contain rich semantic knowledge of various entities
and the relational information among them, temporal knowledge graphs (TKGs)
further indicate the interactions of the entities over time. To study how to
better model TKGs, automatic temporal knowledge graph completion (TKGC) has
gained great interest. Recent TKGC methods aim to integrate advanced deep
learning techniques, e.g., attention mechanism and Transformer, to boost model
performance. However, we find that compared to adopting various kinds of
complex modules, it is more beneficial to better utilize the whole amount of
temporal information along the time axis. In this paper, we propose a simple
but powerful graph encoder TARGCN for TKGC. TARGCN is parameter-efficient, and
it extensively utilizes the information from the whole temporal context. We
perform experiments on three benchmark datasets. Our model can achieve a more
than 42% relative improvement on GDELT dataset compared with the
state-of-the-art model. Meanwhile, it outperforms the strongest baseline on
ICEWS05-15 dataset with around 18.5% fewer parameters.
Related papers
- KERMIT: Knowledge Graph Completion of Enhanced Relation Modeling with Inverse Transformation [19.31783654838732]
We use large language models to generate coherent descriptions, bridging the semantic gap between queries and answers.
We also utilize inverse relations to create a symmetric graph, thereby providing augmented training samples for KGC.
Our approach achieves a 4.2% improvement in Hit@1 on WN18RR and a 3.4% improvement in Hit@3 on FB15k-237, demonstrating superior performance.
arXiv Detail & Related papers (2023-09-26T09:03:25Z) - Challenging the Myth of Graph Collaborative Filtering: a Reasoned and Reproducibility-driven Analysis [50.972595036856035]
We present a code that successfully replicates results from six popular and recent graph recommendation models.
We compare these graph models with traditional collaborative filtering models that historically performed well in offline evaluations.
By investigating the information flow from users' neighborhoods, we aim to identify which models are influenced by intrinsic features in the dataset structure.
arXiv Detail & Related papers (2023-08-01T09:31:44Z) - Temporal Graph Benchmark for Machine Learning on Temporal Graphs [54.52243310226456]
Temporal Graph Benchmark (TGB) is a collection of challenging and diverse benchmark datasets.
We benchmark each dataset and find that the performance of common models can vary drastically across datasets.
TGB provides an automated machine learning pipeline for reproducible and accessible temporal graph research.
arXiv Detail & Related papers (2023-07-03T13:58:20Z) - Deep Temporal Graph Clustering [77.02070768950145]
We propose a general framework for deep Temporal Graph Clustering (GC)
GC introduces deep clustering techniques to suit the interaction sequence-based batch-processing pattern of temporal graphs.
Our framework can effectively improve the performance of existing temporal graph learning methods.
arXiv Detail & Related papers (2023-05-18T06:17:50Z) - A Survey of Knowledge Graph Reasoning on Graph Types: Static, Dynamic,
and Multimodal [57.8455911689554]
Knowledge graph reasoning (KGR) aims to deduce new facts from existing facts based on mined logic rules underlying knowledge graphs (KGs)
It has been proven to significantly benefit the usage of KGs in many AI applications, such as question answering, recommendation systems, and etc.
arXiv Detail & Related papers (2022-12-12T08:40:04Z) - Few-Shot Inductive Learning on Temporal Knowledge Graphs using
Concept-Aware Information [31.10140298420744]
We propose a few-shot out-of-graph (OOG) link prediction task for temporal knowledge graphs (TKGs)
We predict the missing entities from the links concerning unseen entities by employing a meta-learning framework.
Our model achieves superior performance on all three datasets.
arXiv Detail & Related papers (2022-11-15T14:23:07Z) - A Simple Temporal Information Matching Mechanism for Entity Alignment
Between Temporal Knowledge Graphs [18.451872649228196]
We propose a simple graph neural network (GNN) model combined with a temporal information matching mechanism.
We also propose a method to generate unsupervised alignment seeds via the temporal information of TKG.
arXiv Detail & Related papers (2022-09-20T12:27:34Z) - Learning Meta Representations of One-shot Relations for Temporal
Knowledge Graph Link Prediction [33.36701435886095]
Few-shot relational learning for static knowledge graphs (KGs) has drawn greater interest in recent years.
TKGs contain rich temporal information, thus requiring temporal reasoning techniques for modeling.
This poses a greater challenge in learning few-shot relations in the temporal context.
arXiv Detail & Related papers (2022-05-21T15:17:52Z) - Temporal Knowledge Graph Reasoning with Low-rank and Model-agnostic
Representations [1.8262547855491458]
We introduce Time-LowFER, a family of parameter-efficient and time-aware extensions of the low-rank tensor factorization model LowFER.
Noting several limitations in current approaches to represent time, we propose a cycle-aware time-encoding scheme for time features.
We implement our methods in a unified temporal knowledge graph embedding framework, focusing on time-sensitive data processing.
arXiv Detail & Related papers (2022-04-10T22:24:11Z) - Node Feature Extraction by Self-Supervised Multi-scale Neighborhood
Prediction [123.20238648121445]
We propose a new self-supervised learning framework, Graph Information Aided Node feature exTraction (GIANT)
GIANT makes use of the eXtreme Multi-label Classification (XMC) formalism, which is crucial for fine-tuning the language model based on graph information.
We demonstrate the superior performance of GIANT over the standard GNN pipeline on Open Graph Benchmark datasets.
arXiv Detail & Related papers (2021-10-29T19:55:12Z) - Heuristic Semi-Supervised Learning for Graph Generation Inspired by
Electoral College [80.67842220664231]
We propose a novel pre-processing technique, namely ELectoral COllege (ELCO), which automatically expands new nodes and edges to refine the label similarity within a dense subgraph.
In all setups tested, our method boosts the average score of base models by a large margin of 4.7 points, as well as consistently outperforms the state-of-the-art.
arXiv Detail & Related papers (2020-06-10T14:48:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.