Relational Message Passing for Fully Inductive Knowledge Graph
Completion
- URL: http://arxiv.org/abs/2210.03994v1
- Date: Sat, 8 Oct 2022 10:35:52 GMT
- Title: Relational Message Passing for Fully Inductive Knowledge Graph
Completion
- Authors: Yuxia Geng, Jiaoyan Chen, Wen Zhang, Jeff Z. Pan, Mingyang Chen,
Huajun Chen, Song Jiang
- Abstract summary: In knowledge graph completion (KGC), predicting triples involving emerging entities and/or relations, which are unseen when KG embeddings are learned, has become a critical challenge.
Subgraph reasoning with message passing is a promising and popular solution.
We propose a new method named RMPI which uses a novel Message Passing network for fully available KGC.
- Score: 37.29833710603933
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In knowledge graph completion (KGC), predicting triples involving emerging
entities and/or relations, which are unseen when the KG embeddings are learned,
has become a critical challenge. Subgraph reasoning with message passing is a
promising and popular solution. Some recent methods have achieved good
performance, but they (i) usually can only predict triples involving unseen
entities alone, failing to address more realistic fully inductive situations
with both unseen entities and unseen relations, and (ii) often conduct message
passing over the entities with the relation patterns not fully utilized. In
this study, we propose a new method named RMPI which uses a novel Relational
Message Passing network for fully Inductive KGC. It passes messages directly
between relations to make full use of the relation patterns for subgraph
reasoning with new techniques on graph transformation, graph pruning,
relation-aware neighborhood attention, addressing empty subgraphs, etc., and
can utilize the relation semantics defined in the ontological schema of KG.
Extensive evaluation on multiple benchmarks has shown the effectiveness of
techniques involved in RMPI and its better performance compared with the
existing methods that support fully inductive KGC. RMPI is also comparable to
the state-of-the-art partially inductive KGC methods with very promising
results achieved. Our codes and data are available at
https://github.com/zjukg/RMPI.
Related papers
- Inference over Unseen Entities, Relations and Literals on Knowledge Graphs [1.7474352892977463]
knowledge graph embedding models have been successfully applied in the transductive setting to tackle various challenging tasks.
We propose the attentive byte-pair encoding layer (BytE) to construct a triple embedding from a sequence of byte-pair encoded subword units of entities and relations.
BytE leads to massive feature reuse via weight tying, since it forces a knowledge graph embedding model to learn embeddings for subword units instead of entities and relations directly.
arXiv Detail & Related papers (2024-10-09T10:20:54Z) - Towards Better Benchmark Datasets for Inductive Knowledge Graph Completion [34.58496513149175]
We find that the current procedure for constructing inductive KGC datasets inadvertently creates a shortcut that can be exploited.
Specifically, we observe that the Personalized PageRank (PPR) score can achieve strong or near SOTA performance on most inductive datasets.
We propose an alternative strategy for constructing inductive KGC datasets that helps mitigate the PPR shortcut.
arXiv Detail & Related papers (2024-06-14T21:01:46Z) - Logical Reasoning with Relation Network for Inductive Knowledge Graph Completion [9.815135283458808]
We propose a novel iNfOmax RelAtion Network, namely NORAN, for inductive KG completion.
Our framework substantially outperforms the state-of-the-art KGC methods.
arXiv Detail & Related papers (2024-06-03T09:30:43Z) - zrLLM: Zero-Shot Relational Learning on Temporal Knowledge Graphs with Large Language Models [33.10218179341504]
We use large language models to generate relation representations for embedding-based TKGF methods.
We show that our approach helps TKGF models to achieve much better performance in forecasting the facts with previously unseen relations.
arXiv Detail & Related papers (2023-11-15T21:25:15Z) - Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - Normalizing Flow-based Neural Process for Few-Shot Knowledge Graph
Completion [69.55700751102376]
Few-shot knowledge graph completion (FKGC) aims to predict missing facts for unseen relations with few-shot associated facts.
Existing FKGC methods are based on metric learning or meta-learning, which often suffer from the out-of-distribution and overfitting problems.
In this paper, we propose a normalizing flow-based neural process for few-shot knowledge graph completion (NP-FKGC)
arXiv Detail & Related papers (2023-04-17T11:42:28Z) - RAILD: Towards Leveraging Relation Features for Inductive Link
Prediction In Knowledge Graphs [1.5469452301122175]
Relation Aware Inductive Link preDiction (RAILD) is proposed for Knowledge Graph completion.
RAILD learns representations for both unseen entities and unseen relations.
arXiv Detail & Related papers (2022-11-21T12:35:30Z) - Explainable Sparse Knowledge Graph Completion via High-order Graph
Reasoning Network [111.67744771462873]
This paper proposes a novel explainable model for sparse Knowledge Graphs (KGs)
It combines high-order reasoning into a graph convolutional network, namely HoGRN.
It can not only improve the generalization ability to mitigate the information insufficiency issue but also provide interpretability.
arXiv Detail & Related papers (2022-07-14T10:16:56Z) - DisenKGAT: Knowledge Graph Embedding with Disentangled Graph Attention
Network [48.38954651216983]
We propose a novel Disentangled Knowledge Graph Attention Network (DisenKGAT) for Knowledge graphs.
DisenKGAT uses both micro-disentanglement and macro-disentanglement to exploit representations behind Knowledge graphs.
Our work has strong robustness and flexibility to adapt to various score functions.
arXiv Detail & Related papers (2021-08-22T04:10:35Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.