Exploring & Exploiting High-Order Graph Structure for Sparse Knowledge
Graph Completion
- URL: http://arxiv.org/abs/2306.17034v1
- Date: Thu, 29 Jun 2023 15:35:34 GMT
- Title: Exploring & Exploiting High-Order Graph Structure for Sparse Knowledge
Graph Completion
- Authors: Tao He, Ming Liu, Yixin Cao, Zekun Wang, Zihao Zheng, Zheng Chu, and
Bing Qin
- Abstract summary: We present a novel framework, LR-GCN, that is able to automatically capture valuable long-range dependency among entities.
The proposed approach comprises two main components: a GNN-based predictor and a reasoning path distiller.
- Score: 20.45256490854869
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sparse knowledge graph (KG) scenarios pose a challenge for previous Knowledge
Graph Completion (KGC) methods, that is, the completion performance decreases
rapidly with the increase of graph sparsity. This problem is also exacerbated
because of the widespread existence of sparse KGs in practical applications. To
alleviate this challenge, we present a novel framework, LR-GCN, that is able to
automatically capture valuable long-range dependency among entities to
supplement insufficient structure features and distill logical reasoning
knowledge for sparse KGC. The proposed approach comprises two main components:
a GNN-based predictor and a reasoning path distiller. The reasoning path
distiller explores high-order graph structures such as reasoning paths and
encodes them as rich-semantic edges, explicitly compositing long-range
dependencies into the predictor. This step also plays an essential role in
densifying KGs, effectively alleviating the sparse issue. Furthermore, the path
distiller further distills logical reasoning knowledge from these mined
reasoning paths into the predictor. These two components are jointly optimized
using a well-designed variational EM algorithm. Extensive experiments and
analyses on four sparse benchmarks demonstrate the effectiveness of our
proposed method.
Related papers
- Preserving Node Distinctness in Graph Autoencoders via Similarity Distillation [9.395697548237333]
Graph autoencoders (GAEs) rely on distance-based criteria, such as mean-square-error (MSE) to reconstruct the input graph.
relying solely on a single reconstruction criterion may lead to a loss of distinctiveness in the reconstructed graph.
We have developed a simple yet effective strategy to preserve the necessary distinctness in the reconstructed graph.
arXiv Detail & Related papers (2024-06-25T12:54:35Z) - RobGC: Towards Robust Graph Condensation [61.259453496191696]
Graph neural networks (GNNs) have attracted widespread attention for their impressive capability of graph representation learning.
However, the increasing prevalence of large-scale graphs presents a significant challenge for GNN training due to their computational demands.
We propose graph condensation (GC) to generate an informative compact graph that enables efficient training of GNNs while retaining performance.
arXiv Detail & Related papers (2024-06-19T04:14:57Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - River of No Return: Graph Percolation Embeddings for Efficient Knowledge
Graph Reasoning [16.143817898963785]
We study Graph Neural Networks (GNNs)-based embedding techniques for knowledge graph (KG) reasoning.
For the first time, we link the path redundancy issue in the state-of-the-art KG reasoning models based on path encoding and message passing to the transformation error in model training.
We propose an efficient Graph Percolation Process motivated by the percolation model in Fluid Mechanics, and design a lightweight GNN-based KG reasoning framework called Graph Percolation Embeddings (GraPE)
arXiv Detail & Related papers (2023-05-17T06:13:28Z) - Cardinality Estimation over Knowledge Graphs with Embeddings and Graph Neural Networks [0.552480439325792]
Cardinality Estimation over Knowledge Graphs (KG) is crucial for query optimization.
We propose GNCE, a novel approach that leverages knowledge graph embeddings and Graph Neural Networks (GNN) to accurately predict the cardinality of conjunctive queries.
arXiv Detail & Related papers (2023-03-02T10:39:13Z) - A Comprehensive Study on Large-Scale Graph Training: Benchmarking and
Rethinking [124.21408098724551]
Large-scale graph training is a notoriously challenging problem for graph neural networks (GNNs)
We present a new ensembling training manner, named EnGCN, to address the existing issues.
Our proposed method has achieved new state-of-the-art (SOTA) performance on large-scale datasets.
arXiv Detail & Related papers (2022-10-14T03:43:05Z) - Mastering Spatial Graph Prediction of Road Networks [18.321172168775472]
We propose a graph-based framework that simulates the addition of sequences of graph edges.
In particular, given a partially generated graph associated with a satellite image, an RL agent nominates modifications that maximize a cumulative reward.
arXiv Detail & Related papers (2022-10-03T11:26:09Z) - Comprehensive Graph Gradual Pruning for Sparse Training in Graph Neural
Networks [52.566735716983956]
We propose a graph gradual pruning framework termed CGP to dynamically prune GNNs.
Unlike LTH-based methods, the proposed CGP approach requires no re-training, which significantly reduces the computation costs.
Our proposed strategy greatly improves both training and inference efficiency while matching or even exceeding the accuracy of existing methods.
arXiv Detail & Related papers (2022-07-18T14:23:31Z) - Explainable Sparse Knowledge Graph Completion via High-order Graph
Reasoning Network [111.67744771462873]
This paper proposes a novel explainable model for sparse Knowledge Graphs (KGs)
It combines high-order reasoning into a graph convolutional network, namely HoGRN.
It can not only improve the generalization ability to mitigate the information insufficiency issue but also provide interpretability.
arXiv Detail & Related papers (2022-07-14T10:16:56Z) - RelWalk A Latent Variable Model Approach to Knowledge Graph Embedding [50.010601631982425]
This paper extends the random walk model (Arora et al., 2016a) of word embeddings to Knowledge Graph Embeddings (KGEs)
We derive a scoring function that evaluates the strength of a relation R between two entities h (head) and t (tail)
We propose a learning objective motivated by the theoretical analysis to learn KGEs from a given knowledge graph.
arXiv Detail & Related papers (2021-01-25T13:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.