Enhancing Knowledge Graph Completion with GNN Distillation and Probabilistic Interaction Modeling
- URL: http://arxiv.org/abs/2505.12272v1
- Date: Sun, 18 May 2025 07:22:53 GMT
- Title: Enhancing Knowledge Graph Completion with GNN Distillation and Probabilistic Interaction Modeling
- Authors: Lingzhi Wang, Pengcheng Huang, Haotian Li, Yuliang Wei, Guodong Xin, Rui Zhang, Donglin Zhang, Zhenzhou Ji, Wei Wang,
- Abstract summary: Knowledge graphs serve as fundamental structures for organizing interconnected data across diverse domains.<n>Existing methods face critical challenges: deep graph neural networks (GNNs) suffer from over-smoothing, while embedding-based models fail to capture abstract relational features.<n>This study proposes a unified framework that integrates GNN distillation and abstract probabilistic interaction modeling (APIM)
- Score: 14.550518793230404
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge graphs (KGs) serve as fundamental structures for organizing interconnected data across diverse domains. However, most KGs remain incomplete, limiting their effectiveness in downstream applications. Knowledge graph completion (KGC) aims to address this issue by inferring missing links, but existing methods face critical challenges: deep graph neural networks (GNNs) suffer from over-smoothing, while embedding-based models fail to capture abstract relational features. This study aims to overcome these limitations by proposing a unified framework that integrates GNN distillation and abstract probabilistic interaction modeling (APIM). GNN distillation approach introduces an iterative message-feature filtering process to mitigate over-smoothing, preserving the discriminative power of node representations. APIM module complements this by learning structured, abstract interaction patterns through probabilistic signatures and transition matrices, allowing for a richer, more flexible representation of entity and relation interactions. We apply these methods to GNN-based models and the APIM to embedding-based KGC models, conducting extensive evaluations on the widely used WN18RR and FB15K-237 datasets. Our results demonstrate significant performance gains over baseline models, showcasing the effectiveness of the proposed techniques. The findings highlight the importance of both controlling information propagation and leveraging structured probabilistic modeling, offering new avenues for advancing knowledge graph completion. And our codes are available at https://anonymous.4open.science/r/APIM_and_GNN-Distillation-461C.
Related papers
- Do We Really Need GNNs with Explicit Structural Modeling? MLPs Suffice for Language Model Representations [50.45261187796993]
Graph Neural Networks (GNNs) fail to fully utilize structural information, whereas Multi-Layer Perceptrons (MLPs) exhibit a surprising ability in structure-aware tasks.<n>This paper introduces a comprehensive probing framework from an information-theoretic perspective.
arXiv Detail & Related papers (2025-06-26T18:10:28Z) - Large Language Model Enhancers for Graph Neural Networks: An Analysis from the Perspective of Causal Mechanism Identification [19.389891710579022]
We study the use of large language models (LLMs) as feature enhancers to optimize node representations, which are then used as inputs for graph neural networks (GNNs)<n>Building on the analytical results, we design a plug-and-play optimization module to improve the information transfer between LLM enhancers and GNNs.
arXiv Detail & Related papers (2025-05-13T06:29:25Z) - Neighborhood-Order Learning Graph Attention Network for Fake News Detection [2.34863357088666]
We propose a novel model called Neighborhood-Order Learning Graph Attention Network (NOL-GAT) for fake news detection.<n>This model allows each node in each layer to independently learn its optimal neighborhood order.<n>To evaluate the model's performance, experiments are conducted on various fake news datasets.
arXiv Detail & Related papers (2025-02-10T18:51:57Z) - DeltaGNN: Graph Neural Network with Information Flow Control [5.563171090433323]
Graph Neural Networks (GNNs) are designed to process graph-structured data through neighborhood aggregations in the message passing process.<n>Message-passing enables GNNs to understand short-range spatial interactions, but also causes them to suffer from over-smoothing and over-squashing.<n>We propose a mechanism called emph information flow control to address over-smoothing and over-squashing with linear computational overhead.<n>We benchmark our model across 10 real-world datasets, including graphs with varying sizes, topologies, densities, and homophilic ratios, showing superior performance
arXiv Detail & Related papers (2025-01-10T14:34:20Z) - Deep Sparse Latent Feature Models for Knowledge Graph Completion [24.342670268545085]
In this paper, we introduce a novel framework of sparse latent feature models for knowledge graphs.
Our approach not only effectively completes missing triples but also provides clear interpretability of the latent structures.
Our method significantly improves performance by revealing latent communities and producing interpretable representations.
arXiv Detail & Related papers (2024-11-24T03:17:37Z) - Graph as a feature: improving node classification with non-neural graph-aware logistic regression [2.952177779219163]
Graph-aware Logistic Regression (GLR) is a non-neural model designed for node classification tasks.
Unlike traditional graph algorithms that use only a fraction of the information accessible to GNNs, our proposed model simultaneously leverages both node features and the relationships between entities.
arXiv Detail & Related papers (2024-11-19T08:32:14Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Factor Graph Neural Networks [20.211455592922736]
Graph Neural Networks (GNNs) can learn powerful representations in an end-to-end fashion with great success in many real-world applications.
We propose Factor Graph Neural Networks (FGNNs) to effectively capture higher-order relations for inference and learning.
arXiv Detail & Related papers (2023-08-02T00:32:02Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - EIGNN: Efficient Infinite-Depth Graph Neural Networks [51.97361378423152]
Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications.
Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN)
We show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-02-22T08:16:58Z) - Learning Intents behind Interactions with Knowledge Graph for
Recommendation [93.08709357435991]
Knowledge graph (KG) plays an increasingly important role in recommender systems.
Existing GNN-based models fail to identify user-item relation at a fine-grained level of intents.
We propose a new model, Knowledge Graph-based Intent Network (KGIN)
arXiv Detail & Related papers (2021-02-14T03:21:36Z) - Amortized Probabilistic Detection of Communities in Graphs [39.56798207634738]
We propose a simple framework for amortized community detection.
We combine the expressive power of GNNs with recent methods for amortized clustering.
We evaluate several models from our framework on synthetic and real datasets.
arXiv Detail & Related papers (2020-10-29T16:18:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.