Pure Message Passing Can Estimate Common Neighbor for Link Prediction
- URL: http://arxiv.org/abs/2309.00976v4
- Date: Mon, 14 Oct 2024 14:11:22 GMT
- Title: Pure Message Passing Can Estimate Common Neighbor for Link Prediction
- Authors: Kaiwen Dong, Zhichun Guo, Nitesh V. Chawla,
- Abstract summary: We study the proficiency of MPNNs in approximating Common Neighbor (CN)
We introduce the Message Passing Link Predictor (MPLP), a novel link prediction model.
- Score: 25.044734252779975
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Message Passing Neural Networks (MPNNs) have emerged as the {\em de facto} standard in graph representation learning. However, when it comes to link prediction, they often struggle, surpassed by simple heuristics such as Common Neighbor (CN). This discrepancy stems from a fundamental limitation: while MPNNs excel in node-level representation, they stumble with encoding the joint structural features essential to link prediction, like CN. To bridge this gap, we posit that, by harnessing the orthogonality of input vectors, pure message-passing can indeed capture joint structural features. Specifically, we study the proficiency of MPNNs in approximating CN heuristics. Based on our findings, we introduce the Message Passing Link Predictor (MPLP), a novel link prediction model. MPLP taps into quasi-orthogonal vectors to estimate link-level structural features, all while preserving the node-level complexities. Moreover, our approach demonstrates that leveraging message-passing to capture structural features could offset MPNNs' expressiveness limitations at the expense of estimation variance. We conduct experiments on benchmark datasets from various domains, where our method consistently outperforms the baseline methods.
Related papers
- RelGNN: Composite Message Passing for Relational Deep Learning [56.48834369525997]
We introduce RelGNN, a novel GNN framework specifically designed to capture the unique characteristics of relational databases.
At the core of our approach is the introduction of atomic routes, which are sequences of nodes forming high-order tripartite structures.
RelGNN consistently achieves state-of-the-art accuracy with up to 25% improvement.
arXiv Detail & Related papers (2025-02-10T18:58:40Z) - Can GNNs Learn Link Heuristics? A Concise Review and Evaluation of Link Prediction Methods [16.428742189544955]
This paper explores the ability of Graph Neural Networks (GNNs) in learning various forms of information for link prediction.
Our analysis reveals that GNNs cannot effectively learn structural information related to the number of common neighbors between two nodes.
Also, our extensive experiments indicate that trainable node embeddings can improve the performance of GNN-based link prediction models.
arXiv Detail & Related papers (2024-11-22T03:38:20Z) - Link Prediction with Untrained Message Passing Layers [0.716879432974126]
We study the use of various untrained message passing layers in graph neural networks.
We find that untrained message passing layers can lead to competitive and even superior performance compared to fully trained MPNNs.
arXiv Detail & Related papers (2024-06-24T14:46:34Z) - A Pure Transformer Pretraining Framework on Text-attributed Graphs [50.833130854272774]
We introduce a feature-centric pretraining perspective by treating graph structure as a prior.
Our framework, Graph Sequence Pretraining with Transformer (GSPT), samples node contexts through random walks.
GSPT can be easily adapted to both node classification and link prediction, demonstrating promising empirical success on various datasets.
arXiv Detail & Related papers (2024-06-19T22:30:08Z) - Learning Scalable Structural Representations for Link Prediction with
Bloom Signatures [39.63963077346406]
Graph neural networks (GNNs) are known to perform sub-optimally on link prediction tasks.
We propose to learn structural link representations by augmenting the message-passing framework of GNNs with Bloom signatures.
Our proposed model achieves comparable or better performance than existing edge-wise GNN models.
arXiv Detail & Related papers (2023-12-28T02:21:40Z) - Revisiting Link Prediction: A Data Perspective [59.296773787387224]
Link prediction, a fundamental task on graphs, has proven indispensable in various applications, e.g., friend recommendation, protein analysis, and drug interaction prediction.
Evidence in existing literature underscores the absence of a universally best algorithm suitable for all datasets.
We recognize three fundamental factors critical to link prediction: local structural proximity, global structural proximity, and feature proximity.
arXiv Detail & Related papers (2023-10-01T21:09:59Z) - Neural Common Neighbor with Completion for Link Prediction [25.871382203332903]
We introduce MPNN-then-SF, an innovative architecture leveraging structural feature (SF) to guide MPNN's representation pooling.
We investigate the impact of graph incompleteness on SF, like the common neighbor.
To address this issue, we propose to use a link prediction model to complete the common neighbor structure.
arXiv Detail & Related papers (2023-02-02T05:45:09Z) - Graph Neural Networks are Inherently Good Generalizers: Insights by
Bridging GNNs and MLPs [71.93227401463199]
This paper pinpoints the major source of GNNs' performance gain to their intrinsic capability, by introducing an intermediate model class dubbed as P(ropagational)MLP.
We observe that PMLPs consistently perform on par with (or even exceed) their GNN counterparts, while being much more efficient in training.
arXiv Detail & Related papers (2022-12-18T08:17:32Z) - Link Prediction with Contextualized Self-Supervision [63.25455976593081]
Link prediction aims to infer the existence of a link between two nodes in a network.
Traditional link prediction algorithms are hindered by three major challenges -- link sparsity, node attribute noise and network dynamics.
We propose a Contextualized Self-Supervised Learning framework that fully exploits structural context prediction for link prediction.
arXiv Detail & Related papers (2022-01-25T03:12:32Z) - VQ-GNN: A Universal Framework to Scale up Graph Neural Networks using
Vector Quantization [70.8567058758375]
VQ-GNN is a universal framework to scale up any convolution-based GNNs using Vector Quantization (VQ) without compromising the performance.
Our framework avoids the "neighbor explosion" problem of GNNs using quantized representations combined with a low-rank version of the graph convolution matrix.
arXiv Detail & Related papers (2021-10-27T11:48:50Z) - Channel-Wise Early Stopping without a Validation Set via NNK Polytope
Interpolation [36.479195100553085]
Convolutional neural networks (ConvNets) comprise high-dimensional feature spaces formed by the aggregation of multiple channels.
We present channel-wise DeepNNK, a novel generalization estimate based on non-dimensional kernel regression (NNK) graphs.
arXiv Detail & Related papers (2021-07-27T17:33:30Z) - Explaining and Improving Model Behavior with k Nearest Neighbor
Representations [107.24850861390196]
We propose using k nearest neighbor representations to identify training examples responsible for a model's predictions.
We show that kNN representations are effective at uncovering learned spurious associations.
Our results indicate that the kNN approach makes the finetuned model more robust to adversarial inputs.
arXiv Detail & Related papers (2020-10-18T16:55:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.