Sign is Not a Remedy: Multiset-to-Multiset Message Passing for Learning on Heterophilic Graphs
- URL: http://arxiv.org/abs/2405.20652v1
- Date: Fri, 31 May 2024 07:39:22 GMT
- Title: Sign is Not a Remedy: Multiset-to-Multiset Message Passing for Learning on Heterophilic Graphs
- Authors: Langzhang Liang, Sunwoo Kim, Kijung Shin, Zenglin Xu, Shirui Pan, Yuan Qi,
- Abstract summary: We propose a novel message passing function called Multiset to Multiset GNN(M2M-GNN)
Our theoretical analyses and extensive experiments demonstrate that M2M-GNN effectively alleviates the aforementioned limitations of SMP, yielding superior performance in comparison.
- Score: 77.42221150848535
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have gained significant attention as a powerful modeling and inference method, especially for homophilic graph-structured data. To empower GNNs in heterophilic graphs, where adjacent nodes exhibit dissimilar labels or features, Signed Message Passing (SMP) has been widely adopted. However, there is a lack of theoretical and empirical analysis regarding the limitations of SMP. In this work, we unveil some potential pitfalls of SMP and their remedies. We first identify two limitations of SMP: undesirable representation update for multi-hop neighbors and vulnerability against oversmoothing issues. To overcome these challenges, we propose a novel message passing function called Multiset to Multiset GNN(M2M-GNN). Our theoretical analyses and extensive experiments demonstrate that M2M-GNN effectively alleviates the aforementioned limitations of SMP, yielding superior performance in comparison
Related papers
- Better Not to Propagate: Understanding Edge Uncertainty and Over-smoothing in Signed Graph Neural Networks [3.4498722449655066]
We propose a novel method for estimating homophily and edge error ratio, integrated with dynamic selection between blocked and signed propagation during training.
Our theoretical analysis, supported by extensive experiments, demonstrates that blocking MP can be more effective than signed propagation under high edge error ratios.
arXiv Detail & Related papers (2024-08-09T06:46:06Z) - Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - Probabilistic Graph Rewiring via Virtual Nodes [21.273828055299408]
Message-passing graph neural networks (MPNNs) have emerged as a powerful paradigm for graph-based machine learning.
MPNNs face challenges such as under-reaching and over-squashing, where limited receptive fields and structural bottlenecks hinder information flow in the graph.
Here, we propose implicitly rewired message-passing neural networks (IPR-MPNNs), a novel approach that integrates implicit probabilistic graph rewiring into MPNNs.
arXiv Detail & Related papers (2024-05-27T16:11:49Z) - How does over-squashing affect the power of GNNs? [39.52168593457813]
Graph Neural Networks (GNNs) are the state-of-the-art model for machine learning on graph-structured data.
We provide a rigorous analysis to determine which function classes of node features can be learned by an MPNN of a given capacity.
We prove that, to guarantee sufficient communication between pairs of nodes, the capacity of the MPNN must be large enough.
arXiv Detail & Related papers (2023-06-06T11:15:53Z) - MGNNI: Multiscale Graph Neural Networks with Implicit Layers [53.75421430520501]
implicit graph neural networks (GNNs) have been proposed to capture long-range dependencies in underlying graphs.
We introduce and justify two weaknesses of implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions.
We propose a multiscale graph neural network with implicit layers (MGNNI) which is able to model multiscale structures on graphs and has an expanded effective range for capturing long-range dependencies.
arXiv Detail & Related papers (2022-10-15T18:18:55Z) - RU-Net: Regularized Unrolling Network for Scene Graph Generation [92.95032610978511]
Scene graph generation (SGG) aims to detect objects and predict the relationships between each pair of objects.
Existing SGG methods usually suffer from several issues, including 1) ambiguous object representations, and 2) low diversity in relationship predictions.
We propose a regularized unrolling network (RU-Net) to address both problems.
arXiv Detail & Related papers (2022-05-03T04:21:15Z) - Boosting Graph Neural Networks by Injecting Pooling in Message Passing [4.952681349410351]
We propose a new, adaptable, and powerful MP framework to prevent over-smoothing.
Our bilateral-MP estimates a pairwise modular gradient by utilizing the class information of nodes.
Experiments on five medium-size benchmark datasets indicate that the bilateral-MP improves performance by alleviating over-smoothing.
arXiv Detail & Related papers (2022-02-08T08:21:20Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Optimization and Generalization Analysis of Transduction through
Gradient Boosting and Application to Multi-scale Graph Neural Networks [60.22494363676747]
It is known that the current graph neural networks (GNNs) are difficult to make themselves deep due to the problem known as over-smoothing.
Multi-scale GNNs are a promising approach for mitigating the over-smoothing problem.
We derive the optimization and generalization guarantees of transductive learning algorithms that include multi-scale GNNs.
arXiv Detail & Related papers (2020-06-15T17:06:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.