Over-Squashing in Riemannian Graph Neural Networks
- URL: http://arxiv.org/abs/2311.15945v1
- Date: Mon, 27 Nov 2023 15:51:07 GMT
- Title: Over-Squashing in Riemannian Graph Neural Networks
- Authors: Julia Balla
- Abstract summary: Most graph neural networks (GNNs) are prone to the phenomenon of over-squashing.
Recent works have shown that the topology of the graph has the greatest impact on over-squashing.
We explore whether over-squashing can be mitigated through the embedding space of the GNN.
- Score: 1.6317061277457001
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Most graph neural networks (GNNs) are prone to the phenomenon of
over-squashing in which node features become insensitive to information from
distant nodes in the graph. Recent works have shown that the topology of the
graph has the greatest impact on over-squashing, suggesting graph rewiring
approaches as a suitable solution. In this work, we explore whether
over-squashing can be mitigated through the embedding space of the GNN. In
particular, we consider the generalization of Hyperbolic GNNs (HGNNs) to
Riemannian manifolds of variable curvature in which the geometry of the
embedding space is faithful to the graph's topology. We derive bounds on the
sensitivity of the node features in these Riemannian GNNs as the number of
layers increases, which yield promising theoretical and empirical results for
alleviating over-squashing in graphs with negative curvature.
Related papers
- Spiking Graph Neural Network on Riemannian Manifolds [51.15400848660023]
Graph neural networks (GNNs) have become the dominant solution for learning on graphs.
Existing spiking GNNs consider graphs in Euclidean space, ignoring the structural geometry.
We present a Manifold-valued Spiking GNN (MSG)
MSG achieves superior performance to previous spiking GNNs and energy efficiency to conventional GNNs.
arXiv Detail & Related papers (2024-10-23T15:09:02Z) - Generalization of Geometric Graph Neural Networks [84.01980526069075]
We study the generalization capabilities of geometric graph neural networks (GNNs)
We prove a generalization gap between the optimal empirical risk and the optimal statistical risk of this GNN.
The most important observation is that the generalization capability can be realized with one large graph instead of being limited to the size of the graph as in previous results.
arXiv Detail & Related papers (2024-09-08T18:55:57Z) - A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
We take a manifold perspective to establish the statistical generalization theory of GNNs on graphs sampled from a manifold in the spectral domain.
We prove that the generalization bounds of GNNs decrease linearly with the size of the graphs in the logarithmic scale, and increase linearly with the spectral continuity constants of the filter functions.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - DeepRicci: Self-supervised Graph Structure-Feature Co-Refinement for
Alleviating Over-squashing [72.70197960100677]
Graph Structure Learning (GSL) plays an important role in boosting Graph Neural Networks (GNNs) with a refined graph.
GSL solutions usually focus on structure refinement with task-specific supervision (i.e., node classification) or overlook the inherent weakness of GNNs themselves.
We propose to study self-supervised graph structure-feature co-refinement for effectively alleviating the issue of over-squashing in typical GNNs.
arXiv Detail & Related papers (2024-01-23T14:06:08Z) - Robust Graph Neural Network based on Graph Denoising [10.564653734218755]
Graph Neural Networks (GNNs) have emerged as a notorious alternative to address learning problems dealing with non-Euclidean datasets.
This work proposes a robust implementation of GNNs that explicitly accounts for the presence of perturbations in the observed topology.
arXiv Detail & Related papers (2023-12-11T17:43:57Z) - How Curvature Enhance the Adaptation Power of Framelet GCNs [27.831929635701886]
Graph neural network (GNN) has been demonstrated powerful in modeling graph-structured data.
This paper introduces a new approach to enhance GNN by discrete graph Ricci curvature.
We show that our curvature-based GNN model outperforms the state-of-the-art baselines in both homophily and heterophily graph datasets.
arXiv Detail & Related papers (2023-07-19T06:05:33Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Graph Neural Network Bandits [89.31889875864599]
We consider the bandit optimization problem with the reward function defined over graph-structured data.
Key challenges in this setting are scaling to large domains, and to graphs with many nodes.
We show that graph neural networks (GNNs) can be used to estimate the reward function.
arXiv Detail & Related papers (2022-07-13T18:12:36Z) - Understanding over-squashing and bottlenecks on graphs via curvature [17.359098638324546]
Over-squashing is a phenomenon where the number of $k$-hop neighbors grows rapidly with $k$.
We introduce a new edge-based curvature and prove that negatively curved edges are responsible for over-squashing.
We also propose and experimentally test a curvature-based rewiring method to alleviate the over-squashing.
arXiv Detail & Related papers (2021-11-29T13:27:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.