Revisiting Over-smoothing and Over-squashing Using Ollivier-Ricci
Curvature
- URL: http://arxiv.org/abs/2211.15779v3
- Date: Wed, 31 May 2023 07:50:48 GMT
- Title: Revisiting Over-smoothing and Over-squashing Using Ollivier-Ricci
Curvature
- Authors: Khang Nguyen and Hieu Nong and Vinh Nguyen and Nhat Ho and Stanley
Osher and Tan Nguyen
- Abstract summary: Our study reveals the key connection between the local graph geometry and the occurrence of over-smoothing and over-squashing.
We propose the Batch Ollivier-Ricci Flow, a novel rewiring algorithm capable of simultaneously addressing both over-smoothing and over-squashing.
- Score: 11.592567773739411
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) had been demonstrated to be inherently
susceptible to the problems of over-smoothing and over-squashing. These issues
prohibit the ability of GNNs to model complex graph interactions by limiting
their effectiveness in taking into account distant information. Our study
reveals the key connection between the local graph geometry and the occurrence
of both of these issues, thereby providing a unified framework for studying
them at a local scale using the Ollivier-Ricci curvature. Specifically, we
demonstrate that over-smoothing is linked to positive graph curvature while
over-squashing is linked to negative graph curvature. Based on our theory, we
propose the Batch Ollivier-Ricci Flow, a novel rewiring algorithm capable of
simultaneously addressing both over-smoothing and over-squashing.
Related papers
- Revealing Decurve Flows for Generalized Graph Propagation [108.80758541147418]
This study addresses the limitations of the traditional analysis of message-passing, central to graph learning, by defining em textbfgeneralized propagation with directed and weighted graphs.
We include a preliminary exploration of learned propagation patterns in datasets, a first in the field.
arXiv Detail & Related papers (2024-02-13T14:13:17Z) - DeepRicci: Self-supervised Graph Structure-Feature Co-Refinement for
Alleviating Over-squashing [72.70197960100677]
Graph Structure Learning (GSL) plays an important role in boosting Graph Neural Networks (GNNs) with a refined graph.
GSL solutions usually focus on structure refinement with task-specific supervision (i.e., node classification) or overlook the inherent weakness of GNNs themselves.
We propose to study self-supervised graph structure-feature co-refinement for effectively alleviating the issue of over-squashing in typical GNNs.
arXiv Detail & Related papers (2024-01-23T14:06:08Z) - Learning to Reweight for Graph Neural Network [63.978102332612906]
Graph Neural Networks (GNNs) show promising results for graph tasks.
Existing GNNs' generalization ability will degrade when there exist distribution shifts between testing and training graph data.
We propose a novel nonlinear graph decorrelation method, which can substantially improve the out-of-distribution generalization ability.
arXiv Detail & Related papers (2023-12-19T12:25:10Z) - Over-Squashing in Riemannian Graph Neural Networks [1.6317061277457001]
Most graph neural networks (GNNs) are prone to the phenomenon of over-squashing.
Recent works have shown that the topology of the graph has the greatest impact on over-squashing.
We explore whether over-squashing can be mitigated through the embedding space of the GNN.
arXiv Detail & Related papers (2023-11-27T15:51:07Z) - Mitigating Over-Smoothing and Over-Squashing using Augmentations of Forman-Ricci Curvature [1.1126342180866644]
We propose a rewiring technique based on Augmented Forman-Ricci curvature (AFRC), a scalable curvature notation.
We prove that AFRC effectively characterizes over-smoothing and over-squashing effects in message-passing GNNs.
arXiv Detail & Related papers (2023-09-17T21:43:18Z) - Curvature-based Pooling within Graph Neural Networks [3.1733862899654643]
Over-squashing and over-smoothing limit the capabilities of graph neural networks (GNNs)
CurvPool exploits the notion of curvature of a graph to adaptively identify structures responsible for both over-smoothing and over-squashing.
We compare it to other state-of-the-art pooling approaches and establish its competitiveness in terms of classification accuracy, computational complexity, and flexibility.
arXiv Detail & Related papers (2023-08-31T08:00:08Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Contrastive Graph Clustering in Curvature Spaces [74.03252813800334]
We present a novel end-to-end contrastive graph clustering model named CONGREGATE.
To support geometric clustering, we construct a theoretically grounded Heterogeneous Curvature Space.
We then train the graph clusters by an augmentation-free reweighted contrastive approach.
arXiv Detail & Related papers (2023-05-05T14:04:52Z) - FoSR: First-order spectral rewiring for addressing oversquashing in GNNs [0.0]
Graph neural networks (GNNs) are able to leverage the structure of graph data by passing messages along the edges of the graph.
We propose a computationally efficient algorithm that prevents oversquashing by systematically adding edges to the graph.
We find experimentally that our algorithm outperforms existing graph rewiring methods in several graph classification tasks.
arXiv Detail & Related papers (2022-10-21T07:58:03Z) - Transferability Properties of Graph Neural Networks [125.71771240180654]
Graph neural networks (GNNs) are provably successful at learning representations from data supported on moderate-scale graphs.
We study the problem of training GNNs on graphs of moderate size and transferring them to large-scale graphs.
Our results show that (i) the transference error decreases with the graph size, and (ii) that graph filters have a transferability-discriminability tradeoff that in GNNs is alleviated by the scattering behavior of the nonlinearity.
arXiv Detail & Related papers (2021-12-09T00:08:09Z) - Understanding over-squashing and bottlenecks on graphs via curvature [17.359098638324546]
Over-squashing is a phenomenon where the number of $k$-hop neighbors grows rapidly with $k$.
We introduce a new edge-based curvature and prove that negatively curved edges are responsible for over-squashing.
We also propose and experimentally test a curvature-based rewiring method to alleviate the over-squashing.
arXiv Detail & Related papers (2021-11-29T13:27:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.