Deeper with Riemannian Geometry: Overcoming Oversmoothing and Oversquashing for Graph Foundation Models
- URL: http://arxiv.org/abs/2510.17457v1
- Date: Mon, 20 Oct 2025 11:41:45 GMT
- Title: Deeper with Riemannian Geometry: Overcoming Oversmoothing and Oversquashing for Graph Foundation Models
- Authors: Li Sun, Zhenhao Huang, Ming Zhang, Philip S. Yu,
- Abstract summary: Message Passing Neural Networks (MPNNs) are building block of graph foundation models.<n>MPNNs suffer from oversmoothing and oversquashing.<n>We propose a textbflocal approach that adjusts message passing based on local structures.
- Score: 47.23316001059971
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Message Passing Neural Networks (MPNNs) is the building block of graph foundation models, but fundamentally suffer from oversmoothing and oversquashing. There has recently been a surge of interest in fixing both issues. Existing efforts primarily adopt global approaches, which may be beneficial in some regions but detrimental in others, ultimately leading to the suboptimal expressiveness. In this paper, we begin by revisiting oversquashing through a global measure -- spectral gap $\lambda$ -- and prove that the increase of $\lambda$ leads to gradient vanishing with respect to the input features, thereby undermining the effectiveness of message passing. Motivated by such theoretical insights, we propose a \textbf{local} approach that adaptively adjusts message passing based on local structures. To achieve this, we connect local Riemannian geometry with MPNNs, and establish a novel nonhomogeneous boundary condition to address both oversquashing and oversmoothing. Building on the Robin condition, we design a GBN network with local bottleneck adjustment, coupled with theoretical guarantees. Extensive experiments on homophilic and heterophilic graphs show the expressiveness of GBN. Furthermore, GBN does not exhibit performance degradation even when the network depth exceeds $256$ layers.
Related papers
- Mean-Field Control on Sparse Graphs: From Local Limits to GNNs via Neighborhood Distributions [5.081469534056712]
Mean-field control (MFC) offers a scalable solution to the curse of dimensionality in multi-agent systems.<n>We bridge the gap to real-world network structures by proposing a rigorous framework for MFC on large sparse graphs.
arXiv Detail & Related papers (2026-01-29T09:57:48Z) - Gaussian Primitive Optimized Deformable Retinal Image Registration [19.882820812725523]
Deformable retinal image registration is notoriously difficult due to large homogeneous regions and sparse but critical vascular features.<n>We introduce a novel iterative framework that performs structured message passing to overcome these challenges.<n>Experiments on the FIRE dataset show that GPO reduces the target registration error from 6.2,px to 2.4,px and increases the AUC at 25,px from 0.770 to 0.938.
arXiv Detail & Related papers (2025-08-23T00:44:50Z) - Simplifying Graph Convolutional Networks with Redundancy-Free Neighbors [8.793707476780304]
We analyze the intrinsic message passing mechanism of Graph Convolutional Networks (GCNs)<n>This repeated reliance on low-order neighbors leads to redundant information aggregation, a phenomenon we term over-aggregation.<n>Our analysis demonstrates that over-aggregation not only introduces significant redundancy but also serves as the fundamental cause of over-smoothing in GCNs.
arXiv Detail & Related papers (2025-04-18T02:56:21Z) - A Signed Graph Approach to Understanding and Mitigating Oversmoothing in GNNs [54.62268052283014]
We present a unified theoretical perspective based on the framework of signed graphs.<n>We show that many existing strategies implicitly introduce negative edges that alter message-passing to resist oversmoothing.<n>We propose Structural Balanced Propagation (SBP), a plug-and-play method that assigns signed edges based on either labels or feature similarity.
arXiv Detail & Related papers (2025-02-17T03:25:36Z) - Tackling Oversmoothing in GNN via Graph Sparsification: A Truss-based Approach [1.4854797901022863]
We propose a novel and flexible truss-based graph sparsification model that prunes edges from dense regions of the graph.
We then utilize our sparsification model in the state-of-the-art baseline GNNs and pooling models, such as GIN, SAGPool, GMT, DiffPool, MinCutPool, HGP-SL, DMonPool, and AdamGNN.
arXiv Detail & Related papers (2024-07-16T17:21:36Z) - Minimum Topology Attacks for Graph Neural Networks [70.17791814425148]
Graph Neural Networks (GNNs) have received significant attention for their robustness to adversarial topology attacks.
We propose a new type of topology attack, named minimum-budget topology attack, aiming to adaptively find the minimum perturbation sufficient for a successful attack on each node.
arXiv Detail & Related papers (2024-03-05T07:29:12Z) - DeepRicci: Self-supervised Graph Structure-Feature Co-Refinement for
Alleviating Over-squashing [72.70197960100677]
Graph Structure Learning (GSL) plays an important role in boosting Graph Neural Networks (GNNs) with a refined graph.
GSL solutions usually focus on structure refinement with task-specific supervision (i.e., node classification) or overlook the inherent weakness of GNNs themselves.
We propose to study self-supervised graph structure-feature co-refinement for effectively alleviating the issue of over-squashing in typical GNNs.
arXiv Detail & Related papers (2024-01-23T14:06:08Z) - Neighborhood Homophily-based Graph Convolutional Network [4.511171093050241]
Graph neural networks (GNNs) have been proved powerful in graph-oriented tasks.
Many real-world graphs are heterophilous, challenging the homophily assumption of classical GNNs.
Recent studies propose new metrics to characterize the homophily, but rarely consider the correlation of the proposed metrics and models.
In this paper, we first design a new metric, Neighborhood Homophily (textitNH), to measure the label complexity or purity in node neighborhoods.
arXiv Detail & Related papers (2023-01-24T07:56:44Z) - ResNorm: Tackling Long-tailed Degree Distribution Issue in Graph Neural
Networks via Normalization [80.90206641975375]
This paper focuses on improving the performance of GNNs via normalization.
By studying the long-tailed distribution of node degrees in the graph, we propose a novel normalization method for GNNs.
The $scale$ operation of ResNorm reshapes the node-wise standard deviation (NStd) distribution so as to improve the accuracy of tail nodes.
arXiv Detail & Related papers (2022-06-16T13:49:09Z) - Tackling Over-Smoothing for General Graph Convolutional Networks [88.71154017107257]
We study how general GCNs act with the increase in depth, including generic GCN, GCN with bias, ResGCN, and APPNP.
We propose DropEdge to alleviate over-smoothing by randomly removing a certain number of edges at each training epoch.
arXiv Detail & Related papers (2020-08-22T16:14:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.