On the Stability of Graph Convolutional Neural Networks: A Probabilistic Perspective
- URL: http://arxiv.org/abs/2506.01213v3
- Date: Thu, 12 Jun 2025 22:02:52 GMT
- Title: On the Stability of Graph Convolutional Neural Networks: A Probabilistic Perspective
- Authors: Ning Zhang, Henry Kenlay, Li Zhang, Mihai Cucuringu, Xiaowen Dong,
- Abstract summary: We study how perturbations in the graph topology affect GCNN outputs and propose a novel formulation for analyzing model stability.<n>Unlike prior studies that focus only on worst-case perturbations, our distribution-aware formulation characterizes output perturbations across a broad range of input data.
- Score: 24.98112303106984
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph convolutional neural networks (GCNNs) have emerged as powerful tools for analyzing graph-structured data, achieving remarkable success across diverse applications. However, the theoretical understanding of the stability of these models, i.e., their sensitivity to small changes in the graph structure, remains in rather limited settings, hampering the development and deployment of robust and trustworthy models in practice. To fill this gap, we study how perturbations in the graph topology affect GCNN outputs and propose a novel formulation for analyzing model stability. Unlike prior studies that focus only on worst-case perturbations, our distribution-aware formulation characterizes output perturbations across a broad range of input data. This way, our framework enables, for the first time, a probabilistic perspective on the interplay between the statistical properties of the node data and perturbations in the graph topology. We conduct extensive experiments to validate our theoretical findings and demonstrate their benefits over existing baselines, in terms of both representation stability and adversarial attacks on downstream tasks. Our results demonstrate the practical significance of the proposed formulation and highlight the importance of incorporating data distribution into stability analysis.
Related papers
- Variational Graph Convolutional Neural Networks [72.67088029389764]
Uncertainty can help improve the explainability of Graph Convolutional Networks.<n>Uncertainty can also be used in critical applications to verify the results of the model.
arXiv Detail & Related papers (2025-07-02T13:28:37Z) - Hierarchical Uncertainty-Aware Graph Neural Network [3.4498722449655066]
This work introduces a novel architecture, the Hierarchical Uncertainty-Aware Graph Neural Network (HU-GNN)<n>It unifies multi-scale representation learning, principled uncertainty estimation, and self-supervised embedding diversity within a single end-to-end framework.<n>Specifically, HU-GNN adaptively forms node clusters and estimates uncertainty at multiple structural scales from individual nodes to higher levels.
arXiv Detail & Related papers (2025-04-28T14:22:18Z) - On the Relationship Between Robustness and Expressivity of Graph Neural Networks [7.161966906570077]
Graph Neural Networks (GNNs) are vulnerable to bit-flip attacks (BFAs)<n>We introduce an analytical framework to study the influence of architectural features, graph properties, and their interaction.<n>We derive theoretical bounds for the number of bit flips required to degrade GNN expressivity on a dataset.
arXiv Detail & Related papers (2025-04-18T16:38:33Z) - Conformal Prediction for Federated Graph Neural Networks with Missing Neighbor Information [2.404163279345609]
This study extends the applicability of Conformal Prediction to federated graph learning.
We tackle the missing links issue in distributed subgraphs to minimize its adverse effects on CP set sizes.
We introduce a Variational Autoencoder-based approach for reconstructing missing neighbors to mitigate the negative impact of missing data.
arXiv Detail & Related papers (2024-10-17T20:22:25Z) - Uncertainty in Graph Neural Networks: A Survey [47.785948021510535]
Graph Neural Networks (GNNs) have been extensively used in various real-world applications.<n>However, the predictive uncertainty of GNNs stemming from diverse sources can lead to unstable and erroneous predictions.<n>This survey aims to provide a comprehensive overview of the GNNs from the perspective of uncertainty.
arXiv Detail & Related papers (2024-03-11T21:54:52Z) - Probabilistically Rewired Message-Passing Neural Networks [41.554499944141654]
Message-passing graph neural networks (MPNNs) emerged as powerful tools for processing graph-structured input.
MPNNs operate on a fixed input graph structure, ignoring potential noise and missing information.
We devise probabilistically rewired MPNNs (PR-MPNNs) which learn to add relevant edges while omitting less beneficial ones.
arXiv Detail & Related papers (2023-10-03T15:43:59Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Graph-Time Convolutional Neural Networks: Architecture and Theoretical
Analysis [12.995632804090198]
We introduce Graph-Time Convolutional Neural Networks (GTCNNs) as principled architecture to aid learning.
The approach can work with any type of product graph and we also introduce a parametric graph to learn also the producttemporal coupling.
Extensive numerical results on benchmark corroborate our findings and show the GTCNN compares favorably with state-of-the-art solutions.
arXiv Detail & Related papers (2022-06-30T10:20:52Z) - coVariance Neural Networks [119.45320143101381]
Graph neural networks (GNN) are an effective framework that exploit inter-relationships within graph-structured data for learning.
We propose a GNN architecture, called coVariance neural network (VNN), that operates on sample covariance matrices as graphs.
We show that VNN performance is indeed more stable than PCA-based statistical approaches.
arXiv Detail & Related papers (2022-05-31T15:04:43Z) - Handling Distribution Shifts on Graphs: An Invariance Perspective [78.31180235269035]
We formulate the OOD problem on graphs and develop a new invariant learning approach, Explore-to-Extrapolate Risk Minimization (EERM)
EERM resorts to multiple context explorers that are adversarially trained to maximize the variance of risks from multiple virtual environments.
We prove the validity of our method by theoretically showing its guarantee of a valid OOD solution.
arXiv Detail & Related papers (2022-02-05T02:31:01Z) - Training Stable Graph Neural Networks Through Constrained Learning [116.03137405192356]
Graph Neural Networks (GNNs) rely on graph convolutions to learn features from network data.
GNNs are stable to different types of perturbations of the underlying graph, a property that they inherit from graph filters.
We propose a novel constrained learning approach by imposing a constraint on the stability condition of the GNN within a perturbation of choice.
arXiv Detail & Related papers (2021-10-07T15:54:42Z) - Stability of Graph Convolutional Neural Networks to Stochastic
Perturbations [122.12962842842349]
Graph convolutional neural networks (GCNNs) are nonlinear processing tools to learn representations from network data.
Current analysis considers deterministic perturbations but fails to provide relevant insights when topological changes are random.
This paper investigates the stability of GCNNs to perturbed graph perturbations induced by link losses.
arXiv Detail & Related papers (2021-06-19T16:25:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.