Learning Stable Graph Neural Networks via Spectral Regularization
- URL: http://arxiv.org/abs/2211.06966v1
- Date: Sun, 13 Nov 2022 17:27:21 GMT
- Title: Learning Stable Graph Neural Networks via Spectral Regularization
- Authors: Zhan Gao and Elvin Isufi
- Abstract summary: Stability of graph neural networks (GNNs) characterizes how GNNs react to graph perturbations and provides guarantees for architecture performance in noisy scenarios.
This paper develops a self-regularized graph neural network (SR-GNN) that improves the architecture stability by regularizing the filter frequency responses in the graph spectral domain.
- Score: 18.32587282139282
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Stability of graph neural networks (GNNs) characterizes how GNNs react to
graph perturbations and provides guarantees for architecture performance in
noisy scenarios. This paper develops a self-regularized graph neural network
(SR-GNN) solution that improves the architecture stability by regularizing the
filter frequency responses in the graph spectral domain. The SR-GNN considers
not only the graph signal as input but also the eigenvectors of the underlying
graph, where the signal is processed to generate task-relevant features and the
eigenvectors to characterize the frequency responses at each layer. We train
the SR-GNN by minimizing the cost function and regularizing the maximal
frequency response close to one. The former improves the architecture
performance, while the latter tightens the perturbation stability and
alleviates the information loss through multi-layer propagation. We further
show the SR-GNN preserves the permutation equivariance, which allows to explore
the internal symmetries of graph signals and to exhibit transference on similar
graph structures. Numerical results with source localization and movie
recommendation corroborate our findings and show the SR-GNN yields a comparable
performance with the vanilla GNN on the unperturbed graph but improves
substantially the stability.
Related papers
- Adaptive Least Mean Squares Graph Neural Networks and Online Graph
Signal Estimation [3.6448362316632115]
We propose an efficient Neural Network architecture for the online estimation of time-varying graph signals.
The Adaptive Least Mean Squares Graph Neural Networks (LMS-GNN) is a combination of adaptive graph filters and Graph Neural Networks (GNN)
Experimenting on real-world temperature data reveals that our LMS-GNN achieves more accurate online predictions compared to graph-based methods.
arXiv Detail & Related papers (2024-01-27T05:47:12Z) - Stable and Transferable Hyper-Graph Neural Networks [95.07035704188984]
We introduce an architecture for processing signals supported on hypergraphs via graph neural networks (GNNs)
We provide a framework for bounding the stability and transferability error of GNNs across arbitrary graphs via spectral similarity.
arXiv Detail & Related papers (2022-11-11T23:44:20Z) - Space-Time Graph Neural Networks with Stochastic Graph Perturbations [100.31591011966603]
Space-time graph neural networks (ST-GNNs) learn efficient graph representations of time-varying data.
In this paper we revisit the properties of ST-GNNs and prove that they are stable to graph stabilitys.
Our analysis suggests that ST-GNNs are suitable for transfer learning on time-varying graphs.
arXiv Detail & Related papers (2022-10-28T16:59:51Z) - Adaptive Kernel Graph Neural Network [21.863238974404474]
Graph neural networks (GNNs) have demonstrated great success in representation learning for graph-structured data.
In this paper, we propose a novel framework - i.e., namely Adaptive Kernel Graph Neural Network (AKGNN)
AKGNN learns to adapt to the optimal graph kernel in a unified manner at the first attempt.
Experiments are conducted on acknowledged benchmark datasets and promising results demonstrate the outstanding performance of our proposed AKGNN.
arXiv Detail & Related papers (2021-12-08T20:23:58Z) - Training Stable Graph Neural Networks Through Constrained Learning [116.03137405192356]
Graph Neural Networks (GNNs) rely on graph convolutions to learn features from network data.
GNNs are stable to different types of perturbations of the underlying graph, a property that they inherit from graph filters.
We propose a novel constrained learning approach by imposing a constraint on the stability condition of the GNN within a perturbation of choice.
arXiv Detail & Related papers (2021-10-07T15:54:42Z) - Stability of Graph Convolutional Neural Networks to Stochastic
Perturbations [122.12962842842349]
Graph convolutional neural networks (GCNNs) are nonlinear processing tools to learn representations from network data.
Current analysis considers deterministic perturbations but fails to provide relevant insights when topological changes are random.
This paper investigates the stability of GCNNs to perturbed graph perturbations induced by link losses.
arXiv Detail & Related papers (2021-06-19T16:25:28Z) - Stochastic Graph Neural Networks [123.39024384275054]
Graph neural networks (GNNs) model nonlinear representations in graph data with applications in distributed agent coordination, control, and planning.
Current GNN architectures assume ideal scenarios and ignore link fluctuations that occur due to environment, human factors, or external attacks.
In these situations, the GNN fails to address its distributed task if the topological randomness is not considered accordingly.
arXiv Detail & Related papers (2020-06-04T08:00:00Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.