Stable and Transferable Hyper-Graph Neural Networks
- URL: http://arxiv.org/abs/2211.06513v1
- Date: Fri, 11 Nov 2022 23:44:20 GMT
- Title: Stable and Transferable Hyper-Graph Neural Networks
- Authors: Mikhail Hayhoe, Hans Riess, Victor M. Preciado, and Alejandro Ribeiro
- Abstract summary: We introduce an architecture for processing signals supported on hypergraphs via graph neural networks (GNNs)
We provide a framework for bounding the stability and transferability error of GNNs across arbitrary graphs via spectral similarity.
- Score: 95.07035704188984
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We introduce an architecture for processing signals supported on hypergraphs
via graph neural networks (GNNs), which we call a Hyper-graph Expansion Neural
Network (HENN), and provide the first bounds on the stability and
transferability error of a hypergraph signal processing model. To do so, we
provide a framework for bounding the stability and transferability error of
GNNs across arbitrary graphs via spectral similarity. By bounding the
difference between two graph shift operators (GSOs) in the positive
semi-definite sense via their eigenvalue spectrum, we show that this error
depends only on the properties of the GNN and the magnitude of spectral
similarity of the GSOs. Moreover, we show that existing transferability results
that assume the graphs are small perturbations of one another, or that the
graphs are random and drawn from the same distribution or sampled from the same
graphon can be recovered using our approach. Thus, both GNNs and our HENNs
(trained using normalized Laplacians as graph shift operators) will be
increasingly stable and transferable as the graphs become larger. Experimental
results illustrate the importance of considering multiple graph representations
in HENN, and show its superior performance when transferability is desired.
Related papers
- Higher-Order GNNs Meet Efficiency: Sparse Sobolev Graph Neural Networks [6.080095317098909]
Graph Neural Networks (GNNs) have shown great promise in modeling relationships between nodes in a graph.
Previous studies have primarily attempted to utilize the information from higher-order neighbors in the graph.
We make a fundamental observation: the regular and the Hadamard power of the Laplacian matrix behave similarly in the spectrum.
We propose a novel graph convolutional operator based on the sparse Sobolev norm of graph signals.
arXiv Detail & Related papers (2024-11-07T09:53:11Z) - Graph neural networks and non-commuting operators [4.912318087940015]
We develop a limit theory of graphon-tuple neural networks and use it to prove a universal transferability theorem.
Our theoretical results extend well-known transferability theorems for GNNs to the case of several simultaneous graphs.
We derive a training procedure that provably enforces the stability of the resulting model.
arXiv Detail & Related papers (2024-11-06T21:17:14Z) - Transferability of Graph Neural Networks using Graphon and Sampling Theories [0.0]
Graph neural networks (GNNs) have become powerful tools for processing graph-based information in various domains.
A desirable property of GNNs is transferability, where a trained network can swap in information from a different graph without retraining and retain its accuracy.
We contribute to the application of graphons to GNNs by presenting an explicit two-layer graphon neural network (WNN) architecture.
arXiv Detail & Related papers (2023-07-25T02:11:41Z) - Learning Stable Graph Neural Networks via Spectral Regularization [18.32587282139282]
Stability of graph neural networks (GNNs) characterizes how GNNs react to graph perturbations and provides guarantees for architecture performance in noisy scenarios.
This paper develops a self-regularized graph neural network (SR-GNN) that improves the architecture stability by regularizing the filter frequency responses in the graph spectral domain.
arXiv Detail & Related papers (2022-11-13T17:27:21Z) - Graph Convolutional Neural Networks Sensitivity under Probabilistic Error Model [24.215504503548864]
This paper proposes an analysis framework to investigate the sensitivity of GCNNs to probabilistic graph perturbations.
Our study establishes tight expected GSO error bounds, which are explicitly linked to the error model parameters, and reveals a linear relationship between GSO perturbations and the resulting output differences.
Experiments validate our theoretical derivations and the effectiveness of our approach.
arXiv Detail & Related papers (2022-03-15T12:40:10Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Stability of Graph Convolutional Neural Networks to Stochastic
Perturbations [122.12962842842349]
Graph convolutional neural networks (GCNNs) are nonlinear processing tools to learn representations from network data.
Current analysis considers deterministic perturbations but fails to provide relevant insights when topological changes are random.
This paper investigates the stability of GCNNs to perturbed graph perturbations induced by link losses.
arXiv Detail & Related papers (2021-06-19T16:25:28Z) - Graph and graphon neural network stability [122.06927400759021]
Graph networks (GNNs) are learning architectures that rely on knowledge of the graph structure to generate meaningful representations of network data.
We analyze GNN stability using kernel objects called graphons.
arXiv Detail & Related papers (2020-10-23T16:55:56Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.