Stability of Aggregation Graph Neural Networks
- URL: http://arxiv.org/abs/2207.03678v2
- Date: Wed, 23 Aug 2023 02:22:37 GMT
- Title: Stability of Aggregation Graph Neural Networks
- Authors: Alejandro Parada-Mayorga, Zhiyang Wang, Fernando Gama, and Alejandro
Ribeiro
- Abstract summary: We study the stability properties of aggregation graph neural networks (Agg-GNNs) considering perturbations of the underlying graph.
We prove that the stability bounds are defined by the properties of the filters in the first layer of the CNN that acts on each node.
We also conclude that in Agg-GNNs the selectivity of the mapping operators is tied to the properties of the filters only in the first layer of the CNN stage.
- Score: 153.70485149740608
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper we study the stability properties of aggregation graph neural
networks (Agg-GNNs) considering perturbations of the underlying graph. An
Agg-GNN is a hybrid architecture where information is defined on the nodes of a
graph, but it is processed block-wise by Euclidean CNNs on the nodes after
several diffusions on the graph shift operator. We derive stability bounds for
the mapping operator associated to a generic Agg-GNN, and we specify conditions
under which such operators can be stable to deformations. We prove that the
stability bounds are defined by the properties of the filters in the first
layer of the CNN that acts on each node. Additionally, we show that there is a
close relationship between the number of aggregations, the filter's
selectivity, and the size of the stability constants. We also conclude that in
Agg-GNNs the selectivity of the mapping operators is tied to the properties of
the filters only in the first layer of the CNN stage. This shows a substantial
difference with respect to the stability properties of selection GNNs, where
the selectivity of the filters in all layers is constrained by their stability.
We provide numerical evidence corroborating the results derived, testing the
behavior of Agg-GNNs in real life application scenarios considering
perturbations of different magnitude.
Related papers
- On the Trade-Off between Stability and Representational Capacity in
Graph Neural Networks [22.751509906413943]
We study the stability of EdgeNet: a general GNN framework that unifies more than twenty solutions.
By studying the effect of different EdgeNet categories on the stability, we show that GNNs with fewer degrees of freedom in their parameter space, linked to a lower representational capacity, are more stable.
arXiv Detail & Related papers (2023-12-04T22:07:17Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Limitless stability for Graph Convolutional Networks [8.1585306387285]
This work establishes rigorous, novel and widely applicable stability guarantees and transferability bounds for graph convolutional networks.
It is showcased that graph convolutional networks are stable under graph-coarse-graining procedures precisely if the GSO is the graph Laplacian and filters are regular at infinity.
arXiv Detail & Related papers (2023-01-26T22:17:00Z) - Stability of Neural Networks on Manifolds to Relative Perturbations [118.84154142918214]
Graph Neural Networks (GNNs) show impressive performance in many practical scenarios.
GNNs can scale well on large size graphs, but this is contradicted by the fact that existing stability bounds grow with the number of nodes.
arXiv Detail & Related papers (2021-10-10T04:37:19Z) - Training Stable Graph Neural Networks Through Constrained Learning [116.03137405192356]
Graph Neural Networks (GNNs) rely on graph convolutions to learn features from network data.
GNNs are stable to different types of perturbations of the underlying graph, a property that they inherit from graph filters.
We propose a novel constrained learning approach by imposing a constraint on the stability condition of the GNN within a perturbation of choice.
arXiv Detail & Related papers (2021-10-07T15:54:42Z) - Stability of Graph Convolutional Neural Networks to Stochastic
Perturbations [122.12962842842349]
Graph convolutional neural networks (GCNNs) are nonlinear processing tools to learn representations from network data.
Current analysis considers deterministic perturbations but fails to provide relevant insights when topological changes are random.
This paper investigates the stability of GCNNs to perturbed graph perturbations induced by link losses.
arXiv Detail & Related papers (2021-06-19T16:25:28Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.