Convergence and Stability of Graph Convolutional Networks on Large
Random Graphs
- URL: http://arxiv.org/abs/2006.01868v2
- Date: Fri, 23 Oct 2020 13:17:58 GMT
- Title: Convergence and Stability of Graph Convolutional Networks on Large
Random Graphs
- Authors: Nicolas Keriven and Alberto Bietti and Samuel Vaiter
- Abstract summary: We study properties of Graph Convolutional Networks (GCNs) by analyzing their behavior on standard models of random graphs.
We first study the convergence of GCNs to their continuous counterpart as the number of nodes grows.
We then analyze the stability of GCNs to small deformations of the random graph model.
- Score: 22.387735135790706
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study properties of Graph Convolutional Networks (GCNs) by analyzing their
behavior on standard models of random graphs, where nodes are represented by
random latent variables and edges are drawn according to a similarity kernel.
This allows us to overcome the difficulties of dealing with discrete notions
such as isomorphisms on very large graphs, by considering instead more natural
geometric aspects. We first study the convergence of GCNs to their continuous
counterpart as the number of nodes grows. Our results are fully non-asymptotic
and are valid for relatively sparse graphs with an average degree that grows
logarithmically with the number of nodes. We then analyze the stability of GCNs
to small deformations of the random graph model. In contrast to previous
studies of stability in discrete settings, our continuous setup allows us to
provide more intuitive deformation-based metrics for understanding stability,
which have proven useful for explaining the success of convolutional
representations on Euclidean domains.
Related papers
- Unitary convolutions for learning on graphs and groups [0.9899763598214121]
We study unitary group convolutions, which allow for deeper networks that are more stable during training.
The main focus of the paper are graph neural networks, where we show that unitary graph convolutions provably avoid over-smoothing.
Our experimental results confirm that unitary graph convolutional networks achieve competitive performance on benchmark datasets.
arXiv Detail & Related papers (2024-10-07T21:09:14Z) - Estimating the stability number of a random graph using convolutional neural networks [0.0]
We explore the feasibility of using convolutional neural networks (CNNs) on graph images to predict the cardinality of properties of random graphs and networks.
The model and results presented in this study suggest potential for applying deep learning in optimization problems previously not considered by simple deep learning techniques.
arXiv Detail & Related papers (2024-07-10T16:50:59Z) - Limitless stability for Graph Convolutional Networks [8.1585306387285]
This work establishes rigorous, novel and widely applicable stability guarantees and transferability bounds for graph convolutional networks.
It is showcased that graph convolutional networks are stable under graph-coarse-graining procedures precisely if the GSO is the graph Laplacian and filters are regular at infinity.
arXiv Detail & Related papers (2023-01-26T22:17:00Z) - Stable and Transferable Hyper-Graph Neural Networks [95.07035704188984]
We introduce an architecture for processing signals supported on hypergraphs via graph neural networks (GNNs)
We provide a framework for bounding the stability and transferability error of GNNs across arbitrary graphs via spectral similarity.
arXiv Detail & Related papers (2022-11-11T23:44:20Z) - Stability of Aggregation Graph Neural Networks [153.70485149740608]
We study the stability properties of aggregation graph neural networks (Agg-GNNs) considering perturbations of the underlying graph.
We prove that the stability bounds are defined by the properties of the filters in the first layer of the CNN that acts on each node.
We also conclude that in Agg-GNNs the selectivity of the mapping operators is tied to the properties of the filters only in the first layer of the CNN stage.
arXiv Detail & Related papers (2022-07-08T03:54:52Z) - Stability of Neural Networks on Manifolds to Relative Perturbations [118.84154142918214]
Graph Neural Networks (GNNs) show impressive performance in many practical scenarios.
GNNs can scale well on large size graphs, but this is contradicted by the fact that existing stability bounds grow with the number of nodes.
arXiv Detail & Related papers (2021-10-10T04:37:19Z) - Training Stable Graph Neural Networks Through Constrained Learning [116.03137405192356]
Graph Neural Networks (GNNs) rely on graph convolutions to learn features from network data.
GNNs are stable to different types of perturbations of the underlying graph, a property that they inherit from graph filters.
We propose a novel constrained learning approach by imposing a constraint on the stability condition of the GNN within a perturbation of choice.
arXiv Detail & Related papers (2021-10-07T15:54:42Z) - Graph and graphon neural network stability [122.06927400759021]
Graph networks (GNNs) are learning architectures that rely on knowledge of the graph structure to generate meaningful representations of network data.
We analyze GNN stability using kernel objects called graphons.
arXiv Detail & Related papers (2020-10-23T16:55:56Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z) - The Power of Graph Convolutional Networks to Distinguish Random Graph
Models: Short Version [27.544219236164764]
Graph convolutional networks (GCNs) are a widely used method for graph representation learning.
We investigate the power of GCNs to distinguish between different random graph models on the basis of the embeddings of their sample graphs.
arXiv Detail & Related papers (2020-02-13T17:58:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.