On the Stability of Low Pass Graph Filter With a Large Number of Edge
Rewires
- URL: http://arxiv.org/abs/2110.07234v1
- Date: Thu, 14 Oct 2021 09:00:35 GMT
- Title: On the Stability of Low Pass Graph Filter With a Large Number of Edge
Rewires
- Authors: Hoang-Son Nguyen, Yiran He, Hoi-To Wai
- Abstract summary: We show that the stability of a graph filter depends on perturbation to the community structure.
For block graphs, the graph filter distance converges to zero when the number of nodes approaches infinity.
- Score: 21.794739464247687
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recently, the stability of graph filters has been studied as one of the key
theoretical properties driving the highly successful graph convolutional neural
networks (GCNs). The stability of a graph filter characterizes the effect of
topology perturbation on the output of a graph filter, a fundamental building
block for GCNs. Many existing results have focused on the regime of small
perturbation with a small number of edge rewires. However, the number of edge
rewires can be large in many applications. To study the latter case, this work
departs from the previous analysis and proves a bound on the stability of graph
filter relying on the filter's frequency response. Assuming the graph filter is
low pass, we show that the stability of the filter depends on perturbation to
the community structure. As an application, we show that for stochastic block
model graphs, the graph filter distance converges to zero when the number of
nodes approaches infinity. Numerical simulations validate our findings.
Related papers
- Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Limitless stability for Graph Convolutional Networks [8.1585306387285]
This work establishes rigorous, novel and widely applicable stability guarantees and transferability bounds for graph convolutional networks.
It is showcased that graph convolutional networks are stable under graph-coarse-graining procedures precisely if the GSO is the graph Laplacian and filters are regular at infinity.
arXiv Detail & Related papers (2023-01-26T22:17:00Z) - Online Filtering over Expanding Graphs [14.84852576248587]
We propose an online update of the filter, based on the principles of online machine learning.
We show the performance of our method for signal at the incoming nodes.
These findings lay the foundation for efficient filtering over expanding graphs.
arXiv Detail & Related papers (2023-01-17T14:07:52Z) - Stability of Aggregation Graph Neural Networks [153.70485149740608]
We study the stability properties of aggregation graph neural networks (Agg-GNNs) considering perturbations of the underlying graph.
We prove that the stability bounds are defined by the properties of the filters in the first layer of the CNN that acts on each node.
We also conclude that in Agg-GNNs the selectivity of the mapping operators is tied to the properties of the filters only in the first layer of the CNN stage.
arXiv Detail & Related papers (2022-07-08T03:54:52Z) - Beyond Low-pass Filtering: Graph Convolutional Networks with Automatic
Filtering [61.315598419655224]
We propose Automatic Graph Convolutional Networks (AutoGCN) to capture the full spectrum of graph signals.
While it is based on graph spectral theory, our AutoGCN is also localized in space and has a spatial form.
arXiv Detail & Related papers (2021-07-10T04:11:25Z) - Stability to Deformations of Manifold Filters and Manifold Neural Networks [89.53585099149973]
The paper defines and studies manifold (M) convolutional filters and neural networks (NNs)
The main technical contribution of the paper is to analyze the stability of manifold filters and MNNs to smooth deformations of the manifold.
arXiv Detail & Related papers (2021-06-07T15:41:03Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Interpretable Stability Bounds for Spectral Graph Filters [12.590415345079991]
We study filter stability and provide a novel and interpretable upper bound on the change of filter output.
This upper bound allows us to reason, in terms of structural properties of the graph, when a spectral graph filter will be stable.
arXiv Detail & Related papers (2021-02-18T19:25:52Z) - Unrolling of Deep Graph Total Variation for Image Denoising [106.93258903150702]
In this paper, we combine classical graph signal filtering with deep feature learning into a competitive hybrid design.
We employ interpretable analytical low-pass graph filters and employ 80% fewer network parameters than state-of-the-art DL denoising scheme DnCNN.
arXiv Detail & Related papers (2020-10-21T20:04:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.