A Robust Alternative for Graph Convolutional Neural Networks via Graph
Neighborhood Filters
- URL: http://arxiv.org/abs/2110.00844v1
- Date: Sat, 2 Oct 2021 17:05:27 GMT
- Title: A Robust Alternative for Graph Convolutional Neural Networks via Graph
Neighborhood Filters
- Authors: Victor M. Tenorio, Samuel Rey, Fernando Gama, Santiago Segarra and
Antonio G. Marques
- Abstract summary: We present a family of graph filters (NGFs) that replace the powers of the graph shift operator with $k$-hop neighborhood adjacency matrices.
NGFs help to alleviate the numerical issues of traditional GFs, allow for the design of deeper GCNNs, and enhance the robustness to errors in the topology of the graph.
- Score: 84.20468404544047
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph convolutional neural networks (GCNNs) are popular deep learning
architectures that, upon replacing regular convolutions with graph filters
(GFs), generalize CNNs to irregular domains. However, classical GFs are prone
to numerical errors since they consist of high-order polynomials. This problem
is aggravated when several filters are applied in cascade, limiting the
practical depth of GCNNs. To tackle this issue, we present the neighborhood
graph filters (NGFs), a family of GFs that replaces the powers of the graph
shift operator with $k$-hop neighborhood adjacency matrices. NGFs help to
alleviate the numerical issues of traditional GFs, allow for the design of
deeper GCNNs, and enhance the robustness to errors in the topology of the
graph. To illustrate the advantage over traditional GFs in practical
applications, we use NGFs in the design of deep neighborhood GCNNs to solve
graph signal denoising and node classification problems over both synthetic and
real-world data.
Related papers
- Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - Robust Graph Neural Network based on Graph Denoising [10.564653734218755]
Graph Neural Networks (GNNs) have emerged as a notorious alternative to address learning problems dealing with non-Euclidean datasets.
This work proposes a robust implementation of GNNs that explicitly accounts for the presence of perturbations in the observed topology.
arXiv Detail & Related papers (2023-12-11T17:43:57Z) - Training Graph Neural Networks on Growing Stochastic Graphs [114.75710379125412]
Graph Neural Networks (GNNs) rely on graph convolutions to exploit meaningful patterns in networked data.
We propose to learn GNNs on very large graphs by leveraging the limit object of a sequence of growing graphs, the graphon.
arXiv Detail & Related papers (2022-10-27T16:00:45Z) - KerGNNs: Interpretable Graph Neural Networks with Graph Kernels [14.421535610157093]
Graph neural networks (GNNs) have become the state-of-the-art method in downstream graph-related tasks.
We propose a novel GNN framework, termed textit Kernel Graph Neural Networks (KerGNNs)
KerGNNs integrate graph kernels into the message passing process of GNNs.
We show that our method achieves competitive performance compared with existing state-of-the-art methods.
arXiv Detail & Related papers (2022-01-03T06:16:30Z) - Transferability Properties of Graph Neural Networks [125.71771240180654]
Graph neural networks (GNNs) are provably successful at learning representations from data supported on moderate-scale graphs.
We study the problem of training GNNs on graphs of moderate size and transferring them to large-scale graphs.
Our results show that (i) the transference error decreases with the graph size, and (ii) that graph filters have a transferability-discriminability tradeoff that in GNNs is alleviated by the scattering behavior of the nonlinearity.
arXiv Detail & Related papers (2021-12-09T00:08:09Z) - Framework for Designing Filters of Spectral Graph Convolutional Neural
Networks in the Context of Regularization Theory [1.0152838128195467]
Graph convolutional neural networks (GCNNs) have been widely used in graph learning.
It has been observed that the smoothness functional on graphs can be defined in terms of the graph Laplacian.
In this work, we explore the regularization properties of graph Laplacian and proposed a generalized framework for regularized filter designs in spectral GCNNs.
arXiv Detail & Related papers (2020-09-29T06:19:08Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Graph Neural Networks for Motion Planning [108.51253840181677]
We present two techniques, GNNs over dense fixed graphs for low-dimensional problems and sampling-based GNNs for high-dimensional problems.
We examine the ability of a GNN to tackle planning problems such as identifying critical nodes or learning the sampling distribution in Rapidly-exploring Random Trees (RRT)
Experiments with critical sampling, a pendulum and a six DoF robot arm show GNNs improve on traditional analytic methods as well as learning approaches using fully-connected or convolutional neural networks.
arXiv Detail & Related papers (2020-06-11T08:19:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.