Nonlinear State-Space Generalizations of Graph Convolutional Neural
Networks
- URL: http://arxiv.org/abs/2010.14585v2
- Date: Thu, 11 Feb 2021 19:54:58 GMT
- Title: Nonlinear State-Space Generalizations of Graph Convolutional Neural
Networks
- Authors: Luana Ruiz, Fernando Gama, Alejandro Ribeiro, Elvin Isufi
- Abstract summary: Graph convolutional neural networks (GCNNs) learn compositional representations from network data by nesting linear graph convolutions into nonlinearities.
In this work, we approach GCNNs from a state-space perspective revealing that the graph convolutional module is a minimalistic linear state-space model.
We show that this state update may be problematic because it is nonparametric, and depending on the graph spectrum it may explode or vanish.
We propose a novel family of nodal aggregation rules that aggregate node features within a layer in a nonlinear state-space parametric fashion allowing for a better trade-off.
- Score: 172.18295279061607
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph convolutional neural networks (GCNNs) learn compositional
representations from network data by nesting linear graph convolutions into
nonlinearities. In this work, we approach GCNNs from a state-space perspective
revealing that the graph convolutional module is a minimalistic linear
state-space model, in which the state update matrix is the graph shift
operator. We show that this state update may be problematic because it is
nonparametric, and depending on the graph spectrum it may explode or vanish.
Therefore, the GCNN has to trade its degrees of freedom between extracting
features from data and handling these instabilities. To improve such trade-off,
we propose a novel family of nodal aggregation rules that aggregate node
features within a layer in a nonlinear state-space parametric fashion allowing
for a better trade-off. We develop two architectures within this family
inspired by the recurrence with and without nodal gating mechanisms. The
proposed solutions generalize the GCNN and provide an additional handle to
control the state update and learn from the data. Numerical results on source
localization and authorship attribution show the superiority of the nonlinear
state-space generalization models over the baseline GCNN.
Related papers
- Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - L^2GC:Lorentzian Linear Graph Convolutional Networks for Node Classification [12.69417276887153]
We propose a novel framework for Lorentzian linear GCN.
We map the learned features of graph nodes into hyperbolic space.
We then perform a Lorentzian linear feature transformation to capture the underlying tree-like structure of data.
arXiv Detail & Related papers (2024-03-10T02:16:13Z) - Implicit Graph Neural Diffusion Networks: Convergence, Generalization,
and Over-Smoothing [7.984586585987328]
Implicit Graph Neural Networks (GNNs) have achieved significant success in addressing graph learning problems.
We introduce a geometric framework for designing implicit graph diffusion layers based on a parameterized graph Laplacian operator.
We show how implicit GNN layers can be viewed as the fixed-point equation of a Dirichlet energy minimization problem.
arXiv Detail & Related papers (2023-08-07T05:22:33Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Understanding Non-linearity in Graph Neural Networks from the
Bayesian-Inference Perspective [33.01636846541052]
Graph neural networks (GNNs) have shown superiority in many prediction tasks over graphs.
We investigate the functions of non-linearity in GNNs for node classification tasks.
arXiv Detail & Related papers (2022-07-22T19:36:12Z) - Stability of Graph Convolutional Neural Networks to Stochastic
Perturbations [122.12962842842349]
Graph convolutional neural networks (GCNNs) are nonlinear processing tools to learn representations from network data.
Current analysis considers deterministic perturbations but fails to provide relevant insights when topological changes are random.
This paper investigates the stability of GCNNs to perturbed graph perturbations induced by link losses.
arXiv Detail & Related papers (2021-06-19T16:25:28Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.