DiffGCN: Graph Convolutional Networks via Differential Operators and
Algebraic Multigrid Pooling
- URL: http://arxiv.org/abs/2006.04115v2
- Date: Thu, 22 Oct 2020 15:36:52 GMT
- Title: DiffGCN: Graph Convolutional Networks via Differential Operators and
Algebraic Multigrid Pooling
- Authors: Moshe Eliasof, Eran Treister
- Abstract summary: Graph Convolutional Networks (GCNs) have shown to be effective in handling unordered data like point clouds and meshes.
We propose novel approaches for graph convolution, pooling and unpooling, inspired from finite differences and algebraic multigrid frameworks.
- Score: 7.23389716633927
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Convolutional Networks (GCNs) have shown to be effective in handling
unordered data like point clouds and meshes. In this work we propose novel
approaches for graph convolution, pooling and unpooling, inspired from finite
differences and algebraic multigrid frameworks. We form a parameterized
convolution kernel based on discretized differential operators, leveraging the
graph mass, gradient and Laplacian. This way, the parameterization does not
depend on the graph structure, only on the meaning of the network convolutions
as differential operators. To allow hierarchical representations of the input,
we propose pooling and unpooling operations that are based on algebraic
multigrid methods, which are mainly used to solve partial differential
equations on unstructured grids. To motivate and explain our method, we compare
it to standard convolutional neural networks, and show their similarities and
relations in the case of a regular grid. Our proposed method is demonstrated in
various experiments like classification and part-segmentation, achieving on par
or better than state of the art results. We also analyze the computational cost
of our method compared to other GCNs.
Related papers
- Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Discrete Graph Auto-Encoder [52.50288418639075]
We introduce a new framework named Discrete Graph Auto-Encoder (DGAE)
We first use a permutation-equivariant auto-encoder to convert graphs into sets of discrete latent node representations.
In the second step, we sort the sets of discrete latent representations and learn their distribution with a specifically designed auto-regressive model.
arXiv Detail & Related papers (2023-06-13T12:40:39Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - DiffWire: Inductive Graph Rewiring via the Lov\'asz Bound [1.0323063834827415]
Graph Neural Networks (GNNs) have been shown to achieve competitive results to tackle graph-related tasks.
MPNNs have been reported to suffer from over-smoothing, over-squashing and under-reaching.
We propose DiffWire, a novel framework for graph rewiring in MPNNs that is principled, fully differentiable and parameter-free.
arXiv Detail & Related papers (2022-06-15T08:22:07Z) - Effects of Graph Convolutions in Deep Networks [8.937905773981702]
We present a rigorous theoretical understanding of the effects of graph convolutions in multi-layer networks.
We show that a single graph convolution expands the regime of the distance between the means where multi-layer networks can classify the data.
We provide both theoretical and empirical insights into the performance of graph convolutions placed in different combinations among the layers of a network.
arXiv Detail & Related papers (2022-04-20T08:24:43Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Simple Graph Convolutional Networks [72.92604941595019]
We propose simple graph convolution operators, that can be implemented in single-layer graph convolutional networks.
We show that our convolution operators are more theoretically grounded than many proposals in literature, and exhibit state-of-the-art predictive performance on the considered benchmark datasets.
arXiv Detail & Related papers (2021-06-10T15:23:59Z) - Action Recognition with Kernel-based Graph Convolutional Networks [14.924672048447338]
Learning graph convolutional networks (GCNs) aims at generalizing deep learning to arbitrary non-regular domains.
We introduce a novel GCN framework that achieves spatial graph convolution in a reproducing kernel Hilbert space (RKHS)
The particularity of our GCN model also resides in its ability to achieve convolutions without explicitly realigning nodes in the receptive fields of the learned graph filters with those of the input graphs.
arXiv Detail & Related papers (2020-12-28T11:02:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.