pathGCN: Learning General Graph Spatial Operators from Paths
- URL: http://arxiv.org/abs/2207.07408v1
- Date: Fri, 15 Jul 2022 11:28:11 GMT
- Title: pathGCN: Learning General Graph Spatial Operators from Paths
- Authors: Moshe Eliasof, Eldad Haber, Eran Treister
- Abstract summary: We propose pathGCN, a novel approach to learn the spatial operator from random paths on the graph.
Our experiments on numerous datasets suggest that by properly learning both the spatial and point-wise convolutions, phenomena like over-smoothing can be inherently avoided.
- Score: 9.539495585692007
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Convolutional Networks (GCNs), similarly to Convolutional Neural
Networks (CNNs), are typically based on two main operations - spatial and
point-wise convolutions. In the context of GCNs, differently from CNNs, a
pre-determined spatial operator based on the graph Laplacian is often chosen,
allowing only the point-wise operations to be learnt. However, learning a
meaningful spatial operator is critical for developing more expressive GCNs for
improved performance. In this paper we propose pathGCN, a novel approach to
learn the spatial operator from random paths on the graph. We analyze the
convergence of our method and its difference from existing GCNs. Furthermore,
we discuss several options of combining our learnt spatial operator with
point-wise convolutions. Our extensive experiments on numerous datasets suggest
that by properly learning both the spatial and point-wise convolutions,
phenomena like over-smoothing can be inherently avoided, and new
state-of-the-art performance is achieved.
Related papers
- Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Neural Tangent Kernels Motivate Graph Neural Networks with
Cross-Covariance Graphs [94.44374472696272]
We investigate NTKs and alignment in the context of graph neural networks (GNNs)
Our results establish the theoretical guarantees on the optimality of the alignment for a two-layer GNN.
These guarantees are characterized by the graph shift operator being a function of the cross-covariance between the input and the output data.
arXiv Detail & Related papers (2023-10-16T19:54:21Z) - Tuning the Geometry of Graph Neural Networks [0.7614628596146599]
spatial graph convolution operators have been heralded as key to the success of Graph Neural Networks (GNNs)
We show that this aggregation operator is in fact tunable, and explicit regimes in which certain choices of operators -- and therefore, embedding geometries -- might be more appropriate.
arXiv Detail & Related papers (2022-07-12T23:28:03Z) - Learning Connectivity with Graph Convolutional Networks for
Skeleton-based Action Recognition [14.924672048447338]
We introduce a novel framework for graph convolutional networks that learns the topological properties of graphs.
The design principle of our method is based on the optimization of a constrained objective function.
Experiments conducted on the challenging task of skeleton-based action recognition shows the superiority of the proposed method.
arXiv Detail & Related papers (2021-12-06T19:43:26Z) - Positional Encoder Graph Neural Networks for Geographic Data [1.840220263320992]
Graph neural networks (GNNs) provide a powerful and scalable solution for modeling continuous spatial data.
In this paper, we propose PE-GNN, a new framework that incorporates spatial context and correlation explicitly into the models.
arXiv Detail & Related papers (2021-11-19T10:41:49Z) - Towards Efficient Graph Convolutional Networks for Point Cloud Handling [181.59146413326056]
We aim at improving the computational efficiency of graph convolutional networks (GCNs) for learning on point clouds.
A series of experiments show that optimized networks have reduced computational complexity, decreased memory consumption, and accelerated inference speed.
arXiv Detail & Related papers (2021-04-12T17:59:16Z) - Learning Chebyshev Basis in Graph Convolutional Networks for
Skeleton-based Action Recognition [14.924672048447338]
Spectral graph convolutional networks (GCNs) are particular deep models which aim at extending neural networks to arbitrary irregular domains.
We introduce a novel spectral GCN that learns not only the usual convolutional parameters but also the Laplacian operators.
arXiv Detail & Related papers (2021-04-12T14:08:58Z) - Action Recognition with Kernel-based Graph Convolutional Networks [14.924672048447338]
Learning graph convolutional networks (GCNs) aims at generalizing deep learning to arbitrary non-regular domains.
We introduce a novel GCN framework that achieves spatial graph convolution in a reproducing kernel Hilbert space (RKHS)
The particularity of our GCN model also resides in its ability to achieve convolutions without explicitly realigning nodes in the receptive fields of the learned graph filters with those of the input graphs.
arXiv Detail & Related papers (2020-12-28T11:02:51Z) - Bi-GCN: Binary Graph Convolutional Network [57.733849700089955]
We propose a Binary Graph Convolutional Network (Bi-GCN), which binarizes both the network parameters and input node features.
Our Bi-GCN can reduce the memory consumption by an average of 30x for both the network parameters and input data, and accelerate the inference speed by an average of 47x.
arXiv Detail & Related papers (2020-10-15T07:26:23Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Node Masking: Making Graph Neural Networks Generalize and Scale Better [71.51292866945471]
Graph Neural Networks (GNNs) have received a lot of interest in the recent times.
In this paper, we utilize some theoretical tools to better visualize the operations performed by state of the art spatial GNNs.
We introduce a simple concept, Node Masking, that allows them to generalize and scale better.
arXiv Detail & Related papers (2020-01-17T06:26:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.