Autobahn: Automorphism-based Graph Neural Nets
- URL: http://arxiv.org/abs/2103.01710v1
- Date: Tue, 2 Mar 2021 13:34:29 GMT
- Title: Autobahn: Automorphism-based Graph Neural Nets
- Authors: Erik Henning Thiede, Wenda Zhou, Risi Kondor
- Abstract summary: We introduce Automorphism-based graph neural networks (Autobahn)
In an Autobahn, we decompose the graph into a collection of subgraphs and applying local convolutions that are equivariant to each subgraph's automorphism group.
We validate our approach by applying Autobahn to molecular graphs, where it achieves state-of-the-art results.
- Score: 12.029647699164315
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Automorphism-based graph neural networks (Autobahn), a new
family of graph neural networks. In an Autobahn, we decompose the graph into a
collection of subgraphs and applying local convolutions that are equivariant to
each subgraph's automorphism group. Specific choices of local neighborhoods and
subgraphs recover existing architectures such as message passing neural
networks. However, our formalism also encompasses novel architectures: as an
example, we introduce a graph neural network that decomposes the graph into
paths and cycles. The resulting convolutions reflect the natural way that parts
of the graph can transform, preserving the intuitive meaning of convolution
without sacrificing global permutation equivariance. We validate our approach
by applying Autobahn to molecular graphs, where it achieves state-of-the-art
results.
Related papers
- Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - A generative neural network model for random dot product graphs [1.1421942894219896]
GraphMoE is a novel approach to learning generative models for random graphs.
The neural network is trained to match the distribution of a class of random graphs by way of a moment estimator.
arXiv Detail & Related papers (2022-04-15T19:59:22Z) - SpeqNets: Sparsity-aware Permutation-equivariant Graph Networks [16.14454388348814]
We present a class of universal, permutation-equivariant graph networks.
They offer a fine-grained control between expressivity and scalability and adapt to the sparsity of the graph.
These architectures lead to vastly reduced computation times compared to standard higher-order graph networks.
arXiv Detail & Related papers (2022-03-25T21:17:09Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Graph Representation Learning for Road Type Classification [13.227651826285014]
We present a learning-based approach to graph representations of road networks employing state-of-the-art graph convolutional neural networks.
Our approach is applied to realistic road networks of 17 cities from Open Street Map.
arXiv Detail & Related papers (2021-07-16T09:32:58Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - Natural Graph Networks [80.77570956520482]
We show that the more general concept of naturality is sufficient for a graph network to be well-defined.
We define global and local natural graph networks, the latter of which are as scalable as conventional message passing graph neural networks.
arXiv Detail & Related papers (2020-07-16T14:19:06Z) - Convolutional Kernel Networks for Graph-Structured Data [37.13712126432493]
We introduce a family of multilayer graph kernels and establish new links between graph convolutional neural networks and kernel methods.
Our approach generalizes convolutional kernel networks to graph-structured data, by representing graphs as a sequence of kernel feature maps.
Our model can also be trained end-to-end on large-scale data, leading to new types of graph convolutional neural networks.
arXiv Detail & Related papers (2020-03-11T09:44:03Z) - A Convolutional Neural Network into graph space [5.6326241162252755]
We propose a new convolution neural network architecture, defined directly into graph space.
We show its usability in a back-propagation context.
It shows robustness with respect to graph domain changes and improvement with respect to other euclidean and non-euclidean convolutional architectures.
arXiv Detail & Related papers (2020-02-20T15:14:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.