E(n)-equivariant Graph Neural Cellular Automata
- URL: http://arxiv.org/abs/2301.10497v1
- Date: Wed, 25 Jan 2023 10:17:07 GMT
- Title: E(n)-equivariant Graph Neural Cellular Automata
- Authors: Gennaro Gala, Daniele Grattarola and Erik Quaeghebeur
- Abstract summary: We propose a class of isotropic automata that we call E(n)-GNCAs.
These models are lightweight, but can nevertheless handle large graphs, capture complex dynamics and exhibit emergent self-organising behaviours.
We showcase the broad and successful applicability of E(n)-GNCAs on three different tasks.
- Score: 4.168157981135698
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Cellular automata (CAs) are computational models exhibiting rich dynamics
emerging from the local interaction of cells arranged in a regular lattice.
Graph CAs (GCAs) generalise standard CAs by allowing for arbitrary graphs
rather than regular lattices, similar to how Graph Neural Networks (GNNs)
generalise Convolutional NNs. Recently, Graph Neural CAs (GNCAs) have been
proposed as models built on top of standard GNNs that can be trained to
approximate the transition rule of any arbitrary GCA. Existing GNCAs are
anisotropic in the sense that their transition rules are not equivariant to
translation, rotation, and reflection of the nodes' spatial locations. However,
it is desirable for instances related by such transformations to be treated
identically by the model. By replacing standard graph convolutions with
E(n)-equivariant ones, we avoid anisotropy by design and propose a class of
isotropic automata that we call E(n)-GNCAs. These models are lightweight, but
can nevertheless handle large graphs, capture complex dynamics and exhibit
emergent self-organising behaviours. We showcase the broad and successful
applicability of E(n)-GNCAs on three different tasks: (i) pattern formation,
(ii) graph auto-encoding, and (iii) simulation of E(n)-equivariant dynamical
systems.
Related papers
- Relaxing Continuous Constraints of Equivariant Graph Neural Networks for Physical Dynamics Learning [39.25135680793105]
We propose a general Discrete Equivariant Graph Neural Network (DEGNN) that guarantees equivariance to a given discrete point group.
Specifically, we show that such discrete equivariant message passing could be constructed by transforming geometric features into permutation-invariant embeddings.
We show that DEGNN is data efficient, learning with less data, and can generalize across scenarios such as unobserved orientation.
arXiv Detail & Related papers (2024-06-24T03:37:51Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - SEGNO: Generalizing Equivariant Graph Neural Networks with Physical
Inductive Biases [66.61789780666727]
We show how the second-order continuity can be incorporated into GNNs while maintaining the equivariant property.
We also offer theoretical insights into SEGNO, highlighting that it can learn a unique trajectory between adjacent states.
Our model yields a significant improvement over the state-of-the-art baselines.
arXiv Detail & Related papers (2023-08-25T07:15:58Z) - Learning Graph Cellular Automata [25.520299226767946]
We focus on a generalised version of typical cellular automata (GCA)
In particular, we extend previous work that used convolutional neural networks to learn the transition rule of conventional GCA.
We show that it can represent any arbitrary GCA with finite and discrete state space.
arXiv Detail & Related papers (2021-10-27T07:42:48Z) - Orthogonal Graph Neural Networks [53.466187667936026]
Graph neural networks (GNNs) have received tremendous attention due to their superiority in learning node representations.
stacking more convolutional layers significantly decreases the performance of GNNs.
We propose a novel Ortho-GConv, which could generally augment the existing GNN backbones to stabilize the model training and improve the model's generalization performance.
arXiv Detail & Related papers (2021-09-23T12:39:01Z) - Continuous-Depth Neural Models for Dynamic Graph Prediction [16.89981677708299]
We introduce the framework of continuous-depth graph neural networks (GNNs)
Neural graph differential equations (Neural GDEs) are formalized as the counterpart to GNNs.
Results prove the effectiveness of the proposed models across applications, such as traffic forecasting or prediction in genetic regulatory networks.
arXiv Detail & Related papers (2021-06-22T07:30:35Z) - E(n) Equivariant Graph Neural Networks [86.75170631724548]
This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations called E(n)-Equivariant Graph Neural Networks (EGNNs)
In contrast with existing methods, our work does not require computationally expensive higher-order representations in intermediate layers while it still achieves competitive or better performance.
arXiv Detail & Related papers (2021-02-19T10:25:33Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.