Theoretical Aspects of Group Equivariant Neural Networks
- URL: http://arxiv.org/abs/2004.05154v2
- Date: Thu, 30 Apr 2020 02:10:51 GMT
- Title: Theoretical Aspects of Group Equivariant Neural Networks
- Authors: Carlos Esteves
- Abstract summary: Group equivariant neural networks have been explored in the past few years and are interesting from theoretical and practical standpoints.
They leverage concepts from group representation theory, non-commutative harmonic analysis and differential geometry.
In practice, they have been shown to reduce sample and model complexity, notably in challenging tasks where input transformations such as arbitrary rotations are present.
- Score: 9.449391486456209
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Group equivariant neural networks have been explored in the past few years
and are interesting from theoretical and practical standpoints. They leverage
concepts from group representation theory, non-commutative harmonic analysis
and differential geometry that do not often appear in machine learning. In
practice, they have been shown to reduce sample and model complexity, notably
in challenging tasks where input transformations such as arbitrary rotations
are present. We begin this work with an exposition of group representation
theory and the machinery necessary to define and evaluate integrals and
convolutions on groups. Then, we show applications to recent SO(3) and SE(3)
equivariant networks, namely the Spherical CNNs, Clebsch-Gordan Networks, and
3D Steerable CNNs. We proceed to discuss two recent theoretical results. The
first, by Kondor and Trivedi (ICML'18), shows that a neural network is group
equivariant if and only if it has a convolutional structure. The second, by
Cohen et al. (NeurIPS'19), generalizes the first to a larger class of networks,
with feature maps as fields on homogeneous spaces.
Related papers
- Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - G-RepsNet: A Fast and General Construction of Equivariant Networks for
Arbitrary Matrix Groups [8.24167511378417]
Group equivariant networks are useful in a wide range of deep learning tasks.
Here, we introduce Group Representation Networks (G-RepsNets), a lightweight equivariant network for arbitrary groups.
We show that G-RepsNet is competitive to G-FNO (Helwig et al., 2023) and EGNN (Satorras et al., 2023) on N-body predictions and solving PDEs, respectively.
arXiv Detail & Related papers (2024-02-23T16:19:49Z) - Investigating how ReLU-networks encode symmetries [13.935148870831396]
We investigate whether equivariance of a network implies that all layers are equivariant.
We conjecture that CNNs trained to be equivariant will exhibit layerwise equivariance.
We show that it is typically easier to merge a network with a group-transformed version of itself than merging two different networks.
arXiv Detail & Related papers (2023-05-26T15:23:20Z) - Equivalence Between SE(3) Equivariant Networks via Steerable Kernels and
Group Convolution [90.67482899242093]
A wide range of techniques have been proposed in recent years for designing neural networks for 3D data that are equivariant under rotation and translation of the input.
We provide an in-depth analysis of both methods and their equivalence and relate the two constructions to multiview convolutional networks.
We also derive new TFN non-linearities from our equivalence principle and test them on practical benchmark datasets.
arXiv Detail & Related papers (2022-11-29T03:42:11Z) - Equivariant Transduction through Invariant Alignment [71.45263447328374]
We introduce a novel group-equivariant architecture that incorporates a group-in hard alignment mechanism.
We find that our network's structure allows it to develop stronger equivariant properties than existing group-equivariant approaches.
We additionally find that it outperforms previous group-equivariant networks empirically on the SCAN task.
arXiv Detail & Related papers (2022-09-22T11:19:45Z) - Universality of group convolutional neural networks based on ridgelet
analysis on groups [10.05944106581306]
We investigate the approximation property of group convolutional neural networks (GCNNs) based on the ridgelet theory.
We formulate a versatile GCNN as a nonlinear mapping between group representations.
arXiv Detail & Related papers (2022-05-30T02:52:22Z) - Geometric Deep Learning and Equivariant Neural Networks [0.9381376621526817]
We survey the mathematical foundations of geometric deep learning, focusing on group equivariant and gauge equivariant neural networks.
We develop gauge equivariant convolutional neural networks on arbitrary manifold $mathcalM$ using principal bundles with structure group $K$ and equivariant maps between sections of associated vector bundles.
We analyze several applications of this formalism, including semantic segmentation and object detection networks.
arXiv Detail & Related papers (2021-05-28T15:41:52Z) - A Practical Method for Constructing Equivariant Multilayer Perceptrons
for Arbitrary Matrix Groups [115.58550697886987]
We provide a completely general algorithm for solving for the equivariant layers of matrix groups.
In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before.
Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems.
arXiv Detail & Related papers (2021-04-19T17:21:54Z) - Group Equivariant Neural Architecture Search via Group Decomposition and
Reinforcement Learning [17.291131923335918]
We prove a new group-theoretic result in the context of equivariant neural networks.
We also design an algorithm to construct equivariant networks that significantly improves computational complexity.
We use deep Q-learning to search for group equivariant networks that maximize performance.
arXiv Detail & Related papers (2021-04-10T19:37:25Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - MDP Homomorphic Networks: Group Symmetries in Reinforcement Learning [90.20563679417567]
This paper introduces MDP homomorphic networks for deep reinforcement learning.
MDP homomorphic networks are neural networks that are equivariant under symmetries in the joint state-action space of an MDP.
We show that such networks converge faster than unstructured networks on CartPole, a grid world and Pong.
arXiv Detail & Related papers (2020-06-30T15:38:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.