Geometric Deep Learning and Equivariant Neural Networks
- URL: http://arxiv.org/abs/2105.13926v1
- Date: Fri, 28 May 2021 15:41:52 GMT
- Title: Geometric Deep Learning and Equivariant Neural Networks
- Authors: Jan E. Gerken, Jimmy Aronsson, Oscar Carlsson, Hampus Linander,
Fredrik Ohlsson, Christoffer Petersson, Daniel Persson
- Abstract summary: We survey the mathematical foundations of geometric deep learning, focusing on group equivariant and gauge equivariant neural networks.
We develop gauge equivariant convolutional neural networks on arbitrary manifold $mathcalM$ using principal bundles with structure group $K$ and equivariant maps between sections of associated vector bundles.
We analyze several applications of this formalism, including semantic segmentation and object detection networks.
- Score: 0.9381376621526817
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We survey the mathematical foundations of geometric deep learning, focusing
on group equivariant and gauge equivariant neural networks. We develop gauge
equivariant convolutional neural networks on arbitrary manifolds $\mathcal{M}$
using principal bundles with structure group $K$ and equivariant maps between
sections of associated vector bundles. We also discuss group equivariant neural
networks for homogeneous spaces $\mathcal{M}=G/K$, which are instead
equivariant with respect to the global symmetry $G$ on $\mathcal{M}$. Group
equivariant layers can be interpreted as intertwiners between induced
representations of $G$, and we show their relation to gauge equivariant
convolutional layers. We analyze several applications of this formalism,
including semantic segmentation and object detection networks. We also discuss
the case of spherical networks in great detail, corresponding to the case
$\mathcal{M}=S^2=\mathrm{SO}(3)/\mathrm{SO}(2)$. Here we emphasize the use of
Fourier analysis involving Wigner matrices, spherical harmonics and
Clebsch-Gordan coefficients for $G=\mathrm{SO}(3)$, illustrating the power of
representation theory for deep learning.
Related papers
- Lie Neurons: Adjoint-Equivariant Neural Networks for Semisimple Lie Algebras [5.596048634951087]
This paper proposes an equivariant neural network that takes data in any semi-simple Lie algebra as input.
The corresponding group acts on the Lie algebra as adjoint operations, making our proposed network adjoint-equivariant.
Our framework generalizes the Vector Neurons, a simple $mathrmSO(3)$-equivariant network, from 3-D Euclidean space to Lie algebra spaces.
arXiv Detail & Related papers (2023-10-06T18:34:27Z) - Graph Automorphism Group Equivariant Neural Networks [1.9643748953805935]
Permutation equivariant neural networks are typically used to learn from data that lives on a graph.
We show how to construct neural networks that are equivariant to Aut$(G)$ by obtaining a full characterisation of the learnable, linear, Aut$(G)$-equivariant functions.
arXiv Detail & Related papers (2023-07-15T14:19:42Z) - Geometric Clifford Algebra Networks [53.456211342585824]
We propose Geometric Clifford Algebra Networks (GCANs) for modeling dynamical systems.
GCANs are based on symmetry group transformations using geometric (Clifford) algebras.
arXiv Detail & Related papers (2023-02-13T18:48:33Z) - How Jellyfish Characterise Alternating Group Equivariant Neural Networks [0.0]
We find a basis for the learnable, linear, $A_n$-equivariant layer functions between such tensor power spaces in the standard basis of $mathbbRn$.
We also describe how our approach generalises to the construction of neural networks that are equivariant to local symmetries.
arXiv Detail & Related papers (2023-01-24T17:39:10Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Brauer's Group Equivariant Neural Networks [0.0]
We provide a full characterisation of all of the possible group equivariant neural networks whose layers are some tensor power of $mathbbRn$.
We find a spanning set of matrices for the learnable, linear, equivariant layer functions between such tensor power spaces.
arXiv Detail & Related papers (2022-12-16T18:08:51Z) - Equivalence Between SE(3) Equivariant Networks via Steerable Kernels and
Group Convolution [90.67482899242093]
A wide range of techniques have been proposed in recent years for designing neural networks for 3D data that are equivariant under rotation and translation of the input.
We provide an in-depth analysis of both methods and their equivalence and relate the two constructions to multiview convolutional networks.
We also derive new TFN non-linearities from our equivalence principle and test them on practical benchmark datasets.
arXiv Detail & Related papers (2022-11-29T03:42:11Z) - Homogeneous vector bundles and $G$-equivariant convolutional neural
networks [0.0]
$G$-equivariant convolutional neural networks (GCNNs) are a geometric deep learning model for data defined on a homogeneous $G$-space $mathcalM$.
In this paper, we analyze GCNNs on homogeneous spaces $mathcalM = G/K$ in the case of unimodular Lie groups $G$ and compact subgroups $K leq G$.
arXiv Detail & Related papers (2021-05-12T02:06:04Z) - A Practical Method for Constructing Equivariant Multilayer Perceptrons
for Arbitrary Matrix Groups [115.58550697886987]
We provide a completely general algorithm for solving for the equivariant layers of matrix groups.
In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before.
Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems.
arXiv Detail & Related papers (2021-04-19T17:21:54Z) - MDP Homomorphic Networks: Group Symmetries in Reinforcement Learning [90.20563679417567]
This paper introduces MDP homomorphic networks for deep reinforcement learning.
MDP homomorphic networks are neural networks that are equivariant under symmetries in the joint state-action space of an MDP.
We show that such networks converge faster than unstructured networks on CartPole, a grid world and Pong.
arXiv Detail & Related papers (2020-06-30T15:38:37Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.