Clifford Group Equivariant Neural Networks
- URL: http://arxiv.org/abs/2305.11141v5
- Date: Sun, 22 Oct 2023 16:54:33 GMT
- Title: Clifford Group Equivariant Neural Networks
- Authors: David Ruhe, Johannes Brandstetter, Patrick Forr\'e
- Abstract summary: We introduce Clifford Group Equivariant Neural Networks, a novel approach for constructing $mathrmO(n)$- and $mathrmE(n)$-equivariant models.
We demonstrate, notably from a single core implementation, state-of-the-art performance on several distinct tasks.
- Score: 14.260561321140976
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Clifford Group Equivariant Neural Networks: a novel approach for
constructing $\mathrm{O}(n)$- and $\mathrm{E}(n)$-equivariant models. We
identify and study the $\textit{Clifford group}$, a subgroup inside the
Clifford algebra tailored to achieve several favorable properties. Primarily,
the group's action forms an orthogonal automorphism that extends beyond the
typical vector space to the entire Clifford algebra while respecting the
multivector grading. This leads to several non-equivalent subrepresentations
corresponding to the multivector decomposition. Furthermore, we prove that the
action respects not just the vector space structure of the Clifford algebra but
also its multiplicative structure, i.e., the geometric product. These findings
imply that every polynomial in multivectors, An advantage worth mentioning is
that we obtain expressive layers that can elegantly generalize to inner-product
spaces of any dimension. We demonstrate, notably from a single core
implementation, state-of-the-art performance on several distinct tasks,
including a three-dimensional $n$-body experiment, a four-dimensional
Lorentz-equivariant high-energy physics experiment, and a five-dimensional
convex hull experiment.
Related papers
- Clifford Group Equivariant Simplicial Message Passing Networks [7.598439350696356]
We introduce Clifford Group Equivariant Simplicial Message Passing Networks.
Our method integrates the expressivity of Clifford group-equivariant layers with simplicial message passing.
Our method is able to outperform both equivariant and simplicial graph neural networks on a variety of geometric tasks.
arXiv Detail & Related papers (2024-02-15T15:18:53Z) - Lie Neurons: Adjoint-Equivariant Neural Networks for Semisimple Lie Algebras [5.596048634951087]
This paper proposes an equivariant neural network that takes data in any semi-simple Lie algebra as input.
The corresponding group acts on the Lie algebra as adjoint operations, making our proposed network adjoint-equivariant.
Our framework generalizes the Vector Neurons, a simple $mathrmSO(3)$-equivariant network, from 3-D Euclidean space to Lie algebra spaces.
arXiv Detail & Related papers (2023-10-06T18:34:27Z) - Geometric Clifford Algebra Networks [53.456211342585824]
We propose Geometric Clifford Algebra Networks (GCANs) for modeling dynamical systems.
GCANs are based on symmetry group transformations using geometric (Clifford) algebras.
arXiv Detail & Related papers (2023-02-13T18:48:33Z) - Clifford Neural Layers for PDE Modeling [61.07764203014727]
Partial differential equations (PDEs) see widespread use in sciences and engineering to describe simulation of physical processes as scalar and vector fields interacting and coevolving over time.
Current methods do not explicitly take into account the relationship between different fields and their internal components, which are often correlated.
This paper presents the first usage of such multivector representations together with Clifford convolutions and Clifford Fourier transforms in the context of deep learning.
The resulting Clifford neural layers are universally applicable and will find direct use in the areas of fluid dynamics, weather forecasting, and the modeling of physical systems in general.
arXiv Detail & Related papers (2022-09-08T17:35:30Z) - Unified Fourier-based Kernel and Nonlinearity Design for Equivariant
Networks on Homogeneous Spaces [52.424621227687894]
We introduce a unified framework for group equivariant networks on homogeneous spaces.
We take advantage of the sparsity of Fourier coefficients of the lifted feature fields.
We show that other methods treating features as the Fourier coefficients in the stabilizer subgroup are special cases of our activation.
arXiv Detail & Related papers (2022-06-16T17:59:01Z) - Geometric Deep Learning and Equivariant Neural Networks [0.9381376621526817]
We survey the mathematical foundations of geometric deep learning, focusing on group equivariant and gauge equivariant neural networks.
We develop gauge equivariant convolutional neural networks on arbitrary manifold $mathcalM$ using principal bundles with structure group $K$ and equivariant maps between sections of associated vector bundles.
We analyze several applications of this formalism, including semantic segmentation and object detection networks.
arXiv Detail & Related papers (2021-05-28T15:41:52Z) - A Practical Method for Constructing Equivariant Multilayer Perceptrons
for Arbitrary Matrix Groups [115.58550697886987]
We provide a completely general algorithm for solving for the equivariant layers of matrix groups.
In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before.
Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems.
arXiv Detail & Related papers (2021-04-19T17:21:54Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - A deep network construction that adapts to intrinsic dimensionality
beyond the domain [79.23797234241471]
We study the approximation of two-layer compositions $f(x) = g(phi(x))$ via deep networks with ReLU activation.
We focus on two intuitive and practically relevant choices for $phi$: the projection onto a low-dimensional embedded submanifold and a distance to a collection of low-dimensional sets.
arXiv Detail & Related papers (2020-08-06T09:50:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.