Lie Neurons: Adjoint-Equivariant Neural Networks for Semisimple Lie Algebras
- URL: http://arxiv.org/abs/2310.04521v3
- Date: Thu, 6 Jun 2024 18:01:49 GMT
- Title: Lie Neurons: Adjoint-Equivariant Neural Networks for Semisimple Lie Algebras
- Authors: Tzu-Yuan Lin, Minghan Zhu, Maani Ghaffari,
- Abstract summary: This paper proposes an equivariant neural network that takes data in any semi-simple Lie algebra as input.
The corresponding group acts on the Lie algebra as adjoint operations, making our proposed network adjoint-equivariant.
Our framework generalizes the Vector Neurons, a simple $mathrmSO(3)$-equivariant network, from 3-D Euclidean space to Lie algebra spaces.
- Score: 5.596048634951087
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper proposes an equivariant neural network that takes data in any semi-simple Lie algebra as input. The corresponding group acts on the Lie algebra as adjoint operations, making our proposed network adjoint-equivariant. Our framework generalizes the Vector Neurons, a simple $\mathrm{SO}(3)$-equivariant network, from 3-D Euclidean space to Lie algebra spaces, building upon the invariance property of the Killing form. Furthermore, we propose novel Lie bracket layers and geometric channel mixing layers that extend the modeling capacity. Experiments are conducted for the $\mathfrak{so}(3)$, $\mathfrak{sl}(3)$, and $\mathfrak{sp}(4)$ Lie algebras on various tasks, including fitting equivariant and invariant functions, learning system dynamics, point cloud registration, and homography-based shape classification. Our proposed equivariant network shows wide applicability and competitive performance in various domains.
Related papers
- Geometric Clifford Algebra Networks [53.456211342585824]
We propose Geometric Clifford Algebra Networks (GCANs) for modeling dynamical systems.
GCANs are based on symmetry group transformations using geometric (Clifford) algebras.
arXiv Detail & Related papers (2023-02-13T18:48:33Z) - How Jellyfish Characterise Alternating Group Equivariant Neural Networks [0.0]
We find a basis for the learnable, linear, $A_n$-equivariant layer functions between such tensor power spaces in the standard basis of $mathbbRn$.
We also describe how our approach generalises to the construction of neural networks that are equivariant to local symmetries.
arXiv Detail & Related papers (2023-01-24T17:39:10Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Connecting Permutation Equivariant Neural Networks and Partition Diagrams [0.0]
We show that all of the weight matrices that appear in Permutation equivariant neural networks can be obtained from Schur-Weyl duality.
In particular, we adapt Schur-Weyl duality to derive a simple, diagrammatic method for calculating the weight matrices themselves.
arXiv Detail & Related papers (2022-12-16T18:48:54Z) - Geometric and Physical Quantities improve E(3) Equivariant Message
Passing [59.98327062664975]
We introduce Steerable E(3) Equivariant Graph Neural Networks (SEGNNs) that generalise equivariant graph networks.
This model, composed of steerables, is able to incorporate geometric and physical information in both the message and update functions.
We demonstrate the effectiveness of our method on several tasks in computational physics and chemistry.
arXiv Detail & Related papers (2021-10-06T16:34:26Z) - Geometric Deep Learning and Equivariant Neural Networks [0.9381376621526817]
We survey the mathematical foundations of geometric deep learning, focusing on group equivariant and gauge equivariant neural networks.
We develop gauge equivariant convolutional neural networks on arbitrary manifold $mathcalM$ using principal bundles with structure group $K$ and equivariant maps between sections of associated vector bundles.
We analyze several applications of this formalism, including semantic segmentation and object detection networks.
arXiv Detail & Related papers (2021-05-28T15:41:52Z) - Vector Neurons: A General Framework for SO(3)-Equivariant Networks [32.81671803104126]
In this paper, we introduce a general framework built on top of what we call Vector Neuron representations.
Our vector neurons enable a simple mapping of SO(3) actions to latent spaces.
We also show for the first time a rotation equivariant reconstruction network.
arXiv Detail & Related papers (2021-04-25T18:48:15Z) - A Practical Method for Constructing Equivariant Multilayer Perceptrons
for Arbitrary Matrix Groups [115.58550697886987]
We provide a completely general algorithm for solving for the equivariant layers of matrix groups.
In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before.
Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems.
arXiv Detail & Related papers (2021-04-19T17:21:54Z) - Stability of Algebraic Neural Networks to Small Perturbations [179.55535781816343]
Algebraic neural networks (AlgNNs) are composed of a cascade of layers each one associated to and algebraic signal model.
We show how any architecture that uses a formal notion of convolution can be stable beyond particular choices of the shift operator.
arXiv Detail & Related papers (2020-10-22T09:10:16Z) - Algebraic Neural Networks: Stability to Deformations [179.55535781816343]
We study algebraic neural networks (AlgNNs) with commutative algebras.
AlgNNs unify diverse architectures such as Euclidean convolutional neural networks, graph neural networks, and group neural networks.
arXiv Detail & Related papers (2020-09-03T03:41:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.