Lie Group Decompositions for Equivariant Neural Networks
- URL: http://arxiv.org/abs/2310.11366v2
- Date: Wed, 10 Jul 2024 17:12:45 GMT
- Title: Lie Group Decompositions for Equivariant Neural Networks
- Authors: Mircea Mironenco, Patrick Forré,
- Abstract summary: We show how convolution kernels can be parametrized to build models equivariant with respect to affine transformations.
We evaluate the robustness and out-of-distribution generalisation capability of our model on the benchmark affine-invariant classification task.
- Score: 12.139222986297261
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Invariance and equivariance to geometrical transformations have proven to be very useful inductive biases when training (convolutional) neural network models, especially in the low-data regime. Much work has focused on the case where the symmetry group employed is compact or abelian, or both. Recent work has explored enlarging the class of transformations used to the case of Lie groups, principally through the use of their Lie algebra, as well as the group exponential and logarithm maps. The applicability of such methods is limited by the fact that depending on the group of interest $G$, the exponential map may not be surjective. Further limitations are encountered when $G$ is neither compact nor abelian. Using the structure and geometry of Lie groups and their homogeneous spaces, we present a framework by which it is possible to work with such groups primarily focusing on the groups $G = \text{GL}^{+}(n, \mathbb{R})$ and $G = \text{SL}(n, \mathbb{R})$, as well as their representation as affine transformations $\mathbb{R}^{n} \rtimes G$. Invariant integration as well as a global parametrization is realized by a decomposition into subgroups and submanifolds which can be handled individually. Under this framework, we show how convolution kernels can be parametrized to build models equivariant with respect to affine transformations. We evaluate the robustness and out-of-distribution generalisation capability of our model on the benchmark affine-invariant classification task, outperforming previous proposals.
Related papers
- Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Capacity of Group-invariant Linear Readouts from Equivariant
Representations: How Many Objects can be Linearly Classified Under All
Possible Views? [21.06669693699965]
We find that the fraction of separable dichotomies is determined by the dimension of the space that is fixed by the group action.
We show how this relation extends to operations such as convolutions, element-wise nonlinearities, and global and local pooling.
arXiv Detail & Related papers (2021-10-14T15:46:53Z) - Commutative Lie Group VAE for Disentanglement Learning [96.32813624341833]
We view disentanglement learning as discovering an underlying structure that equivariantly reflects the factorized variations shown in data.
A simple model named Commutative Lie Group VAE is introduced to realize the group-based disentanglement learning.
Experiments show that our model can effectively learn disentangled representations without supervision, and can achieve state-of-the-art performance without extra constraints.
arXiv Detail & Related papers (2021-06-07T07:03:14Z) - A Practical Method for Constructing Equivariant Multilayer Perceptrons
for Arbitrary Matrix Groups [115.58550697886987]
We provide a completely general algorithm for solving for the equivariant layers of matrix groups.
In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before.
Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems.
arXiv Detail & Related papers (2021-04-19T17:21:54Z) - On the geometric and Riemannian structure of the spaces of group
equivariant non-expansive operators [0.0]
Group equivariant non-expansive operators have been recently proposed as basic components in topological data analysis and deep learning.
We show how a space $mathcalF$ of group equivariant non-expansive operators can be endowed with the structure of a Riemannian manifold.
We also describe a procedure to select a finite set of representative group equivariant non-expansive operators in the considered manifold.
arXiv Detail & Related papers (2021-03-03T17:29:25Z) - Group Equivariant Conditional Neural Processes [30.134634059773703]
We present the group equivariant conditional neural process (EquivCNP)
We show that EquivCNP achieves comparable performance to conventional conditional neural processes in a 1D regression task.
arXiv Detail & Related papers (2021-02-17T13:50:07Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - Applying Lie Groups Approaches for Rigid Registration of Point Clouds [3.308743964406687]
We use Lie groups and Lie algebras to find the rigid transformation that best register two surfaces represented by point clouds.
The so called pairwise rigid registration can be formulated by comparing intrinsic second-order orientation tensors.
We show promising results when embedding orientation tensor fields in Lie algebras.
arXiv Detail & Related papers (2020-06-23T21:26:57Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z) - Generalizing Convolutional Neural Networks for Equivariance to Lie
Groups on Arbitrary Continuous Data [52.78581260260455]
We propose a general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group.
We apply the same model architecture to images, ball-and-stick molecular data, and Hamiltonian dynamical systems.
arXiv Detail & Related papers (2020-02-25T17:40:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.