Computing equivariant matrices on homogeneous spaces for Geometric Deep Learning and Automorphic Lie Algebras
- URL: http://arxiv.org/abs/2303.07157v2
- Date: Mon, 15 Apr 2024 12:03:24 GMT
- Title: Computing equivariant matrices on homogeneous spaces for Geometric Deep Learning and Automorphic Lie Algebras
- Authors: Vincent Knibbeler,
- Abstract summary: We compute equivariant maps from a homogeneous space $G/H$ of a Lie group $G$ to a module of this group.
This work has applications in the theoretical development of geometric deep learning and also in the theory of automorphic Lie algebras.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We develop an elementary method to compute spaces of equivariant maps from a homogeneous space $G/H$ of a Lie group $G$ to a module of this group. The Lie group is not required to be compact. More generally, we study spaces of invariant sections in homogeneous vector bundles, and take a special interest in the case where the fibres are algebras. These latter cases have a natural global algebra structure. We classify these automorphic algebras for the case where the homogeneous space has compact stabilisers. This work has applications in the theoretical development of geometric deep learning and also in the theory of automorphic Lie algebras.
Related papers
- Hyperpolyadic structures [0.0]
We introduce a new class of division algebras, the hyperpolyadic algebras, which correspond to the binary division algebras $mathbbR$, $mathbbC$, $mathbbH$, $mathbbO$ without considering new elements.
For each invertible element we define a new norm which is polyadically multiplicative, and the corresponding map is a $n$-ary homomorphism.
We show that the ternary division algebra of imaginary "half-octonions" is unitless and totally as
arXiv Detail & Related papers (2023-12-03T12:27:53Z) - Lie Group Decompositions for Equivariant Neural Networks [12.139222986297261]
We show how convolution kernels can be parametrized to build models equivariant with respect to affine transformations.
We evaluate the robustness and out-of-distribution generalisation capability of our model on the benchmark affine-invariant classification task.
arXiv Detail & Related papers (2023-10-17T16:04:33Z) - Discovering Sparse Representations of Lie Groups with Machine Learning [55.41644538483948]
We show that our method reproduces the canonical representations of the generators of the Lorentz group.
This approach is completely general and can be used to find the infinitesimal generators for any Lie group.
arXiv Detail & Related papers (2023-02-10T17:12:05Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Invertible subalgebras [0.30458514384586394]
We introduce invertible subalgebras of local operator algebras on lattices.
On a two-dimensional lattice, an invertible subalgebra hosts a chiral anyon theory by a commuting Hamiltonian.
We consider a metric on the group of all QCA on infinite lattices and prove that the metric completion contains the time evolution by local Hamiltonians.
arXiv Detail & Related papers (2022-11-03T18:31:32Z) - Machine learning and invariant theory [10.178220223515956]
We introduce the topic and explain a couple of methods to explicitly parameterize equivariant functions.
We explicate a general procedure, attributed to Malgrange, to express all maps between linear spaces that are equivariant under the action of a group $G$.
arXiv Detail & Related papers (2022-09-29T17:52:17Z) - Unified Fourier-based Kernel and Nonlinearity Design for Equivariant
Networks on Homogeneous Spaces [52.424621227687894]
We introduce a unified framework for group equivariant networks on homogeneous spaces.
We take advantage of the sparsity of Fourier coefficients of the lifted feature fields.
We show that other methods treating features as the Fourier coefficients in the stabilizer subgroup are special cases of our activation.
arXiv Detail & Related papers (2022-06-16T17:59:01Z) - Coordinate Independent Convolutional Networks -- Isometry and Gauge
Equivariant Convolutions on Riemannian Manifolds [70.32518963244466]
A major complication in comparison to flat spaces is that it is unclear in which alignment a convolution kernel should be applied on a manifold.
We argue that the particular choice of coordinatization should not affect a network's inference -- it should be coordinate independent.
A simultaneous demand for coordinate independence and weight sharing is shown to result in a requirement on the network to be equivariant.
arXiv Detail & Related papers (2021-06-10T19:54:19Z) - Hilbert Spaces of Entire Functions and Toeplitz Quantization of
Euclidean Planes [0.0]
We extend the theory of Toeplitz quantization to include diverse and interesting non-commutative realizations of the classical Euclidean plane.
The Toeplitz operators are geometrically constructed as special elements from this algebra.
Various illustrative examples are computed.
arXiv Detail & Related papers (2021-05-18T09:52:48Z) - A Practical Method for Constructing Equivariant Multilayer Perceptrons
for Arbitrary Matrix Groups [115.58550697886987]
We provide a completely general algorithm for solving for the equivariant layers of matrix groups.
In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before.
Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems.
arXiv Detail & Related papers (2021-04-19T17:21:54Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.