Representing and Learning Functions Invariant Under Crystallographic
Groups
- URL: http://arxiv.org/abs/2306.05261v1
- Date: Thu, 8 Jun 2023 15:02:04 GMT
- Title: Representing and Learning Functions Invariant Under Crystallographic
Groups
- Authors: Ryan P. Adams and Peter Orbanz
- Abstract summary: Crystallographic groups describe the symmetries of crystals and other repetitive structures encountered in nature and the sciences.
We derive linear and nonlinear representations of functions that are (1) smooth and (2) invariant under such a group.
We show that such a basis exists for each crystallographic group, that it is orthonormal in the relevant $L$ space, and recover the standard Fourier basis as a special case for pure shift groups.
- Score: 18.6870237776672
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Crystallographic groups describe the symmetries of crystals and other
repetitive structures encountered in nature and the sciences. These groups
include the wallpaper and space groups. We derive linear and nonlinear
representations of functions that are (1) smooth and (2) invariant under such a
group. The linear representation generalizes the Fourier basis to
crystallographically invariant basis functions. We show that such a basis
exists for each crystallographic group, that it is orthonormal in the relevant
$L_2$ space, and recover the standard Fourier basis as a special case for pure
shift groups. The nonlinear representation embeds the orbit space of the group
into a finite-dimensional Euclidean space. We show that such an embedding
exists for every crystallographic group, and that it factors functions through
a generalization of a manifold called an orbifold. We describe algorithms that,
given a standardized description of the group, compute the Fourier basis and an
embedding map. As examples, we construct crystallographically invariant neural
networks, kernel machines, and Gaussian processes.
Related papers
- Algebras of actions in an agent's representations of the world [51.06229789727133]
We use our framework to reproduce the symmetry-based representations from the symmetry-based disentangled representation learning formalism.
We then study the algebras of the transformations of worlds with features that occur in simple reinforcement learning scenarios.
Using computational methods, that we developed, we extract the algebras of the transformations of these worlds and classify them according to their properties.
arXiv Detail & Related papers (2023-10-02T18:24:51Z) - Computing equivariant matrices on homogeneous spaces for Geometric Deep Learning and Automorphic Lie Algebras [0.0]
We compute equivariant maps from a homogeneous space $G/H$ of a Lie group $G$ to a module of this group.
This work has applications in the theoretical development of geometric deep learning and also in the theory of automorphic Lie algebras.
arXiv Detail & Related papers (2023-03-13T14:32:49Z) - Fast computation of permutation equivariant layers with the partition
algebra [0.0]
Linear neural network layers that are either equivariant or invariant to permutations of their inputs form core building blocks of modern deep learning architectures.
Examples include the layers of DeepSets, as well as linear layers occurring in attention blocks of transformers and some graph neural networks.
arXiv Detail & Related papers (2023-03-10T21:13:12Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Unified Fourier-based Kernel and Nonlinearity Design for Equivariant
Networks on Homogeneous Spaces [52.424621227687894]
We introduce a unified framework for group equivariant networks on homogeneous spaces.
We take advantage of the sparsity of Fourier coefficients of the lifted feature fields.
We show that other methods treating features as the Fourier coefficients in the stabilizer subgroup are special cases of our activation.
arXiv Detail & Related papers (2022-06-16T17:59:01Z) - A Practical Method for Constructing Equivariant Multilayer Perceptrons
for Arbitrary Matrix Groups [115.58550697886987]
We provide a completely general algorithm for solving for the equivariant layers of matrix groups.
In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before.
Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems.
arXiv Detail & Related papers (2021-04-19T17:21:54Z) - Manifold Topology, Observables and Gauge Group [0.0]
The relation between manifold topology, observables and gauge group is clarified.
The implications on the observability of the Permutation Group in Particle Statistics are discussed.
arXiv Detail & Related papers (2021-02-17T18:05:13Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - Algebraic Neural Networks: Stability to Deformations [179.55535781816343]
We study algebraic neural networks (AlgNNs) with commutative algebras.
AlgNNs unify diverse architectures such as Euclidean convolutional neural networks, graph neural networks, and group neural networks.
arXiv Detail & Related papers (2020-09-03T03:41:38Z) - Applying Lie Groups Approaches for Rigid Registration of Point Clouds [3.308743964406687]
We use Lie groups and Lie algebras to find the rigid transformation that best register two surfaces represented by point clouds.
The so called pairwise rigid registration can be formulated by comparing intrinsic second-order orientation tensors.
We show promising results when embedding orientation tensor fields in Lie algebras.
arXiv Detail & Related papers (2020-06-23T21:26:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.