Detecting Symmetries with Neural Networks
- URL: http://arxiv.org/abs/2003.13679v1
- Date: Mon, 30 Mar 2020 17:58:24 GMT
- Title: Detecting Symmetries with Neural Networks
- Authors: Sven Krippendorf, Marc Syvaeri
- Abstract summary: We make extensive use of the structure in the embedding layer of the neural network.
We identify whether a symmetry is present and to identify orbits of the symmetry in the input.
For this example we present a novel data representation in terms of graphs.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Identifying symmetries in data sets is generally difficult, but knowledge
about them is crucial for efficient data handling. Here we present a method how
neural networks can be used to identify symmetries. We make extensive use of
the structure in the embedding layer of the neural network which allows us to
identify whether a symmetry is present and to identify orbits of the symmetry
in the input. To determine which continuous or discrete symmetry group is
present we analyse the invariant orbits in the input. We present examples based
on rotation groups $SO(n)$ and the unitary group $SU(2).$ Further we find that
this method is useful for the classification of complete intersection
Calabi-Yau manifolds where it is crucial to identify discrete symmetries on the
input space. For this example we present a novel data representation in terms
of graphs.
Related papers
- Learning Infinitesimal Generators of Continuous Symmetries from Data [15.42275880523356]
We propose a novel symmetry learning algorithm based on transformations defined with one- parameter groups.
Our method is built upon minimal inductive biases, encompassing not only commonly utilized symmetries rooted in Lie groups but also extending to symmetries derived from nonlinear generators.
arXiv Detail & Related papers (2024-10-29T08:28:23Z) - Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - The Empirical Impact of Neural Parameter Symmetries, or Lack Thereof [50.49582712378289]
We investigate the impact of neural parameter symmetries by introducing new neural network architectures.
We develop two methods, with some provable guarantees, of modifying standard neural networks to reduce parameter space symmetries.
Our experiments reveal several interesting observations on the empirical impact of parameter symmetries.
arXiv Detail & Related papers (2024-05-30T16:32:31Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Self-Supervised Detection of Perfect and Partial Input-Dependent Symmetries [11.54837584979607]
Group equivariance can overly constrain models if the symmetries in the group differ from those observed in data.
We propose a method able to detect the level of symmetry of each input without the need for labels.
Our framework is general enough to accommodate different families of both continuous and discrete symmetry distributions.
arXiv Detail & Related papers (2023-12-19T15:11:46Z) - Learning Layer-wise Equivariances Automatically using Gradients [66.81218780702125]
Convolutions encode equivariance symmetries into neural networks leading to better generalisation performance.
symmetries provide fixed hard constraints on the functions a network can represent, need to be specified in advance, and can not be adapted.
Our goal is to allow flexible symmetry constraints that can automatically be learned from data using gradients.
arXiv Detail & Related papers (2023-10-09T20:22:43Z) - Latent Space Symmetry Discovery [31.28537696897416]
We propose a novel generative model, Latent LieGAN, which can discover symmetries of nonlinear group actions.
We show that our model can express nonlinear symmetries under some conditions about the group action.
LaLiGAN also results in a well-structured latent space that is useful for downstream tasks including equation discovery and long-term forecasting.
arXiv Detail & Related papers (2023-09-29T19:33:01Z) - Approximately Equivariant Graph Networks [14.312312714312046]
Graph neural networks (GNNs) are commonly described as being permutation equivariant with respect to node relabeling in the graph.
We focus on the active symmetries of GNNs, by considering a learning setting where signals are supported on a fixed graph.
arXiv Detail & Related papers (2023-08-21T03:13:38Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - LieGG: Studying Learned Lie Group Generators [1.5293427903448025]
Symmetries built into a neural network have appeared to be very beneficial for a wide range of tasks as it saves the data to learn them.
We present a method to extract symmetries learned by a neural network and to evaluate the degree to which a network is invariant to them.
arXiv Detail & Related papers (2022-10-09T20:42:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.