SymmetryLens: Unsupervised Symmetry Learning via Locality and Density Preservation
- URL: http://arxiv.org/abs/2410.05232v2
- Date: Fri, 04 Jul 2025 15:19:05 GMT
- Title: SymmetryLens: Unsupervised Symmetry Learning via Locality and Density Preservation
- Authors: Onur Efe, Arkadas Ozakin,
- Abstract summary: We develop a new unsupervised symmetry learning method that starts with raw data and provides the minimal generator of an underlying Lie group of symmetries.<n>The method is able to learn the pixel translation operator from a dataset with only an approximate translation symmetry.<n>We demonstrate that this coupling between symmetry and locality, together with an optimization technique developed for entropy estimation, results in a stable system.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We develop a new unsupervised symmetry learning method that starts with raw data and provides the minimal generator of an underlying Lie group of symmetries, together with a symmetry-equivariant representation of the data, which turns the hidden symmetry into an explicit one. The method is able to learn the pixel translation operator from a dataset with only an approximate translation symmetry and can learn quite different types of symmetries that are not apparent to the naked eye. The method is based on the formulation of an information-theoretic loss function that measures both the degree of symmetry of a dataset under a candidate symmetry generator and a proposed notion of locality of the samples, which is coupled to symmetry. We demonstrate that this coupling between symmetry and locality, together with an optimization technique developed for entropy estimation, results in a stable system that provides reproducible results.
Related papers
- Learning Infinitesimal Generators of Continuous Symmetries from Data [15.42275880523356]
We propose a novel symmetry learning algorithm based on transformations defined with one- parameter groups.
Our method is built upon minimal inductive biases, encompassing not only commonly utilized symmetries rooted in Lie groups but also extending to symmetries derived from nonlinear generators.
arXiv Detail & Related papers (2024-10-29T08:28:23Z) - Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - A Generative Model of Symmetry Transformations [44.87295754993983]
We build a generative model that explicitly aims to capture the data's approximate symmetries.<n>We empirically demonstrate its ability to capture symmetries under affine and color transformations.
arXiv Detail & Related papers (2024-03-04T11:32:18Z) - Self-Supervised Detection of Perfect and Partial Input-Dependent Symmetries [11.54837584979607]
Group equivariance can overly constrain models if the symmetries in the group differ from those observed in data.
We propose a method able to detect the level of symmetry of each input without the need for labels.
Our framework is general enough to accommodate different families of both continuous and discrete symmetry distributions.
arXiv Detail & Related papers (2023-12-19T15:11:46Z) - Learning Layer-wise Equivariances Automatically using Gradients [66.81218780702125]
Convolutions encode equivariance symmetries into neural networks leading to better generalisation performance.
symmetries provide fixed hard constraints on the functions a network can represent, need to be specified in advance, and can not be adapted.
Our goal is to allow flexible symmetry constraints that can automatically be learned from data using gradients.
arXiv Detail & Related papers (2023-10-09T20:22:43Z) - Latent Space Symmetry Discovery [31.28537696897416]
We propose a novel generative model, Latent LieGAN, which can discover symmetries of nonlinear group actions.
We show that our model can express nonlinear symmetries under some conditions about the group action.
LaLiGAN also results in a well-structured latent space that is useful for downstream tasks including equation discovery and long-term forecasting.
arXiv Detail & Related papers (2023-09-29T19:33:01Z) - Regularizing Towards Soft Equivariance Under Mixed Symmetries [23.603875905608565]
We present a regularizer-based method for building a model for a dataset with mixed approximate symmetries.
We show that our method achieves better accuracy than prior approaches while discovering the approximate symmetry levels correctly.
arXiv Detail & Related papers (2023-06-01T05:33:41Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Generative Adversarial Symmetry Discovery [19.098785309131458]
LieGAN represents symmetry as interpretable Lie algebra basis and can discover various symmetries.
The learned symmetry can also be readily used in several existing equivariant neural networks to improve accuracy and generalization in prediction.
arXiv Detail & Related papers (2023-02-01T04:28:36Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Machine-learning hidden symmetries [0.0]
We present an automated method for finding hidden symmetries, defined as symmetries that become manifest only in a new coordinate system that must be discovered.
Its core idea is to quantify asymmetry as violation of certain partial differential equations, and to numerically minimize such violation over the space of all invertible transformations, parametrized as invertible neural networks.
arXiv Detail & Related papers (2021-09-20T17:55:02Z) - Asymptotic Analysis of an Ensemble of Randomly Projected Linear
Discriminants [94.46276668068327]
In [1], an ensemble of randomly projected linear discriminants is used to classify datasets.
We develop a consistent estimator of the misclassification probability as an alternative to the computationally-costly cross-validation estimator.
We also demonstrate the use of our estimator for tuning the projection dimension on both real and synthetic data.
arXiv Detail & Related papers (2020-04-17T12:47:04Z) - Inverse Learning of Symmetries [71.62109774068064]
We learn the symmetry transformation with a model consisting of two latent subspaces.
Our approach is based on the deep information bottleneck in combination with a continuous mutual information regulariser.
Our model outperforms state-of-the-art methods on artificial and molecular datasets.
arXiv Detail & Related papers (2020-02-07T13:48:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.