Self-Supervised Detection of Perfect and Partial Input-Dependent Symmetries
- URL: http://arxiv.org/abs/2312.12223v4
- Date: Wed, 3 Jul 2024 07:15:51 GMT
- Title: Self-Supervised Detection of Perfect and Partial Input-Dependent Symmetries
- Authors: Alonso Urbano, David W. Romero,
- Abstract summary: Group equivariance can overly constrain models if the symmetries in the group differ from those observed in data.
We propose a method able to detect the level of symmetry of each input without the need for labels.
Our framework is general enough to accommodate different families of both continuous and discrete symmetry distributions.
- Score: 11.54837584979607
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Group equivariance can overly constrain models if the symmetries in the group differ from those observed in data. While common methods address this by determining the appropriate level of symmetry at the dataset level, they are limited to supervised settings and ignore scenarios in which multiple levels of symmetry co-exist in the same dataset. In this paper, we propose a method able to detect the level of symmetry of each input without the need for labels. Our framework is general enough to accommodate different families of both continuous and discrete symmetry distributions, such as arbitrary unimodal, symmetric distributions and discrete groups. We validate the effectiveness of our approach on synthetic datasets with different per-class levels of symmetries, and demonstrate practical applications such as the detection of out-of-distribution symmetries. Our code is publicly available at https://github.com/aurban0/ssl-sym.
Related papers
- Level statistics detect generalized symmetries [0.0]
Level statistics are a useful probe for detecting symmetries and distinguishing integrable and non-integrable systems.
I consider non-invertible symmetries through the example of Kramers-Wannier duality at an Ising critical point.
I discovered via level statistics a $q$-deformed generalization of inversion that is interesting in its own right and that may protect $q$-deformed SPT phases.
arXiv Detail & Related papers (2024-06-06T11:54:28Z) - Learning Layer-wise Equivariances Automatically using Gradients [66.81218780702125]
Convolutions encode equivariance symmetries into neural networks leading to better generalisation performance.
symmetries provide fixed hard constraints on the functions a network can represent, need to be specified in advance, and can not be adapted.
Our goal is to allow flexible symmetry constraints that can automatically be learned from data using gradients.
arXiv Detail & Related papers (2023-10-09T20:22:43Z) - Latent Space Symmetry Discovery [31.28537696897416]
We propose a novel generative model, Latent LieGAN, which can discover symmetries of nonlinear group actions.
We show that our method can express any nonlinear symmetry under some conditions about the group action.
LaLiGAN also results in a well-structured latent space that is useful for downstream tasks including equation discovery and long-term forecasting.
arXiv Detail & Related papers (2023-09-29T19:33:01Z) - Learning Probabilistic Symmetrization for Architecture Agnostic Equivariance [16.49488981364657]
We present a novel framework to overcome the limitations of equivariant architectures in learning functions with group symmetries.
We use an arbitrary base model such as anvariant or a transformer and symmetrize it to be equivariant to the given group.
Empirical tests show competitive results against tailored equivariant architectures.
arXiv Detail & Related papers (2023-06-05T13:40:54Z) - Regularizing Towards Soft Equivariance Under Mixed Symmetries [23.603875905608565]
We present a regularizer-based method for building a model for a dataset with mixed approximate symmetries.
We show that our method achieves better accuracy than prior approaches while discovering the approximate symmetry levels correctly.
arXiv Detail & Related papers (2023-06-01T05:33:41Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Multi-Agent MDP Homomorphic Networks [100.74260120972863]
In cooperative multi-agent systems, complex symmetries arise between different configurations of the agents and their local observations.
Existing work on symmetries in single agent reinforcement learning can only be generalized to the fully centralized setting.
This paper introduces Multi-Agent MDP Homomorphic Networks, a class of networks that allows distributed execution using only local information.
arXiv Detail & Related papers (2021-10-09T07:46:25Z) - Sampling asymmetric open quantum systems for artificial neural networks [77.34726150561087]
We present a hybrid sampling strategy which takes asymmetric properties explicitly into account, achieving fast convergence times and high scalability for asymmetric open systems.
We highlight the universal applicability of artificial neural networks, underlining the universal applicability of neural networks.
arXiv Detail & Related papers (2020-12-20T18:25:29Z) - Asymptotic Analysis of an Ensemble of Randomly Projected Linear
Discriminants [94.46276668068327]
In [1], an ensemble of randomly projected linear discriminants is used to classify datasets.
We develop a consistent estimator of the misclassification probability as an alternative to the computationally-costly cross-validation estimator.
We also demonstrate the use of our estimator for tuning the projection dimension on both real and synthetic data.
arXiv Detail & Related papers (2020-04-17T12:47:04Z) - Detecting Symmetries with Neural Networks [0.0]
We make extensive use of the structure in the embedding layer of the neural network.
We identify whether a symmetry is present and to identify orbits of the symmetry in the input.
For this example we present a novel data representation in terms of graphs.
arXiv Detail & Related papers (2020-03-30T17:58:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.