Learning Lie Group Symmetry Transformations with Neural Networks
- URL: http://arxiv.org/abs/2307.01583v1
- Date: Tue, 4 Jul 2023 09:23:24 GMT
- Title: Learning Lie Group Symmetry Transformations with Neural Networks
- Authors: Alex Gabel, Victoria Klein, Riccardo Valperga, Jeroen S. W. Lamb,
Kevin Webster, Rick Quax, Efstratios Gavves
- Abstract summary: This work focuses on discovering and characterizing unknown symmetries present in the dataset, namely, Lie group symmetry transformations.
Our goal is to characterize the transformation group and the distribution of the parameter values.
Results showcase the effectiveness of the approach in both these settings.
- Score: 17.49001206996365
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The problem of detecting and quantifying the presence of symmetries in
datasets is useful for model selection, generative modeling, and data analysis,
amongst others. While existing methods for hard-coding transformations in
neural networks require prior knowledge of the symmetries of the task at hand,
this work focuses on discovering and characterizing unknown symmetries present
in the dataset, namely, Lie group symmetry transformations beyond the
traditional ones usually considered in the field (rotation, scaling, and
translation). Specifically, we consider a scenario in which a dataset has been
transformed by a one-parameter subgroup of transformations with different
parameter values for each data point. Our goal is to characterize the
transformation group and the distribution of the parameter values. The results
showcase the effectiveness of the approach in both these settings.
Related papers
- Group Crosscoders for Mechanistic Analysis of Symmetry [0.0]
Group crosscoders systematically discover and analyse symmetrical features in neural networks.
We show that group crosscoders can provide systematic insights into how neural networks represent symmetry.
arXiv Detail & Related papers (2024-10-31T17:47:01Z) - Learning Infinitesimal Generators of Continuous Symmetries from Data [15.42275880523356]
We propose a novel symmetry learning algorithm based on transformations defined with one- parameter groups.
Our method is built upon minimal inductive biases, encompassing not only commonly utilized symmetries rooted in Lie groups but also extending to symmetries derived from nonlinear generators.
arXiv Detail & Related papers (2024-10-29T08:28:23Z) - Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - Relative Representations: Topological and Geometric Perspectives [53.88896255693922]
Relative representations are an established approach to zero-shot model stitching.
We introduce a normalization procedure in the relative transformation, resulting in invariance to non-isotropic rescalings and permutations.
Second, we propose to deploy topological densification when fine-tuning relative representations, a topological regularization loss encouraging clustering within classes.
arXiv Detail & Related papers (2024-09-17T08:09:22Z) - The Empirical Impact of Neural Parameter Symmetries, or Lack Thereof [50.49582712378289]
We investigate the impact of neural parameter symmetries by introducing new neural network architectures.
We develop two methods, with some provable guarantees, of modifying standard neural networks to reduce parameter space symmetries.
Our experiments reveal several interesting observations on the empirical impact of parameter symmetries.
arXiv Detail & Related papers (2024-05-30T16:32:31Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - LieGG: Studying Learned Lie Group Generators [1.5293427903448025]
Symmetries built into a neural network have appeared to be very beneficial for a wide range of tasks as it saves the data to learn them.
We present a method to extract symmetries learned by a neural network and to evaluate the degree to which a network is invariant to them.
arXiv Detail & Related papers (2022-10-09T20:42:37Z) - Equivariant Mesh Attention Networks [10.517110532297021]
We present an attention-based architecture for mesh data that is provably equivariant to all transformations mentioned above.
Our results confirm that our proposed architecture is equivariant, and therefore robust, to these local/global transformations.
arXiv Detail & Related papers (2022-05-21T19:53:14Z) - Asymptotic Analysis of an Ensemble of Randomly Projected Linear
Discriminants [94.46276668068327]
In [1], an ensemble of randomly projected linear discriminants is used to classify datasets.
We develop a consistent estimator of the misclassification probability as an alternative to the computationally-costly cross-validation estimator.
We also demonstrate the use of our estimator for tuning the projection dimension on both real and synthetic data.
arXiv Detail & Related papers (2020-04-17T12:47:04Z) - Inverse Learning of Symmetries [71.62109774068064]
We learn the symmetry transformation with a model consisting of two latent subspaces.
Our approach is based on the deep information bottleneck in combination with a continuous mutual information regulariser.
Our model outperforms state-of-the-art methods on artificial and molecular datasets.
arXiv Detail & Related papers (2020-02-07T13:48:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.