Equivariant Representation Learning in the Presence of Stabilizers
- URL: http://arxiv.org/abs/2301.05231v2
- Date: Sat, 16 Sep 2023 20:56:32 GMT
- Title: Equivariant Representation Learning in the Presence of Stabilizers
- Authors: Luis Armando P\'erez Rey, Giovanni Luca Marchetti, Danica Kragic,
Dmitri Jarnikov, Mike Holenderski
- Abstract summary: EquIN is suitable for group actions that are not free, i.e., that stabilize data via nontrivial symmetries.
EquIN is theoretically grounded in the orbit-stabilizer theorem from group theory.
- Score: 13.11108596589607
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Equivariant Isomorphic Networks (EquIN) -- a method for learning
representations that are equivariant with respect to general group actions over
data. Differently from existing equivariant representation learners, EquIN is
suitable for group actions that are not free, i.e., that stabilize data via
nontrivial symmetries. EquIN is theoretically grounded in the orbit-stabilizer
theorem from group theory. This guarantees that an ideal learner infers
isomorphic representations while trained on equivariance alone and thus fully
extracts the geometric structure of data. We provide an empirical investigation
on image datasets with rotational symmetries and show that taking stabilizers
into account improves the quality of the representations.
Related papers
- Equivariant score-based generative models provably learn distributions with symmetries efficiently [7.90752151686317]
Empirical studies have demonstrated that incorporating symmetries into generative models can provide better generalization and sampling efficiency.
We provide the first theoretical analysis and guarantees of score-based generative models (SGMs) for learning distributions that are invariant with respect to some group symmetry.
arXiv Detail & Related papers (2024-10-02T05:14:28Z) - Algebras of actions in an agent's representations of the world [51.06229789727133]
We use our framework to reproduce the symmetry-based representations from the symmetry-based disentangled representation learning formalism.
We then study the algebras of the transformations of worlds with features that occur in simple reinforcement learning scenarios.
Using computational methods, that we developed, we extract the algebras of the transformations of these worlds and classify them according to their properties.
arXiv Detail & Related papers (2023-10-02T18:24:51Z) - Evaluating the Robustness of Interpretability Methods through
Explanation Invariance and Equivariance [72.50214227616728]
Interpretability methods are valuable only if their explanations faithfully describe the explained model.
We consider neural networks whose predictions are invariant under a specific symmetry group.
arXiv Detail & Related papers (2023-04-13T17:59:03Z) - Equivariant Disentangled Transformation for Domain Generalization under
Combination Shift [91.38796390449504]
Combinations of domains and labels are not observed during training but appear in the test environment.
We provide a unique formulation of the combination shift problem based on the concepts of homomorphism, equivariance, and a refined definition of disentanglement.
arXiv Detail & Related papers (2022-08-03T12:31:31Z) - Equivariant Representation Learning via Class-Pose Decomposition [17.032782230538388]
We introduce a general method for learning representations that are equivariant to symmetries of data.
The components semantically correspond to intrinsic data classes and poses respectively.
Results show that our representations capture the geometry of data and outperform other equivariant representation learning frameworks.
arXiv Detail & Related papers (2022-07-07T06:55:52Z) - Unified Fourier-based Kernel and Nonlinearity Design for Equivariant
Networks on Homogeneous Spaces [52.424621227687894]
We introduce a unified framework for group equivariant networks on homogeneous spaces.
We take advantage of the sparsity of Fourier coefficients of the lifted feature fields.
We show that other methods treating features as the Fourier coefficients in the stabilizer subgroup are special cases of our activation.
arXiv Detail & Related papers (2022-06-16T17:59:01Z) - Equivariant Mesh Attention Networks [10.517110532297021]
We present an attention-based architecture for mesh data that is provably equivariant to all transformations mentioned above.
Our results confirm that our proposed architecture is equivariant, and therefore robust, to these local/global transformations.
arXiv Detail & Related papers (2022-05-21T19:53:14Z) - Learning Symmetric Embeddings for Equivariant World Models [9.781637768189158]
We propose learning symmetric embedding networks (SENs) that encode an input space (e.g. images)
This network can be trained end-to-end with an equivariant task network to learn an explicitly symmetric representation.
Our experiments demonstrate that SENs facilitate the application of equivariant networks to data with complex symmetry representations.
arXiv Detail & Related papers (2022-04-24T22:31:52Z) - Commutative Lie Group VAE for Disentanglement Learning [96.32813624341833]
We view disentanglement learning as discovering an underlying structure that equivariantly reflects the factorized variations shown in data.
A simple model named Commutative Lie Group VAE is introduced to realize the group-based disentanglement learning.
Experiments show that our model can effectively learn disentangled representations without supervision, and can achieve state-of-the-art performance without extra constraints.
arXiv Detail & Related papers (2021-06-07T07:03:14Z) - Group Equivariant Conditional Neural Processes [30.134634059773703]
We present the group equivariant conditional neural process (EquivCNP)
We show that EquivCNP achieves comparable performance to conventional conditional neural processes in a 1D regression task.
arXiv Detail & Related papers (2021-02-17T13:50:07Z) - Generalizing Convolutional Neural Networks for Equivariance to Lie
Groups on Arbitrary Continuous Data [52.78581260260455]
We propose a general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group.
We apply the same model architecture to images, ball-and-stick molecular data, and Hamiltonian dynamical systems.
arXiv Detail & Related papers (2020-02-25T17:40:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.