Learning Symmetrization for Equivariance with Orbit Distance
Minimization
- URL: http://arxiv.org/abs/2311.07143v1
- Date: Mon, 13 Nov 2023 08:14:29 GMT
- Title: Learning Symmetrization for Equivariance with Orbit Distance
Minimization
- Authors: Tien Dat Nguyen, Jinwoo Kim, Hongseok Yang, Seunghoon Hong
- Abstract summary: We present a framework for symmetrizing an arbitrary neural-network architecture and making it equivariant with respect to a given group.
We experimentally show our method's competitiveness on the SO(2) image classification task.
- Score: 27.284125807569115
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a general framework for symmetrizing an arbitrary neural-network
architecture and making it equivariant with respect to a given group. We build
upon the proposals of Kim et al. (2023); Kaba et al. (2023) for symmetrization,
and improve them by replacing their conversion of neural features into group
representations, with an optimization whose loss intuitively measures the
distance between group orbits. This change makes our approach applicable to a
broader range of matrix groups, such as the Lorentz group O(1, 3), than these
two proposals. We experimentally show our method's competitiveness on the SO(2)
image classification task, and also its increased generality on the task with
O(1, 3). Our implementation will be made accessible at
https://github.com/tiendatnguyen-vision/Orbit-symmetrize.
Related papers
- Learning Symmetries via Weight-Sharing with Doubly Stochastic Tensors [46.59269589647962]
Group equivariance has emerged as a valuable inductive bias in deep learning.
Group equivariant methods require the groups of interest to be known beforehand.
We show that when the dataset exhibits strong symmetries, the permutation matrices will converge to regular group representations.
arXiv Detail & Related papers (2024-12-05T20:15:34Z) - Optimal Symmetries in Binary Classification [0.0]
We show that selecting the appropriate group symmetries is crucial for optimising generalisation and sample efficiency.
We develop a theoretical foundation for designing group equivariant neural networks that align the choice of symmetries with the underlying probability distributions of the data.
arXiv Detail & Related papers (2024-08-16T16:15:18Z) - Equivariance via Minimal Frame Averaging for More Symmetries and Efficiency [48.81897136561015]
Minimal Frame Averaging (MFA) is a mathematical framework for constructing provably minimal frames that are exactly equivariant.
Results demonstrate the efficiency and effectiveness of encoding symmetries via MFA across a diverse range of tasks.
arXiv Detail & Related papers (2024-06-11T15:58:56Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Learning Probabilistic Symmetrization for Architecture Agnostic Equivariance [16.49488981364657]
We present a novel framework to overcome the limitations of equivariant architectures in learning functions with group symmetries.
We use an arbitrary base model such as anvariant or a transformer and symmetrize it to be equivariant to the given group.
Empirical tests show competitive results against tailored equivariant architectures.
arXiv Detail & Related papers (2023-06-05T13:40:54Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Semi-Supervised Subspace Clustering via Tensor Low-Rank Representation [64.49871502193477]
We propose a novel semi-supervised subspace clustering method, which is able to simultaneously augment the initial supervisory information and construct a discriminative affinity matrix.
Comprehensive experimental results on six commonly-used benchmark datasets demonstrate the superiority of our method over state-of-the-art methods.
arXiv Detail & Related papers (2022-05-21T01:47:17Z) - GeomNet: A Neural Network Based on Riemannian Geometries of SPD Matrix
Space and Cholesky Space for 3D Skeleton-Based Interaction Recognition [2.817412580574242]
We propose a novel method for representation and classification of two-person interactions from 3D skeleton sequences.
We show that the proposed method achieves competitive results in two-person interaction recognition on three benchmarks for 3D human activity understanding.
arXiv Detail & Related papers (2021-11-25T13:57:43Z) - Frame Averaging for Invariant and Equivariant Network Design [50.87023773850824]
We introduce Frame Averaging (FA), a framework for adapting known (backbone) architectures to become invariant or equivariant to new symmetry types.
We show that FA-based models have maximal expressive power in a broad setting.
We propose a new class of universal Graph Neural Networks (GNNs), universal Euclidean motion invariant point cloud networks, and Euclidean motion invariant Message Passing (MP) GNNs.
arXiv Detail & Related papers (2021-10-07T11:05:23Z) - A Practical Method for Constructing Equivariant Multilayer Perceptrons
for Arbitrary Matrix Groups [115.58550697886987]
We provide a completely general algorithm for solving for the equivariant layers of matrix groups.
In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before.
Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems.
arXiv Detail & Related papers (2021-04-19T17:21:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.