A Unified Framework for Discovering Discrete Symmetries
- URL: http://arxiv.org/abs/2309.02898v2
- Date: Fri, 27 Oct 2023 09:24:20 GMT
- Title: A Unified Framework for Discovering Discrete Symmetries
- Authors: Pavan Karjol, Rohan Kashyap, Aditya Gopalan, Prathosh A.P
- Abstract summary: We consider the problem of learning a function respecting a symmetry from among a class of symmetries.
We develop a unified framework that enables symmetry discovery across a broad range of subgroups.
- Score: 17.687122467264487
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider the problem of learning a function respecting a symmetry from
among a class of symmetries. We develop a unified framework that enables
symmetry discovery across a broad range of subgroups including locally
symmetric, dihedral and cyclic subgroups. At the core of the framework is a
novel architecture composed of linear, matrix-valued and non-linear functions
that expresses functions invariant to these subgroups in a principled manner.
The structure of the architecture enables us to leverage multi-armed bandit
algorithms and gradient descent to efficiently optimize over the linear and the
non-linear functions, respectively, and to infer the symmetry that is
ultimately learnt. We also discuss the necessity of the matrix-valued functions
in the architecture. Experiments on image-digit sum and polynomial regression
tasks demonstrate the effectiveness of our approach.
Related papers
- Structured Regularization for Constrained Optimization on the SPD Manifold [1.1126342180866644]
We introduce a class of structured regularizers, based on symmetric gauge functions, which allow for solving constrained optimization on the SPD manifold with faster unconstrained methods.
We show that our structured regularizers can be chosen to preserve or induce desirable structure, in particular convexity and "difference of convex" structure.
arXiv Detail & Related papers (2024-10-12T22:11:22Z) - Symmetry From Scratch: Group Equivariance as a Supervised Learning Task [1.8570740863168362]
In machine learning datasets with symmetries, the paradigm for backward compatibility with symmetry-breaking has been to relax equivariant architectural constraints.
We introduce symmetry-cloning, a method for inducing equivariance in machine learning models.
arXiv Detail & Related papers (2024-10-05T00:44:09Z) - Symmetry-Based Structured Matrices for Efficient Approximately Equivariant Networks [5.187307904567701]
Group Matrices (GMs) are a forgotten precursor to the modern notion of regular representations of finite groups.
GMs can be employed to extend all the elementary operations of CNNs to general discrete groups.
We show that GMs can be employed to extend all the elementary operations of CNNs to general discrete groups.
arXiv Detail & Related papers (2024-09-18T07:52:33Z) - Understanding Matrix Function Normalizations in Covariance Pooling through the Lens of Riemannian Geometry [63.694184882697435]
Global Covariance Pooling (GCP) has been demonstrated to improve the performance of Deep Neural Networks (DNNs) by exploiting second-order statistics of high-level representations.
arXiv Detail & Related papers (2024-07-15T07:11:44Z) - Nonlinear SVD with Asymmetric Kernels: feature learning and asymmetric
Nystr\"om method [14.470859959783995]
Asymmetric data naturally exist in real life, such as directed graphs.
This paper tackles the asymmetric kernel-based learning problem.
Experiments show that asymmetric KSVD learns features outperforming Mercer- Kernel.
arXiv Detail & Related papers (2023-06-12T11:39:34Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - The Dynamics of Riemannian Robbins-Monro Algorithms [101.29301565229265]
We propose a family of Riemannian algorithms generalizing and extending the seminal approximation framework of Robbins and Monro.
Compared to their Euclidean counterparts, Riemannian algorithms are much less understood due to lack of a global linear structure on the manifold.
We provide a general template of almost sure convergence results that mirrors and extends the existing theory for Euclidean Robbins-Monro schemes.
arXiv Detail & Related papers (2022-06-14T12:30:11Z) - A Functional Perspective on Learning Symmetric Functions with Neural
Networks [48.80300074254758]
We study the learning and representation of neural networks defined on measures.
We establish approximation and generalization bounds under different choices of regularization.
The resulting models can be learned efficiently and enjoy generalization guarantees that extend across input sizes.
arXiv Detail & Related papers (2020-08-16T16:34:33Z) - Nonconvex Matrix Completion with Linearly Parameterized Factors [10.163102766021373]
Parametric Factorization holds for important examples including subspace and completion simulations.
The effectiveness of our unified nonconstrained matrix optimization method is also illustrated.
arXiv Detail & Related papers (2020-03-29T22:40:47Z) - Invariant Feature Coding using Tensor Product Representation [75.62232699377877]
We prove that the group-invariant feature vector contains sufficient discriminative information when learning a linear classifier.
A novel feature model that explicitly consider group action is proposed for principal component analysis and k-means clustering.
arXiv Detail & Related papers (2019-06-05T07:15:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.