Group Invariant Dictionary Learning
- URL: http://arxiv.org/abs/2007.07550v2
- Date: Sat, 5 Jun 2021 04:58:41 GMT
- Title: Group Invariant Dictionary Learning
- Authors: Yong Sheng Soh
- Abstract summary: We develop a framework for learning dictionaries for data under the constraint that the collection of basic building blocks remains invariant under such symmetries.
Our framework specializes to the convolutional dictionary learning problem when we consider integer shifts.
Our numerical experiments on synthetic data and ECG data show that the incorporation of such symmetries as priors are most valuable when the dataset has few data-points.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The dictionary learning problem concerns the task of representing data as
sparse linear sums drawn from a smaller collection of basic building blocks. In
application domains where such techniques are deployed, we frequently encounter
datasets where some form of symmetry or invariance is present. Motivated by
this observation, we develop a framework for learning dictionaries for data
under the constraint that the collection of basic building blocks remains
invariant under such symmetries. Our procedure for learning such dictionaries
relies on representing the symmetry as the action of a matrix group acting on
the data, and subsequently introducing a convex penalty function so as to
induce sparsity with respect to the collection of matrix group elements. Our
framework specializes to the convolutional dictionary learning problem when we
consider integer shifts. Using properties of positive semidefinite Hermitian
Toeplitz matrices, we develop an extension that learns dictionaries that are
invariant under continuous shifts. Our numerical experiments on synthetic data
and ECG data show that the incorporation of such symmetries as priors are most
valuable when the dataset has few data-points, or when the full range of
symmetries is inadequately expressed in the dataset.
Related papers
- Learning Infinitesimal Generators of Continuous Symmetries from Data [15.42275880523356]
We propose a novel symmetry learning algorithm based on transformations defined with one- parameter groups.
Our method is built upon minimal inductive biases, encompassing not only commonly utilized symmetries rooted in Lie groups but also extending to symmetries derived from nonlinear generators.
arXiv Detail & Related papers (2024-10-29T08:28:23Z) - Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - Weakly supervised covariance matrices alignment through Stiefel matrices
estimation for MEG applications [64.20396555814513]
This paper introduces a novel domain adaptation technique for time series data, called Mixing model Stiefel Adaptation (MSA)
We exploit abundant unlabeled data in the target domain to ensure effective prediction by establishing pairwise correspondence with equivalent signal variances between domains.
MSA outperforms recent methods in brain-age regression with task variations using magnetoencephalography (MEG) signals from the Cam-CAN dataset.
arXiv Detail & Related papers (2024-01-24T19:04:49Z) - Accelerated Discovery of Machine-Learned Symmetries: Deriving the
Exceptional Lie Groups G2, F4 and E6 [55.41644538483948]
This letter introduces two improved algorithms that significantly speed up the discovery of symmetry transformations.
Given the significant complexity of the exceptional Lie groups, our results demonstrate that this machine-learning method for discovering symmetries is completely general and can be applied to a wide variety of labeled datasets.
arXiv Detail & Related papers (2023-07-10T20:25:44Z) - Learning Lie Group Symmetry Transformations with Neural Networks [17.49001206996365]
This work focuses on discovering and characterizing unknown symmetries present in the dataset, namely, Lie group symmetry transformations.
Our goal is to characterize the transformation group and the distribution of the parameter values.
Results showcase the effectiveness of the approach in both these settings.
arXiv Detail & Related papers (2023-07-04T09:23:24Z) - Dictionary Learning under Symmetries via Group Representations [1.304892050913381]
We study the problem of learning a dictionary that is invariant under a pre-specified group of transformations.
We apply our paradigm to investigate the dictionary learning problem for the groups SO(2) and SO(3).
arXiv Detail & Related papers (2023-05-31T04:54:06Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Equivariant Representation Learning via Class-Pose Decomposition [17.032782230538388]
We introduce a general method for learning representations that are equivariant to symmetries of data.
The components semantically correspond to intrinsic data classes and poses respectively.
Results show that our representations capture the geometry of data and outperform other equivariant representation learning frameworks.
arXiv Detail & Related papers (2022-07-07T06:55:52Z) - Learning Log-Determinant Divergences for Positive Definite Matrices [47.61701711840848]
In this paper, we propose to learn similarity measures in a data-driven manner.
We capitalize on the alphabeta-log-det divergence, which is a meta-divergence parametrized by scalars alpha and beta.
Our key idea is to cast these parameters in a continuum and learn them from data.
arXiv Detail & Related papers (2021-04-13T19:09:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.