On the finite representation of group equivariant operators via
permutant measures
- URL: http://arxiv.org/abs/2008.06340v2
- Date: Wed, 9 Mar 2022 19:05:50 GMT
- Title: On the finite representation of group equivariant operators via
permutant measures
- Authors: Giovanni Bocchi, Stefano Botteghi, Martina Brasini, Patrizio Frosini,
Nicola Quercioli
- Abstract summary: We show that each linear $G$-equivariant operator can be produced by a suitable permutant measure.
This result makes available a new method to build linear $G$-equivariant operators in the finite setting.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The study of $G$-equivariant operators is of great interest to explain and
understand the architecture of neural networks. In this paper we show that each
linear $G$-equivariant operator can be produced by a suitable permutant
measure, provided that the group $G$ transitively acts on a finite signal
domain $X$. This result makes available a new method to build linear
$G$-equivariant operators in the finite setting.
Related papers
- Lie Group Decompositions for Equivariant Neural Networks [12.139222986297261]
We show how convolution kernels can be parametrized to build models equivariant with respect to affine transformations.
We evaluate the robustness and out-of-distribution generalisation capability of our model on the benchmark affine-invariant classification task.
arXiv Detail & Related papers (2023-10-17T16:04:33Z) - Discovering Sparse Representations of Lie Groups with Machine Learning [55.41644538483948]
We show that our method reproduces the canonical representations of the generators of the Lorentz group.
This approach is completely general and can be used to find the infinitesimal generators for any Lie group.
arXiv Detail & Related papers (2023-02-10T17:12:05Z) - How Jellyfish Characterise Alternating Group Equivariant Neural Networks [0.0]
We find a basis for the learnable, linear, $A_n$-equivariant layer functions between such tensor power spaces in the standard basis of $mathbbRn$.
We also describe how our approach generalises to the construction of neural networks that are equivariant to local symmetries.
arXiv Detail & Related papers (2023-01-24T17:39:10Z) - Equivariant Transduction through Invariant Alignment [71.45263447328374]
We introduce a novel group-equivariant architecture that incorporates a group-in hard alignment mechanism.
We find that our network's structure allows it to develop stronger equivariant properties than existing group-equivariant approaches.
We additionally find that it outperforms previous group-equivariant networks empirically on the SCAN task.
arXiv Detail & Related papers (2022-09-22T11:19:45Z) - On the geometric and Riemannian structure of the spaces of group
equivariant non-expansive operators [0.0]
Group equivariant non-expansive operators have been recently proposed as basic components in topological data analysis and deep learning.
We show how a space $mathcalF$ of group equivariant non-expansive operators can be endowed with the structure of a Riemannian manifold.
We also describe a procedure to select a finite set of representative group equivariant non-expansive operators in the considered manifold.
arXiv Detail & Related papers (2021-03-03T17:29:25Z) - Abelian Neural Networks [48.52497085313911]
We first construct a neural network architecture for Abelian group operations and derive a universal approximation property.
We extend it to Abelian semigroup operations using the characterization of associative symmetrics.
We train our models over fixed word embeddings and demonstrate improved performance over the original word2vec.
arXiv Detail & Related papers (2021-02-24T11:52:21Z) - Structured Sparsity Inducing Adaptive Optimizers for Deep Learning [94.23102887731417]
In this paper, we derive the weighted proximal operator, which is a necessary component of proximal gradient methods.
We show that this adaptive method, together with the weighted proximal operators derived here, is indeed capable of finding solutions with structure in their sparsity patterns.
arXiv Detail & Related papers (2021-02-07T18:06:23Z) - A New Neural Network Architecture Invariant to the Action of Symmetry
Subgroups [11.812645659940237]
We propose a $G$-invariant neural network that approximates functions invariant to the action of a given permutation subgroup on input data.
The key element of the proposed network architecture is a new $G$-invariant transformation module, which produces a $G$-invariant latent representation of the input data.
arXiv Detail & Related papers (2020-12-11T16:19:46Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z) - A Computationally Efficient Neural Network Invariant to the Action of
Symmetry Subgroups [12.654871396334668]
A new $G$-invariant transformation module produces a $G$-invariant latent representation of the input data.
This latent representation is then processed with a multi-layer perceptron in the network.
We prove the universality of the proposed architecture, discuss its properties and highlight its computational and memory efficiency.
arXiv Detail & Related papers (2020-02-18T12:50:56Z) - Invariant Feature Coding using Tensor Product Representation [75.62232699377877]
We prove that the group-invariant feature vector contains sufficient discriminative information when learning a linear classifier.
A novel feature model that explicitly consider group action is proposed for principal component analysis and k-means clustering.
arXiv Detail & Related papers (2019-06-05T07:15:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.