Scattering Networks on Noncommutative Finite Groups
- URL: http://arxiv.org/abs/2505.20950v1
- Date: Tue, 27 May 2025 09:41:39 GMT
- Title: Scattering Networks on Noncommutative Finite Groups
- Authors: Maria Teresa Arias, Davide Barbieri, Eugenio Hernández,
- Abstract summary: We introduce a scattering transform on an arbitrary finite group (not necessarily abelian) within the context of group-equivariant convolutional neural networks (G-CNNs)<n>We demonstrate that, under certain conditions in the wavelet coefficients, the scattering transform is non-expansive, stable under deformations, preserves energy, equivariant with respect to left and right group translations, and, as depth increases, the scattering coefficients are less sensitive to group translations of the signal.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Scattering Networks were initially designed to elucidate the behavior of early layers in Convolutional Neural Networks (CNNs) over Euclidean spaces and are grounded in wavelets. In this work, we introduce a scattering transform on an arbitrary finite group (not necessarily abelian) within the context of group-equivariant convolutional neural networks (G-CNNs). We present wavelets on finite groups and analyze their similarity to classical wavelets. We demonstrate that, under certain conditions in the wavelet coefficients, the scattering transform is non-expansive, stable under deformations, preserves energy, equivariant with respect to left and right group translations, and, as depth increases, the scattering coefficients are less sensitive to group translations of the signal, all desirable properties of convolutional neural networks. Furthermore, we provide examples illustrating the application of the scattering transform to classify data with domains involving abelian and nonabelian groups.
Related papers
- Non Commutative Convolutional Signal Models in Neural Networks:
Stability to Small Deformations [111.27636893711055]
We study the filtering and stability properties of non commutative convolutional filters.
Our results have direct implications for group neural networks, multigraph neural networks and quaternion neural networks.
arXiv Detail & Related papers (2023-10-05T20:27:22Z) - On the Implicit Bias of Linear Equivariant Steerable Networks [9.539074889921935]
We study the implicit bias of gradient flow on linear equivariant steerable networks in group-invariant binary classification.
Under a unitary assumption on the input representation, we establish the equivalence between steerable networks and data augmentation.
arXiv Detail & Related papers (2023-03-07T19:37:35Z) - Bispectral Neural Networks [1.0323063834827415]
We present a neural network architecture, Bispectral Neural Networks (BNNs)
BNNs are able to simultaneously learn groups, their irreducible representations, and corresponding equivariant and complete-invariant maps.
arXiv Detail & Related papers (2022-09-07T18:34:48Z) - Unified Fourier-based Kernel and Nonlinearity Design for Equivariant
Networks on Homogeneous Spaces [52.424621227687894]
We introduce a unified framework for group equivariant networks on homogeneous spaces.
We take advantage of the sparsity of Fourier coefficients of the lifted feature fields.
We show that other methods treating features as the Fourier coefficients in the stabilizer subgroup are special cases of our activation.
arXiv Detail & Related papers (2022-06-16T17:59:01Z) - On the Effective Number of Linear Regions in Shallow Univariate ReLU
Networks: Convergence Guarantees and Implicit Bias [50.84569563188485]
We show that gradient flow converges in direction when labels are determined by the sign of a target network with $r$ neurons.
Our result may already hold for mild over- parameterization, where the width is $tildemathcalO(r)$ and independent of the sample size.
arXiv Detail & Related papers (2022-05-18T16:57:10Z) - Convolutional Filtering and Neural Networks with Non Commutative
Algebras [153.20329791008095]
We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
arXiv Detail & Related papers (2021-08-23T04:22:58Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - Lorentz Group Equivariant Neural Network for Particle Physics [58.56031187968692]
We present a neural network architecture that is fully equivariant with respect to transformations under the Lorentz group.
For classification tasks in particle physics, we demonstrate that such an equivariant architecture leads to drastically simpler models that have relatively few learnable parameters.
arXiv Detail & Related papers (2020-06-08T17:54:43Z) - Geometric Wavelet Scattering Networks on Compact Riemannian Manifolds [9.341436585977913]
Similar to the Euclidean scattering transform, the geometric scattering transform is based on a cascade of wavelet filters and pointwise nonlinearities.
Empirical results demonstrate its utility on several geometric learning tasks.
arXiv Detail & Related papers (2019-05-24T21:19:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.