Non Commutative Convolutional Signal Models in Neural Networks:
Stability to Small Deformations
- URL: http://arxiv.org/abs/2310.03879v1
- Date: Thu, 5 Oct 2023 20:27:22 GMT
- Title: Non Commutative Convolutional Signal Models in Neural Networks:
Stability to Small Deformations
- Authors: Alejandro Parada-Mayorga, Landon Butler, and Alejandro Ribeiro
- Abstract summary: We study the filtering and stability properties of non commutative convolutional filters.
Our results have direct implications for group neural networks, multigraph neural networks and quaternion neural networks.
- Score: 111.27636893711055
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper we discuss the results recently published in~[1] about
algebraic signal models (ASMs) based on non commutative algebras and their use
in convolutional neural networks. Relying on the general tools from algebraic
signal processing (ASP), we study the filtering and stability properties of non
commutative convolutional filters. We show how non commutative filters can be
stable to small perturbations on the space of operators. We also show that
although the spectral components of the Fourier representation in a non
commutative signal model are associated to spaces of dimension larger than one,
there is a trade-off between stability and selectivity similar to that observed
for commutative models. Our results have direct implications for group neural
networks, multigraph neural networks and quaternion neural networks, among
other non commutative architectures. We conclude by corroborating these results
through numerical experiments.
Related papers
- Interrelation of equivariant Gaussian processes and convolutional neural
networks [77.34726150561087]
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship between neural networks (NN) and Gaussian processes (GP)
In this work we establish a relationship between the many-channel limit for CNNs equivariant with respect to two-dimensional Euclidean group with vector-valued neuron activations and the corresponding independently introduced equivariant Gaussian processes (GP)
arXiv Detail & Related papers (2022-09-17T17:02:35Z) - Bispectral Neural Networks [1.0323063834827415]
We present a neural network architecture, Bispectral Neural Networks (BNNs)
BNNs are able to simultaneously learn groups, their irreducible representations, and corresponding equivariant and complete-invariant maps.
arXiv Detail & Related papers (2022-09-07T18:34:48Z) - Preserving gauge invariance in neural networks [0.0]
lattice gauge equivariant convolutional neural networks (L-CNNs)
We show how L-CNNs can represent a large class of gauge invariant and equivariant functions on the lattice.
arXiv Detail & Related papers (2021-12-21T14:08:12Z) - Stability of Neural Networks on Manifolds to Relative Perturbations [118.84154142918214]
Graph Neural Networks (GNNs) show impressive performance in many practical scenarios.
GNNs can scale well on large size graphs, but this is contradicted by the fact that existing stability bounds grow with the number of nodes.
arXiv Detail & Related papers (2021-10-10T04:37:19Z) - Designing Rotationally Invariant Neural Networks from PDEs and
Variational Methods [8.660429288575367]
We investigate how diffusion and variational models achieve rotation invariance and transfer these ideas to neural networks.
We propose activation functions which couple network channels by combining information from several oriented filters.
Our findings help to translate diffusion and variational models into mathematically well-grained network architectures, and provide novel concepts for model-based CNN design.
arXiv Detail & Related papers (2021-08-31T17:34:40Z) - Convolutional Filtering and Neural Networks with Non Commutative
Algebras [153.20329791008095]
We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
arXiv Detail & Related papers (2021-08-23T04:22:58Z) - Stability of Algebraic Neural Networks to Small Perturbations [179.55535781816343]
Algebraic neural networks (AlgNNs) are composed of a cascade of layers each one associated to and algebraic signal model.
We show how any architecture that uses a formal notion of convolution can be stable beyond particular choices of the shift operator.
arXiv Detail & Related papers (2020-10-22T09:10:16Z) - Algebraic Neural Networks: Stability to Deformations [179.55535781816343]
We study algebraic neural networks (AlgNNs) with commutative algebras.
AlgNNs unify diverse architectures such as Euclidean convolutional neural networks, graph neural networks, and group neural networks.
arXiv Detail & Related papers (2020-09-03T03:41:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.