Convolutional Filtering and Neural Networks with Non Commutative
Algebras
- URL: http://arxiv.org/abs/2108.09923v3
- Date: Thu, 6 Jul 2023 16:49:58 GMT
- Title: Convolutional Filtering and Neural Networks with Non Commutative
Algebras
- Authors: Alejandro Parada-Mayorga, Landon Butler and Alejandro Ribeiro
- Abstract summary: We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
- Score: 153.20329791008095
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper we introduce and study the algebraic generalization of non
commutative convolutional neural networks. We leverage the theory of algebraic
signal processing to model convolutional non commutative architectures, and we
derive concrete stability bounds that extend those obtained in the literature
for commutative convolutional neural networks. We show that non commutative
convolutional architectures can be stable to deformations on the space of
operators. We develop the spectral representation of non commutative signal
models to show that non commutative filters process Fourier components
independently of each other. In particular we prove that although the spectral
decompositions of signals in non commutative models are associated to
eigenspaces of dimension larger than one, there exists a trade-off between
stability and selectivity, which is controlled by matrix polynomial functions
in spaces of matrices of low dimension. This tradeoff shows how when the
filters in the algebra are restricted to be stable, there is a loss in
discriminability that is compensated in the network by the pointwise
nonlinearities. The results derived in this paper have direct applications and
implications in non commutative convolutional architectures such as group
neural networks, multigraph neural networks, and quaternion neural networks,
for which we provide a set of numerical experiments showing their behavior when
perturbations are present.
Related papers
- Non Commutative Convolutional Signal Models in Neural Networks:
Stability to Small Deformations [111.27636893711055]
We study the filtering and stability properties of non commutative convolutional filters.
Our results have direct implications for group neural networks, multigraph neural networks and quaternion neural networks.
arXiv Detail & Related papers (2023-10-05T20:27:22Z) - Function-Space Optimality of Neural Architectures With Multivariate
Nonlinearities [30.762063524541638]
We prove a representer theorem that states that the solution sets to learning problems posed over Banach spaces are completely characterized by neural architectures with nonlinearities.
Our results shed light on the regularity of functions learned by neural networks trained on data, and provide new theoretical motivation for several architectural choices found in practice.
arXiv Detail & Related papers (2023-10-05T17:13:16Z) - Bispectral Neural Networks [1.0323063834827415]
We present a neural network architecture, Bispectral Neural Networks (BNNs)
BNNs are able to simultaneously learn groups, their irreducible representations, and corresponding equivariant and complete-invariant maps.
arXiv Detail & Related papers (2022-09-07T18:34:48Z) - Exploring Linear Feature Disentanglement For Neural Networks [63.20827189693117]
Non-linear activation functions, e.g., Sigmoid, ReLU, and Tanh, have achieved great success in neural networks (NNs)
Due to the complex non-linear characteristic of samples, the objective of those activation functions is to project samples from their original feature space to a linear separable feature space.
This phenomenon ignites our interest in exploring whether all features need to be transformed by all non-linear functions in current typical NNs.
arXiv Detail & Related papers (2022-03-22T13:09:17Z) - Stability of Algebraic Neural Networks to Small Perturbations [179.55535781816343]
Algebraic neural networks (AlgNNs) are composed of a cascade of layers each one associated to and algebraic signal model.
We show how any architecture that uses a formal notion of convolution can be stable beyond particular choices of the shift operator.
arXiv Detail & Related papers (2020-10-22T09:10:16Z) - Sparse Quantized Spectral Clustering [85.77233010209368]
We exploit tools from random matrix theory to make precise statements about how the eigenspectrum of a matrix changes under such nonlinear transformations.
We show that very little change occurs in the informative eigenstructure even under drastic sparsification/quantization.
arXiv Detail & Related papers (2020-10-03T15:58:07Z) - Algebraic Neural Networks: Stability to Deformations [179.55535781816343]
We study algebraic neural networks (AlgNNs) with commutative algebras.
AlgNNs unify diverse architectures such as Euclidean convolutional neural networks, graph neural networks, and group neural networks.
arXiv Detail & Related papers (2020-09-03T03:41:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.