Algebraic Neural Networks: Stability to Deformations
- URL: http://arxiv.org/abs/2009.01433v5
- Date: Wed, 30 Jun 2021 23:17:55 GMT
- Title: Algebraic Neural Networks: Stability to Deformations
- Authors: Alejandro Parada-Mayorga and Alejandro Ribeiro
- Abstract summary: We study algebraic neural networks (AlgNNs) with commutative algebras.
AlgNNs unify diverse architectures such as Euclidean convolutional neural networks, graph neural networks, and group neural networks.
- Score: 179.55535781816343
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study algebraic neural networks (AlgNNs) with commutative algebras which
unify diverse architectures such as Euclidean convolutional neural networks,
graph neural networks, and group neural networks under the umbrella of
algebraic signal processing. An AlgNN is a stacked layered information
processing structure where each layer is conformed by an algebra, a vector
space and a homomorphism between the algebra and the space of endomorphisms of
the vector space. Signals are modeled as elements of the vector space and are
processed by convolutional filters that are defined as the images of the
elements of the algebra under the action of the homomorphism. We analyze
stability of algebraic filters and AlgNNs to deformations of the homomorphism
and derive conditions on filters that lead to Lipschitz stable operators. We
conclude that stable algebraic filters have frequency responses -- defined as
eigenvalue domain representations -- whose derivative is inversely proportional
to the frequency -- defined as eigenvalue magnitudes. It follows that for a
given level of discriminability, AlgNNs are more stable than algebraic filters,
thereby explaining their better empirical performance. This same phenomenon has
been proven for Euclidean convolutional neural networks and graph neural
networks. Our analysis shows that this is a deep algebraic property shared by a
number of architectures.
Related papers
- Non Commutative Convolutional Signal Models in Neural Networks:
Stability to Small Deformations [111.27636893711055]
We study the filtering and stability properties of non commutative convolutional filters.
Our results have direct implications for group neural networks, multigraph neural networks and quaternion neural networks.
arXiv Detail & Related papers (2023-10-05T20:27:22Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Fast computation of permutation equivariant layers with the partition
algebra [0.0]
Linear neural network layers that are either equivariant or invariant to permutations of their inputs form core building blocks of modern deep learning architectures.
Examples include the layers of DeepSets, as well as linear layers occurring in attention blocks of transformers and some graph neural networks.
arXiv Detail & Related papers (2023-03-10T21:13:12Z) - Bispectral Neural Networks [1.0323063834827415]
We present a neural network architecture, Bispectral Neural Networks (BNNs)
BNNs are able to simultaneously learn groups, their irreducible representations, and corresponding equivariant and complete-invariant maps.
arXiv Detail & Related papers (2022-09-07T18:34:48Z) - Stability of Aggregation Graph Neural Networks [153.70485149740608]
We study the stability properties of aggregation graph neural networks (Agg-GNNs) considering perturbations of the underlying graph.
We prove that the stability bounds are defined by the properties of the filters in the first layer of the CNN that acts on each node.
We also conclude that in Agg-GNNs the selectivity of the mapping operators is tied to the properties of the filters only in the first layer of the CNN stage.
arXiv Detail & Related papers (2022-07-08T03:54:52Z) - Convolutional Filtering and Neural Networks with Non Commutative
Algebras [153.20329791008095]
We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
arXiv Detail & Related papers (2021-08-23T04:22:58Z) - Stability of Algebraic Neural Networks to Small Perturbations [179.55535781816343]
Algebraic neural networks (AlgNNs) are composed of a cascade of layers each one associated to and algebraic signal model.
We show how any architecture that uses a formal notion of convolution can be stable beyond particular choices of the shift operator.
arXiv Detail & Related papers (2020-10-22T09:10:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.