Convolutional Filtering with RKHS Algebras
- URL: http://arxiv.org/abs/2411.01341v1
- Date: Sat, 02 Nov 2024 18:53:44 GMT
- Title: Convolutional Filtering with RKHS Algebras
- Authors: Alejandro Parada-Mayorga, Leopoldo Agorio, Alejandro Ribeiro, Juan Bazerque,
- Abstract summary: We develop a theory of convolutional signal processing and neural networks for Reproducing Kernel Hilbert Spaces (RKHS)
We show that any RKHS allows the formal definition of multiple algebraic convolutional models.
We present a set of numerical experiments on real data in which wireless coverage is predicted from measurements captured by unmaned aerial vehicles.
- Score: 110.06688302593349
- License:
- Abstract: In this paper, we develop a generalized theory of convolutional signal processing and neural networks for Reproducing Kernel Hilbert Spaces (RKHS). Leveraging the theory of algebraic signal processing (ASP), we show that any RKHS allows the formal definition of multiple algebraic convolutional models. We show that any RKHS induces algebras whose elements determine convolutional operators acting on RKHS elements. This approach allows us to achieve scalable filtering and learning as a byproduct of the convolutional model, and simultaneously take advantage of the well-known benefits of processing information in an RKHS. To emphasize the generality and usefulness of our approach, we show how algebraic RKHS can be used to define convolutional signal models on groups, graphons, and traditional Euclidean signal spaces. Furthermore, using algebraic RKHS models, we build convolutional networks, formally defining the notion of pointwise nonlinearities and deriving explicit expressions for the training. Such derivations are obtained in terms of the algebraic representation of the RKHS. We present a set of numerical experiments on real data in which wireless coverage is predicted from measurements captured by unmaned aerial vehicles. This particular real-life scenario emphasizes the benefits of the convolutional RKHS models in neural networks compared to fully connected and standard convolutional operators.
Related papers
- Stability Analysis of Equivariant Convolutional Representations Through The Lens of Equivariant Multi-layered CKNs [0.0]
We construct and theoretically analyse group equivariant kernel networks (CKNs)
We study the stability analysis of such equiv-CNNs under the action of diffeomorphism.
The goal is to analyse the geometry of inductive biases of equiv-CNNs through the lens of reproducing kernel Hilbert spaces (RKHSs)
arXiv Detail & Related papers (2024-08-08T07:31:22Z) - A Sampling Theory Perspective on Activations for Implicit Neural
Representations [73.6637608397055]
Implicit Neural Representations (INRs) have gained popularity for encoding signals as compact, differentiable entities.
We conduct a comprehensive analysis of these activations from a sampling theory perspective.
Our investigation reveals that sinc activations, previously unused in conjunction with INRs, are theoretically optimal for signal encoding.
arXiv Detail & Related papers (2024-02-08T05:52:45Z) - Learning in RKHM: a $C^*$-Algebraic Twist for Kernel Machines [13.23700804428796]
Supervised learning in reproducing kernel Hilbert space (RKHS) and vector-valued RKHS (vvRKHS) has been investigated for more than 30 years.
We provide a new twist by generalizing supervised learning in RKHS and vvRKHS to reproducing kernel Hilbert $C*$-module (RKHM)
We show how to construct effective positive-definite kernels by considering the perspective of $C*$-algebra.
arXiv Detail & Related papers (2022-10-21T10:23:54Z) - Inducing Gaussian Process Networks [80.40892394020797]
We propose inducing Gaussian process networks (IGN), a simple framework for simultaneously learning the feature space as well as the inducing points.
The inducing points, in particular, are learned directly in the feature space, enabling a seamless representation of complex structured domains.
We report on experimental results for real-world data sets showing that IGNs provide significant advances over state-of-the-art methods.
arXiv Detail & Related papers (2022-04-21T05:27:09Z) - Learning primal-dual sparse kernel machines [10.230121160034674]
Traditionally, kernel methods rely on the representer theorem which states that the solution to a learning problem is obtained as a linear combination of the data mapped into the reproducing kernel Hilbert space (RKHS)
We propose to search for a solution in RKHS that has a pre-image decomposition in the original data space, where the elements don't necessarily correspond to the elements in the training set.
Our gradient-based method then hinges on optimisation over possibly sparse elements in the input space, and enables us to obtain a kernel-based model with both primal and dual sparsity.
arXiv Detail & Related papers (2021-08-27T09:38:53Z) - Convolutional Filtering and Neural Networks with Non Commutative
Algebras [153.20329791008095]
We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
arXiv Detail & Related papers (2021-08-23T04:22:58Z) - Action Recognition with Kernel-based Graph Convolutional Networks [14.924672048447338]
Learning graph convolutional networks (GCNs) aims at generalizing deep learning to arbitrary non-regular domains.
We introduce a novel GCN framework that achieves spatial graph convolution in a reproducing kernel Hilbert space (RKHS)
The particularity of our GCN model also resides in its ability to achieve convolutions without explicitly realigning nodes in the receptive fields of the learned graph filters with those of the input graphs.
arXiv Detail & Related papers (2020-12-28T11:02:51Z) - Stability of Algebraic Neural Networks to Small Perturbations [179.55535781816343]
Algebraic neural networks (AlgNNs) are composed of a cascade of layers each one associated to and algebraic signal model.
We show how any architecture that uses a formal notion of convolution can be stable beyond particular choices of the shift operator.
arXiv Detail & Related papers (2020-10-22T09:10:16Z) - Algebraic Neural Networks: Stability to Deformations [179.55535781816343]
We study algebraic neural networks (AlgNNs) with commutative algebras.
AlgNNs unify diverse architectures such as Euclidean convolutional neural networks, graph neural networks, and group neural networks.
arXiv Detail & Related papers (2020-09-03T03:41:38Z) - Analysis via Orthonormal Systems in Reproducing Kernel Hilbert
$C^*$-Modules and Applications [12.117553807794382]
We propose a novel data analysis framework with reproducing kernel Hilbert $C*$-module (RKHM)
We show the theoretical validity for the construction of orthonormal systems in Hilbert $C*$-modules, and derive concrete procedures for orthonormalization in RKHMs.
We apply those to generalize with RKHM kernel principal component analysis and the analysis of dynamical systems with Perron-Frobenius operators.
arXiv Detail & Related papers (2020-03-02T10:01:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.