RKHS Representation of Algebraic Convolutional Filters with Integral Operators
- URL: http://arxiv.org/abs/2602.19094v1
- Date: Sun, 22 Feb 2026 08:28:34 GMT
- Title: RKHS Representation of Algebraic Convolutional Filters with Integral Operators
- Authors: Alejandro Parada-Mayorga, Alejandro Ribeiro, Juan Bazerque,
- Abstract summary: In this paper, we develop a theory showing that the range of integral operators naturally induces RKHS convolutional signal models.<n>We show that filtering with integral operators corresponds to iterated box products, giving rise to a unital kernel algebra.<n>Our results establish precise connections between eigendecompositions and RKHS representations in graphon signal processing, extend naturally to directed graphons, and enable novel spatial-spectral localization results.
- Score: 111.57971404925486
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Integral operators play a central role in signal processing, underpinning classical convolution, and filtering on continuous network models such as graphons. While these operators are traditionally analyzed through spectral decompositions, their connection to reproducing kernel Hilbert spaces (RKHS) has not been systematically explored within the algebraic signal processing framework. In this paper, we develop a comprehensive theory showing that the range of integral operators naturally induces RKHS convolutional signal models whose reproducing kernels are determined by a box product of the operator symbols. We characterize the algebraic and spectral properties of these induced RKHS and show that polynomial filtering with integral operators corresponds to iterated box products, giving rise to a unital kernel algebra. This perspective yields pointwise RKHS representations of filters via the reproducing property, providing an alternative to operator-based implementations. Our results establish precise connections between eigendecompositions and RKHS representations in graphon signal processing, extend naturally to directed graphons, and enable novel spatial--spectral localization results. Furthermore, we show that when the spectral domain is a subset of the original domain of the signals, optimal filters for regularized learning problems admit finite-dimensional RKHS representations, providing a principled foundation for learnable filters in integral-operator-based neural architectures.
Related papers
- The Vekua Layer: Exact Physical Priors for Implicit Neural Representations via Generalized Analytic Functions [0.0]
Implicit Neural Representations (INRs) have emerged as a powerful paradigm for parameterizing physical fields.<n>We introduce a differentiable spectral method grounded in the Generalized Analytic theory.<n>We show that our method can effectively act as a physics-informed spectral filter.
arXiv Detail & Related papers (2025-12-11T21:57:21Z) - A Spectral Interpretation of Redundancy in a Graph Reservoir [51.40366905583043]
This work revisits the definition of the reservoir in the Multiresolution Reservoir Graph Neural Network (MRGNN)<n>It proposes a variant based on a Fairing algorithm originally introduced in the field of surface design in computer graphics.<n>The core contribution of the paper lies in the theoretical analysis of the algorithm from a random walks perspective.
arXiv Detail & Related papers (2025-07-17T10:02:57Z) - Convolutional Filtering with RKHS Algebras [110.06688302593349]
We develop a theory of convolutional signal processing and neural networks for Reproducing Kernel Hilbert Spaces (RKHS)<n>We show that any RKHS allows the formal definition of multiple algebraic convolutional models.<n>We present a set of numerical experiments on real data in which wireless coverage is predicted from measurements captured by unmaned aerial vehicles.
arXiv Detail & Related papers (2024-11-02T18:53:44Z) - Energy-filtered excited states and real-time dynamics served in a contour integral [0.0]
Cauchy integral formula (CIF) can be used to represent holomorphic functions of diagonalizable operators on a finite domain.<n>I showcase a novel real-time electron dynamics (RT-EOM-CCSD) algorithm based on the CIF form of the exponential time-evolution operator.
arXiv Detail & Related papers (2024-09-11T15:39:50Z) - Parseval Convolution Operators and Neural Networks [16.78532039510369]
We first identify the Parseval convolution operators as the class of energy-preserving filterbanks.
We then present a constructive approach for the design/specification of such filterbanks via the chaining of elementary Parseval modules.
We demonstrate the usage of those tools with the design of a CNN-based algorithm for the iterative reconstruction of biomedical images.
arXiv Detail & Related papers (2024-08-19T13:31:16Z) - Tangent Bundle Convolutional Learning: from Manifolds to Cellular Sheaves and Back [84.61160272624262]
We define tangent bundle filters and tangent bundle neural networks (TNNs) based on this convolution operation.
Tangent bundle filters admit a spectral representation that generalizes the ones of scalar manifold filters, graph filters and standard convolutional filters in continuous time.
We numerically evaluate the effectiveness of the proposed architecture on various learning tasks.
arXiv Detail & Related papers (2023-03-20T17:57:15Z) - Convolutional Filtering and Neural Networks with Non Commutative
Algebras [153.20329791008095]
We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
arXiv Detail & Related papers (2021-08-23T04:22:58Z) - Stability to Deformations of Manifold Filters and Manifold Neural Networks [89.53585099149973]
The paper defines and studies manifold (M) convolutional filters and neural networks (NNs)
The main technical contribution of the paper is to analyze the stability of manifold filters and MNNs to smooth deformations of the manifold.
arXiv Detail & Related papers (2021-06-07T15:41:03Z) - Learning Chebyshev Basis in Graph Convolutional Networks for
Skeleton-based Action Recognition [14.924672048447338]
Spectral graph convolutional networks (GCNs) are particular deep models which aim at extending neural networks to arbitrary irregular domains.
We introduce a novel spectral GCN that learns not only the usual convolutional parameters but also the Laplacian operators.
arXiv Detail & Related papers (2021-04-12T14:08:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.