Duality for Neural Networks through Reproducing Kernel Banach Spaces
- URL: http://arxiv.org/abs/2211.05020v2
- Date: Thu, 10 Nov 2022 11:11:21 GMT
- Title: Duality for Neural Networks through Reproducing Kernel Banach Spaces
- Authors: Len Spek, Tjeerd Jan Heeringa, Christoph Brune
- Abstract summary: We show that Reproducing Kernel Banach spaces (RKBS) can be understood as an infinite union of RKHS spaces.
As the RKBS is not a Hilbert space, it is not its own dual space. However, we show that its dual space is again an RKBS where the roles of the data and parameters are interchanged.
This allows us to construct the saddle point problem for neural networks, which can be used in the whole field of primal-dual optimisation.
- Score: 1.3750624267664155
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reproducing Kernel Hilbert spaces (RKHS) have been a very successful tool in
various areas of machine learning. Recently, Barron spaces have been used to
prove bounds on the generalisation error for neural networks. Unfortunately,
Barron spaces cannot be understood in terms of RKHS due to the strong nonlinear
coupling of the weights. We show that this can be solved by using the more
general Reproducing Kernel Banach spaces (RKBS). This class of integral RKBS
can be understood as an infinite union of RKHS spaces. As the RKBS is not a
Hilbert space, it is not its own dual space. However, we show that its dual
space is again an RKBS where the roles of the data and parameters are
interchanged, forming an adjoint pair of RKBSs including a reproducing property
in the dual space. This allows us to construct the saddle point problem for
neural networks, which can be used in the whole field of primal-dual
optimisation.
Related papers
- Which Spaces can be Embedded in $L_p$-type Reproducing Kernel Banach Space? A Characterization via Metric Entropy [4.256898347232072]
We establish a novel connection between the metric entropy growth and the embeddability of function spaces into reproducing kernel Hilbert/Banach spaces.
Our results shed new light on the power and limitations of kernel methods for learning complex function spaces.
arXiv Detail & Related papers (2024-10-14T21:53:19Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Hypothesis Spaces for Deep Learning [7.695772976072261]
This paper introduces a hypothesis space for deep learning that employs deep neural networks (DNNs)
By treating a DNN as a function of two variables, we consider the primitive set of the DNNs for the parameter variable located in a set of the weight matrices and biases determined by a prescribed depth and widths of the DNNs.
We prove that the Banach space so constructed is a kernel reproducing Banach space (RKBS) and construct its reproducing kernel.
arXiv Detail & Related papers (2024-03-05T22:42:29Z) - Barron Space for Graph Convolution Neural Networks [4.980329703241598]
Graph convolutional neural network (GCNN) operates on graph domain.
In this paper, we introduce a Barron space of functions on a compact domain of graph signals.
arXiv Detail & Related papers (2023-11-06T02:58:05Z) - Gradient Descent in Neural Networks as Sequential Learning in RKBS [63.011641517977644]
We construct an exact power-series representation of the neural network in a finite neighborhood of the initial weights.
We prove that, regardless of width, the training sequence produced by gradient descent can be exactly replicated by regularized sequential learning.
arXiv Detail & Related papers (2023-02-01T03:18:07Z) - Continuous percolation in a Hilbert space for a large system of qubits [58.720142291102135]
The percolation transition is defined through the appearance of the infinite cluster.
We show that the exponentially increasing dimensionality of the Hilbert space makes its covering by finite-size hyperspheres inefficient.
Our approach to the percolation transition in compact metric spaces may prove useful for its rigorous treatment in other contexts.
arXiv Detail & Related papers (2022-10-15T13:53:21Z) - Non-Hermitian $C_{NH} = 2$ Chern insulator protected by generalized
rotational symmetry [85.36456486475119]
A non-Hermitian system is protected by the generalized rotational symmetry $H+=UHU+$ of the system.
Our finding paves the way towards novel non-Hermitian topological systems characterized by large values of topological invariants.
arXiv Detail & Related papers (2021-11-24T15:50:22Z) - The Separation Capacity of Random Neural Networks [78.25060223808936]
We show that a sufficiently large two-layer ReLU-network with standard Gaussian weights and uniformly distributed biases can solve this problem with high probability.
We quantify the relevant structure of the data in terms of a novel notion of mutual complexity.
arXiv Detail & Related papers (2021-07-31T10:25:26Z) - Kolmogorov Width Decay and Poor Approximators in Machine Learning:
Shallow Neural Networks, Random Feature Models and Neural Tangent Kernels [8.160343645537106]
We establish a scale separation of Kolmogorov width type between subspaces of a given Banach space.
We show that reproducing kernel Hilbert spaces are poor $L2$-approximators for the class of two-layer neural networks in high dimension.
arXiv Detail & Related papers (2020-05-21T17:40:38Z) - Kernel-Based Reinforcement Learning: A Finite-Time Analysis [53.47210316424326]
We introduce Kernel-UCBVI, a model-based optimistic algorithm that leverages the smoothness of the MDP and a non-parametric kernel estimator of the rewards.
We empirically validate our approach in continuous MDPs with sparse rewards.
arXiv Detail & Related papers (2020-04-12T12:23:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.