Interpolation with the polynomial kernels
- URL: http://arxiv.org/abs/2212.07658v1
- Date: Thu, 15 Dec 2022 08:30:23 GMT
- Title: Interpolation with the polynomial kernels
- Authors: Giacomo Elefante and Wolfgang Erb and Francesco Marchetti and Emma
Perracchione and Davide Poggiali and Gabriele Santin
- Abstract summary: kernels are widely used in machine learning and they are one of the default choices to develop kernel-based regression models.
They are rarely used and considered in numerical analysis due to their lack of strict positive definiteness.
This paper is devoted to establish some initial results for the study of these kernels, and their related algorithms.
- Score: 5.8720142291102135
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The polynomial kernels are widely used in machine learning and they are one
of the default choices to develop kernel-based classification and regression
models. However, they are rarely used and considered in numerical analysis due
to their lack of strict positive definiteness. In particular they do not enjoy
the usual property of unisolvency for arbitrary point sets, which is one of the
key properties used to build kernel-based interpolation methods. This paper is
devoted to establish some initial results for the study of these kernels, and
their related interpolation algorithms, in the context of approximation theory.
We will first prove necessary and sufficient conditions on point sets which
guarantee the existence and uniqueness of an interpolant. We will then study
the Reproducing Kernel Hilbert Spaces (or native spaces) of these kernels and
their norms, and provide inclusion relations between spaces corresponding to
different kernel parameters. With these spaces at hand, it will be further
possible to derive generic error estimates which apply to sufficiently smooth
functions, thus escaping the native space. Finally, we will show how to employ
an efficient stable algorithm to these kernels to obtain accurate interpolants,
and we will test them in some numerical experiment. After this analysis several
computational and theoretical aspects remain open, and we will outline possible
further research directions in a concluding section. This work builds some
bridges between kernel and polynomial interpolation, two topics to which the
authors, to different extents, have been introduced under the supervision or
through the work of Stefano De Marchi. For this reason, they wish to dedicate
this work to him in the occasion of his 60th birthday.
Related papers
- Optimal Kernel Choice for Score Function-based Causal Discovery [92.65034439889872]
We propose a kernel selection method within the generalized score function that automatically selects the optimal kernel that best fits the data.
We conduct experiments on both synthetic data and real-world benchmarks, and the results demonstrate that our proposed method outperforms kernel selection methods.
arXiv Detail & Related papers (2024-07-14T09:32:20Z) - On the Approximation of Kernel functions [0.0]
The paper addresses approximations of the kernel itself.
For the Hilbert Gauss kernel on the unit cube, the paper establishes an upper bound of the associated eigenfunctions.
This improvement confirms low rank approximation methods such as the Nystr"om method.
arXiv Detail & Related papers (2024-03-11T13:50:07Z) - Approximation by non-symmetric networks for cross-domain learning [0.0]
We study the approximation capabilities of kernel based networks using non-symmetric kernels.
We obtain estimates on the accuracy of uniform approximation of functions in a Sobolev class by ReLU$r$ networks when $r$ is not necessarily an integer.
arXiv Detail & Related papers (2023-05-06T01:33:26Z) - Kernelized Cumulants: Beyond Kernel Mean Embeddings [11.448622437140022]
We extend cumulants to reproducing kernel Hilbert spaces (RKHS) using tools from tensor algebras.
We argue that going beyond degree one has several advantages and can be achieved with the same computational complexity and minimal overhead.
arXiv Detail & Related papers (2023-01-29T15:31:06Z) - Reconstructing Kernel-based Machine Learning Force Fields with
Super-linear Convergence [0.18416014644193063]
We consider the broad class of Nystr"om-type methods to construct preconditioners.
All considered methods aim to identify a representative subset of inducing ( Kernel) columns to approximate the dominant kernel spectrum.
arXiv Detail & Related papers (2022-12-24T13:45:50Z) - On the Benefits of Large Learning Rates for Kernel Methods [110.03020563291788]
We show that a phenomenon can be precisely characterized in the context of kernel methods.
We consider the minimization of a quadratic objective in a separable Hilbert space, and show that with early stopping, the choice of learning rate influences the spectral decomposition of the obtained solution.
arXiv Detail & Related papers (2022-02-28T13:01:04Z) - Fast Sketching of Polynomial Kernels of Polynomial Degree [61.83993156683605]
kernel is especially important as other kernels can often be approximated by the kernel via a Taylor series expansion.
Recent techniques in sketching reduce the dependence in the running time on the degree oblivious $q$ of the kernel.
We give a new sketch which greatly improves upon this running time, by removing the dependence on $q$ in the leading order term.
arXiv Detail & Related papers (2021-08-21T02:14:55Z) - Taming Nonconvexity in Kernel Feature Selection---Favorable Properties
of the Laplace Kernel [77.73399781313893]
A challenge is to establish the objective function of kernel-based feature selection.
The gradient-based algorithms available for non-global optimization are only able to guarantee convergence to local minima.
arXiv Detail & Related papers (2021-06-17T11:05:48Z) - Kernel Identification Through Transformers [54.3795894579111]
Kernel selection plays a central role in determining the performance of Gaussian Process (GP) models.
This work addresses the challenge of constructing custom kernel functions for high-dimensional GP regression models.
We introduce a novel approach named KITT: Kernel Identification Through Transformers.
arXiv Detail & Related papers (2021-06-15T14:32:38Z) - Generalized vec trick for fast learning of pairwise kernel models [3.867363075280544]
We present a comprehensive review of pairwise kernels, that have been proposed for incorporating prior knowledge about the relationship between the objects.
We show how all the reviewed kernels can be expressed as sums of Kronecker products, allowing the use of generalized vec trick for speeding up their computation.
arXiv Detail & Related papers (2020-09-02T13:27:51Z) - Learning Deep Kernels for Non-Parametric Two-Sample Tests [50.92621794426821]
We propose a class of kernel-based two-sample tests, which aim to determine whether two sets of samples are drawn from the same distribution.
Our tests are constructed from kernels parameterized by deep neural nets, trained to maximize test power.
arXiv Detail & Related papers (2020-02-21T03:54:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.