An appointment with Reproducing Kernel Hilbert Space generated by
Generalized Gaussian RBF as $L^2-$measure
- URL: http://arxiv.org/abs/2312.10693v1
- Date: Sun, 17 Dec 2023 12:02:10 GMT
- Title: An appointment with Reproducing Kernel Hilbert Space generated by
Generalized Gaussian RBF as $L^2-$measure
- Authors: Himanshu Singh
- Abstract summary: The Generalized Gaussian Radial Basis Function (RBF) Kernels are the most-often-employed kernels in artificial intelligence and machine learning routines.
This manuscript demonstrates the application of the Generalized Gaussian RBF in the kernel sense on the aforementioned machine learning routines along with the comparisons against the aforementioned functions as well.
- Score: 3.9931474959554496
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaussian Radial Basis Function (RBF) Kernels are the most-often-employed
kernels in artificial intelligence and machine learning routines for providing
optimally-best results in contrast to their respective counter-parts. However,
a little is known about the application of the Generalized Gaussian Radial
Basis Function on various machine learning algorithms namely, kernel
regression, support vector machine (SVM) and pattern-recognition via neural
networks. The results that are yielded by Generalized Gaussian RBF in the
kernel sense outperforms in stark contrast to Gaussian RBF Kernel, Sigmoid
Function and ReLU Function. This manuscript demonstrates the application of the
Generalized Gaussian RBF in the kernel sense on the aforementioned machine
learning routines along with the comparisons against the aforementioned
functions as well.
Related papers
- Learning Analysis of Kernel Ridgeless Regression with Asymmetric Kernel Learning [33.34053480377887]
This paper enhances kernel ridgeless regression with Locally-Adaptive-Bandwidths (LAB) RBF kernels.
For the first time, we demonstrate that functions learned from LAB RBF kernels belong to an integral space of Reproducible Kernel Hilbert Spaces (RKHSs)
arXiv Detail & Related papers (2024-06-03T15:28:12Z) - On the Sublinear Regret of GP-UCB [58.25014663727544]
We show that the Gaussian Process Upper Confidence Bound (GP-UCB) algorithm enjoys nearly optimal regret rates.
Our improvements rely on a key technical contribution -- regularizing kernel ridge estimators in proportion to the smoothness of the underlying kernel.
arXiv Detail & Related papers (2023-07-14T13:56:11Z) - Promises and Pitfalls of the Linearized Laplace in Bayesian Optimization [73.80101701431103]
The linearized-Laplace approximation (LLA) has been shown to be effective and efficient in constructing Bayesian neural networks.
We study the usefulness of the LLA in Bayesian optimization and highlight its strong performance and flexibility.
arXiv Detail & Related papers (2023-04-17T14:23:43Z) - An approach to the Gaussian RBF kernels via Fock spaces [0.0]
We use methods from the Fock space and Segal-Bargmann theories to prove several results on the Gaussian RBF kernel.
We show how the RBF kernels can be related to some of the most used operators in quantum mechanics and time frequency analysis.
arXiv Detail & Related papers (2022-10-25T16:59:00Z) - Learning "best" kernels from data in Gaussian process regression. With
application to aerodynamics [0.4588028371034406]
We introduce algorithms to select/design kernels in Gaussian process regression/kriging surrogate modeling techniques.
A first class of algorithms is kernel flow, which was introduced in a context of classification in machine learning.
A second class of algorithms is called spectral kernel ridge regression, and aims at selecting a "best" kernel such that the norm of the function to be approximated is minimal.
arXiv Detail & Related papers (2022-06-03T07:50:54Z) - Vector-valued Gaussian Processes on Riemannian Manifolds via Gauge
Equivariant Projected Kernels [108.60991563944351]
We present a recipe for constructing gauge equivariant kernels, which induce vector-valued Gaussian processes coherent with geometry.
We extend standard Gaussian process training methods, such as variational inference, to this setting.
arXiv Detail & Related papers (2021-10-27T13:31:10Z) - Hybrid Random Features [60.116392415715275]
We propose a new class of random feature methods for linearizing softmax and Gaussian kernels called hybrid random features (HRFs)
HRFs automatically adapt the quality of kernel estimation to provide most accurate approximation in the defined regions of interest.
arXiv Detail & Related papers (2021-10-08T20:22:59Z) - A Robust Asymmetric Kernel Function for Bayesian Optimization, with
Application to Image Defect Detection in Manufacturing Systems [2.4278445972594525]
We propose a robust kernel function, Asymmetric Elastic Net Radial Basis Function (AEN-RBF)
We show theoretically that AEN-RBF can realize smaller mean squared prediction error under mild conditions.
We also show that the AEN-RBF kernel function is less sensitive to outliers.
arXiv Detail & Related papers (2021-09-22T17:59:05Z) - Random Features for the Neural Tangent Kernel [57.132634274795066]
We propose an efficient feature map construction of the Neural Tangent Kernel (NTK) of fully-connected ReLU network.
We show that dimension of the resulting features is much smaller than other baseline feature map constructions to achieve comparable error bounds both in theory and practice.
arXiv Detail & Related papers (2021-04-03T09:08:12Z) - Flow-based Kernel Prior with Application to Blind Super-Resolution [143.21527713002354]
Kernel estimation is generally one of the key problems for blind image super-resolution (SR)
This paper proposes a normalizing flow-based kernel prior (FKP) for kernel modeling.
Experiments on synthetic and real-world images demonstrate that the proposed FKP can significantly improve the kernel estimation accuracy.
arXiv Detail & Related papers (2021-03-29T22:37:06Z) - Advanced Stationary and Non-Stationary Kernel Designs for Domain-Aware
Gaussian Processes [0.0]
We propose advanced kernel designs that only allow for functions with certain desirable characteristics to be elements of the reproducing kernel Hilbert space (RKHS)
We will show the impact of advanced kernel designs on Gaussian processes using several synthetic and two scientific data sets.
arXiv Detail & Related papers (2021-02-05T22:07:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.