Spectrum Gaussian Processes Based On Tunable Basis Functions
- URL: http://arxiv.org/abs/2107.06473v1
- Date: Wed, 14 Jul 2021 03:51:24 GMT
- Title: Spectrum Gaussian Processes Based On Tunable Basis Functions
- Authors: Wenqi Fang, Guanlin Wu, Jingjing Li, Zheng Wang, Jiang Cao, Yang Ping
- Abstract summary: We introduce a novel basis function, which is tunable, local and bounded, to approximate the kernel function in the Gaussian process.
We conduct extensive experiments on open-source datasets to testify its performance.
- Score: 15.088239458693003
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spectral approximation and variational inducing learning for the Gaussian
process are two popular methods to reduce computational complexity. However, in
previous research, those methods always tend to adopt the orthonormal basis
functions, such as eigenvectors in the Hilbert space, in the spectrum method,
or decoupled orthogonal components in the variational framework. In this paper,
inspired by quantum physics, we introduce a novel basis function, which is
tunable, local and bounded, to approximate the kernel function in the Gaussian
process. There are two adjustable parameters in these functions, which control
their orthogonality to each other and limit their boundedness. And we conduct
extensive experiments on open-source datasets to testify its performance.
Compared to several state-of-the-art methods, it turns out that the proposed
method can obtain satisfactory or even better results, especially with poorly
chosen kernel functions.
Related papers
- Learning dissipative Hamiltonian dynamics with reproducing kernel Hilbert spaces and random Fourier features [0.7510165488300369]
This paper presents a new method for learning dissipative Hamiltonian dynamics from a limited and noisy dataset.
The performance of the method is validated in simulations for two dissipative Hamiltonian systems.
arXiv Detail & Related papers (2024-10-24T11:35:39Z) - MEP: Multiple Kernel Learning Enhancing Relative Positional Encoding Length Extrapolation [5.298814565953444]
Relative position encoding methods address the length extrapolation challenge exclusively through the implementation of a single kernel function.
This study proposes a novel relative positional encoding method, called MEP, which employs a weighted average to combine distinct kernel functions.
We present two distinct versions of our method: a parameter-free variant that requires no new learnable parameters, and a parameterized variant capable of integrating state-of-the-art techniques.
arXiv Detail & Related papers (2024-03-26T13:38:06Z) - Quantum-Assisted Hilbert-Space Gaussian Process Regression [0.0]
We propose a space approximation-based quantum algorithm for Gaussian process regression.
Our method consists of a combination of classical basis function expansion with quantum computing techniques.
arXiv Detail & Related papers (2024-02-01T12:13:35Z) - Rational kernel-based interpolation for complex-valued frequency response functions [0.0]
This work is concerned with the kernel-based approximation of a complex-valued function from data.
We introduce new Hilbert kernel spaces of complex-valued functions, and formulate the problem of complex-valued with a kernel pair as minimum norm in these spaces.
Numerical results on examples from different fields, including electromagnetics and acoustic examples, illustrate the performance of the method.
arXiv Detail & Related papers (2023-07-25T13:21:07Z) - Promises and Pitfalls of the Linearized Laplace in Bayesian Optimization [73.80101701431103]
The linearized-Laplace approximation (LLA) has been shown to be effective and efficient in constructing Bayesian neural networks.
We study the usefulness of the LLA in Bayesian optimization and highlight its strong performance and flexibility.
arXiv Detail & Related papers (2023-04-17T14:23:43Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - A Discrete Variational Derivation of Accelerated Methods in Optimization [68.8204255655161]
We introduce variational which allow us to derive different methods for optimization.
We derive two families of optimization methods in one-to-one correspondence.
The preservation of symplecticity of autonomous systems occurs here solely on the fibers.
arXiv Detail & Related papers (2021-06-04T20:21:53Z) - Advanced Stationary and Non-Stationary Kernel Designs for Domain-Aware
Gaussian Processes [0.0]
We propose advanced kernel designs that only allow for functions with certain desirable characteristics to be elements of the reproducing kernel Hilbert space (RKHS)
We will show the impact of advanced kernel designs on Gaussian processes using several synthetic and two scientific data sets.
arXiv Detail & Related papers (2021-02-05T22:07:56Z) - Local optimization on pure Gaussian state manifolds [63.76263875368856]
We exploit insights into the geometry of bosonic and fermionic Gaussian states to develop an efficient local optimization algorithm.
The method is based on notions of descent gradient attuned to the local geometry.
We use the presented methods to collect numerical and analytical evidence for the conjecture that Gaussian purifications are sufficient to compute the entanglement of purification of arbitrary mixed Gaussian states.
arXiv Detail & Related papers (2020-09-24T18:00:36Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.