Projecting basis functions with tensor networks for Gaussian process
regression
- URL: http://arxiv.org/abs/2310.20630v1
- Date: Tue, 31 Oct 2023 16:59:07 GMT
- Title: Projecting basis functions with tensor networks for Gaussian process
regression
- Authors: Clara Menzen, Eva Memmel, Kim Batselier, Manon Kok
- Abstract summary: We develop an approach that allows us to use an exponential amount of basis functions without the corresponding exponential computational complexity.
We project the resulting weights back to the original space to make GP predictions.
In an experiment with an 18-dimensional benchmark data set, we show the applicability of our method to an inverse dynamics problem.
- Score: 5.482420806459269
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents a method for approximate Gaussian process (GP) regression
with tensor networks (TNs). A parametric approximation of a GP uses a linear
combination of basis functions, where the accuracy of the approximation depends
on the total number of basis functions $M$. We develop an approach that allows
us to use an exponential amount of basis functions without the corresponding
exponential computational complexity. The key idea to enable this is using
low-rank TNs. We first find a suitable low-dimensional subspace from the data,
described by a low-rank TN. In this low-dimensional subspace, we then infer the
weights of our model by solving a Bayesian inference problem. Finally, we
project the resulting weights back to the original space to make GP
predictions. The benefit of our approach comes from the projection to a smaller
subspace: It modifies the shape of the basis functions in a way that it sees
fit based on the given data, and it allows for efficient computations in the
smaller subspace. In an experiment with an 18-dimensional benchmark data set,
we show the applicability of our method to an inverse dynamics problem.
Related papers
- Highly Adaptive Ridge [84.38107748875144]
We propose a regression method that achieves a $n-2/3$ dimension-free L2 convergence rate in the class of right-continuous functions with square-integrable sectional derivatives.
Har is exactly kernel ridge regression with a specific data-adaptive kernel based on a saturated zero-order tensor-product spline basis expansion.
We demonstrate empirical performance better than state-of-the-art algorithms for small datasets in particular.
arXiv Detail & Related papers (2024-10-03T17:06:06Z) - Deep Horseshoe Gaussian Processes [1.0742675209112622]
We introduce the deep Horseshoe Gaussian process Deep-HGP, a new simple prior based on deep Gaussian processes with a squared-exponential kernel.
We show that the associated tempered posterior distribution recovers the unknown true regression curve optimally in terms of quadratic loss, up to a logarithmic factor.
arXiv Detail & Related papers (2024-03-04T05:30:43Z) - Promises and Pitfalls of the Linearized Laplace in Bayesian Optimization [73.80101701431103]
The linearized-Laplace approximation (LLA) has been shown to be effective and efficient in constructing Bayesian neural networks.
We study the usefulness of the LLA in Bayesian optimization and highlight its strong performance and flexibility.
arXiv Detail & Related papers (2023-04-17T14:23:43Z) - Active Nearest Neighbor Regression Through Delaunay Refinement [79.93030583257597]
We introduce an algorithm for active function approximation based on nearest neighbor regression.
Our Active Nearest Neighbor Regressor (ANNR) relies on the Voronoi-Delaunay framework from computational geometry to subdivide the space into cells with constant estimated function value.
arXiv Detail & Related papers (2022-06-16T10:24:03Z) - Sensing Cox Processes via Posterior Sampling and Positive Bases [56.82162768921196]
We study adaptive sensing of point processes, a widely used model from spatial statistics.
We model the intensity function as a sample from a truncated Gaussian process, represented in a specially constructed positive basis.
Our adaptive sensing algorithms use Langevin dynamics and are based on posterior sampling (textscCox-Thompson) and top-two posterior sampling (textscTop2) principles.
arXiv Detail & Related papers (2021-10-21T14:47:06Z) - Gaussian Process Subspace Regression for Model Reduction [7.41244589428771]
Subspace-valued functions arise in a wide range of problems, including parametric reduced order modeling (PROM)
In PROM, each parameter point can be associated with a subspace, which is used for Petrov-Galerkin projections of large system matrices.
We propose a novel Bayesian non model for subspace prediction: the Gaussian Process Subspace regression (GPS) model.
arXiv Detail & Related papers (2021-07-09T20:41:23Z) - Deep FPF: Gain function approximation in high-dimensional setting [8.164433158925592]
We present a novel approach to approximate the gain function of the feedback particle filter (FPF)
The numerical problem is to approximate the exact gain function using only finitely many particles sampled from the probability distribution.
Inspired by the recent success of the deep learning methods, we represent the gain function as a gradient of the output of a neural network.
arXiv Detail & Related papers (2020-10-02T20:17:21Z) - Improving predictions of Bayesian neural nets via local linearization [79.21517734364093]
We argue that the Gauss-Newton approximation should be understood as a local linearization of the underlying Bayesian neural network (BNN)
Because we use this linearized model for posterior inference, we should also predict using this modified model instead of the original one.
We refer to this modified predictive as "GLM predictive" and show that it effectively resolves common underfitting problems of the Laplace approximation.
arXiv Detail & Related papers (2020-08-19T12:35:55Z) - Projection Pursuit Gaussian Process Regression [5.837881923712394]
A primary goal of computer experiments is to reconstruct the function given by the computer code via scattered evaluations.
Traditional isotropic Gaussian process models suffer from the curse of dimensionality, when the input dimension is relatively high given limited data points.
We consider a projection pursuit model, in which the nonparametric part is driven by an additive Gaussian process regression.
arXiv Detail & Related papers (2020-04-01T19:12:01Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.