Projection Pursuit Gaussian Process Regression
- URL: http://arxiv.org/abs/2004.00667v2
- Date: Tue, 30 Aug 2022 19:17:59 GMT
- Title: Projection Pursuit Gaussian Process Regression
- Authors: Gecheng Chen, Rui Tuo
- Abstract summary: A primary goal of computer experiments is to reconstruct the function given by the computer code via scattered evaluations.
Traditional isotropic Gaussian process models suffer from the curse of dimensionality, when the input dimension is relatively high given limited data points.
We consider a projection pursuit model, in which the nonparametric part is driven by an additive Gaussian process regression.
- Score: 5.837881923712394
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A primary goal of computer experiments is to reconstruct the function given
by the computer code via scattered evaluations. Traditional isotropic Gaussian
process models suffer from the curse of dimensionality, when the input
dimension is relatively high given limited data points. Gaussian process models
with additive correlation functions are scalable to dimensionality, but they
are more restrictive as they only work for additive functions. In this work, we
consider a projection pursuit model, in which the nonparametric part is driven
by an additive Gaussian process regression. We choose the dimension of the
additive function higher than the original input dimension, and call this
strategy "dimension expansion". We show that dimension expansion can help
approximate more complex functions. A gradient descent algorithm is proposed
for model training based on the maximum likelihood estimation. Simulation
studies show that the proposed method outperforms the traditional Gaussian
process models. The Supplementary Materials are available online.
Related papers
- Scaling and renormalization in high-dimensional regression [72.59731158970894]
This paper presents a succinct derivation of the training and generalization performance of a variety of high-dimensional ridge regression models.
We provide an introduction and review of recent results on these topics, aimed at readers with backgrounds in physics and deep learning.
arXiv Detail & Related papers (2024-05-01T15:59:00Z) - Combining additivity and active subspaces for high-dimensional Gaussian
process modeling [2.7140711924836816]
We show how to combine high-dimensional Gaussian process modeling with a multi-fidelity strategy.
Our contribution for high-dimensional Gaussian process modeling is to combine them with a multi-fidelity strategy, showcasing the advantages through experiments on synthetic functions and datasets.
arXiv Detail & Related papers (2024-02-06T08:49:27Z) - Projecting basis functions with tensor networks for Gaussian process
regression [5.482420806459269]
We develop an approach that allows us to use an exponential amount of basis functions without the corresponding exponential computational complexity.
We project the resulting weights back to the original space to make GP predictions.
In an experiment with an 18-dimensional benchmark data set, we show the applicability of our method to an inverse dynamics problem.
arXiv Detail & Related papers (2023-10-31T16:59:07Z) - Implicit Manifold Gaussian Process Regression [49.0787777751317]
Gaussian process regression is widely used to provide well-calibrated uncertainty estimates.
It struggles with high-dimensional data because of the implicit low-dimensional manifold upon which the data actually lies.
In this paper we propose a technique capable of inferring implicit structure directly from data (labeled and unlabeled) in a fully differentiable way.
arXiv Detail & Related papers (2023-10-30T09:52:48Z) - Probabilistic Unrolling: Scalable, Inverse-Free Maximum Likelihood
Estimation for Latent Gaussian Models [69.22568644711113]
We introduce probabilistic unrolling, a method that combines Monte Carlo sampling with iterative linear solvers to circumvent matrix inversions.
Our theoretical analyses reveal that unrolling and backpropagation through the iterations of the solver can accelerate gradient estimation for maximum likelihood estimation.
In experiments on simulated and real data, we demonstrate that probabilistic unrolling learns latent Gaussian models up to an order of magnitude faster than gradient EM, with minimal losses in model performance.
arXiv Detail & Related papers (2023-06-05T21:08:34Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Additive Gaussian Processes Revisited [13.158344774468413]
We propose a new class of flexible non-parametric GP models with additive structure.
We show that the OAK model achieves similar or better predictive performance compared to black-box models.
With only a small number of additive low-dimensional terms, we demonstrate the OAK model achieves similar or better predictive performance compared to black-box models.
arXiv Detail & Related papers (2022-06-20T15:52:59Z) - RMFGP: Rotated Multi-fidelity Gaussian process with Dimension Reduction
for High-dimensional Uncertainty Quantification [12.826754199680474]
Multi-fidelity modelling enables accurate inference even when only a small set of accurate data is available.
By combining the realizations of the high-fidelity model with one or more low-fidelity models, the multi-fidelity method can make accurate predictions of quantities of interest.
This paper proposes a new dimension reduction framework based on rotated multi-fidelity Gaussian process regression and a Bayesian active learning scheme.
arXiv Detail & Related papers (2022-04-11T01:20:35Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Multi-fidelity data fusion for the approximation of scalar functions
with low intrinsic dimensionality through active subspaces [0.0]
We present a multi-fidelity approach involving active subspaces and we test it on two different high-dimensional benchmarks.
In this work we present a multi-fidelity approach involving active subspaces and we test it on two different high-dimensional benchmarks.
arXiv Detail & Related papers (2020-10-16T12:35:49Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.