Combining additivity and active subspaces for high-dimensional Gaussian
process modeling
- URL: http://arxiv.org/abs/2402.03809v1
- Date: Tue, 6 Feb 2024 08:49:27 GMT
- Title: Combining additivity and active subspaces for high-dimensional Gaussian
process modeling
- Authors: Mickael Binois (ACUMES), Victor Picheny
- Abstract summary: We show how to combine high-dimensional Gaussian process modeling with a multi-fidelity strategy.
Our contribution for high-dimensional Gaussian process modeling is to combine them with a multi-fidelity strategy, showcasing the advantages through experiments on synthetic functions and datasets.
- Score: 2.7140711924836816
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaussian processes are a widely embraced technique for regression and
classification due to their good prediction accuracy, analytical tractability
and built-in capabilities for uncertainty quantification. However, they suffer
from the curse of dimensionality whenever the number of variables increases.
This challenge is generally addressed by assuming additional structure in
theproblem, the preferred options being either additivity or low intrinsic
dimensionality. Our contribution for high-dimensional Gaussian process modeling
is to combine them with a multi-fidelity strategy, showcasing the advantages
through experiments on synthetic functions and datasets.
Related papers
- Scaling and renormalization in high-dimensional regression [72.59731158970894]
This paper presents a succinct derivation of the training and generalization performance of a variety of high-dimensional ridge regression models.
We provide an introduction and review of recent results on these topics, aimed at readers with backgrounds in physics and deep learning.
arXiv Detail & Related papers (2024-05-01T15:59:00Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Implicit Manifold Gaussian Process Regression [49.0787777751317]
Gaussian process regression is widely used to provide well-calibrated uncertainty estimates.
It struggles with high-dimensional data because of the implicit low-dimensional manifold upon which the data actually lies.
In this paper we propose a technique capable of inferring implicit structure directly from data (labeled and unlabeled) in a fully differentiable way.
arXiv Detail & Related papers (2023-10-30T09:52:48Z) - Multi-Response Heteroscedastic Gaussian Process Models and Their
Inference [1.52292571922932]
We propose a novel framework for the modeling of heteroscedastic covariance functions.
We employ variational inference to approximate the posterior and facilitate posterior predictive modeling.
We show that our proposed framework offers a robust and versatile tool for a wide array of applications.
arXiv Detail & Related papers (2023-08-29T15:06:47Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Random Sampling High Dimensional Model Representation Gaussian Process
Regression (RS-HDMR-GPR) for representing multidimensional functions with
machine-learned lower-dimensional terms allowing insight with a general
method [0.0]
Python implementation for RS-HDMR-GPR (Random Sampling High Dimensional Model Representation Gaussian Process Regression)
Code allows for imputation of missing values of the variables and for a significant pruning of the useful number of HDMR terms.
The capabilities of this regression tool are demonstrated on test cases involving synthetic analytic functions, the potential energy surface of the water molecule, kinetic energy densities of materials, and financial market data.
arXiv Detail & Related papers (2020-11-24T00:12:05Z) - Factorized Gaussian Process Variational Autoencoders [6.866104126509981]
Variational autoencoders often assume isotropic Gaussian priors and mean-field posteriors, hence do not exploit structure in scenarios where we may expect similarity or consistency across latent variables.
We propose a more scalable extension of these models by leveraging the independence of the auxiliary features, which is present in many datasets.
arXiv Detail & Related papers (2020-11-14T10:24:10Z) - Multi-fidelity data fusion for the approximation of scalar functions
with low intrinsic dimensionality through active subspaces [0.0]
We present a multi-fidelity approach involving active subspaces and we test it on two different high-dimensional benchmarks.
In this work we present a multi-fidelity approach involving active subspaces and we test it on two different high-dimensional benchmarks.
arXiv Detail & Related papers (2020-10-16T12:35:49Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - Projection Pursuit Gaussian Process Regression [5.837881923712394]
A primary goal of computer experiments is to reconstruct the function given by the computer code via scattered evaluations.
Traditional isotropic Gaussian process models suffer from the curse of dimensionality, when the input dimension is relatively high given limited data points.
We consider a projection pursuit model, in which the nonparametric part is driven by an additive Gaussian process regression.
arXiv Detail & Related papers (2020-04-01T19:12:01Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.