Multi-fidelity data fusion for the approximation of scalar functions
with low intrinsic dimensionality through active subspaces
- URL: http://arxiv.org/abs/2010.08349v1
- Date: Fri, 16 Oct 2020 12:35:49 GMT
- Title: Multi-fidelity data fusion for the approximation of scalar functions
with low intrinsic dimensionality through active subspaces
- Authors: Francesco Romor, Marco Tezzele, Gianluigi Rozza
- Abstract summary: We present a multi-fidelity approach involving active subspaces and we test it on two different high-dimensional benchmarks.
In this work we present a multi-fidelity approach involving active subspaces and we test it on two different high-dimensional benchmarks.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaussian processes are employed for non-parametric regression in a Bayesian
setting. They generalize linear regression, embedding the inputs in a latent
manifold inside an infinite-dimensional reproducing kernel Hilbert space. We
can augment the inputs with the observations of low-fidelity models in order to
learn a more expressive latent manifold and thus increment the model's
accuracy. This can be realized recursively with a chain of Gaussian processes
with incrementally higher fidelity. We would like to extend these
multi-fidelity model realizations to case studies affected by a
high-dimensional input space but with low intrinsic dimensionality. In this
cases physical supported or purely numerical low-order models are still
affected by the curse of dimensionality when queried for responses. When the
model's gradient information is provided, the presence of an active subspace
can be exploited to design low-fidelity response surfaces and thus enable
Gaussian process multi-fidelity regression, without the need to perform new
simulations. This is particularly useful in the case of data scarcity. In this
work we present a multi-fidelity approach involving active subspaces and we
test it on two different high-dimensional benchmarks.
Related papers
- Combining additivity and active subspaces for high-dimensional Gaussian
process modeling [2.7140711924836816]
We show how to combine high-dimensional Gaussian process modeling with a multi-fidelity strategy.
Our contribution for high-dimensional Gaussian process modeling is to combine them with a multi-fidelity strategy, showcasing the advantages through experiments on synthetic functions and datasets.
arXiv Detail & Related papers (2024-02-06T08:49:27Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - On Learning Gaussian Multi-index Models with Gradient Flow [57.170617397894404]
We study gradient flow on the multi-index regression problem for high-dimensional Gaussian data.
We consider a two-timescale algorithm, whereby the low-dimensional link function is learnt with a non-parametric model infinitely faster than the subspace parametrizing the low-rank projection.
arXiv Detail & Related papers (2023-10-30T17:55:28Z) - Implicit Manifold Gaussian Process Regression [49.0787777751317]
Gaussian process regression is widely used to provide well-calibrated uncertainty estimates.
It struggles with high-dimensional data because of the implicit low-dimensional manifold upon which the data actually lies.
In this paper we propose a technique capable of inferring implicit structure directly from data (labeled and unlabeled) in a fully differentiable way.
arXiv Detail & Related papers (2023-10-30T09:52:48Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Random Smoothing Regularization in Kernel Gradient Descent Learning [24.383121157277007]
We present a framework for random smoothing regularization that can adaptively learn a wide range of ground truth functions belonging to the classical Sobolev spaces.
Our estimator can adapt to the structural assumptions of the underlying data and avoid the curse of dimensionality.
arXiv Detail & Related papers (2023-05-05T13:37:34Z) - Learning in latent spaces improves the predictive accuracy of deep
neural operators [0.0]
L-DeepONet is an extension of standard DeepONet, which leverages latent representations of high-dimensional PDE input and output functions identified with suitable autoencoders.
We show that L-DeepONet outperforms the standard approach in terms of both accuracy and computational efficiency across diverse time-dependent PDEs.
arXiv Detail & Related papers (2023-04-15T17:13:09Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Infinite-Fidelity Coregionalization for Physical Simulation [22.524773932668023]
Multi-fidelity modeling and learning are important in physical simulation-related applications.
We propose Infinite Fidelity Coregionalization (IFC) to exploit rich information within continuous, infinite fidelities.
We show the advantage of our method in several benchmark tasks in computational physics.
arXiv Detail & Related papers (2022-07-01T23:01:10Z) - Projection Pursuit Gaussian Process Regression [5.837881923712394]
A primary goal of computer experiments is to reconstruct the function given by the computer code via scattered evaluations.
Traditional isotropic Gaussian process models suffer from the curse of dimensionality, when the input dimension is relatively high given limited data points.
We consider a projection pursuit model, in which the nonparametric part is driven by an additive Gaussian process regression.
arXiv Detail & Related papers (2020-04-01T19:12:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.