Multi-fidelity modeling with different input domain definitions using
Deep Gaussian Processes
- URL: http://arxiv.org/abs/2006.15924v1
- Date: Mon, 29 Jun 2020 10:44:06 GMT
- Title: Multi-fidelity modeling with different input domain definitions using
Deep Gaussian Processes
- Authors: Ali Hebbal, Loic Brevault, Mathieu Balesdent, El-Ghazali Talbi and
Nouredine Melab
- Abstract summary: Multi-fidelity approaches combine different models built on a scarce but accurate data-set (high-fidelity data-set), and a large but approximate one (low-fidelity data-set)
Deep Gaussian Processes (DGPs) that are functional compositions of GPs have also been adapted to multi-fidelity using the Multi-Fidelity Deep Gaussian process model (MF-DGP)
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multi-fidelity approaches combine different models built on a scarce but
accurate data-set (high-fidelity data-set), and a large but approximate one
(low-fidelity data-set) in order to improve the prediction accuracy. Gaussian
Processes (GPs) are one of the popular approaches to exhibit the correlations
between these different fidelity levels. Deep Gaussian Processes (DGPs) that
are functional compositions of GPs have also been adapted to multi-fidelity
using the Multi-Fidelity Deep Gaussian process model (MF-DGP). This model
increases the expressive power compared to GPs by considering non-linear
correlations between fidelities within a Bayesian framework. However, these
multi-fidelity methods consider only the case where the inputs of the different
fidelity models are defined over the same domain of definition (e.g., same
variables, same dimensions). However, due to simplification in the modeling of
the low-fidelity, some variables may be omitted or a different parametrization
may be used compared to the high-fidelity model. In this paper, Deep Gaussian
Processes for multi-fidelity (MF-DGP) are extended to the case where a
different parametrization is used for each fidelity. The performance of the
proposed multifidelity modeling technique is assessed on analytical test cases
and on structural and aerodynamic real physical problems.
Related papers
- Gradient-enhanced deep Gaussian processes for multifidelity modelling [0.0]
Multifidelity models integrate data from multiple sources to produce a single approximator for the underlying process.
Deep Gaussian processes (GPs) are attractive for multifidelity modelling as they are non-parametric, robust to overfitting, perform well for small datasets.
arXiv Detail & Related papers (2024-02-25T11:08:19Z) - Heterogeneous Multi-Task Gaussian Cox Processes [61.67344039414193]
We present a novel extension of multi-task Gaussian Cox processes for modeling heterogeneous correlated tasks jointly.
A MOGP prior over the parameters of the dedicated likelihoods for classification, regression and point process tasks can facilitate sharing of information between heterogeneous tasks.
We derive a mean-field approximation to realize closed-form iterative updates for estimating model parameters.
arXiv Detail & Related papers (2023-08-29T15:01:01Z) - Multi-fidelity Hierarchical Neural Processes [79.0284780825048]
Multi-fidelity surrogate modeling reduces the computational cost by fusing different simulation outputs.
We propose Multi-fidelity Hierarchical Neural Processes (MF-HNP), a unified neural latent variable model for multi-fidelity surrogate modeling.
We evaluate MF-HNP on epidemiology and climate modeling tasks, achieving competitive performance in terms of accuracy and uncertainty estimation.
arXiv Detail & Related papers (2022-06-10T04:54:13Z) - Dynamically-Scaled Deep Canonical Correlation Analysis [77.34726150561087]
Canonical Correlation Analysis (CCA) is a method for feature extraction of two views by finding maximally correlated linear projections of them.
We introduce a novel dynamic scaling method for training an input-dependent canonical correlation model.
arXiv Detail & Related papers (2022-03-23T12:52:49Z) - Deep Variational Models for Collaborative Filtering-based Recommender
Systems [63.995130144110156]
Deep learning provides accurate collaborative filtering models to improve recommender system results.
Our proposed models apply the variational concept to injectity in the latent space of the deep architecture.
Results show the superiority of the proposed approach in scenarios where the variational enrichment exceeds the injected noise effect.
arXiv Detail & Related papers (2021-07-27T08:59:39Z) - Multifidelity Modeling for Physics-Informed Neural Networks (PINNs) [13.590496719224987]
Physics-informed Neural Networks (PINNs) are candidates for multifidelity simulation approaches.
We propose a particular multifidelity approach applied to PINNs that exploits low-rank structure.
arXiv Detail & Related papers (2021-06-25T00:19:19Z) - Post-mortem on a deep learning contest: a Simpson's paradox and the
complementary roles of scale metrics versus shape metrics [61.49826776409194]
We analyze a corpus of models made publicly-available for a contest to predict the generalization accuracy of neural network (NN) models.
We identify what amounts to a Simpson's paradox: where "scale" metrics perform well overall but perform poorly on sub partitions of the data.
We present two novel shape metrics, one data-independent, and the other data-dependent, which can predict trends in the test accuracy of a series of NNs.
arXiv Detail & Related papers (2021-06-01T19:19:49Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Overview of Gaussian process based multi-fidelity techniques with
variable relationship between fidelities [0.0]
Multi-fidelity modeling is a way to merge different fidelity models to provide engineers with accurate results with a limited computational cost.
The relationship between the fidelity models is a key aspect in multi-fidelity modeling.
This paper provides an overview of Gaussian process-based multi-fidelity modeling techniques for variable relationship between the fidelity models.
arXiv Detail & Related papers (2020-06-30T12:37:41Z) - Multi-Fidelity High-Order Gaussian Processes for Physical Simulation [24.033468062984458]
High-fidelity partial differential equations (PDEs) are more expensive than low-fidelity ones.
We propose Multi-Fidelity High-Order Gaussian Process (MFHoGP) that can capture complex correlations.
MFHoGP propagates bases throughout fidelities to fuse information, and places a deep matrix GP prior over the basis weights.
arXiv Detail & Related papers (2020-06-08T22:31:59Z) - Conditional Deep Gaussian Processes: multi-fidelity kernel learning [6.599344783327053]
We propose the conditional DGP model in which the latent GPs are directly supported by the fixed lower fidelity data.
Experiments with synthetic and high dimensional data show comparable performance against other multi-fidelity regression methods.
We conclude that, with the low fidelity data and the hierarchical DGP structure, the effective kernel encodes the inductive bias for true function.
arXiv Detail & Related papers (2020-02-07T14:56:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.