Context-aware learning of hierarchies of low-fidelity models for
multi-fidelity uncertainty quantification
- URL: http://arxiv.org/abs/2211.10835v1
- Date: Sun, 20 Nov 2022 01:12:51 GMT
- Title: Context-aware learning of hierarchies of low-fidelity models for
multi-fidelity uncertainty quantification
- Authors: Ionut-Gabriel Farcas and Benjamin Peherstorfer and Tobias Neckel and
Frank Jenko and Hans-Joachim Bungartz
- Abstract summary: Multi-fidelity Monte Carlo methods leverage low-fidelity and surrogate models for variance reduction to make tractable uncertainty quantification.
This work proposes a context-aware multi-fidelity Monte Carlo method that optimally balances the costs of training low-fidelity models with the costs of Monte Carlo sampling.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multi-fidelity Monte Carlo methods leverage low-fidelity and surrogate models
for variance reduction to make tractable uncertainty quantification even when
numerically simulating the physical systems of interest with high-fidelity
models is computationally expensive. This work proposes a context-aware
multi-fidelity Monte Carlo method that optimally balances the costs of training
low-fidelity models with the costs of Monte Carlo sampling. It generalizes the
previously developed context-aware bi-fidelity Monte Carlo method to
hierarchies of multiple models and to more general types of low-fidelity
models. When training low-fidelity models, the proposed approach takes into
account the context in which the learned low-fidelity models will be used,
namely for variance reduction in Monte Carlo estimation, which allows it to
find optimal trade-offs between training and sampling to minimize upper bounds
of the mean-squared errors of the estimators for given computational budgets.
This is in stark contrast to traditional surrogate modeling and model reduction
techniques that construct low-fidelity models with the primary goal of
approximating well the high-fidelity model outputs and typically ignore the
context in which the learned models will be used in upstream tasks. The
proposed context-aware multi-fidelity Monte Carlo method applies to hierarchies
of a wide range of types of low-fidelity models such as sparse-grid and
deep-network models. Numerical experiments with the gyrokinetic simulation code
\textsc{Gene} show speedups of up to two orders of magnitude compared to
standard estimators when quantifying uncertainties in small-scale fluctuations
in confined plasma in fusion reactors. This corresponds to a runtime reduction
from 72 days to about four hours on one node of the Lonestar6 supercomputer at
the Texas Advanced Computing Center.
Related papers
- Supervised Score-Based Modeling by Gradient Boosting [49.556736252628745]
We propose a Supervised Score-based Model (SSM) which can be viewed as a gradient boosting algorithm combining score matching.
We provide a theoretical analysis of learning and sampling for SSM to balance inference time and prediction accuracy.
Our model outperforms existing models in both accuracy and inference time.
arXiv Detail & Related papers (2024-11-02T07:06:53Z) - Bayesian computation with generative diffusion models by Multilevel Monte Carlo [0.16874375111244327]
This paper presents a Multilevel Monte Carlo strategy that significantly reduces the cost of Bayesian computation with diffusion models.
The effectiveness of the proposed Multilevel Monte Carlo approach is demonstrated with three canonical computational imaging problems.
arXiv Detail & Related papers (2024-09-23T19:57:08Z) - Practical multi-fidelity machine learning: fusion of deterministic and Bayesian models [0.34592277400656235]
Multi-fidelity machine learning methods integrate scarce, resource-intensive high-fidelity data with abundant but less accurate low-fidelity data.
We propose a practical multi-fidelity strategy for problems spanning low- and high-dimensional domains.
arXiv Detail & Related papers (2024-07-21T10:40:50Z) - General multi-fidelity surrogate models: Framework and active learning
strategies for efficient rare event simulation [1.708673732699217]
Estimating the probability of failure for complex real-world systems is often prohibitively expensive.
This paper presents a robust multi-fidelity surrogate modeling strategy.
It is shown to be highly accurate while drastically reducing the number of high-fidelity model calls.
arXiv Detail & Related papers (2022-12-07T00:03:21Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Generative Temporal Difference Learning for Infinite-Horizon Prediction [101.59882753763888]
We introduce the $gamma$-model, a predictive model of environment dynamics with an infinite probabilistic horizon.
We discuss how its training reflects an inescapable tradeoff between training-time and testing-time compounding errors.
arXiv Detail & Related papers (2020-10-27T17:54:12Z) - Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modeling [54.94763543386523]
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the ( aggregate) posterior to encourage statistical independence of the latent factors.
We present a novel multi-stage modeling approach where the disentangled factors are first learned using a penalty-based disentangled representation learning method.
Then, the low-quality reconstruction is improved with another deep generative model that is trained to model the missing correlated latent variables.
arXiv Detail & Related papers (2020-10-25T18:51:15Z) - Context-aware surrogate modeling for balancing approximation and
sampling costs in multi-fidelity importance sampling and Bayesian inverse
problems [0.0]
Multi-fidelity methods leverage low-cost surrogate models to speed up computations.
Because surrogate and high-fidelity models are used together, poor predictions by surrogate models can be compensated with frequent recourse to high-fidelity models.
This work considers multi-fidelity importance sampling and theoretically and computationally trades off increasing the fidelity of surrogate models.
arXiv Detail & Related papers (2020-10-22T13:31:51Z) - Goal-directed Generation of Discrete Structures with Conditional
Generative Models [85.51463588099556]
We introduce a novel approach to directly optimize a reinforcement learning objective, maximizing an expected reward.
We test our methodology on two tasks: generating molecules with user-defined properties and identifying short python expressions which evaluate to a given target value.
arXiv Detail & Related papers (2020-10-05T20:03:13Z) - Hybrid modeling: Applications in real-time diagnosis [64.5040763067757]
We outline a novel hybrid modeling approach that combines machine learning inspired models and physics-based models.
We are using such models for real-time diagnosis applications.
arXiv Detail & Related papers (2020-03-04T00:44:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.