Dimensionality reduction can be used as a surrogate model for
high-dimensional forward uncertainty quantification
- URL: http://arxiv.org/abs/2402.04582v1
- Date: Wed, 7 Feb 2024 04:47:19 GMT
- Title: Dimensionality reduction can be used as a surrogate model for
high-dimensional forward uncertainty quantification
- Authors: Jungho Kim, Sang-ri Yi, Ziqi Wang
- Abstract summary: We introduce a method to construct a surrogate model from the results of dimensionality reduction in uncertainty quantification.
The proposed approach differs from a sequential application of dimensionality reduction followed by surrogate modeling.
The proposed method is demonstrated through two uncertainty quantification problems characterized by high-dimensional input uncertainties.
- Score: 3.218294891039672
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We introduce a method to construct a stochastic surrogate model from the
results of dimensionality reduction in forward uncertainty quantification. The
hypothesis is that the high-dimensional input augmented by the output of a
computational model admits a low-dimensional representation. This assumption
can be met by numerous uncertainty quantification applications with
physics-based computational models. The proposed approach differs from a
sequential application of dimensionality reduction followed by surrogate
modeling, as we "extract" a surrogate model from the results of dimensionality
reduction in the input-output space. This feature becomes desirable when the
input space is genuinely high-dimensional. The proposed method also diverges
from the Probabilistic Learning on Manifold, as a reconstruction mapping from
the feature space to the input-output space is circumvented. The final product
of the proposed method is a stochastic simulator that propagates a
deterministic input into a stochastic output, preserving the convenience of a
sequential "dimensionality reduction + Gaussian process regression" approach
while overcoming some of its limitations. The proposed method is demonstrated
through two uncertainty quantification problems characterized by
high-dimensional input uncertainties.
Related papers
- Shape-informed surrogate models based on signed distance function domain encoding [8.052704959617207]
We propose a non-intrusive method to build surrogate models that approximate the solution of parameterized partial differential equations (PDEs)
Our approach is based on the combination of two neural networks (NNs)
arXiv Detail & Related papers (2024-09-19T01:47:04Z) - Learning Latent Space Dynamics with Model-Form Uncertainties: A Stochastic Reduced-Order Modeling Approach [0.0]
This paper presents a probabilistic approach to represent and quantify model-form uncertainties in the reduced-order modeling of complex systems.
The proposed method captures these uncertainties by expanding the approximation space through the randomization of the projection matrix.
The efficacy of the approach is assessed on canonical problems in fluid mechanics by identifying and quantifying the impact of model-form uncertainties on the inferred operators.
arXiv Detail & Related papers (2024-08-30T19:25:28Z) - Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Latent Energy-Based Odyssey: Black-Box Optimization via Expanded Exploration in the Energy-Based Latent Space [65.44449711359724]
High-dimensional and highly-multimodal input design space of black-box function pose inherent challenges for existing methods.
We consider finding a latent space that serves as a compressed yet accurate representation of the design-value joint space.
We propose Noise-intensified Telescoping density-Ratio Estimation scheme for variational learning of an accurate latent space model.
arXiv Detail & Related papers (2024-05-27T00:11:53Z) - Data-freeWeight Compress and Denoise for Large Language Models [101.53420111286952]
We propose a novel approach termed Data-free Joint Rank-k Approximation for compressing the parameter matrices.
We achieve a model pruning of 80% parameters while retaining 93.43% of the original performance without any calibration data.
arXiv Detail & Related papers (2024-02-26T05:51:47Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Extension of Dynamic Mode Decomposition for dynamic systems with
incomplete information based on t-model of optimal prediction [69.81996031777717]
The Dynamic Mode Decomposition has proved to be a very efficient technique to study dynamic data.
The application of this approach becomes problematic if the available data is incomplete because some dimensions of smaller scale either missing or unmeasured.
We consider a first-order approximation of the Mori-Zwanzig decomposition, state the corresponding optimization problem and solve it with the gradient-based optimization method.
arXiv Detail & Related papers (2022-02-23T11:23:59Z) - A survey of unsupervised learning methods for high-dimensional
uncertainty quantification in black-box-type problems [0.0]
We construct surrogate models for quantification uncertainty (UQ) on complex partial differential equations (PPDEs)
The curse of dimensionality can be a pre-dimensional subspace used with suitable unsupervised learning techniques.
We demonstrate both the advantages and limitations of a suitable m-PCE model and we conclude that a suitable m-PCE model provides a cost-effective approach to deep subspaces.
arXiv Detail & Related papers (2022-02-09T16:33:40Z) - Manifold learning-based polynomial chaos expansions for high-dimensional
surrogate models [0.0]
We introduce a manifold learning-based method for uncertainty quantification (UQ) in describing systems.
The proposed method is able to achieve highly accurate approximations which ultimately lead to the significant acceleration of UQ tasks.
arXiv Detail & Related papers (2021-07-21T00:24:15Z) - A Fully Bayesian Gradient-Free Supervised Dimension Reduction Method
using Gaussian Processes [3.2636291418858474]
The proposed methodology is gradient-free and fully Bayesian, as it quantifies uncertainty in both the low-dimensional subspace and the surrogate model parameters.
It is validated on multiple datasets from engineering and science and compared to two other state-of-the-art methods.
arXiv Detail & Related papers (2020-08-08T14:24:25Z) - Understanding Implicit Regularization in Over-Parameterized Single Index
Model [55.41685740015095]
We design regularization-free algorithms for the high-dimensional single index model.
We provide theoretical guarantees for the induced implicit regularization phenomenon.
arXiv Detail & Related papers (2020-07-16T13:27:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.