GenMod: A generative modeling approach for spectral representation of
PDEs with random inputs
- URL: http://arxiv.org/abs/2201.12973v1
- Date: Mon, 31 Jan 2022 02:56:20 GMT
- Title: GenMod: A generative modeling approach for spectral representation of
PDEs with random inputs
- Authors: Jacqueline Wentz and Alireza Doostan
- Abstract summary: We present an approach where we assume the coefficients are close to the range of a generative model that maps from a low to a high dimensional space of coefficients.
Using PDE theory on decay rates, we construct an explicit generative model that predicts the chaos magnitudes.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a method for quantifying uncertainty in high-dimensional PDE
systems with random parameters, where the number of solution evaluations is
small. Parametric PDE solutions are often approximated using a spectral
decomposition based on polynomial chaos expansions. For the class of systems we
consider (i.e., high dimensional with limited solution evaluations) the
coefficients are given by an underdetermined linear system in a regression
formulation. This implies additional assumptions, such as sparsity of the
coefficient vector, are needed to approximate the solution. Here, we present an
approach where we assume the coefficients are close to the range of a
generative model that maps from a low to a high dimensional space of
coefficients. Our approach is inspired be recent work examining how generative
models can be used for compressed sensing in systems with random Gaussian
measurement matrices. Using results from PDE theory on coefficient decay rates,
we construct an explicit generative model that predicts the polynomial chaos
coefficient magnitudes. The algorithm we developed to find the coefficients,
which we call GenMod, is composed of two main steps. First, we predict the
coefficient signs using Orthogonal Matching Pursuit. Then, we assume the
coefficients are within a sparse deviation from the range of a sign-adjusted
generative model. This allows us to find the coefficients by solving a
nonconvex optimization problem, over the input space of the generative model
and the space of sparse vectors. We obtain theoretical recovery results for a
Lipschitz continuous generative model and for a more specific generative model,
based on coefficient decay rate bounds. We examine three high-dimensional
problems and show that, for all three examples, the generative model approach
outperforms sparsity promoting methods at small sample sizes.
Related papers
- Modeling High-Dimensional Dependent Data in the Presence of Many Explanatory Variables and Weak Signals [0.0]
This article considers a novel and widely applicable approach to modeling high-dimensional dependent data when a large number of explanatory variables are available and the signal-to-noise ratio is low.
arXiv Detail & Related papers (2024-12-06T02:54:31Z) - A Stein Gradient Descent Approach for Doubly Intractable Distributions [5.63014864822787]
We propose a novel Monte Carlo Stein variational gradient descent (MC-SVGD) approach for inference for doubly intractable distributions.
The proposed method achieves substantial computational gains over existing algorithms, while providing comparable inferential performance for the posterior distributions.
arXiv Detail & Related papers (2024-10-28T13:42:27Z) - Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Probabilistic Unrolling: Scalable, Inverse-Free Maximum Likelihood
Estimation for Latent Gaussian Models [69.22568644711113]
We introduce probabilistic unrolling, a method that combines Monte Carlo sampling with iterative linear solvers to circumvent matrix inversions.
Our theoretical analyses reveal that unrolling and backpropagation through the iterations of the solver can accelerate gradient estimation for maximum likelihood estimation.
In experiments on simulated and real data, we demonstrate that probabilistic unrolling learns latent Gaussian models up to an order of magnitude faster than gradient EM, with minimal losses in model performance.
arXiv Detail & Related papers (2023-06-05T21:08:34Z) - High-dimensional scaling limits and fluctuations of online least-squares SGD with smooth covariance [16.652085114513273]
We derive high-dimensional scaling limits and fluctuations for the online least-squares Gradient Descent (SGD) algorithm.
Our results have several applications, including characterization of the limiting mean-square estimation or prediction errors and their fluctuations.
arXiv Detail & Related papers (2023-04-03T03:50:00Z) - Score-based Diffusion Models in Function Space [137.70916238028306]
Diffusion models have recently emerged as a powerful framework for generative modeling.
This work introduces a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Sparse Bayesian Learning for Complex-Valued Rational Approximations [0.03392423750246091]
Surrogate models are used to alleviate the computational burden in engineering tasks.
These models show a strongly non-linear dependence on their input parameters.
We apply a sparse learning approach to the rational approximation.
arXiv Detail & Related papers (2022-06-06T12:06:13Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Probabilistic learning on manifolds constrained by nonlinear partial
differential equations for small datasets [0.0]
A novel extension of the Probabilistic Learning on Manifolds (PLoM) is presented.
It makes it possible to synthesize solutions to a wide range of nonlinear boundary value problems.
Three applications are presented.
arXiv Detail & Related papers (2020-10-27T14:34:54Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Estimation of Switched Markov Polynomial NARX models [75.91002178647165]
We identify a class of models for hybrid dynamical systems characterized by nonlinear autoregressive (NARX) components.
The proposed approach is demonstrated on a SMNARX problem composed by three nonlinear sub-models with specific regressors.
arXiv Detail & Related papers (2020-09-29T15:00:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.