Uncertainty Quantification of Darcy Flow through Porous Media using Deep
Gaussian Process
- URL: http://arxiv.org/abs/2011.01647v2
- Date: Thu, 5 Nov 2020 11:31:03 GMT
- Title: Uncertainty Quantification of Darcy Flow through Porous Media using Deep
Gaussian Process
- Authors: A. Daneshkhah, O. Chatrabgoun, M. Esmaeilbeigi, T. Sedighi, S.
Abolfathi
- Abstract summary: The method is also used for reducing dimensionality of model output.
Deep GPs are multi-layer hierarchical generalisations of GPs with multiple, infinitely wide hidden layers.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A computational method based on the non-linear Gaussian process (GP), known
as deep Gaussian processes (deep GPs) for uncertainty quantification &
propagation in modelling of flow through heterogeneous porous media is
presented. The method is also used for reducing dimensionality of model output
and consequently emulating highly complex relationship between hydrogeological
properties and reduced order fluid velocity field in a tractable manner. Deep
GPs are multi-layer hierarchical generalisations of GPs with multiple,
infinitely wide hidden layers that are very efficient models for deep learning
and modelling of high-dimensional complex systems by tackling the complexity
through several hidden layers connected with non-linear mappings. According to
this approach, the hydrogeological data is modelled as the output of a
multivariate GP whose inputs are governed by another GP such that each single
layer is either a standard GP or the Gaussian process latent variable model. A
variational approximation framework is used so that the posterior distribution
of the model outputs associated to given inputs can be analytically
approximated. In contrast to the other dimensionality reduction, methods that
do not provide any information about the dimensionality of each hidden layer,
the proposed method automatically selects the dimensionality of each hidden
layer and it can be used to propagate uncertainty obtained in each layer across
the hierarchy. Using this, dimensionality of the full input space consists of
both geometrical parameters of modelling domain and stochastic hydrogeological
parameters can be simultaneously reduced without the need for any
simplifications generally being assumed for stochastic modelling of subsurface
flow problems. It allows estimation of the flow statistics with greatly reduced
computational efforts compared to other stochastic approaches such as Monte
Carlo method.
Related papers
- Deep learning joint extremes of metocean variables using the SPAR model [0.0]
This paper presents a novel deep learning framework for estimating multivariate joint extremes of metocean variables.<n>It is based on the Semi-Parametric Angular-Radial (SPAR) model.<n>We show how the method can be applied in higher dimensions, using a case study for five metocean variables.
arXiv Detail & Related papers (2024-12-20T11:39:07Z) - Pushing the Limits of Large Language Model Quantization via the Linearity Theorem [71.3332971315821]
We present a "line theoremarity" establishing a direct relationship between the layer-wise $ell$ reconstruction error and the model perplexity increase due to quantization.
This insight enables two novel applications: (1) a simple data-free LLM quantization method using Hadamard rotations and MSE-optimal grids, dubbed HIGGS, and (2) an optimal solution to the problem of finding non-uniform per-layer quantization levels.
arXiv Detail & Related papers (2024-11-26T15:35:44Z) - Gradient-enhanced deep Gaussian processes for multifidelity modelling [0.0]
Multifidelity models integrate data from multiple sources to produce a single approximator for the underlying process.
Deep Gaussian processes (GPs) are attractive for multifidelity modelling as they are non-parametric, robust to overfitting, perform well for small datasets.
arXiv Detail & Related papers (2024-02-25T11:08:19Z) - Subsurface Characterization using Ensemble-based Approaches with Deep
Generative Models [2.184775414778289]
Inverse modeling is limited for ill-posed, high-dimensional applications due to computational costs and poor prediction accuracy with sparse datasets.
We combine Wasserstein Geneversarative Adrial Network with Gradient Penalty (WGAN-GP) and Ensemble Smoother with Multiple Data Assimilation (ES-MDA)
WGAN-GP is trained to generate high-dimensional K fields from a low-dimensional latent space and ES-MDA updates the latent variables by assimilating available measurements.
arXiv Detail & Related papers (2023-10-02T01:27:10Z) - Towards Efficient Modeling and Inference in Multi-Dimensional Gaussian
Process State-Space Models [11.13664702335756]
We propose to integrate the efficient transformed Gaussian process (ETGP) into the GPSSM to efficiently model the transition function in high-dimensional latent state space.
We also develop a corresponding variational inference algorithm that surpasses existing methods in terms of parameter count and computational complexity.
arXiv Detail & Related papers (2023-09-03T04:34:33Z) - Heterogeneous Multi-Task Gaussian Cox Processes [61.67344039414193]
We present a novel extension of multi-task Gaussian Cox processes for modeling heterogeneous correlated tasks jointly.
A MOGP prior over the parameters of the dedicated likelihoods for classification, regression and point process tasks can facilitate sharing of information between heterogeneous tasks.
We derive a mean-field approximation to realize closed-form iterative updates for estimating model parameters.
arXiv Detail & Related papers (2023-08-29T15:01:01Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Moser Flow: Divergence-based Generative Modeling on Manifolds [49.04974733536027]
Moser Flow (MF) is a new class of generative models within the family of continuous normalizing flows (CNF)
MF does not require invoking or backpropagating through an ODE solver during training.
We demonstrate for the first time the use of flow models for sampling from general curved surfaces.
arXiv Detail & Related papers (2021-08-18T09:00:24Z) - Large-Scale Wasserstein Gradient Flows [84.73670288608025]
We introduce a scalable scheme to approximate Wasserstein gradient flows.
Our approach relies on input neural networks (ICNNs) to discretize the JKO steps.
As a result, we can sample from the measure at each step of the gradient diffusion and compute its density.
arXiv Detail & Related papers (2021-06-01T19:21:48Z) - Deep Gaussian Processes for Biogeophysical Parameter Retrieval and Model
Inversion [14.097477944789484]
This paper introduces the use of deep Gaussian Processes (DGPs) for bio-geo-physical model inversion.
Unlike shallow GP models, DGPs account for complicated (modular, hierarchical) processes, provide an efficient solution that scales well to big datasets.
arXiv Detail & Related papers (2021-04-16T10:42:01Z) - Bayesian multiscale deep generative model for the solution of
high-dimensional inverse problems [0.0]
A novel multiscale Bayesian inference approach is introduced based on deep probabilistic generative models.
The method allows high-dimensional parameter estimation while exhibiting stability, efficiency and accuracy.
arXiv Detail & Related papers (2021-02-04T11:47:21Z) - A Near-Optimal Gradient Flow for Learning Neural Energy-Based Models [93.24030378630175]
We propose a novel numerical scheme to optimize the gradient flows for learning energy-based models (EBMs)
We derive a second-order Wasserstein gradient flow of the global relative entropy from Fokker-Planck equation.
Compared with existing schemes, Wasserstein gradient flow is a smoother and near-optimal numerical scheme to approximate real data densities.
arXiv Detail & Related papers (2019-10-31T02:26:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.