Bayesian Nonlocal Operator Regression (BNOR): A Data-Driven Learning
Framework of Nonlocal Models with Uncertainty Quantification
- URL: http://arxiv.org/abs/2211.01330v1
- Date: Thu, 6 Oct 2022 22:37:59 GMT
- Title: Bayesian Nonlocal Operator Regression (BNOR): A Data-Driven Learning
Framework of Nonlocal Models with Uncertainty Quantification
- Authors: Yiming Fan, Marta D'Elia, Yue Yu, Habib N. Najm, Stewart Silling
- Abstract summary: We consider the problem of modeling heterogeneous materials where micro-scale dynamics and interactions affect global behavior.
We develop a Bayesian framework for uncertainty (UQ) in material response prediction when using nonlocal models.
This work is a first step towards statistical characterization of nonlocal model discrepancy in the context of homogenization.
- Score: 4.705624984585247
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We consider the problem of modeling heterogeneous materials where micro-scale
dynamics and interactions affect global behavior. In the presence of
heterogeneities in material microstructure it is often impractical, if not
impossible, to provide quantitative characterization of material response. The
goal of this work is to develop a Bayesian framework for uncertainty
quantification (UQ) in material response prediction when using nonlocal models.
Our approach combines the nonlocal operator regression (NOR) technique and
Bayesian inference. Specifically, we use a Markov chain Monte Carlo (MCMC)
method to sample the posterior probability distribution on parameters involved
in the nonlocal constitutive law, and associated modeling discrepancies
relative to higher fidelity computations. As an application, we consider the
propagation of stress waves through a one-dimensional heterogeneous bar with
randomly generated microstructure. Several numerical tests illustrate the
construction, enabling UQ in nonlocal model predictions. Although nonlocal
models have become popular means for homogenization, their statistical
calibration with respect to high-fidelity models has not been presented before.
This work is a first step towards statistical characterization of nonlocal
model discrepancy in the context of homogenization.
Related papers
- Embedded Nonlocal Operator Regression (ENOR): Quantifying model error in learning nonlocal operators [8.585650361148558]
We propose a new framework to learn a nonlocal homogenized surrogate model and its structural model error.
This framework provides discrepancy-adaptive uncertainty quantification for homogenized material response predictions in long-term simulations.
arXiv Detail & Related papers (2024-10-27T04:17:27Z) - A Bias-Variance-Covariance Decomposition of Kernel Scores for Generative Models [13.527864898609398]
We introduce the first bias-variance-covariance decomposition for kernel scores.
We derive a kernel-based variance and entropy for uncertainty estimation.
Based on the wide applicability of kernels, we demonstrate our framework via generalization and uncertainty experiments for image, audio, and language generation.
arXiv Detail & Related papers (2023-10-09T16:22:11Z) - Work statistics, quantum signatures and enhanced work extraction in
quadratic fermionic models [62.997667081978825]
In quadratic fermionic models we determine a quantum correction to the work statistics after a sudden and a time-dependent driving.
Such a correction lies in the non-commutativity of the initial quantum state and the time-dependent Hamiltonian.
Thanks to the latter, one can assess the onset of non-classical signatures in the KDQ distribution of work.
arXiv Detail & Related papers (2023-02-27T13:42:40Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Generative models and Bayesian inversion using Laplace approximation [0.3670422696827525]
Recently, inverse problems were solved using generative models as highly informative priors.
We show that derived Bayes estimates are consistent, in contrast to the approach employing the low-dimensional manifold of the generative model.
arXiv Detail & Related papers (2022-03-15T10:05:43Z) - Uncertainty estimation under model misspecification in neural network
regression [3.2622301272834524]
We study the effect of the model choice on uncertainty estimation.
We highlight that under model misspecification, aleatoric uncertainty is not properly captured.
arXiv Detail & Related papers (2021-11-23T10:18:41Z) - Closed-form discovery of structural errors in models of chaotic systems
by integrating Bayesian sparse regression and data assimilation [0.0]
We introduce a framework named MEDIDA: Model Error Discovery with Interpretability and Data Assimilation.
In MEDIDA, first the model error is estimated from differences between the observed states and model-predicted states.
If observations are noisy, a data assimilation technique such as ensemble Kalman filter (EnKF) is first used to provide a noise-free analysis state of the system.
Finally, an equation-discovery technique, such as the relevance vector machine (RVM), a sparsity-promoting Bayesian method, is used to identify an interpretable, parsimonious, closed
arXiv Detail & Related papers (2021-10-01T17:19:28Z) - Estimation of Bivariate Structural Causal Models by Variational Gaussian
Process Regression Under Likelihoods Parametrised by Normalising Flows [74.85071867225533]
Causal mechanisms can be described by structural causal models.
One major drawback of state-of-the-art artificial intelligence is its lack of explainability.
arXiv Detail & Related papers (2021-09-06T14:52:58Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.