Learning Functional Priors and Posteriors from Data and Physics
- URL: http://arxiv.org/abs/2106.05863v1
- Date: Tue, 8 Jun 2021 03:03:24 GMT
- Title: Learning Functional Priors and Posteriors from Data and Physics
- Authors: Xuhui Meng, Liu Yang, Zhiping Mao, Jose del Aguila Ferrandis, George
Em Karniadakis
- Abstract summary: We develop a new framework based on deep neural networks to be able to extrapolate in space-time using historical data.
We employ the physics-informed Generative Adversarial Networks (PI-GAN) to learn a functional prior.
At the second stage, we employ the Hamiltonian Monte Carlo (HMC) method to estimate the posterior in the latent space of PI-GANs.
- Score: 3.537267195871802
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We develop a new Bayesian framework based on deep neural networks to be able
to extrapolate in space-time using historical data and to quantify
uncertainties arising from both noisy and gappy data in physical problems.
Specifically, the proposed approach has two stages: (1) prior learning and (2)
posterior estimation. At the first stage, we employ the physics-informed
Generative Adversarial Networks (PI-GAN) to learn a functional prior either
from a prescribed function distribution, e.g., Gaussian process, or from
historical data and physics. At the second stage, we employ the Hamiltonian
Monte Carlo (HMC) method to estimate the posterior in the latent space of
PI-GANs. In addition, we use two different approaches to encode the physics:
(1) automatic differentiation, used in the physics-informed neural networks
(PINNs) for scenarios with explicitly known partial differential equations
(PDEs), and (2) operator regression using the deep operator network (DeepONet)
for PDE-agnostic scenarios. We then test the proposed method for (1)
meta-learning for one-dimensional regression, and forward/inverse PDE problems
(combined with PINNs); (2) PDE-agnostic physical problems (combined with
DeepONet), e.g., fractional diffusion as well as saturated stochastic
(100-dimensional) flows in heterogeneous porous media; and (3) spatial-temporal
regression problems, i.e., inference of a marine riser displacement field. The
results demonstrate that the proposed approach can provide accurate predictions
as well as uncertainty quantification given very limited scattered and noisy
data, since historical data could be available to provide informative priors.
In summary, the proposed method is capable of learning flexible functional
priors, and can be extended to big data problems using stochastic HMC or
normalizing flows since the latent space is generally characterized as low
dimensional.
Related papers
- Leveraging viscous Hamilton-Jacobi PDEs for uncertainty quantification in scientific machine learning [1.8175282137722093]
Uncertainty (UQ) in scientific machine learning (SciML) combines the powerful predictive power of SciML with methods for quantifying the reliability of the learned models.
We provide a new interpretation for UQ problems by establishing a new theoretical connection between some Bayesian inference problems arising in SciML and viscous Hamilton-Jacobi partial differential equations (HJ PDEs)
We develop a new Riccati-based methodology that provides computational advantages when continuously updating the model predictions.
arXiv Detail & Related papers (2024-04-12T20:54:01Z) - SPDE priors for uncertainty quantification of end-to-end neural data
assimilation schemes [4.213142548113385]
Recent advances in the deep learning community enables to adress this problem as neural architecture embedding data assimilation variational framework.
In this work, we draw from SPDE-based Processes to estimate prior models able to handle non-stationary covariances in both space and time.
Our neural variational scheme is modified to embed an augmented state formulation with both state SPDE parametrization to estimate.
arXiv Detail & Related papers (2024-02-02T19:18:12Z) - Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Random Grid Neural Processes for Parametric Partial Differential
Equations [5.244037702157957]
We introduce a new class of spatially probabilistic physics and data informed deep latent models for PDEs.
We solve forward and inverse problems for parametric PDEs in a way that leads to the construction of Gaussian process models of solution fields.
We show how to incorporate noisy data in a principled manner into our physics informed model to improve predictions for problems where data may be available.
arXiv Detail & Related papers (2023-01-26T11:30:56Z) - Human Trajectory Prediction via Neural Social Physics [63.62824628085961]
Trajectory prediction has been widely pursued in many fields, and many model-based and model-free methods have been explored.
We propose a new method combining both methodologies based on a new Neural Differential Equation model.
Our new model (Neural Social Physics or NSP) is a deep neural network within which we use an explicit physics model with learnable parameters.
arXiv Detail & Related papers (2022-07-21T12:11:18Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Bayesian Deep Learning for Partial Differential Equation Parameter
Discovery with Sparse and Noisy Data [0.0]
We propose to use Bayesian neural networks (BNN) in order to recover the full system states from measurement data.
We show that it is possible to accurately capture physics of varying complexity without overfitting.
We demonstrate our approach on a handful of example applied to physics and non-linear dynamics.
arXiv Detail & Related papers (2021-08-05T19:43:15Z) - Simultaneous boundary shape estimation and velocity field de-noising in
Magnetic Resonance Velocimetry using Physics-informed Neural Networks [70.7321040534471]
Magnetic resonance velocimetry (MRV) is a non-invasive technique widely used in medicine and engineering to measure the velocity field of a fluid.
Previous studies have required the shape of the boundary (for example, a blood vessel) to be known a priori.
We present a physics-informed neural network that instead uses the noisy MRV data alone to infer the most likely boundary shape and de-noised velocity field.
arXiv Detail & Related papers (2021-07-16T12:56:09Z) - Multi-fidelity Bayesian Neural Networks: Algorithms and Applications [0.0]
We propose a new class of Bayesian neural networks (BNNs) that can be trained using noisy data of variable fidelity.
We apply them to learn function approximations as well as to solve inverse problems based on partial differential equations (PDEs)
arXiv Detail & Related papers (2020-12-19T02:03:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.