Bayesian Hidden Physics Models: Uncertainty Quantification for Discovery
of Nonlinear Partial Differential Operators from Data
- URL: http://arxiv.org/abs/2006.04228v1
- Date: Sun, 7 Jun 2020 18:48:43 GMT
- Title: Bayesian Hidden Physics Models: Uncertainty Quantification for Discovery
of Nonlinear Partial Differential Operators from Data
- Authors: Steven Atkinson
- Abstract summary: There has been a surge of interest in using machine learning models to discover physical laws such as differential equations from data.
We introduce a novel model comprising "leaf modules" that learn to govern functional data as neural networks.
Our approach quantifies the reliability of the learned physics in terms of a posterior distribution over operators and propagates this uncertainty to solutions of novel initial-boundary value problem instances.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: What do data tell us about physics-and what don't they tell us? There has
been a surge of interest in using machine learning models to discover governing
physical laws such as differential equations from data, but current methods
lack uncertainty quantification to communicate their credibility. This work
addresses this shortcoming from a Bayesian perspective. We introduce a novel
model comprising "leaf" modules that learn to represent distinct experiments'
spatiotemporal functional data as neural networks and a single "root" module
that expresses a nonparametric distribution over their governing nonlinear
differential operator as a Gaussian process. Automatic differentiation is used
to compute the required partial derivatives from the leaf functions as inputs
to the root. Our approach quantifies the reliability of the learned physics in
terms of a posterior distribution over operators and propagates this
uncertainty to solutions of novel initial-boundary value problem instances.
Numerical experiments demonstrate the method on several nonlinear PDEs.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Linearization Turns Neural Operators into Function-Valued Gaussian Processes [23.85470417458593]
We introduce a new framework for approximate Bayesian uncertainty quantification in neural operators.
Our approach can be interpreted as a probabilistic analogue of the concept of currying from functional programming.
We showcase the efficacy of our approach through applications to different types of partial differential equations.
arXiv Detail & Related papers (2024-06-07T16:43:54Z) - Diffusion models as probabilistic neural operators for recovering unobserved states of dynamical systems [49.2319247825857]
We show that diffusion-based generative models exhibit many properties favourable for neural operators.
We propose to train a single model adaptable to multiple tasks, by alternating between the tasks during training.
arXiv Detail & Related papers (2024-05-11T21:23:55Z) - Physics-Informed Quantum Machine Learning: Solving nonlinear
differential equations in latent spaces without costly grid evaluations [21.24186888129542]
We propose a physics-informed quantum algorithm to solve nonlinear and multidimensional differential equations.
By measuring the overlaps between states which are representations of DE terms, we construct a loss that does not require independent sequential function evaluations on grid points.
When the loss is trained variationally, our approach can be related to the differentiable quantum circuit protocol.
arXiv Detail & Related papers (2023-08-03T15:38:31Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Physics-informed Information Field Theory for Modeling Physical Systems with Uncertainty Quantification [0.0]
Information field theory (IFT) provides the tools necessary to perform statistics over fields that are not necessarily Gaussian.
We extend IFT to physics-informed IFT (PIFT) by encoding the functional priors with information about the physical laws which describe the field.
The posteriors derived from this PIFT remain independent of any numerical scheme and can capture multiple modes.
We numerically demonstrate that the method correctly identifies when the physics cannot be trusted, in which case it automatically treats learning the field as a regression problem.
arXiv Detail & Related papers (2023-01-18T15:40:19Z) - Approximate Bayesian Neural Operators: Uncertainty Quantification for
Parametric PDEs [34.179984253109346]
We provide a mathematically detailed Bayesian formulation of the ''shallow'' (linear) version of neural operators.
We then extend this analytic treatment to general deep neural operators using approximate methods from Bayesian deep learning.
As a result, our approach is able to identify cases, and provide structured uncertainty estimates, where the neural operator fails to predict well.
arXiv Detail & Related papers (2022-08-02T16:10:27Z) - Hessian-based toolbox for reliable and interpretable machine learning in
physics [58.720142291102135]
We present a toolbox for interpretability and reliability, extrapolation of the model architecture.
It provides a notion of the influence of the input data on the prediction at a given test point, an estimation of the uncertainty of the model predictions, and an agnostic score for the model predictions.
Our work opens the road to the systematic use of interpretability and reliability methods in ML applied to physics and, more generally, science.
arXiv Detail & Related papers (2021-08-04T16:32:59Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Non-parametric Models for Non-negative Functions [48.7576911714538]
We provide the first model for non-negative functions from the same good linear models.
We prove that it admits a representer theorem and provide an efficient dual formulation for convex problems.
arXiv Detail & Related papers (2020-07-08T07:17:28Z) - Physics Informed Deep Learning for Transport in Porous Media. Buckley
Leverett Problem [0.0]
We present a new hybrid physics-based machine-learning approach to reservoir modeling.
The methodology relies on a series of deep adversarial neural network architecture with physics-based regularization.
The proposed methodology is a simple and elegant way to instill physical knowledge to machine-learning algorithms.
arXiv Detail & Related papers (2020-01-15T08:20:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.