Discovering stochastic partial differential equations from limited data
using variational Bayes inference
- URL: http://arxiv.org/abs/2306.15873v1
- Date: Wed, 28 Jun 2023 02:18:04 GMT
- Title: Discovering stochastic partial differential equations from limited data
using variational Bayes inference
- Authors: Yogesh Chandrakant Mathpati and Tapas Tripura and Rajdip Nayek and
Souvik Chakraborty
- Abstract summary: We propose a novel framework for discovering Partial Differential Equations (SPDEs) from data.
The proposed approach combines the concepts of calculus, variational Bayes theory, and sparse learning.
This is the first attempt at discovering SPDEs from data, and it has significant implications for various scientific applications.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We propose a novel framework for discovering Stochastic Partial Differential
Equations (SPDEs) from data. The proposed approach combines the concepts of
stochastic calculus, variational Bayes theory, and sparse learning. We propose
the extended Kramers-Moyal expansion to express the drift and diffusion terms
of an SPDE in terms of state responses and use Spike-and-Slab priors with
sparse learning techniques to efficiently and accurately discover the
underlying SPDEs. The proposed approach has been applied to three canonical
SPDEs, (a) stochastic heat equation, (b) stochastic Allen-Cahn equation, and
(c) stochastic Nagumo equation. Our results demonstrate that the proposed
approach can accurately identify the underlying SPDEs with limited data. This
is the first attempt at discovering SPDEs from data, and it has significant
implications for various scientific applications, such as climate modeling,
financial forecasting, and chemical kinetics.
Related papers
- Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Kinetic Interacting Particle Langevin Monte Carlo [0.0]
This paper introduces and analyses interacting underdamped Langevin algorithms, for statistical inference in latent variable models.
We propose a diffusion process that evolves jointly in the space of parameters and latent variables.
We provide two explicit discretisations of this diffusion as practical algorithms to estimate parameters of statistical models.
arXiv Detail & Related papers (2024-07-08T09:52:46Z) - Closure Discovery for Coarse-Grained Partial Differential Equations Using Grid-based Reinforcement Learning [2.9611509639584304]
We propose a systematic approach for identifying closures in under-resolved PDEs using grid-based Reinforcement Learning.
We demonstrate the capabilities and limitations of our framework through numerical solutions of the advection equation and the Burgers' equation.
arXiv Detail & Related papers (2024-02-01T19:41:04Z) - Evaluating Uncertainty Quantification approaches for Neural PDEs in
scientific applications [0.0]
This work evaluates various Uncertainty Quantification (UQ) approaches for both Forward and Inverse Problems in scientific applications.
Specifically, we investigate the effectiveness of Bayesian methods, such as Hamiltonian Monte Carlo (HMC) and Monte-Carlo Dropout (MCD)
Our results indicate that Neural PDEs can effectively reconstruct flow systems and predict the associated unknown parameters.
arXiv Detail & Related papers (2023-11-08T04:52:20Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Non-Parametric Learning of Stochastic Differential Equations with Non-asymptotic Fast Rates of Convergence [65.63201894457404]
We propose a novel non-parametric learning paradigm for the identification of drift and diffusion coefficients of non-linear differential equations.
The key idea essentially consists of fitting a RKHS-based approximation of the corresponding Fokker-Planck equation to such observations.
arXiv Detail & Related papers (2023-05-24T20:43:47Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Pseudo-Spherical Contrastive Divergence [119.28384561517292]
We propose pseudo-spherical contrastive divergence (PS-CD) to generalize maximum learning likelihood of energy-based models.
PS-CD avoids the intractable partition function and provides a generalized family of learning objectives.
arXiv Detail & Related papers (2021-11-01T09:17:15Z) - Parsimony-Enhanced Sparse Bayesian Learning for Robust Discovery of
Partial Differential Equations [5.584060970507507]
A Parsimony Enhanced Sparse Bayesian Learning (PeSBL) method is developed for discovering the governing Partial Differential Equations (PDEs) of nonlinear dynamical systems.
Results of numerical case studies indicate that the governing PDEs of many canonical dynamical systems can be correctly identified using the proposed PeSBL method.
arXiv Detail & Related papers (2021-07-08T00:56:11Z) - Bayesian data-driven discovery of partial differential equations with variable coefficients [9.331440154110117]
We propose an advanced Bayesian sparse learning algorithm for PDE discovery with variable coefficients.
In the experiments, we show that the tBGL-SS method is more robust than the baseline methods under noisy environments.
arXiv Detail & Related papers (2021-02-02T11:05:34Z) - Stochastic Normalizing Flows [52.92110730286403]
We introduce normalizing flows for maximum likelihood estimation and variational inference (VI) using differential equations (SDEs)
Using the theory of rough paths, the underlying Brownian motion is treated as a latent variable and approximated, enabling efficient training of neural SDEs.
These SDEs can be used for constructing efficient chains to sample from the underlying distribution of a given dataset.
arXiv Detail & Related papers (2020-02-21T20:47:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.