Normalizing Field Flows: Solving forward and inverse stochastic
differential equations using Physics-Informed flow model
- URL: http://arxiv.org/abs/2108.12956v1
- Date: Mon, 30 Aug 2021 01:58:01 GMT
- Title: Normalizing Field Flows: Solving forward and inverse stochastic
differential equations using Physics-Informed flow model
- Authors: Ling Guo, Hao Wu, Tao Zhou
- Abstract summary: We introduce in this work the normalizing field flows (NFF) for learning random fields from scattered measurements.
We demonstrate the capability of the proposed NFF model for learning Non Gaussian processes, mixed Gaussian processes, and forward & inverse partial differential equations.
- Score: 8.455584500599807
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce in this work the normalizing field flows (NFF) for learning
random fields from scattered measurements. More precisely, we construct a
bijective transformation (a normalizing flow characterizing by neural networks)
between a reference random field (say, a Gaussian random field with the
Karhunen-Lo\`eve expansion structure) and the target stochastic field, where
the KL expansion coefficients and the invertible networks are trained by
maximizing the sum of the log-likelihood on scattered measurements. This NFF
model can be used to solve data-driven forward, inverse, and mixed
forward/inverse stochastic partial differential equations in a unified
framework. We demonstrate the capability of the proposed NFF model for learning
Non Gaussian processes, mixed Gaussian processes, and forward & inverse
stochastic partial differential equations.
Related papers
- Understanding Diffusion Models by Feynman's Path Integral [2.4373900721120285]
We introduce a novel formulation of diffusion models using Feynman's integral path.
We find this formulation providing comprehensive descriptions of score-based generative models.
We also demonstrate the derivation of backward differential equations and loss functions.
arXiv Detail & Related papers (2024-03-17T16:24:29Z) - Data-driven Modeling and Inference for Bayesian Gaussian Process ODEs
via Double Normalizing Flows [28.62579476863723]
We introduce normalizing flows to re parameterize the ODE vector field, resulting in a data-driven prior distribution.
We also apply normalizing flows to the posterior inference of GP ODEs to resolve the issue of strong mean-field assumptions.
We validate the effectiveness of our approach on simulated dynamical systems and real-world human motion data.
arXiv Detail & Related papers (2023-09-17T09:28:47Z) - A probabilistic, data-driven closure model for RANS simulations with aleatoric, model uncertainty [1.8416014644193066]
We propose a data-driven, closure model for Reynolds-averaged Navier-Stokes (RANS) simulations that incorporates aleatoric, model uncertainty.
A fully Bayesian formulation is proposed, combined with a sparsity-inducing prior in order to identify regions in the problem domain where the parametric closure is insufficient.
arXiv Detail & Related papers (2023-07-05T16:53:31Z) - Mean-field Variational Inference via Wasserstein Gradient Flow [8.05603983337769]
Variational inference, such as the mean-field (MF) approximation, requires certain conjugacy structures for efficient computation.
We introduce a general computational framework to implement MFal inference for Bayesian models, with or without latent variables, using the Wasserstein gradient flow (WGF)
We propose a new constraint-free function approximation method using neural networks to numerically realize our algorithm.
arXiv Detail & Related papers (2022-07-17T04:05:32Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Moser Flow: Divergence-based Generative Modeling on Manifolds [49.04974733536027]
Moser Flow (MF) is a new class of generative models within the family of continuous normalizing flows (CNF)
MF does not require invoking or backpropagating through an ODE solver during training.
We demonstrate for the first time the use of flow models for sampling from general curved surfaces.
arXiv Detail & Related papers (2021-08-18T09:00:24Z) - E(n) Equivariant Normalizing Flows for Molecule Generation in 3D [87.12477361140716]
This paper introduces a generative model equivariant to Euclidean symmetries: E(n) Equivariant Normalizing Flows (E-NFs)
To the best of our knowledge, this is the first likelihood-based deep generative model that generates molecules in 3D.
arXiv Detail & Related papers (2021-05-19T09:28:54Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - Stochastic Normalizing Flows [52.92110730286403]
We introduce normalizing flows for maximum likelihood estimation and variational inference (VI) using differential equations (SDEs)
Using the theory of rough paths, the underlying Brownian motion is treated as a latent variable and approximated, enabling efficient training of neural SDEs.
These SDEs can be used for constructing efficient chains to sample from the underlying distribution of a given dataset.
arXiv Detail & Related papers (2020-02-21T20:47:55Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.