PI-VAE: Physics-Informed Variational Auto-Encoder for stochastic
differential equations
- URL: http://arxiv.org/abs/2203.11363v1
- Date: Mon, 21 Mar 2022 21:51:19 GMT
- Title: PI-VAE: Physics-Informed Variational Auto-Encoder for stochastic
differential equations
- Authors: Weiheng Zhong and Hadi Meidani
- Abstract summary: We propose a new class of physics-informed neural networks, called physics-informed Variational Autoencoder (PI-VAE)
PI-VAE consists of a variational autoencoder (VAE), which generates samples of system variables and parameters.
The satisfactory accuracy and efficiency of the proposed method are numerically demonstrated in comparison with physics-informed generative adversarial network (PI-WGAN)
- Score: 2.741266294612776
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a new class of physics-informed neural networks, called
physics-informed Variational Autoencoder (PI-VAE), to solve stochastic
differential equations (SDEs) or inverse problems involving SDEs. In these
problems the governing equations are known but only a limited number of
measurements of system parameters are available. PI-VAE consists of a
variational autoencoder (VAE), which generates samples of system variables and
parameters. This generative model is integrated with the governing equations.
In this integration, the derivatives of VAE outputs are readily calculated
using automatic differentiation, and used in the physics-based loss term. In
this work, the loss function is chosen to be the Maximum Mean Discrepancy (MMD)
for improved performance, and neural network parameters are updated iteratively
using the stochastic gradient descent algorithm. We first test the proposed
method on approximating stochastic processes. Then we study three types of
problems related to SDEs: forward and inverse problems together with mixed
problems where system parameters and solutions are simultaneously calculated.
The satisfactory accuracy and efficiency of the proposed method are numerically
demonstrated in comparison with physics-informed generative adversarial network
(PI-WGAN).
Related papers
- PICL: Physics Informed Contrastive Learning for Partial Differential Equations [7.136205674624813]
We develop a novel contrastive pretraining framework that improves neural operator generalization across multiple governing equations simultaneously.
A combination of physics-informed system evolution and latent-space model output are anchored to input data and used in our distance function.
We find that physics-informed contrastive pretraining improves accuracy for the Fourier Neural Operator in fixed-future and autoregressive rollout tasks for the 1D and 2D Heat, Burgers', and linear advection equations.
arXiv Detail & Related papers (2024-01-29T17:32:22Z) - Physics-Informed Generator-Encoder Adversarial Networks with Latent
Space Matching for Stochastic Differential Equations [14.999611448900822]
We propose a new class of physics-informed neural networks to address the challenges posed by forward, inverse, and mixed problems in differential equations.
Our model consists of two key components: the generator and the encoder, both updated alternately by gradient descent.
In contrast to previous approaches, we employ an indirect matching that operates within the lower-dimensional latent feature space.
arXiv Detail & Related papers (2023-11-03T04:29:49Z) - PI-VEGAN: Physics Informed Variational Embedding Generative Adversarial
Networks for Stochastic Differential Equations [14.044012646069552]
We present a new category of physics-informed neural networks called physics informed embedding generative adversarial network (PI-VEGAN)
PI-VEGAN effectively tackles forward, inverse, and mixed problems of differential equations.
We evaluate the effectiveness of PI-VEGAN in addressing forward, inverse, and mixed problems that require the concurrent calculation of system parameters and solutions.
arXiv Detail & Related papers (2023-07-21T01:18:02Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Mixed formulation of physics-informed neural networks for
thermo-mechanically coupled systems and heterogeneous domains [0.0]
Physics-informed neural networks (PINNs) are a new tool for solving boundary value problems.
Recent investigations have shown that when designing loss functions for many engineering problems, using first-order derivatives and combining equations from both strong and weak forms can lead to much better accuracy.
In this work, we propose applying the mixed formulation to solve multi-physical problems, specifically a stationary thermo-mechanically coupled system of equations.
arXiv Detail & Related papers (2023-02-09T21:56:59Z) - Physics-informed Neural Network: The Effect of Reparameterization in
Solving Differential Equations [0.0]
Complicated physics mostly involves difficult differential equations, which are hard to solve analytically.
In recent years, physics-informed neural networks have been shown to perform very well in solving systems with various differential equations.
arXiv Detail & Related papers (2023-01-28T07:53:26Z) - Symbolic Recovery of Differential Equations: The Identifiability Problem [52.158782751264205]
Symbolic recovery of differential equations is the ambitious attempt at automating the derivation of governing equations.
We provide both necessary and sufficient conditions for a function to uniquely determine the corresponding differential equation.
We then use our results to devise numerical algorithms aiming to determine whether a function solves a differential equation uniquely.
arXiv Detail & Related papers (2022-10-15T17:32:49Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Learning to Solve PDE-constrained Inverse Problems with Graph Networks [51.89325993156204]
In many application domains across science and engineering, we are interested in solving inverse problems with constraints defined by a partial differential equation (PDE)
Here we explore GNNs to solve such PDE-constrained inverse problems.
We demonstrate computational speedups of up to 90x using GNNs compared to principled solvers.
arXiv Detail & Related papers (2022-06-01T18:48:01Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.