LatentPINNs: Generative physics-informed neural networks via a latent
representation learning
- URL: http://arxiv.org/abs/2305.07671v1
- Date: Thu, 11 May 2023 16:54:17 GMT
- Title: LatentPINNs: Generative physics-informed neural networks via a latent
representation learning
- Authors: Mohammad H. Taufik and Tariq Alkhalifah
- Abstract summary: We introduce latentPINN, a framework that utilizes latent representations of the PDE parameters as additional (to the coordinates) inputs into PINNs.
We use a two-stage training scheme in which the first stage, we learn the latent representations for the distribution of PDE parameters.
In the second stage, we train a physics-informed neural network over inputs given by randomly drawn samples from the coordinate space within the solution domain.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Physics-informed neural networks (PINNs) are promising to replace
conventional partial differential equation (PDE) solvers by offering more
accurate and flexible PDE solutions. However, they are hampered by the
relatively slow convergence and the need to perform additional, potentially
expensive, training for different PDE parameters. To solve this limitation, we
introduce latentPINN, a framework that utilizes latent representations of the
PDE parameters as additional (to the coordinates) inputs into PINNs and allows
for training over the distribution of these parameters. Motivated by the recent
progress on generative models, we promote the use of latent diffusion models to
learn compressed latent representations of the PDE parameters distribution and
act as input parameters to NN functional solutions. We use a two-stage training
scheme in which the first stage, we learn the latent representations for the
distribution of PDE parameters. In the second stage, we train a
physics-informed neural network over inputs given by randomly drawn samples
from the coordinate space within the solution domain and samples from the
learned latent representation of the PDE parameters. We test the approach on a
class of level set equations given by the nonlinear Eikonal equation. We
specifically share results corresponding to three different sets of Eikonal
parameters (velocity models). The proposed method performs well on new phase
velocity models without the need for any additional training.
Related papers
- Parameterized Physics-informed Neural Networks for Parameterized PDEs [24.926311700375948]
In this paper, we propose a novel extension, parameterized physics-informed neural networks (PINNs)
PINNs enable modeling the solutions of parameterized partial differential equations (PDEs) via explicitly encoding a latent representation of PDE parameters.
We demonstrate that P$2$INNs outperform the baselines both in accuracy and parameter efficiency on benchmark 1D and 2D parameterized PDEs.
arXiv Detail & Related papers (2024-08-18T11:58:22Z) - Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers [55.0876373185983]
We present the Universal PDE solver (Unisolver) capable of solving a wide scope of PDEs.
Our key finding is that a PDE solution is fundamentally under the control of a series of PDE components.
Unisolver achieves consistent state-of-the-art results on three challenging large-scale benchmarks.
arXiv Detail & Related papers (2024-05-27T15:34:35Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Reduced-order modeling for parameterized PDEs via implicit neural
representations [4.135710717238787]
We present a new data-driven reduced-order modeling approach to efficiently solve parametrized partial differential equations (PDEs)
The proposed framework encodes PDE and utilizes a parametrized neural ODE (PNODE) to learn latent dynamics characterized by multiple PDE parameters.
We evaluate the proposed method at a large Reynolds number and obtain up to speedup of O(103) and 1% relative error to the ground truth values.
arXiv Detail & Related papers (2023-11-28T01:35:06Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Neural Partial Differential Equations with Functional Convolution [30.35306295442881]
We present a lightweighted neural PDE representation to discover the hidden structure and predict the solution of different nonlinear PDEs.
We leverage the prior of translational similarity'' of numerical PDE differential operators to drastically reduce the scale of learning model and training data.
arXiv Detail & Related papers (2023-03-10T04:25:38Z) - Fully probabilistic deep models for forward and inverse problems in
parametric PDEs [1.9599274203282304]
We introduce a physics-driven deep latent variable model (PDDLVM) to learn simultaneously parameter-to-solution (forward) and solution-to- parameter (inverse) maps of PDEs.
The proposed framework can be easily extended to seamlessly integrate observed data to solve inverse problems and to build generative models.
We demonstrate the efficiency and robustness of our method on finite element discretized parametric PDE problems.
arXiv Detail & Related papers (2022-08-09T15:40:53Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Lie Point Symmetry Data Augmentation for Neural PDE Solvers [69.72427135610106]
We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
arXiv Detail & Related papers (2022-02-15T18:43:17Z) - PDE-constrained Models with Neural Network Terms: Optimization and
Global Convergence [0.0]
Recent research has used deep learning to develop partial differential equation (PDE) models in science and engineering.
We rigorously study the optimization of a class of linear elliptic PDEs with neural network terms.
We train a neural network model for an application in fluid mechanics, in which the neural network functions as a closure model for the Reynolds-averaged Navier-Stokes equations.
arXiv Detail & Related papers (2021-05-18T16:04:33Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.