Learning the solution operator of parametric partial differential
equations with physics-informed DeepOnets
- URL: http://arxiv.org/abs/2103.10974v1
- Date: Fri, 19 Mar 2021 18:15:42 GMT
- Title: Learning the solution operator of parametric partial differential
equations with physics-informed DeepOnets
- Authors: Sifan Wang, Hanwen Wang, Paris Perdikaris
- Abstract summary: Deep operator networks (DeepONets) are receiving increased attention thanks to their demonstrated capability to approximate nonlinear operators between infinite-dimensional Banach spaces.
We propose a novel model class coined as physics-informed DeepONets, which introduces an effective regularization mechanism for biasing the outputs of DeepOnet models towards ensuring physical consistency.
We demonstrate that this simple, yet remarkably effective extension can not only yield a significant improvement in the predictive accuracy of DeepOnets, but also greatly reduce the need for large training data-sets.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep operator networks (DeepONets) are receiving increased attention thanks
to their demonstrated capability to approximate nonlinear operators between
infinite-dimensional Banach spaces. However, despite their remarkable early
promise, they typically require large training data-sets consisting of paired
input-output observations which may be expensive to obtain, while their
predictions may not be consistent with the underlying physical principles that
generated the observed data. In this work, we propose a novel model class
coined as physics-informed DeepONets, which introduces an effective
regularization mechanism for biasing the outputs of DeepOnet models towards
ensuring physical consistency. This is accomplished by leveraging automatic
differentiation to impose the underlying physical laws via soft penalty
constraints during model training. We demonstrate that this simple, yet
remarkably effective extension can not only yield a significant improvement in
the predictive accuracy of DeepOnets, but also greatly reduce the need for
large training data-sets. To this end, a remarkable observation is that
physics-informed DeepONets are capable of solving parametric partial
differential equations (PDEs) without any paired input-output observations,
except for a set of given initial or boundary conditions. We illustrate the
effectiveness of the proposed framework through a series of comprehensive
numerical studies across various types of PDEs. Strikingly, a trained physics
informed DeepOnet model can predict the solution of $\mathcal{O}(10^3)$
time-dependent PDEs in a fraction of a second -- up to three orders of
magnitude faster compared a conventional PDE solver. The data and code
accompanying this manuscript are publicly available at
\url{https://github.com/PredictiveIntelligenceLab/Physics-informed-DeepONets}.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - PhyMPGN: Physics-encoded Message Passing Graph Network for spatiotemporal PDE systems [31.006807854698376]
We propose a new graph learning approach, namely, Physics-encoded Message Passing Graph Network (PhyMPGN)
We incorporate a GNN into a numerical integrator to approximate the temporal marching of partialtemporal dynamics for a given PDE system.
PhyMPGN is capable of accurately predicting various types of operatortemporal dynamics on coarse unstructured meshes.
arXiv Detail & Related papers (2024-10-02T08:54:18Z) - Characteristic Performance Study on Solving Oscillator ODEs via Soft-constrained Physics-informed Neural Network with Small Data [6.3295494018089435]
This paper compares physics-informed neural network (PINN), conventional neural network (NN) and traditional numerical discretization methods on solving differential equations (DEs)
We focus on the soft-constrained PINN approach and formalized its mathematical framework and computational flow for solving Ordinary DEs and Partial DEs.
We demonstrate that the DeepXDE-based implementation of PINN is not only light code and efficient in training, but also flexible across CPU/GPU platforms.
arXiv Detail & Related papers (2024-08-19T13:02:06Z) - Physics-informed Discretization-independent Deep Compositional Operator Network [1.2430809884830318]
We introduce a novel physics-informed model architecture which can generalize to various discrete representations of PDE parameters and irregular domain shapes.
Inspired by deep operator neural networks, our model involves a discretization-independent learning of parameter embedding repeatedly.
Numerical results demonstrate the accuracy and efficiency of the proposed method.
arXiv Detail & Related papers (2024-04-21T12:41:30Z) - Learning time-dependent PDE via graph neural networks and deep operator
network for robust accuracy on irregular grids [14.93012615797081]
GraphDeepONet is an autoregressive model based on graph neural networks (GNNs)
It exhibits robust accuracy in predicting solutions compared to existing GNN-based PDE solver models.
Unlike traditional DeepONet and its variants, GraphDeepONet enables time extrapolation for time-dependent PDE solutions.
arXiv Detail & Related papers (2024-02-13T03:14:32Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Training Deep Surrogate Models with Large Scale Online Learning [48.7576911714538]
Deep learning algorithms have emerged as a viable alternative for obtaining fast solutions for PDEs.
Models are usually trained on synthetic data generated by solvers, stored on disk and read back for training.
It proposes an open source online training framework for deep surrogate models.
arXiv Detail & Related papers (2023-06-28T12:02:27Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - PhyCRNet: Physics-informed Convolutional-Recurrent Network for Solving
Spatiotemporal PDEs [8.220908558735884]
Partial differential equations (PDEs) play a fundamental role in modeling and simulating problems across a wide range of disciplines.
Recent advances in deep learning have shown the great potential of physics-informed neural networks (NNs) to solve PDEs as a basis for data-driven inverse analysis.
We propose the novel physics-informed convolutional-recurrent learning architectures (PhyCRNet and PhCRyNet-s) for solving PDEs without any labeled data.
arXiv Detail & Related papers (2021-06-26T22:22:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.