Learning the solution operator of parametric partial differential
equations with physics-informed DeepOnets
- URL: http://arxiv.org/abs/2103.10974v1
- Date: Fri, 19 Mar 2021 18:15:42 GMT
- Title: Learning the solution operator of parametric partial differential
equations with physics-informed DeepOnets
- Authors: Sifan Wang, Hanwen Wang, Paris Perdikaris
- Abstract summary: Deep operator networks (DeepONets) are receiving increased attention thanks to their demonstrated capability to approximate nonlinear operators between infinite-dimensional Banach spaces.
We propose a novel model class coined as physics-informed DeepONets, which introduces an effective regularization mechanism for biasing the outputs of DeepOnet models towards ensuring physical consistency.
We demonstrate that this simple, yet remarkably effective extension can not only yield a significant improvement in the predictive accuracy of DeepOnets, but also greatly reduce the need for large training data-sets.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep operator networks (DeepONets) are receiving increased attention thanks
to their demonstrated capability to approximate nonlinear operators between
infinite-dimensional Banach spaces. However, despite their remarkable early
promise, they typically require large training data-sets consisting of paired
input-output observations which may be expensive to obtain, while their
predictions may not be consistent with the underlying physical principles that
generated the observed data. In this work, we propose a novel model class
coined as physics-informed DeepONets, which introduces an effective
regularization mechanism for biasing the outputs of DeepOnet models towards
ensuring physical consistency. This is accomplished by leveraging automatic
differentiation to impose the underlying physical laws via soft penalty
constraints during model training. We demonstrate that this simple, yet
remarkably effective extension can not only yield a significant improvement in
the predictive accuracy of DeepOnets, but also greatly reduce the need for
large training data-sets. To this end, a remarkable observation is that
physics-informed DeepONets are capable of solving parametric partial
differential equations (PDEs) without any paired input-output observations,
except for a set of given initial or boundary conditions. We illustrate the
effectiveness of the proposed framework through a series of comprehensive
numerical studies across various types of PDEs. Strikingly, a trained physics
informed DeepOnet model can predict the solution of $\mathcal{O}(10^3)$
time-dependent PDEs in a fraction of a second -- up to three orders of
magnitude faster compared a conventional PDE solver. The data and code
accompanying this manuscript are publicly available at
\url{https://github.com/PredictiveIntelligenceLab/Physics-informed-DeepONets}.
Related papers
- Learning time-dependent PDE via graph neural networks and deep operator
network for robust accuracy on irregular grids [14.93012615797081]
GraphDeepONet is an autoregressive model based on graph neural networks (GNNs)
It exhibits robust accuracy in predicting solutions compared to existing GNN-based PDE solver models.
Unlike traditional DeepONet and its variants, GraphDeepONet enables time extrapolation for time-dependent PDE solutions.
arXiv Detail & Related papers (2024-02-13T03:14:32Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Training Deep Surrogate Models with Large Scale Online Learning [48.7576911714538]
Deep learning algorithms have emerged as a viable alternative for obtaining fast solutions for PDEs.
Models are usually trained on synthetic data generated by solvers, stored on disk and read back for training.
It proposes an open source online training framework for deep surrogate models.
arXiv Detail & Related papers (2023-06-28T12:02:27Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Reliable extrapolation of deep neural operators informed by physics or
sparse observations [2.887258133992338]
Deep neural operators can learn nonlinear mappings between infinite-dimensional function spaces via deep neural networks.
DeepONets provide a new simulation paradigm in science and engineering.
We propose five reliable learning methods that guarantee a safe prediction under extrapolation.
arXiv Detail & Related papers (2022-12-13T03:02:46Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Improved architectures and training algorithms for deep operator
networks [0.0]
Operator learning techniques have emerged as a powerful tool for learning maps between infinite-dimensional Banach spaces.
We analyze the training dynamics of deep operator networks (DeepONets) through the lens of Neural Tangent Kernel (NTK) theory.
arXiv Detail & Related papers (2021-10-04T18:34:41Z) - Spline-PINN: Approaching PDEs without Data using Fast, Physics-Informed
Hermite-Spline CNNs [4.560331122656578]
Partial Differential Equations (PDEs) are notoriously difficult to solve.
In this paper, we propose to approach the solution of PDEs based on a novel technique that combines the advantages of two recently emerging machine learning based approaches.
arXiv Detail & Related papers (2021-09-15T08:10:23Z) - PhyCRNet: Physics-informed Convolutional-Recurrent Network for Solving
Spatiotemporal PDEs [8.220908558735884]
Partial differential equations (PDEs) play a fundamental role in modeling and simulating problems across a wide range of disciplines.
Recent advances in deep learning have shown the great potential of physics-informed neural networks (NNs) to solve PDEs as a basis for data-driven inverse analysis.
We propose the novel physics-informed convolutional-recurrent learning architectures (PhyCRNet and PhCRyNet-s) for solving PDEs without any labeled data.
arXiv Detail & Related papers (2021-06-26T22:22:19Z) - ForceNet: A Graph Neural Network for Large-Scale Quantum Calculations [86.41674945012369]
We develop a scalable and expressive Graph Neural Networks model, ForceNet, to approximate atomic forces.
Our proposed ForceNet is able to predict atomic forces more accurately than state-of-the-art physics-based GNNs.
arXiv Detail & Related papers (2021-03-02T03:09:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.