Discovering Nonlinear PDEs from Scarce Data with Physics-encoded
Learning
- URL: http://arxiv.org/abs/2201.12354v1
- Date: Fri, 28 Jan 2022 07:49:48 GMT
- Title: Discovering Nonlinear PDEs from Scarce Data with Physics-encoded
Learning
- Authors: Chengping Rao, Pu Ren, Yang Liu, Hao Sun
- Abstract summary: We propose a physics-encoded discrete learning framework for discovering PDEs from noisy and scarce data.
We validate our method on three nonlinear PDE systems.
- Score: 11.641708412097659
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: There have been growing interests in leveraging experimental measurements to
discover the underlying partial differential equations (PDEs) that govern
complex physical phenomena. Although past research attempts have achieved great
success in data-driven PDE discovery, the robustness of the existing methods
cannot be guaranteed when dealing with low-quality measurement data. To
overcome this challenge, we propose a novel physics-encoded discrete learning
framework for discovering spatiotemporal PDEs from scarce and noisy data. The
general idea is to (1) firstly introduce a novel deep convolutional-recurrent
network, which can encode prior physics knowledge (e.g., known PDE terms,
assumed PDE structure, initial/boundary conditions, etc.) while remaining
flexible on representation capability, to accurately reconstruct high-fidelity
data, and (2) perform sparse regression with the reconstructed data to identify
the explicit form of the governing PDEs. We validate our method on three
nonlinear PDE systems. The effectiveness and superiority of the proposed method
over baseline models are demonstrated.
Related papers
- Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers [55.0876373185983]
We present the Universal PDE solver (Unisolver) capable of solving a wide scope of PDEs.
Our key finding is that a PDE solution is fundamentally under the control of a series of PDE components.
Unisolver achieves consistent state-of-the-art results on three challenging large-scale benchmarks.
arXiv Detail & Related papers (2024-05-27T15:34:35Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Physics-constrained robust learning of open-form partial differential equations from limited and noisy data [1.50528618730365]
This study proposes a framework to robustly uncover open-form partial differential equations (PDEs) from limited and noisy data.
A neural network-based predictive model fits the system response and serves as the reward evaluator for the generated PDEs.
Numerical experiments demonstrate our framework's capability to uncover governing equations from nonlinear dynamic systems with limited and highly noisy data.
arXiv Detail & Related papers (2023-09-14T12:34:42Z) - Elliptic PDE learning is provably data-efficient [7.097838977449412]
PDE learning combines physics and machine learning to recover unknown physical systems from experimental data.
Our work provides theoretical guarantees on the number of input-output training pairs required in PDE learning.
Specifically, we exploit randomized numerical linear algebra and PDE theory to derive a provably data-efficient algorithm that recovers solution operators of 3D uniformly elliptic PDEs from input-output data.
arXiv Detail & Related papers (2023-02-24T20:51:23Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Lie Point Symmetry Data Augmentation for Neural PDE Solvers [69.72427135610106]
We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
arXiv Detail & Related papers (2022-02-15T18:43:17Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - PDE-READ: Human-readable Partial Differential Equation Discovery using
Deep Learning [0.0]
We introduce a new approach for PDE discovery that uses two Rational Neural Networks and a principled sparse regression algorithm.
We successfully identify the Heat, Burgers, and Korteweg-De Vries equations with remarkable consistency.
Our approach is unprecedentedly robust to both sparsity and noise and is, therefore, applicable to real-world observational data.
arXiv Detail & Related papers (2021-11-01T15:00:16Z) - PhyCRNet: Physics-informed Convolutional-Recurrent Network for Solving
Spatiotemporal PDEs [8.220908558735884]
Partial differential equations (PDEs) play a fundamental role in modeling and simulating problems across a wide range of disciplines.
Recent advances in deep learning have shown the great potential of physics-informed neural networks (NNs) to solve PDEs as a basis for data-driven inverse analysis.
We propose the novel physics-informed convolutional-recurrent learning architectures (PhyCRNet and PhCRyNet-s) for solving PDEs without any labeled data.
arXiv Detail & Related papers (2021-06-26T22:22:19Z) - Physics-Guided Discovery of Highly Nonlinear Parametric Partial
Differential Equations [29.181177365252925]
Partial differential equations (PDEs) that fit scientific data can represent physical laws with explainable mechanisms.
We propose a novel physics-guided learning method, which encodes observation knowledge and incorporates basic physical principles and laws.
Experiments show that our proposed method is more robust against data noise, and can reduce the estimation error by a large margin.
arXiv Detail & Related papers (2021-06-02T11:24:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.