Robust discovery of partial differential equations in complex situations
- URL: http://arxiv.org/abs/2106.00008v1
- Date: Mon, 31 May 2021 02:11:59 GMT
- Title: Robust discovery of partial differential equations in complex situations
- Authors: Hao Xu and Dongxiao Zhang
- Abstract summary: A robust deep learning-genetic algorithm (R-DLGA) that incorporates the physics-informed neural network (PINN) is proposed in this work.
The stability and accuracy of the proposed R-DLGA in several complex situations are examined for proof-and-concept.
Results prove that the proposed framework is able to calculate derivatives accurately with the optimization of PINN.
- Score: 3.7314701799132686
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Data-driven discovery of partial differential equations (PDEs) has achieved
considerable development in recent years. Several aspects of problems have been
resolved by sparse regression-based and neural network-based methods. However,
the performances of existing methods lack stability when dealing with complex
situations, including sparse data with high noise, high-order derivatives and
shock waves, which bring obstacles to calculating derivatives accurately.
Therefore, a robust PDE discovery framework, called the robust deep
learning-genetic algorithm (R-DLGA), that incorporates the physics-informed
neural network (PINN), is proposed in this work. In the framework, a
preliminary result of potential terms provided by the deep learning-genetic
algorithm is added into the loss function of the PINN as physical constraints
to improve the accuracy of derivative calculation. It assists to optimize the
preliminary result and obtain the ultimately discovered PDE by eliminating the
error compensation terms. The stability and accuracy of the proposed R-DLGA in
several complex situations are examined for proof-and-concept, and the results
prove that the proposed framework is able to calculate derivatives accurately
with the optimization of PINN and possesses surprising robustness to complex
situations, including sparse data with high noise, high-order derivatives, and
shock waves.
Related papers
- UGrid: An Efficient-And-Rigorous Neural Multigrid Solver for Linear PDEs [18.532617548168123]
This paper articulates a mathematically rigorous neural solver for linear PDEs.
The proposed UGrid solver, built upon the principled integration of U-Net and MultiGrid, manifests a mathematically rigorous proof of both convergence and correctness.
arXiv Detail & Related papers (2024-08-09T03:46:35Z) - Physics-informed deep learning and compressive collocation for high-dimensional diffusion-reaction equations: practical existence theory and numerics [5.380276949049726]
We develop and analyze an efficient high-dimensional Partial Differential Equations solver based on Deep Learning (DL)
We show, both theoretically and numerically, that it can compete with a novel stable and accurate compressive spectral collocation method.
arXiv Detail & Related papers (2024-06-03T17:16:11Z) - Learning Semilinear Neural Operators : A Unified Recursive Framework For Prediction And Data Assimilation [21.206744437644982]
We propose a learning-based state-space approach to compute solution operators to infinite-dimensional semilinear PDEs.
We develop a flexible method that allows for both prediction and data assimilation by combining prediction and correction operations.
We show through experiments on Kuramoto-Sivashinsky, Navier-Stokes and Korteweg-de Vries equations that the proposed model is robust to noise and can leverage arbitrary amounts of measurements to correct its prediction over a long time horizon with little computational overhead.
arXiv Detail & Related papers (2024-02-24T00:10:51Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Enhanced Physics-Informed Neural Networks with Augmented Lagrangian
Relaxation Method (AL-PINNs) [1.7403133838762446]
Physics-Informed Neural Networks (PINNs) are powerful approximators of solutions to nonlinear partial differential equations (PDEs)
We propose an Augmented Lagrangian relaxation method for PINNs (AL-PINNs)
We demonstrate through various numerical experiments that AL-PINNs yield a much smaller relative error compared with that of state-of-the-art adaptive loss-balancing algorithms.
arXiv Detail & Related papers (2022-04-29T08:33:11Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - Incorporating NODE with Pre-trained Neural Differential Operator for
Learning Dynamics [73.77459272878025]
We propose to enhance the supervised signal in learning dynamics by pre-training a neural differential operator (NDO)
NDO is pre-trained on a class of symbolic functions, and it learns the mapping between the trajectory samples of these functions to their derivatives.
We provide theoretical guarantee on that the output of NDO can well approximate the ground truth derivatives by proper tuning the complexity of the library.
arXiv Detail & Related papers (2021-06-08T08:04:47Z) - Accurate and Reliable Forecasting using Stochastic Differential
Equations [48.21369419647511]
It is critical yet challenging for deep learning models to properly characterize uncertainty that is pervasive in real-world environments.
This paper develops SDE-HNN to characterize the interaction between the predictive mean and variance of HNNs for accurate and reliable regression.
Experiments on the challenging datasets show that our method significantly outperforms the state-of-the-art baselines in terms of both predictive performance and uncertainty quantification.
arXiv Detail & Related papers (2021-03-28T04:18:11Z) - Deep-learning based discovery of partial differential equations in
integral form from sparse and noisy data [2.745859263816099]
A new framework combining deep-learning and integral form is proposed to handle the above-mentioned problems simultaneously.
Our proposed algorithm is more robust to noise and more accurate compared with existing methods due to the utilization of integral form.
arXiv Detail & Related papers (2020-11-24T09:18:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.