Discovering Governing Equations by Machine Learning implemented with
Invariance
- URL: http://arxiv.org/abs/2203.15586v1
- Date: Tue, 29 Mar 2022 14:01:03 GMT
- Title: Discovering Governing Equations by Machine Learning implemented with
Invariance
- Authors: Chao Chen, Xiaowei Jin, Hui Li
- Abstract summary: This paper proposes GSNN (Galileo Neural Network) and LSNN (Lorentz Symbolic Neural Network) as learning discovery methods for control equations.
The adoption of mandatory embedding of physical constraints is fundamentally different from PINN in the form of the loss function.
It shows that the method presented in this study has better accuracy, parsimony, and interpretability.
- Score: 9.014669470289965
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The partial differential equation (PDE) plays a significantly important role
in many fields of science and engineering. The conventional case of the
derivation of PDE mainly relies on first principles and empirical observation.
However, the development of machine learning technology allows us to mine
potential control equations from the massive amounts of stored data in a fresh
way. Although there has been considerable progress in the data-driven discovery
of PDE, the extant literature mostly focuses on the improvements of discovery
methods, without substantial breakthroughs in the discovery process itself,
including the principles for the construction of candidates and how to
incorporate physical priors. In this paper, through rigorous derivation of
formulas, novel physically enhanced machining learning discovery methods for
control equations: GSNN (Galileo Symbolic Neural Network) and LSNN (Lorentz
Symbolic Neural Network) are firstly proposed based on Galileo invariance and
Lorentz invariance respectively, setting forth guidelines for building the
candidates of discovering equations. The adoption of mandatory embedding of
physical constraints is fundamentally different from PINN in the form of the
loss function, thus ensuring that the designed Neural Network strictly obeys
the physical prior of invariance and enhancing the interpretability of the
network. By comparing the results with PDE-NET in numerical experiments of
Burgers equation and Sine-Gordon equation, it shows that the method presented
in this study has better accuracy, parsimony, and interpretability.
Related papers
- Quantifying Training Difficulty and Accelerating Convergence in Neural Network-Based PDE Solvers [9.936559796069844]
We investigate the training dynamics of neural network-based PDE solvers.
We find that two techniques, partition of unity (PoU) and variance scaling (VS) enhance the effective rank.
Experiments using popular PDE-solving frameworks, such as PINN, Deep Ritz, and the operator learning framework DeepOnet, confirm that these techniques consistently speed up convergence.
arXiv Detail & Related papers (2024-10-08T19:35:19Z) - DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Characteristic Performance Study on Solving Oscillator ODEs via Soft-constrained Physics-informed Neural Network with Small Data [6.3295494018089435]
This paper compares physics-informed neural network (PINN), conventional neural network (NN) and traditional numerical discretization methods on solving differential equations (DEs)
We focus on the soft-constrained PINN approach and formalized its mathematical framework and computational flow for solving Ordinary DEs and Partial DEs.
We demonstrate that the DeepXDE-based implementation of PINN is not only light code and efficient in training, but also flexible across CPU/GPU platforms.
arXiv Detail & Related papers (2024-08-19T13:02:06Z) - Knowledge-Based Convolutional Neural Network for the Simulation and Prediction of Two-Phase Darcy Flows [3.5707423185282656]
Physics-informed neural networks (PINNs) have gained significant prominence as a powerful tool in the field of scientific computing and simulations.
We propose to combine the power of neural networks with the dynamics imposed by the discretized differential equations.
By discretizing the governing equations, the PINN learns to account for the discontinuities and accurately capture the underlying relationships between inputs and outputs.
arXiv Detail & Related papers (2024-04-04T06:56:32Z) - Spectral-Bias and Kernel-Task Alignment in Physically Informed Neural
Networks [4.604003661048267]
Physically informed neural networks (PINNs) are a promising emerging method for solving differential equations.
We propose a comprehensive theoretical framework that sheds light on this important problem.
We derive an integro-differential equation that governs PINN prediction in the large data-set limit.
arXiv Detail & Related papers (2023-07-12T18:00:02Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - A mixed formulation for physics-informed neural networks as a potential
solver for engineering problems in heterogeneous domains: comparison with
finite element method [0.0]
Physics-informed neural networks (PINNs) are capable of finding the solution for a given boundary value problem.
We employ several ideas from the finite element method (FEM) to enhance the performance of existing PINNs in engineering problems.
arXiv Detail & Related papers (2022-06-27T08:18:08Z) - Evaluating the Adversarial Robustness for Fourier Neural Operators [78.36413169647408]
Fourier Neural Operator (FNO) was the first to simulate turbulent flow with zero-shot super-resolution.
We generate adversarial examples for FNO based on norm-bounded data input perturbations.
Our results show that the model's robustness degrades rapidly with increasing perturbation levels.
arXiv Detail & Related papers (2022-04-08T19:19:42Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Incorporating NODE with Pre-trained Neural Differential Operator for
Learning Dynamics [73.77459272878025]
We propose to enhance the supervised signal in learning dynamics by pre-training a neural differential operator (NDO)
NDO is pre-trained on a class of symbolic functions, and it learns the mapping between the trajectory samples of these functions to their derivatives.
We provide theoretical guarantee on that the output of NDO can well approximate the ground truth derivatives by proper tuning the complexity of the library.
arXiv Detail & Related papers (2021-06-08T08:04:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.