PhyCRNet: Physics-informed Convolutional-Recurrent Network for Solving
Spatiotemporal PDEs
- URL: http://arxiv.org/abs/2106.14103v1
- Date: Sat, 26 Jun 2021 22:22:19 GMT
- Title: PhyCRNet: Physics-informed Convolutional-Recurrent Network for Solving
Spatiotemporal PDEs
- Authors: Pu Ren, Chengping Rao, Yang Liu, Jianxun Wang, Hao Sun
- Abstract summary: Partial differential equations (PDEs) play a fundamental role in modeling and simulating problems across a wide range of disciplines.
Recent advances in deep learning have shown the great potential of physics-informed neural networks (NNs) to solve PDEs as a basis for data-driven inverse analysis.
We propose the novel physics-informed convolutional-recurrent learning architectures (PhyCRNet and PhCRyNet-s) for solving PDEs without any labeled data.
- Score: 8.220908558735884
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Partial differential equations (PDEs) play a fundamental role in modeling and
simulating problems across a wide range of disciplines. Recent advances in deep
learning have shown the great potential of physics-informed neural networks
(PINNs) to solve PDEs as a basis for data-driven modeling and inverse analysis.
However, the majority of existing PINN methods, based on fully-connected NNs,
pose intrinsic limitations to low-dimensional spatiotemporal parameterizations.
Moreover, since the initial/boundary conditions (I/BCs) are softly imposed via
penalty, the solution quality heavily relies on hyperparameter tuning. To this
end, we propose the novel physics-informed convolutional-recurrent learning
architectures (PhyCRNet and PhyCRNet-s) for solving PDEs without any labeled
data. Specifically, an encoder-decoder convolutional long short-term memory
network is proposed for low-dimensional spatial feature extraction and temporal
evolution learning. The loss function is defined as the aggregated discretized
PDE residuals, while the I/BCs are hard-encoded in the network to ensure
forcible satisfaction (e.g., periodic boundary padding). The networks are
further enhanced by autoregressive and residual connections that explicitly
simulate time marching. The performance of our proposed methods has been
assessed by solving three nonlinear PDEs (e.g., 2D Burgers' equations, the
$\lambda$-$\omega$ and FitzHugh Nagumo reaction-diffusion equations), and
compared against the start-of-the-art baseline algorithms. The numerical
results demonstrate the superiority of our proposed methodology in the context
of solution accuracy, extrapolability and generalizability.
Related papers
- P$^2$C$^2$Net: PDE-Preserved Coarse Correction Network for efficient prediction of spatiotemporal dynamics [38.53011684603394]
We introduce a new PDE-Preserved Coarse Correction Network (P$2$C$2$Net) to solve PDE problems on coarse mesh grids in small data regimes.
The model consists of two synergistic modules: (1) a trainable PDE block that learns to update the coarse solution (i.e., the system state), based on a high-order numerical scheme with boundary condition encoding, and (2) a neural network block that consistently corrects the solution on the fly.
arXiv Detail & Related papers (2024-10-29T14:45:07Z) - PhyMPGN: Physics-encoded Message Passing Graph Network for spatiotemporal PDE systems [31.006807854698376]
We propose a new graph learning approach, namely, Physics-encoded Message Passing Graph Network (PhyMPGN)
We incorporate a GNN into a numerical integrator to approximate the temporal marching of partialtemporal dynamics for a given PDE system.
PhyMPGN is capable of accurately predicting various types of operatortemporal dynamics on coarse unstructured meshes.
arXiv Detail & Related papers (2024-10-02T08:54:18Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Characteristics-Informed Neural Networks for Forward and Inverse
Hyperbolic Problems [0.0]
We propose characteristic-informed neural networks (CINN) for solving forward and inverse problems involving hyperbolic PDEs.
CINN encodes the characteristics of the PDE in a general-purpose deep neural network trained with the usual MSE data-fitting regression loss.
Preliminary results indicate that CINN is able to improve on the accuracy of the baseline PINN, while being nearly twice as fast to train and avoiding non-physical solutions.
arXiv Detail & Related papers (2022-12-28T18:38:53Z) - MAgNet: Mesh Agnostic Neural PDE Solver [68.8204255655161]
Climate predictions require fine-temporal resolutions to resolve all turbulent scales in the fluid simulations.
Current numerical model solveers PDEs on grids that are too coarse (3km to 200km on each side)
We design a novel architecture that predicts the spatially continuous solution of a PDE given a spatial position query.
arXiv Detail & Related papers (2022-10-11T14:52:20Z) - Neural Basis Functions for Accelerating Solutions to High Mach Euler
Equations [63.8376359764052]
We propose an approach to solving partial differential equations (PDEs) using a set of neural networks.
We regress a set of neural networks onto a reduced order Proper Orthogonal Decomposition (POD) basis.
These networks are then used in combination with a branch network that ingests the parameters of the prescribed PDE to compute a reduced order approximation to the PDE.
arXiv Detail & Related papers (2022-08-02T18:27:13Z) - Physics-informed attention-based neural network for solving non-linear
partial differential equations [6.103365780339364]
Physics-Informed Neural Networks (PINNs) have enabled significant improvements in modelling physical processes.
PINNs are based on simple architectures, and learn the behavior of complex physical systems by optimizing the network parameters to minimize the residual of the underlying PDE.
Here, we address the question of which network architectures are best suited to learn the complex behavior of non-linear PDEs.
arXiv Detail & Related papers (2021-05-17T14:29:08Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z) - Physics informed deep learning for computational elastodynamics without
labeled data [13.084113582897965]
We present a physics-informed neural network (PINN) with mixed-variable output to model elastodynamics problems without resort to labeled data.
Results show the promise of PINN in the context of computational mechanics applications.
arXiv Detail & Related papers (2020-06-10T19:05:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.