Evolutional Deep Neural Network
- URL: http://arxiv.org/abs/2103.09959v1
- Date: Thu, 18 Mar 2021 00:33:11 GMT
- Title: Evolutional Deep Neural Network
- Authors: Yifan Du, Tamer A. Zaki
- Abstract summary: An Evolutional Deep Neural Network (EDNN) is introduced for the solution of partial differential equations (PDE)
By marching the neural network weights in the parameter space, EDNN can predict state-space trajectories that are indefinitely long.
Several applications including the heat equation, the advection equation, the Burgers equation, the Kuramoto Sivashinsky equation and the Navier-Stokes equations are solved.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The notion of an Evolutional Deep Neural Network (EDNN) is introduced for the
solution of partial differential equations (PDE). The parameters of the network
are trained to represent the initial state of the system only, and are
subsequently updated dynamically, without any further training, to provide an
accurate prediction of the evolution of the PDE system. In this framework, the
network parameters are treated as functions with respect to the appropriate
coordinate and are numerically updated using the governing equations. By
marching the neural network weights in the parameter space, EDNN can predict
state-space trajectories that are indefinitely long, which is difficult for
other neural network approaches. Boundary conditions of the PDEs are treated as
hard constraints, are embedded into the neural network, and are therefore
exactly satisfied throughout the entire solution trajectory. Several
applications including the heat equation, the advection equation, the Burgers
equation, the Kuramoto Sivashinsky equation and the Navier-Stokes equations are
solved to demonstrate the versatility and accuracy of EDNN. The application of
EDNN to the incompressible Navier-Stokes equation embeds the divergence-free
constraint into the network design so that the projection of the momentum
equation to solenoidal space is implicitly achieved. The numerical results
verify the accuracy of EDNN solutions relative to analytical and benchmark
numerical solutions, both for the transient dynamics and statistics of the
system.
Related papers
- Multi evolutional deep neural networks (Multi-EDNN) [0.0]
Evolutional deep neural networks (EDNN) solve partial differential equations (PDEs)
Use of a single network to solve coupled PDEs on large domains requires a large number of network parameters and incurs a significant computational cost.
We introduce coupled EDNN (C-EDNN) to solve systems of PDEs by using independent networks for each state variable, which are only coupled through the governing equations.
We also introduce distributed EDNN (D-EDNN) by spatially partitioning the global domain into several elements and assigning individual EDNNs to each element to solve the local evolution of the PDE.
arXiv Detail & Related papers (2024-07-17T03:26:03Z) - Solving Poisson Equations using Neural Walk-on-Spheres [80.1675792181381]
We propose Neural Walk-on-Spheres (NWoS), a novel neural PDE solver for the efficient solution of high-dimensional Poisson equations.
We demonstrate the superiority of NWoS in accuracy, speed, and computational costs.
arXiv Detail & Related papers (2024-06-05T17:59:22Z) - Grad-Shafranov equilibria via data-free physics informed neural networks [0.0]
We show that PINNs can accurately and effectively solve the Grad-Shafranov equation with several different boundary conditions.
We introduce a parameterized PINN framework, expanding the input space to include variables such as pressure, aspect ratio, elongation, and triangularity.
arXiv Detail & Related papers (2023-11-22T16:08:38Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - MAgNet: Mesh Agnostic Neural PDE Solver [68.8204255655161]
Climate predictions require fine-temporal resolutions to resolve all turbulent scales in the fluid simulations.
Current numerical model solveers PDEs on grids that are too coarse (3km to 200km on each side)
We design a novel architecture that predicts the spatially continuous solution of a PDE given a spatial position query.
arXiv Detail & Related papers (2022-10-11T14:52:20Z) - Neural Basis Functions for Accelerating Solutions to High Mach Euler
Equations [63.8376359764052]
We propose an approach to solving partial differential equations (PDEs) using a set of neural networks.
We regress a set of neural networks onto a reduced order Proper Orthogonal Decomposition (POD) basis.
These networks are then used in combination with a branch network that ingests the parameters of the prescribed PDE to compute a reduced order approximation to the PDE.
arXiv Detail & Related papers (2022-08-02T18:27:13Z) - PhyCRNet: Physics-informed Convolutional-Recurrent Network for Solving
Spatiotemporal PDEs [8.220908558735884]
Partial differential equations (PDEs) play a fundamental role in modeling and simulating problems across a wide range of disciplines.
Recent advances in deep learning have shown the great potential of physics-informed neural networks (NNs) to solve PDEs as a basis for data-driven inverse analysis.
We propose the novel physics-informed convolutional-recurrent learning architectures (PhyCRNet and PhCRyNet-s) for solving PDEs without any labeled data.
arXiv Detail & Related papers (2021-06-26T22:22:19Z) - Physics-informed attention-based neural network for solving non-linear
partial differential equations [6.103365780339364]
Physics-Informed Neural Networks (PINNs) have enabled significant improvements in modelling physical processes.
PINNs are based on simple architectures, and learn the behavior of complex physical systems by optimizing the network parameters to minimize the residual of the underlying PDE.
Here, we address the question of which network architectures are best suited to learn the complex behavior of non-linear PDEs.
arXiv Detail & Related papers (2021-05-17T14:29:08Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z) - A nonlocal physics-informed deep learning framework using the
peridynamic differential operator [0.0]
We develop a nonlocal PINN approach using the Peridynamic Differential Operator (PDDO)---a numerical method which incorporates long-range interactions and removes spatial derivatives in the governing equations.
Because the PDDO functions can be readily incorporated in the neural network architecture, the nonlocality does not degrade the performance of modern deep-learning algorithms.
We document the superior behavior of nonlocal PINN with respect to local PINN in both solution accuracy and parameter inference.
arXiv Detail & Related papers (2020-05-31T06:26:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.