Multi evolutional deep neural networks (Multi-EDNN)
- URL: http://arxiv.org/abs/2407.12293v1
- Date: Wed, 17 Jul 2024 03:26:03 GMT
- Title: Multi evolutional deep neural networks (Multi-EDNN)
- Authors: Hadden Kim, Tamer A. Zaki,
- Abstract summary: Evolutional deep neural networks (EDNN) solve partial differential equations (PDEs)
Use of a single network to solve coupled PDEs on large domains requires a large number of network parameters and incurs a significant computational cost.
We introduce coupled EDNN (C-EDNN) to solve systems of PDEs by using independent networks for each state variable, which are only coupled through the governing equations.
We also introduce distributed EDNN (D-EDNN) by spatially partitioning the global domain into several elements and assigning individual EDNNs to each element to solve the local evolution of the PDE.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Evolutional deep neural networks (EDNN) solve partial differential equations (PDEs) by marching the network representation of the solution fields, using the governing equations. Use of a single network to solve coupled PDEs on large domains requires a large number of network parameters and incurs a significant computational cost. We introduce coupled EDNN (C-EDNN) to solve systems of PDEs by using independent networks for each state variable, which are only coupled through the governing equations. We also introduce distributed EDNN (D-EDNN) by spatially partitioning the global domain into several elements and assigning individual EDNNs to each element to solve the local evolution of the PDE. The networks then exchange the solution and fluxes at their interfaces, similar to flux-reconstruction methods, and ensure that the PDE dynamics are accurately preserved between neighboring elements. Together C-EDNN and D-EDNN form the general class of Multi-EDNN methods. We demonstrate these methods with aid of canonical problems including linear advection, the heat equation, and the compressible Navier-Stokes equations in Couette and Taylor-Green flows.
Related papers
- Learning Traveling Solitary Waves Using Separable Gaussian Neural
Networks [0.9065034043031668]
We apply a machine-learning approach to learn traveling solitary waves across various families of partial differential equations (PDEs)
Our approach integrates a novel interpretable neural network (NN) architecture into the framework of Physics-Informed Neural Networks (PINNs)
arXiv Detail & Related papers (2024-03-07T20:16:18Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - iPINNs: Incremental learning for Physics-informed neural networks [66.4795381419701]
Physics-informed neural networks (PINNs) have recently become a powerful tool for solving partial differential equations (PDEs)
We propose incremental PINNs that can learn multiple tasks sequentially without additional parameters for new tasks and improve performance for every equation in the sequence.
Our approach learns multiple PDEs starting from the simplest one by creating its own subnetwork for each PDE and allowing each subnetwork to overlap with previously learnedworks.
arXiv Detail & Related papers (2023-04-10T20:19:20Z) - Anisotropic, Sparse and Interpretable Physics-Informed Neural Networks
for PDEs [0.0]
We present ASPINN, an anisotropic extension of our earlier work called SPINN--Sparse, Physics-informed, and Interpretable Neural Networks--to solve PDEs.
ASPINNs generalize radial basis function networks.
We also streamline the training of ASPINNs into a form that is closer to that of supervised learning algorithms.
arXiv Detail & Related papers (2022-07-01T12:24:43Z) - Improved Training of Physics-Informed Neural Networks with Model
Ensembles [81.38804205212425]
We propose to expand the solution interval gradually to make the PINN converge to the correct solution.
All ensemble members converge to the same solution in the vicinity of observed data.
We show experimentally that the proposed method can improve the accuracy of the found solution.
arXiv Detail & Related papers (2022-04-11T14:05:34Z) - Learning Autonomy in Management of Wireless Random Networks [102.02142856863563]
This paper presents a machine learning strategy that tackles a distributed optimization task in a wireless network with an arbitrary number of randomly interconnected nodes.
We develop a flexible deep neural network formalism termed distributed message-passing neural network (DMPNN) with forward and backward computations independent of the network topology.
arXiv Detail & Related papers (2021-06-15T09:03:28Z) - Evolutional Deep Neural Network [0.0]
An Evolutional Deep Neural Network (EDNN) is introduced for the solution of partial differential equations (PDE)
By marching the neural network weights in the parameter space, EDNN can predict state-space trajectories that are indefinitely long.
Several applications including the heat equation, the advection equation, the Burgers equation, the Kuramoto Sivashinsky equation and the Navier-Stokes equations are solved.
arXiv Detail & Related papers (2021-03-18T00:33:11Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.