Deep Neural Network Modeling of Unknown Partial Differential Equations
in Nodal Space
- URL: http://arxiv.org/abs/2106.03603v1
- Date: Mon, 7 Jun 2021 13:27:09 GMT
- Title: Deep Neural Network Modeling of Unknown Partial Differential Equations
in Nodal Space
- Authors: Zhen Chen, Victor Churchill, Kailiang Wu, Dongbin Xiu
- Abstract summary: We present a framework for deep neural network (DNN) modeling of unknown time-dependent partial differential equations (PDE) using trajectory data.
We present a DNN structure that has a direct correspondence to the evolution operator of the underlying PDE.
A trained DNN defines a predictive model for the underlying unknown PDE over structureless grids.
- Score: 1.8010196131724825
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a numerical framework for deep neural network (DNN) modeling of
unknown time-dependent partial differential equations (PDE) using their
trajectory data. Unlike the recent work of [Wu and Xiu, J. Comput. Phys. 2020],
where the learning takes place in modal/Fourier space, the current method
conducts the learning and modeling in physical space and uses measurement data
as nodal values. We present a DNN structure that has a direct correspondence to
the evolution operator of the underlying PDE, thus establishing the existence
of the DNN model. The DNN model also does not require any geometric information
of the data nodes. Consequently, a trained DNN defines a predictive model for
the underlying unknown PDE over structureless grids. A set of examples,
including linear and nonlinear scalar PDE, system of PDEs, in both one
dimension and two dimensions, over structured and unstructured grids, are
presented to demonstrate the effectiveness of the proposed DNN modeling.
Extension to other equations such as differential-integral equations is also
discussed.
Related papers
- Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - A Survey on Solving and Discovering Differential Equations Using Deep
Neural Networks [1.0055663066199056]
Ordinary and partial differential equations (DE) are used extensively in scientific and mathematical domains to model physical systems.
Current literature has focused primarily on deep neural network (DNN) based methods for solving a specific DE or a family of DEs.
This paper surveys and classifies previous works and provides an educational tutorial for senior practitioners, professionals, and graduate students in engineering and computer science.
arXiv Detail & Related papers (2023-04-26T20:14:25Z) - Learning Low Dimensional State Spaces with Overparameterized Recurrent
Neural Nets [57.06026574261203]
We provide theoretical evidence for learning low-dimensional state spaces, which can also model long-term memory.
Experiments corroborate our theory, demonstrating extrapolation via learning low-dimensional state spaces with both linear and non-linear RNNs.
arXiv Detail & Related papers (2022-10-25T14:45:15Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - GCN-FFNN: A Two-Stream Deep Model for Learning Solution to Partial
Differential Equations [3.5665681694253903]
This paper introduces a novel two-stream deep model based on graph convolutional network (GCN) architecture and feed-forward neural networks (FFNN)
The proposed GCN-FFNN model learns from two types of input representations, i.e. grid and graph data, obtained via the discretization of the PDE domain.
The obtained numerical results demonstrate the applicability and efficiency of the proposed GCN-FFNN model over individual GCN and FFNN models.
arXiv Detail & Related papers (2022-04-28T19:16:31Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - NeuralPDE: Modelling Dynamical Systems from Data [0.44259821861543996]
We propose NeuralPDE, a model which combines convolutional neural networks (CNNs) with differentiable ODE solvers to model dynamical systems.
We show that the Method of Lines used in standard PDE solvers can be represented using convolutions which makes CNNs the natural choice to parametrize arbitrary PDE dynamics.
Our model can be applied to any data without requiring any prior knowledge about the governing PDE.
arXiv Detail & Related papers (2021-11-15T10:59:52Z) - GrADE: A graph based data-driven solver for time-dependent nonlinear
partial differential equations [0.0]
We propose a novel framework referred to as the Graph Attention Differential Equation (GrADE) for solving time dependent nonlinear PDEs.
The proposed approach couples FNN, graph neural network, and recently developed Neural ODE framework.
Results obtained illustrate the capability of the proposed framework in modeling PDE and its scalability to larger domains without the need for retraining.
arXiv Detail & Related papers (2021-08-24T10:49:03Z) - Incorporating NODE with Pre-trained Neural Differential Operator for
Learning Dynamics [73.77459272878025]
We propose to enhance the supervised signal in learning dynamics by pre-training a neural differential operator (NDO)
NDO is pre-trained on a class of symbolic functions, and it learns the mapping between the trajectory samples of these functions to their derivatives.
We provide theoretical guarantee on that the output of NDO can well approximate the ground truth derivatives by proper tuning the complexity of the library.
arXiv Detail & Related papers (2021-06-08T08:04:47Z) - Evolutional Deep Neural Network [0.0]
An Evolutional Deep Neural Network (EDNN) is introduced for the solution of partial differential equations (PDE)
By marching the neural network weights in the parameter space, EDNN can predict state-space trajectories that are indefinitely long.
Several applications including the heat equation, the advection equation, the Burgers equation, the Kuramoto Sivashinsky equation and the Navier-Stokes equations are solved.
arXiv Detail & Related papers (2021-03-18T00:33:11Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.