A Deep Neural Network Framework for Solving Forward and Inverse Problems in Delay Differential Equations
- URL: http://arxiv.org/abs/2408.09202v2
- Date: Sat, 24 Aug 2024 16:25:32 GMT
- Title: A Deep Neural Network Framework for Solving Forward and Inverse Problems in Delay Differential Equations
- Authors: Housen Wang, Yuxing Chen, Sirong Cao, Xiaoli Wang, Qiang Liu,
- Abstract summary: We propose a unified framework for delay differential equations (DDEs) based on deep neural networks (DNNs)
This framework could embed delay differential equations into neural networks to accommodate the diverse requirements of DDEs.
In addressing inverse problems, the NDDE framework can utilize observational data to perform precise estimation of single or multiple delay parameters.
- Score: 12.888147363070749
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a unified framework for delay differential equations (DDEs) based on deep neural networks (DNNs) - the neural delay differential equations (NDDEs), aimed at solving the forward and inverse problems of delay differential equations. This framework could embed delay differential equations into neural networks to accommodate the diverse requirements of DDEs in terms of initial conditions, control equations, and known data. NDDEs adjust the network parameters through automatic differentiation and optimization algorithms to minimize the loss function, thereby obtaining numerical solutions to the delay differential equations without the grid dependence and polynomial interpolation typical of traditional numerical methods. In addressing inverse problems, the NDDE framework can utilize observational data to perform precise estimation of single or multiple delay parameters, which is very important in practical mathematical modeling. The results of multiple numerical experiments have shown that NDDEs demonstrate high precision in both forward and inverse problems, proving their effectiveness and promising potential in dealing with delayed differential equation issues.
Related papers
- Reduced-order modeling for parameterized PDEs via implicit neural
representations [4.135710717238787]
We present a new data-driven reduced-order modeling approach to efficiently solve parametrized partial differential equations (PDEs)
The proposed framework encodes PDE and utilizes a parametrized neural ODE (PNODE) to learn latent dynamics characterized by multiple PDE parameters.
We evaluate the proposed method at a large Reynolds number and obtain up to speedup of O(103) and 1% relative error to the ground truth values.
arXiv Detail & Related papers (2023-11-28T01:35:06Z) - Time and State Dependent Neural Delay Differential Equations [0.5249805590164901]
Delayed terms are encountered in the governing equations of a large class of problems ranging from physics and engineering to medicine and economics.
We introduce Neural State-Dependent DDE, a framework that can model multiple and state- and time-dependent delays.
We show that our method is competitive and outperforms other continuous-class models on a wide variety of delayed dynamical systems.
arXiv Detail & Related papers (2023-06-26T09:35:56Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Learning the Delay Using Neural Delay Differential Equations [0.5505013339790825]
We develop a continuous time neural network approach based on Delay Differential Equations (DDEs)
Our model uses the adjoint sensitivity method to learn the model parameters and delay directly from data.
We conclude our discussion with potential future directions and applications.
arXiv Detail & Related papers (2023-04-03T19:50:36Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - High Precision Differentiation Techniques for Data-Driven Solution of
Nonlinear PDEs by Physics-Informed Neural Networks [0.0]
Time-dependent Partial Differential Equations with given initial conditions are considered in this paper.
New differentiation techniques of the unknown solution with respect to time variable are proposed.
arXiv Detail & Related papers (2022-10-02T13:36:01Z) - Neural Basis Functions for Accelerating Solutions to High Mach Euler
Equations [63.8376359764052]
We propose an approach to solving partial differential equations (PDEs) using a set of neural networks.
We regress a set of neural networks onto a reduced order Proper Orthogonal Decomposition (POD) basis.
These networks are then used in combination with a branch network that ingests the parameters of the prescribed PDE to compute a reduced order approximation to the PDE.
arXiv Detail & Related papers (2022-08-02T18:27:13Z) - LordNet: An Efficient Neural Network for Learning to Solve Parametric Partial Differential Equations without Simulated Data [47.49194807524502]
We propose LordNet, a tunable and efficient neural network for modeling entanglements.
The experiments on solving Poisson's equation and (2D and 3D) Navier-Stokes equation demonstrate that the long-range entanglements can be well modeled by the LordNet.
arXiv Detail & Related papers (2022-06-19T14:41:08Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Neural Delay Differential Equations [9.077775405204347]
We propose a new class of continuous-depth neural networks with delay, named as Neural Delay Differential Equations (NDDEs)
For computing the corresponding gradients, we use the adjoint sensitivity method to obtain the delayed dynamics of the adjoint.
Our results reveal that appropriately articulating the elements of dynamical systems into the network design is truly beneficial to promoting the network performance.
arXiv Detail & Related papers (2021-02-22T06:53:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.