Learning the Delay Using Neural Delay Differential Equations
- URL: http://arxiv.org/abs/2304.01329v2
- Date: Wed, 14 Jun 2023 16:51:05 GMT
- Title: Learning the Delay Using Neural Delay Differential Equations
- Authors: Maria Oprea and Mark Walth and Robert Stephany and Gabriella Torres
Nothaft and Arnaldo Rodriguez-Gonzalez and William Clark
- Abstract summary: We develop a continuous time neural network approach based on Delay Differential Equations (DDEs)
Our model uses the adjoint sensitivity method to learn the model parameters and delay directly from data.
We conclude our discussion with potential future directions and applications.
- Score: 0.5505013339790825
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The intersection of machine learning and dynamical systems has generated
considerable interest recently. Neural Ordinary Differential Equations (NODEs)
represent a rich overlap between these fields. In this paper, we develop a
continuous time neural network approach based on Delay Differential Equations
(DDEs). Our model uses the adjoint sensitivity method to learn the model
parameters and delay directly from data. Our approach is inspired by that of
NODEs and extends earlier neural DDE models, which have assumed that the value
of the delay is known a priori. We perform a sensitivity analysis on our
proposed approach and demonstrate its ability to learn DDE parameters from
benchmark systems. We conclude our discussion with potential future directions
and applications.
Related papers
- A Deep Neural Network Framework for Solving Forward and Inverse Problems in Delay Differential Equations [12.888147363070749]
We propose a unified framework for delay differential equations (DDEs) based on deep neural networks (DNNs)
This framework could embed delay differential equations into neural networks to accommodate the diverse requirements of DDEs.
In addressing inverse problems, the NDDE framework can utilize observational data to perform precise estimation of single or multiple delay parameters.
arXiv Detail & Related papers (2024-08-17T13:41:34Z) - Time and State Dependent Neural Delay Differential Equations [0.5249805590164901]
Delayed terms are encountered in the governing equations of a large class of problems ranging from physics and engineering to medicine and economics.
We introduce Neural State-Dependent DDE, a framework that can model multiple and state- and time-dependent delays.
We show that our method is competitive and outperforms other continuous-class models on a wide variety of delayed dynamical systems.
arXiv Detail & Related papers (2023-06-26T09:35:56Z) - Neural Delay Differential Equations: System Reconstruction and Image
Classification [14.59919398960571]
We propose a new class of continuous-depth neural networks with delay, named Neural Delay Differential Equations (NDDEs)
Compared to NODEs, NDDEs have a stronger capacity of nonlinear representations.
We achieve lower loss and higher accuracy not only for the data produced synthetically but also for the CIFAR10, a well-known image dataset.
arXiv Detail & Related papers (2023-04-11T16:09:28Z) - Neural Basis Functions for Accelerating Solutions to High Mach Euler
Equations [63.8376359764052]
We propose an approach to solving partial differential equations (PDEs) using a set of neural networks.
We regress a set of neural networks onto a reduced order Proper Orthogonal Decomposition (POD) basis.
These networks are then used in combination with a branch network that ingests the parameters of the prescribed PDE to compute a reduced order approximation to the PDE.
arXiv Detail & Related papers (2022-08-02T18:27:13Z) - Learning Time Delay Systems with Neural Ordinary Differential Equations [0.0]
A neural network with trainable delays is used to approximate a delay differential equation.
An example on learning the dynamics of the Mackey-Glass equation using data from chaotic behavior is given.
arXiv Detail & Related papers (2022-06-28T20:59:44Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Incorporating NODE with Pre-trained Neural Differential Operator for
Learning Dynamics [73.77459272878025]
We propose to enhance the supervised signal in learning dynamics by pre-training a neural differential operator (NDO)
NDO is pre-trained on a class of symbolic functions, and it learns the mapping between the trajectory samples of these functions to their derivatives.
We provide theoretical guarantee on that the output of NDO can well approximate the ground truth derivatives by proper tuning the complexity of the library.
arXiv Detail & Related papers (2021-06-08T08:04:47Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Learning continuous-time PDEs from sparse data with graph neural
networks [10.259254824702555]
We propose a continuous-time differential model for dynamical systems whose governing equations are parameterized by message passing graph neural networks.
We demonstrate the model's ability to work with unstructured grids, arbitrary time steps, and noisy observations.
We compare our method with existing approaches on several well-known physical systems that involve first and higher-order PDEs with state-of-the-art predictive performance.
arXiv Detail & Related papers (2020-06-16T07:15:40Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Time Dependence in Non-Autonomous Neural ODEs [74.78386661760662]
We propose a novel family of Neural ODEs with time-varying weights.
We outperform previous Neural ODE variants in both speed and representational capacity.
arXiv Detail & Related papers (2020-05-05T01:41:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.