Learning Time Delay Systems with Neural Ordinary Differential Equations
- URL: http://arxiv.org/abs/2206.14288v1
- Date: Tue, 28 Jun 2022 20:59:44 GMT
- Title: Learning Time Delay Systems with Neural Ordinary Differential Equations
- Authors: Xunbi A. Ji and Gabor Orosz
- Abstract summary: A neural network with trainable delays is used to approximate a delay differential equation.
An example on learning the dynamics of the Mackey-Glass equation using data from chaotic behavior is given.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A novel way of using neural networks to learn the dynamics of time delay
systems from sequential data is proposed. A neural network with trainable
delays is used to approximate the right hand side of a delay differential
equation. We relate the delay differential equation to an ordinary differential
equation by discretizing the time history and train the corresponding neural
ordinary differential equation (NODE) to learn the dynamics. An example on
learning the dynamics of the Mackey-Glass equation using data from chaotic
behavior is given. After learning both the nonlinearity and the time delay, we
demonstrate that the bifurcation diagram of the neural network matches that of
the original system.
Related papers
- A Deep Neural Network Framework for Solving Forward and Inverse Problems in Delay Differential Equations [12.888147363070749]
We propose a unified framework for delay differential equations (DDEs) based on deep neural networks (DNNs)
This framework could embed delay differential equations into neural networks to accommodate the diverse requirements of DDEs.
In addressing inverse problems, the NDDE framework can utilize observational data to perform precise estimation of single or multiple delay parameters.
arXiv Detail & Related papers (2024-08-17T13:41:34Z) - Learning the Delay Using Neural Delay Differential Equations [0.5505013339790825]
We develop a continuous time neural network approach based on Delay Differential Equations (DDEs)
Our model uses the adjoint sensitivity method to learn the model parameters and delay directly from data.
We conclude our discussion with potential future directions and applications.
arXiv Detail & Related papers (2023-04-03T19:50:36Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - On the balance between the training time and interpretability of neural
ODE for time series modelling [77.34726150561087]
The paper shows that modern neural ODE cannot be reduced to simpler models for time-series modelling applications.
The complexity of neural ODE is compared to or exceeds the conventional time-series modelling tools.
We propose a new view on time-series modelling using combined neural networks and an ODE system approach.
arXiv Detail & Related papers (2022-06-07T13:49:40Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - On Neural Differential Equations [13.503274710499971]
In particular, neural differential equations (NDEs) demonstrate that neural networks and differential equations are two sides of the same coin.
NDEs are suitable for tackling generative problems, dynamical systems, and time series.
NDEs offer high-capacity function approximation, strong priors on model space, the ability to handle irregular data, memory efficiency, and a wealth of available theory on both sides.
arXiv Detail & Related papers (2022-02-04T23:32:29Z) - Time-Reversal Symmetric ODE Network [138.02741983098454]
Time-reversal symmetry is a fundamental property that frequently holds in classical and quantum mechanics.
We propose a novel loss function that measures how well our ordinary differential equation (ODE) networks comply with this time-reversal symmetry.
We show that, even for systems that do not possess the full time-reversal symmetry, TRS-ODENs can achieve better predictive performances over baselines.
arXiv Detail & Related papers (2020-07-22T12:19:40Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Finite Difference Neural Networks: Fast Prediction of Partial
Differential Equations [5.575293536755126]
We propose a novel neural network framework, finite difference neural networks (FDNet), to learn partial differential equations from data.
Specifically, our proposed finite difference inspired network is designed to learn the underlying governing partial differential equations from trajectory data.
arXiv Detail & Related papers (2020-06-02T19:17:58Z) - Time Dependence in Non-Autonomous Neural ODEs [74.78386661760662]
We propose a novel family of Neural ODEs with time-varying weights.
We outperform previous Neural ODE variants in both speed and representational capacity.
arXiv Detail & Related papers (2020-05-05T01:41:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.