Neural State-Dependent Delay Differential Equations
- URL: http://arxiv.org/abs/2306.14545v1
- Date: Mon, 26 Jun 2023 09:35:56 GMT
- Title: Neural State-Dependent Delay Differential Equations
- Authors: Thibault Monsel (DATAFLOT, TAU), Onofrio Semeraro (DATAFLOT), Lionel
Mathelin (DATAFLOT), Guillaume Charpiat (TAU)
- Abstract summary: Discontinuities and delayed terms are encountered in the governing equations of a large class of problems ranging from physics to medicine.
We revisit the recently proposed Neural DDE by introducing Neural State-Dependent DDE (SDDDE), a general and flexible framework featuring multiple and state-dependent delays.
We show that our method is competitive and outperforms other continuous-class models on a wide variety of delayed dynamical systems.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Discontinuities and delayed terms are encountered in the governing equations
of a large class of problems ranging from physics, engineering, medicine to
economics. These systems are impossible to be properly modelled and simulated
with standard Ordinary Differential Equations (ODE), or any data-driven
approximation including Neural Ordinary Differential Equations (NODE). To
circumvent this issue, latent variables are typically introduced to solve the
dynamics of the system in a higher dimensional space and obtain the solution as
a projection to the original space. However, this solution lacks physical
interpretability. In contrast, Delay Differential Equations (DDEs) and their
data-driven, approximated counterparts naturally appear as good candidates to
characterize such complicated systems. In this work we revisit the recently
proposed Neural DDE by introducing Neural State-Dependent DDE (SDDDE), a
general and flexible framework featuring multiple and state-dependent delays.
The developed framework is auto-differentiable and runs efficiently on multiple
backends. We show that our method is competitive and outperforms other
continuous-class models on a wide variety of delayed dynamical systems.
Related papers
- A Deep Neural Network Framework for Solving Forward and Inverse Problems in Delay Differential Equations [12.888147363070749]
We propose a unified framework for delay differential equations (DDEs) based on deep neural networks (DNNs)
This framework could embed delay differential equations into neural networks to accommodate the diverse requirements of DDEs.
In addressing inverse problems, the NDDE framework can utilize observational data to perform precise estimation of single or multiple delay parameters.
arXiv Detail & Related papers (2024-08-17T13:41:34Z) - Individualized Dosing Dynamics via Neural Eigen Decomposition [51.62933814971523]
We introduce the Neural Eigen Differential Equation algorithm (NESDE)
NESDE provides individualized modeling, tunable generalization to new treatment policies, and fast, continuous, closed-form prediction.
We demonstrate the robustness of NESDE in both synthetic and real medical problems, and use the learned dynamics to publish simulated medical gym environments.
arXiv Detail & Related papers (2023-06-24T17:01:51Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - NeuralPDE: Modelling Dynamical Systems from Data [0.44259821861543996]
We propose NeuralPDE, a model which combines convolutional neural networks (CNNs) with differentiable ODE solvers to model dynamical systems.
We show that the Method of Lines used in standard PDE solvers can be represented using convolutions which makes CNNs the natural choice to parametrize arbitrary PDE dynamics.
Our model can be applied to any data without requiring any prior knowledge about the governing PDE.
arXiv Detail & Related papers (2021-11-15T10:59:52Z) - Continuous Convolutional Neural Networks: Coupled Neural PDE and ODE [1.1897857181479061]
This work proposes a variant of Convolutional Neural Networks (CNNs) that can learn the hidden dynamics of a physical system.
Instead of considering the physical system such as image, time -series as a system of multiple layers, this new technique can model a system in the form of Differential Equation (DEs)
arXiv Detail & Related papers (2021-10-30T21:45:00Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Neural SDEs as Infinite-Dimensional GANs [18.07683058213448]
We show that the current classical approach to fitting SDEs may be approached as a special case of (Wasserstein) GANs.
We obtain Neural SDEs as (in modern machine learning parlance) continuous-time generative time series models.
arXiv Detail & Related papers (2021-02-06T19:59:15Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Time Dependence in Non-Autonomous Neural ODEs [74.78386661760662]
We propose a novel family of Neural ODEs with time-varying weights.
We outperform previous Neural ODE variants in both speed and representational capacity.
arXiv Detail & Related papers (2020-05-05T01:41:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.