Neural Delay Differential Equations: System Reconstruction and Image
Classification
- URL: http://arxiv.org/abs/2304.05310v1
- Date: Tue, 11 Apr 2023 16:09:28 GMT
- Title: Neural Delay Differential Equations: System Reconstruction and Image
Classification
- Authors: Qunxi Zhu, Yao Guo, Wei Lin
- Abstract summary: We propose a new class of continuous-depth neural networks with delay, named Neural Delay Differential Equations (NDDEs)
Compared to NODEs, NDDEs have a stronger capacity of nonlinear representations.
We achieve lower loss and higher accuracy not only for the data produced synthetically but also for the CIFAR10, a well-known image dataset.
- Score: 14.59919398960571
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Ordinary Differential Equations (NODEs), a framework of
continuous-depth neural networks, have been widely applied, showing exceptional
efficacy in coping with representative datasets. Recently, an augmented
framework has been developed to overcome some limitations that emerged in the
application of the original framework. In this paper, we propose a new class of
continuous-depth neural networks with delay, named Neural Delay Differential
Equations (NDDEs). To compute the corresponding gradients, we use the adjoint
sensitivity method to obtain the delayed dynamics of the adjoint. Differential
equations with delays are typically seen as dynamical systems of infinite
dimension that possess more fruitful dynamics. Compared to NODEs, NDDEs have a
stronger capacity of nonlinear representations. We use several illustrative
examples to demonstrate this outstanding capacity. Firstly, we successfully
model the delayed dynamics where the trajectories in the lower-dimensional
phase space could be mutually intersected and even chaotic in a model-free or
model-based manner. Traditional NODEs, without any argumentation, are not
directly applicable for such modeling. Secondly, we achieve lower loss and
higher accuracy not only for the data produced synthetically by complex models
but also for the CIFAR10, a well-known image dataset. Our results on the NDDEs
demonstrate that appropriately articulating the elements of dynamical systems
into the network design is truly beneficial in promoting network performance.
Related papers
- PhyMPGN: Physics-encoded Message Passing Graph Network for spatiotemporal PDE systems [31.006807854698376]
We propose a new graph learning approach, namely, Physics-encoded Message Passing Graph Network (PhyMPGN)
We incorporate a GNN into a numerical integrator to approximate the temporal marching of partialtemporal dynamics for a given PDE system.
PhyMPGN is capable of accurately predicting various types of operatortemporal dynamics on coarse unstructured meshes.
arXiv Detail & Related papers (2024-10-02T08:54:18Z) - Latent Neural PDE Solver: a reduced-order modelling framework for
partial differential equations [6.173339150997772]
We propose to learn the dynamics of the system in the latent space with much coarser discretizations.
A non-linear autoencoder is first trained to project the full-order representation of the system onto the mesh-reduced space.
We showcase that it has competitive accuracy and efficiency compared to the neural PDE solver that operates on full-order space.
arXiv Detail & Related papers (2024-02-27T19:36:27Z) - Semi-Supervised Learning of Dynamical Systems with Neural Ordinary
Differential Equations: A Teacher-Student Model Approach [10.20098335268973]
TS-NODE is the first semi-supervised approach to modeling dynamical systems with NODE.
We show significant performance improvements over a baseline Neural ODE model on multiple dynamical system modeling tasks.
arXiv Detail & Related papers (2023-10-19T19:17:12Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - Accelerating Neural ODEs Using Model Order Reduction [0.0]
We show that mathematical model order reduction methods can be used for compressing and accelerating Neural ODEs.
We implement our novel compression method by developing Neural ODEs that integrate the necessary subspace-projection and operations as layers of the neural network.
arXiv Detail & Related papers (2021-05-28T19:27:09Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Neural Delay Differential Equations [9.077775405204347]
We propose a new class of continuous-depth neural networks with delay, named as Neural Delay Differential Equations (NDDEs)
For computing the corresponding gradients, we use the adjoint sensitivity method to obtain the delayed dynamics of the adjoint.
Our results reveal that appropriately articulating the elements of dynamical systems into the network design is truly beneficial to promoting the network performance.
arXiv Detail & Related papers (2021-02-22T06:53:51Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Time Dependence in Non-Autonomous Neural ODEs [74.78386661760662]
We propose a novel family of Neural ODEs with time-varying weights.
We outperform previous Neural ODE variants in both speed and representational capacity.
arXiv Detail & Related papers (2020-05-05T01:41:46Z) - Stochasticity in Neural ODEs: An Empirical Study [68.8204255655161]
Regularization of neural networks (e.g. dropout) is a widespread technique in deep learning that allows for better generalization.
We show that data augmentation during the training improves the performance of both deterministic and versions of the same model.
However, the improvements obtained by the data augmentation completely eliminate the empirical regularization gains, making the performance of neural ODE and neural SDE negligible.
arXiv Detail & Related papers (2020-02-22T22:12:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.