Neural Controlled Differential Equations for Online Prediction Tasks
- URL: http://arxiv.org/abs/2106.11028v1
- Date: Mon, 21 Jun 2021 12:23:45 GMT
- Title: Neural Controlled Differential Equations for Online Prediction Tasks
- Authors: James Morrill, Patrick Kidger, Lingyi Yang, Terry Lyons
- Abstract summary: We show that Neural CDEs are not suitable for use in textitonline prediction tasks, where predictions need to be made in real-time.
Here, we identify several theoretical conditions that schemes for Neural CDEs should satisfy, such as boundedness and uniqueness.
Second, we use these to motivate the introduction of new schemes that address these conditions, offering in particular measurability.
Third, we empirically benchmark our online Neural CDE model on three continuous monitoring tasks from the MIMIC-IV medical database.
- Score: 4.6453787256723365
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural controlled differential equations (Neural CDEs) are a continuous-time
extension of recurrent neural networks (RNNs), achieving state-of-the-art
(SOTA) performance at modelling functions of irregular time series. In order to
interpret discrete data in continuous time, current implementations rely on
non-causal interpolations of the data. This is fine when the whole time series
is observed in advance, but means that Neural CDEs are not suitable for use in
\textit{online prediction tasks}, where predictions need to be made in
real-time: a major use case for recurrent networks. Here, we show how this
limitation may be rectified. First, we identify several theoretical conditions
that interpolation schemes for Neural CDEs should satisfy, such as boundedness
and uniqueness. Second, we use these to motivate the introduction of new
schemes that address these conditions, offering in particular measurability
(for online prediction), and smoothness (for speed). Third, we empirically
benchmark our online Neural CDE model on three continuous monitoring tasks from
the MIMIC-IV medical database: we demonstrate improved performance on all tasks
against ODE benchmarks, and on two of the three tasks against SOTA non-ODE
benchmarks.
Related papers
- Temporal Dynamic Embedding for Irregularly Sampled Time Series [0.15346678870160887]
temporal dynamic embedding (TDE) enables neural network models to receive data that change the number of variables over time.
Experiment was conducted on three clinical datasets: PhysioNet 2012, MIMIC-III, and PhysioNet 2019.
arXiv Detail & Related papers (2025-04-08T07:49:22Z) - Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.
We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Networked Time Series Imputation via Position-aware Graph Enhanced
Variational Autoencoders [31.953958053709805]
We design a new model named PoGeVon which leverages variational autoencoder (VAE) to predict missing values over both node time series features and graph structures.
Experiment results demonstrate the effectiveness of our model over baselines.
arXiv Detail & Related papers (2023-05-29T21:11:34Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Brain-Inspired Spiking Neural Network for Online Unsupervised Time
Series Prediction [13.521272923545409]
We present a novel Continuous Learning-based Unsupervised Recurrent Spiking Neural Network Model (CLURSNN)
CLURSNN makes online predictions by reconstructing the underlying dynamical system using Random Delay Embedding.
We show that the proposed online time series prediction methodology outperforms state-of-the-art DNN models when predicting an evolving Lorenz63 dynamical system.
arXiv Detail & Related papers (2023-04-10T16:18:37Z) - Uncovering the Missing Pattern: Unified Framework Towards Trajectory
Imputation and Prediction [60.60223171143206]
Trajectory prediction is a crucial undertaking in understanding entity movement or human behavior from observed sequences.
Current methods often assume that the observed sequences are complete while ignoring the potential for missing values.
This paper presents a unified framework, the Graph-based Conditional Variational Recurrent Neural Network (GC-VRNN), which can perform trajectory imputation and prediction simultaneously.
arXiv Detail & Related papers (2023-03-28T14:27:27Z) - STING: Self-attention based Time-series Imputation Networks using GAN [4.052758394413726]
STING (Self-attention based Time-series Imputation Networks using GAN) is proposed.
We take advantage of generative adversarial networks and bidirectional recurrent neural networks to learn latent representations of the time series.
Experimental results on three real-world datasets demonstrate that STING outperforms the existing state-of-the-art methods in terms of imputation accuracy.
arXiv Detail & Related papers (2022-09-22T06:06:56Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.