Attentive Neural Controlled Differential Equations for Time-series
Classification and Forecasting
- URL: http://arxiv.org/abs/2109.01876v1
- Date: Sat, 4 Sep 2021 14:17:01 GMT
- Title: Attentive Neural Controlled Differential Equations for Time-series
Classification and Forecasting
- Authors: Sheo Yon Jhin, Heejoo Shin, Seoyoung Hong, Solhee Park, Noseong Park
- Abstract summary: We present the method of Attentive Neural Controlled Differential Equations (ANCDEs) for time-series classification and forecasting.
Our method consistently shows the best accuracy in all cases by non-trivial margins.
Our visualizations also show that the presented attention mechanism works as intended by focusing on crucial information.
- Score: 3.673363968661099
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Neural networks inspired by differential equations have proliferated for the
past several years. Neural ordinary differential equations (NODEs) and neural
controlled differential equations (NCDEs) are two representative examples of
them. In theory, NCDEs provide better representation learning capability for
time-series data than NODEs. In particular, it is known that NCDEs are suitable
for processing irregular time-series data. Whereas NODEs have been successfully
extended after adopting attention, however, it had not been studied yet how to
integrate attention into NCDEs. To this end, we present the method of Attentive
Neural Controlled Differential Equations (ANCDEs) for time-series
classification and forecasting, where dual NCDEs are used: one for generating
attention values, and the other for evolving hidden vectors for a downstream
machine learning task. We conduct experiments with three real-world time-series
datasets and 10 baselines. After dropping some values, we also conduct
irregular time-series experiments. Our method consistently shows the best
accuracy in all cases by non-trivial margins. Our visualizations also show that
the presented attention mechanism works as intended by focusing on crucial
information.
Related papers
- Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.
We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Continuous Depth Recurrent Neural Differential Equations [0.0]
We propose continuous depth recurrent neural differential equations (CDR-NDE) to generalize RNN models.
CDR-NDE considers two separate differential equations over each of these dimensions and models the evolution in the temporal and depth directions.
We also propose the CDR-NDE-heat model based on partial differential equations which treats the computation of hidden states as solving a heat equation over time.
arXiv Detail & Related papers (2022-12-28T06:34:32Z) - EXIT: Extrapolation and Interpolation-based Neural Controlled
Differential Equations for Time-series Classification and Forecasting [19.37382379378985]
neural controlled differential equations (NCDEs) are considered as a breakthrough in deep learning.
In this work, we enhance NCDEs by redesigning their core part, i.e., generating a continuous path from a discrete time-series input.
Our NCDE design can use both the extrapolation and the extrapolated information for downstream machine learning tasks.
arXiv Detail & Related papers (2022-04-19T09:37:36Z) - Graph Neural Controlled Differential Equations for Traffic Forecasting [4.012886243094023]
Traffic is one of the most popular-temporal tasks in the field of machine learning.
In this paper we present the method of graph neural controlled differential equations (NCDEs)
We extend the concept to design two NCDEs: one for the temporal processing and the other for the spatial processing.
arXiv Detail & Related papers (2021-12-07T08:14:10Z) - Time Series Forecasting with Ensembled Stochastic Differential Equations
Driven by L\'evy Noise [2.3076895420652965]
We use a collection of SDEs equipped with neural networks to predict long-term trend of noisy time series.
Our contributions are, first, we use the phase space reconstruction method to extract intrinsic dimension of the time series data.
Second, we explore SDEs driven by $alpha$-stable L'evy motion to model the time series data and solve the problem through neural network approximation.
arXiv Detail & Related papers (2021-11-25T16:49:01Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Learning Continuous-Time Dynamics by Stochastic Differential Networks [32.63114111531396]
We propose a flexible continuous-time recurrent neural network named Variational Differential Networks (VSDN)
VSDN embeds the complicated dynamics of the sporadic time series by neural Differential Equations (SDE)
We show that VSDNs outperform state-of-the-art continuous-time deep learning models and achieve remarkable performance on prediction and tasks for sporadic time series.
arXiv Detail & Related papers (2020-06-11T01:40:34Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Stochasticity in Neural ODEs: An Empirical Study [68.8204255655161]
Regularization of neural networks (e.g. dropout) is a widespread technique in deep learning that allows for better generalization.
We show that data augmentation during the training improves the performance of both deterministic and versions of the same model.
However, the improvements obtained by the data augmentation completely eliminate the empirical regularization gains, making the performance of neural ODE and neural SDE negligible.
arXiv Detail & Related papers (2020-02-22T22:12:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.