EXIT: Extrapolation and Interpolation-based Neural Controlled
Differential Equations for Time-series Classification and Forecasting
- URL: http://arxiv.org/abs/2204.08771v1
- Date: Tue, 19 Apr 2022 09:37:36 GMT
- Title: EXIT: Extrapolation and Interpolation-based Neural Controlled
Differential Equations for Time-series Classification and Forecasting
- Authors: Sheo Yon Jhin, Jaehoon Lee, Minju Jo, Seungji Kook, Jinsung Jeon,
Jihyeon Hyeong, Jayoung Kim, Noseong Park
- Abstract summary: neural controlled differential equations (NCDEs) are considered as a breakthrough in deep learning.
In this work, we enhance NCDEs by redesigning their core part, i.e., generating a continuous path from a discrete time-series input.
Our NCDE design can use both the extrapolation and the extrapolated information for downstream machine learning tasks.
- Score: 19.37382379378985
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning inspired by differential equations is a recent research trend
and has marked the state of the art performance for many machine learning
tasks. Among them, time-series modeling with neural controlled differential
equations (NCDEs) is considered as a breakthrough. In many cases, NCDE-based
models not only provide better accuracy than recurrent neural networks (RNNs)
but also make it possible to process irregular time-series. In this work, we
enhance NCDEs by redesigning their core part, i.e., generating a continuous
path from a discrete time-series input. NCDEs typically use interpolation
algorithms to convert discrete time-series samples to continuous paths.
However, we propose to i) generate another latent continuous path using an
encoder-decoder architecture, which corresponds to the interpolation process of
NCDEs, i.e., our neural network-based interpolation vs. the existing explicit
interpolation, and ii) exploit the generative characteristic of the decoder,
i.e., extrapolation beyond the time domain of original data if needed.
Therefore, our NCDE design can use both the interpolated and the extrapolated
information for downstream machine learning tasks. In our experiments with 5
real-world datasets and 12 baselines, our extrapolation and interpolation-based
NCDEs outperform existing baselines by non-trivial margins.
Related papers
- PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Time-Parameterized Convolutional Neural Networks for Irregularly Sampled
Time Series [26.77596449192451]
Irregularly sampled time series are ubiquitous in several application domains, leading to sparse, not fully-observed and non-aligned observations.
Standard sequential neural networks (RNNs) and convolutional neural networks (CNNs) consider regular spacing between observation times, posing significant challenges to irregular time series modeling.
We parameterize convolutional layers by employing time-explicitly irregular kernels.
arXiv Detail & Related papers (2023-08-06T21:10:30Z) - Convolutional Neural Operators for robust and accurate learning of PDEs [11.562748612983956]
We present novel adaptations for convolutional neural networks to process functions as inputs and outputs.
The resulting architecture is termed as convolutional neural operators (CNOs)
We prove a universality theorem to show that CNOs can approximate operators arising in PDEs to desired accuracy.
arXiv Detail & Related papers (2023-02-02T15:54:45Z) - Learnable Path in Neural Controlled Differential Equations [11.38331901271794]
Neural controlled differential equations (NCDEs) are a specialized model in (irregular) time-series processing.
We present a method to generate another latent path, which is identical to learning an appropriate method.
We design an encoder-decoder module based on NCDEs and NODEs, and a special training method for it.
arXiv Detail & Related papers (2023-01-11T07:05:27Z) - Continuous Depth Recurrent Neural Differential Equations [0.0]
We propose continuous depth recurrent neural differential equations (CDR-NDE) to generalize RNN models.
CDR-NDE considers two separate differential equations over each of these dimensions and models the evolution in the temporal and depth directions.
We also propose the CDR-NDE-heat model based on partial differential equations which treats the computation of hidden states as solving a heat equation over time.
arXiv Detail & Related papers (2022-12-28T06:34:32Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Attentive Neural Controlled Differential Equations for Time-series
Classification and Forecasting [3.673363968661099]
We present the method of Attentive Neural Controlled Differential Equations (ANCDEs) for time-series classification and forecasting.
Our method consistently shows the best accuracy in all cases by non-trivial margins.
Our visualizations also show that the presented attention mechanism works as intended by focusing on crucial information.
arXiv Detail & Related papers (2021-09-04T14:17:01Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.