Neural ODEs for Informative Missingness in Multivariate Time Series
- URL: http://arxiv.org/abs/2005.10693v1
- Date: Wed, 20 May 2020 00:28:30 GMT
- Title: Neural ODEs for Informative Missingness in Multivariate Time Series
- Authors: Mansura Habiba, Barak A. Pearlmutter
- Abstract summary: Practical applications, e.g., sensor data, healthcare, weather, generates data that is in truth continuous in time.
Deep learning model called GRU-D is one early attempt to address informative missingness in time series data.
New family of neural networks called Neural ODEs are natural and efficient for processing time series data which is continuous in time.
- Score: 0.7233897166339269
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Informative missingness is unavoidable in the digital processing of
continuous time series, where the value for one or more observations at
different time points are missing. Such missing observations are one of the
major limitations of time series processing using deep learning. Practical
applications, e.g., sensor data, healthcare, weather, generates data that is in
truth continuous in time, and informative missingness is a common phenomenon in
these datasets. These datasets often consist of multiple variables, and often
there are missing values for one or many of these variables. This
characteristic makes time series prediction more challenging, and the impact of
missing input observations on the accuracy of the final output can be
significant. A recent novel deep learning model called GRU-D is one early
attempt to address informative missingness in time series data. On the other
hand, a new family of neural networks called Neural ODEs (Ordinary Differential
Equations) are natural and efficient for processing time series data which is
continuous in time. In this paper, a deep learning model is proposed that
leverages the effective imputation of GRU-D, and the temporal continuity of
Neural ODEs. A time series classification task performed on the PhysioNet
dataset demonstrates the performance of this architecture.
Related papers
- Recent Trends in Modelling the Continuous Time Series using Deep Learning: A Survey [0.18434042562191813]
Continuous-time series is essential for different modern application areas, e.g. healthcare, automobile, energy, finance, Internet of things (IoT)
This paper has described the general problem domain of time series and reviewed the challenges of modelling the continuous time series.
arXiv Detail & Related papers (2024-09-13T14:19:44Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Correlation-aware Spatial-Temporal Graph Learning for Multivariate
Time-series Anomaly Detection [67.60791405198063]
We propose a correlation-aware spatial-temporal graph learning (termed CST-GL) for time series anomaly detection.
CST-GL explicitly captures the pairwise correlations via a multivariate time series correlation learning module.
A novel anomaly scoring component is further integrated into CST-GL to estimate the degree of an anomaly in a purely unsupervised manner.
arXiv Detail & Related papers (2023-07-17T11:04:27Z) - STING: Self-attention based Time-series Imputation Networks using GAN [4.052758394413726]
STING (Self-attention based Time-series Imputation Networks using GAN) is proposed.
We take advantage of generative adversarial networks and bidirectional recurrent neural networks to learn latent representations of the time series.
Experimental results on three real-world datasets demonstrate that STING outperforms the existing state-of-the-art methods in terms of imputation accuracy.
arXiv Detail & Related papers (2022-09-22T06:06:56Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Networked Time Series Prediction with Incomplete Data [59.45358694862176]
We propose NETS-ImpGAN, a novel deep learning framework that can be trained on incomplete data with missing values in both history and future.
We conduct extensive experiments on three real-world datasets under different missing patterns and missing rates.
arXiv Detail & Related papers (2021-10-05T18:20:42Z) - Time Series is a Special Sequence: Forecasting with Sample Convolution
and Interaction [9.449017120452675]
Time series is a special type of sequence data, a set of observations collected at even intervals of time and ordered chronologically.
Existing deep learning techniques use generic sequence models for time series analysis, which ignore some of its unique properties.
We propose a novel neural network architecture and apply it for the time series forecasting problem, wherein we conduct sample convolution and interaction at multiple resolutions for temporal modeling.
arXiv Detail & Related papers (2021-06-17T08:15:04Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Time Series Data Imputation: A Survey on Deep Learning Approaches [4.4458738910060775]
Time series data imputation is a well-studied problem with different categories of methods.
Time series methods based on deep learning have made progress with the usage of models like RNN.
We will review and discuss their model architectures, their pros and cons as well as their effects to show the development of the time series imputation methods.
arXiv Detail & Related papers (2020-11-23T11:57:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.