Time Series Data Imputation: A Survey on Deep Learning Approaches
- URL: http://arxiv.org/abs/2011.11347v1
- Date: Mon, 23 Nov 2020 11:57:27 GMT
- Title: Time Series Data Imputation: A Survey on Deep Learning Approaches
- Authors: Chenguang Fang, Chen Wang
- Abstract summary: Time series data imputation is a well-studied problem with different categories of methods.
Time series methods based on deep learning have made progress with the usage of models like RNN.
We will review and discuss their model architectures, their pros and cons as well as their effects to show the development of the time series imputation methods.
- Score: 4.4458738910060775
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series are all around in real-world applications. However, unexpected
accidents for example broken sensors or missing of the signals will cause
missing values in time series, making the data hard to be utilized. It then
does harm to the downstream applications such as traditional classification or
regression, sequential data integration and forecasting tasks, thus raising the
demand for data imputation. Currently, time series data imputation is a
well-studied problem with different categories of methods. However, these works
rarely take the temporal relations among the observations and treat the time
series as normal structured data, losing the information from the time data. In
recent, deep learning models have raised great attention. Time series methods
based on deep learning have made progress with the usage of models like RNN,
since it captures time information from data. In this paper, we mainly focus on
time series imputation technique with deep learning methods, which recently
made progress in this field. We will review and discuss their model
architectures, their pros and cons as well as their effects to show the
development of the time series imputation methods.
Related papers
- Chronos: Learning the Language of Time Series [79.38691251254173]
Chronos is a framework for pretrained probabilistic time series models.
We show that Chronos models can leverage time series data from diverse domains to improve zero-shot accuracy on unseen forecasting tasks.
arXiv Detail & Related papers (2024-03-12T16:53:54Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Pushing the Limits of Pre-training for Time Series Forecasting in the
CloudOps Domain [54.67888148566323]
We introduce three large-scale time series forecasting datasets from the cloud operations domain.
We show it is a strong zero-shot baseline and benefits from further scaling, both in model and dataset size.
Accompanying these datasets and results is a suite of comprehensive benchmark results comparing classical and deep learning baselines to our pre-trained method.
arXiv Detail & Related papers (2023-10-08T08:09:51Z) - Development of a Neural Network-based Method for Improved Imputation of
Missing Values in Time Series Data by Repurposing DataWig [1.8719295298860394]
Missing values in time series data occur often and present obstacles to successful analysis, thus they need to be filled with alternative values, a process called imputation.
Although various approaches have been attempted for robust imputation of time series data, even the most advanced methods still face challenges.
I developed tsDataWig (time-series DataWig) by modifying DataWig, a neural network-based method that possesses the capacity to process large datasets.
Unlike the original DataWig, tsDataWig can directly handle values of time variables and impute missing values in complex time
arXiv Detail & Related papers (2023-08-18T15:53:40Z) - Time-Varying Propensity Score to Bridge the Gap between the Past and Present [104.46387765330142]
We introduce a time-varying propensity score that can detect gradual shifts in the distribution of data.
We demonstrate different ways of implementing it and evaluate it on a variety of problems.
arXiv Detail & Related papers (2022-10-04T07:21:49Z) - STING: Self-attention based Time-series Imputation Networks using GAN [4.052758394413726]
STING (Self-attention based Time-series Imputation Networks using GAN) is proposed.
We take advantage of generative adversarial networks and bidirectional recurrent neural networks to learn latent representations of the time series.
Experimental results on three real-world datasets demonstrate that STING outperforms the existing state-of-the-art methods in terms of imputation accuracy.
arXiv Detail & Related papers (2022-09-22T06:06:56Z) - Networked Time Series Prediction with Incomplete Data [59.45358694862176]
We propose NETS-ImpGAN, a novel deep learning framework that can be trained on incomplete data with missing values in both history and future.
We conduct extensive experiments on three real-world datasets under different missing patterns and missing rates.
arXiv Detail & Related papers (2021-10-05T18:20:42Z) - Deep Time Series Models for Scarce Data [8.673181404172963]
Time series data have grown at an explosive rate in numerous domains and have stimulated a surge of time series modeling research.
Data scarcity is a universal issue that occurs in a vast range of data analytics problems.
arXiv Detail & Related papers (2021-03-16T22:16:54Z) - Time-Series Imputation with Wasserstein Interpolation for Optimal
Look-Ahead-Bias and Variance Tradeoff [66.59869239999459]
In finance, imputation of missing returns may be applied prior to training a portfolio optimization model.
There is an inherent trade-off between the look-ahead-bias of using the full data set for imputation and the larger variance in the imputation from using only the training data.
We propose a Bayesian posterior consensus distribution which optimally controls the variance and look-ahead-bias trade-off in the imputation.
arXiv Detail & Related papers (2021-02-25T09:05:35Z) - Adjusting for Autocorrelated Errors in Neural Networks for Time Series
Regression and Forecasting [10.659189276058948]
We learn the autocorrelation coefficient jointly with the model parameters in order to adjust for autocorrelated errors.
For time series regression, large-scale experiments indicate that our method outperforms the Prais-Winsten method.
Results across a wide range of real-world datasets show that our method enhances performance in almost all cases.
arXiv Detail & Related papers (2021-01-28T04:25:51Z) - Neural ODEs for Informative Missingness in Multivariate Time Series [0.7233897166339269]
Practical applications, e.g., sensor data, healthcare, weather, generates data that is in truth continuous in time.
Deep learning model called GRU-D is one early attempt to address informative missingness in time series data.
New family of neural networks called Neural ODEs are natural and efficient for processing time series data which is continuous in time.
arXiv Detail & Related papers (2020-05-20T00:28:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.