BayOTIDE: Bayesian Online Multivariate Time series Imputation with functional decomposition
- URL: http://arxiv.org/abs/2308.14906v3
- Date: Thu, 30 May 2024 18:50:32 GMT
- Title: BayOTIDE: Bayesian Online Multivariate Time series Imputation with functional decomposition
- Authors: Shikai Fang, Qingsong Wen, Yingtao Luo, Shandian Zhe, Liang Sun,
- Abstract summary: In real-world scenarios like traffic and energy, massive time-series data with missing values and noises are widely observed, even sampled irregularly.
While many imputation methods have been proposed, most of them work with a local horizon.
Almost all methods assume the observations are sampled at regular time stamps, and fail to handle complex irregular sampled time series.
- Score: 31.096125530322933
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In real-world scenarios like traffic and energy, massive time-series data with missing values and noises are widely observed, even sampled irregularly. While many imputation methods have been proposed, most of them work with a local horizon, which means models are trained by splitting the long sequence into batches of fit-sized patches. This local horizon can make models ignore global trends or periodic patterns. More importantly, almost all methods assume the observations are sampled at regular time stamps, and fail to handle complex irregular sampled time series arising from different applications. Thirdly, most existing methods are learned in an offline manner. Thus, it is not suitable for many applications with fast-arriving streaming data. To overcome these limitations, we propose BayOTIDE: Bayesian Online Multivariate Time series Imputation with functional decomposition. We treat the multivariate time series as the weighted combination of groups of low-rank temporal factors with different patterns. We apply a group of Gaussian Processes (GPs) with different kernels as functional priors to fit the factors. For computational efficiency, we further convert the GPs into a state-space prior by constructing an equivalent stochastic differential equation (SDE), and developing a scalable algorithm for online inference. The proposed method can not only handle imputation over arbitrary time stamps, but also offer uncertainty quantification and interpretability for the downstream application. We evaluate our method on both synthetic and real-world datasets.We release the code at {https://github.com/xuangu-fang/BayOTIDE}
Related papers
- PeFAD: A Parameter-Efficient Federated Framework for Time Series Anomaly Detection [51.20479454379662]
We propose a.
Federated Anomaly Detection framework named PeFAD with the increasing privacy concerns.
We conduct extensive evaluations on four real datasets, where PeFAD outperforms existing state-of-the-art baselines by up to 28.74%.
arXiv Detail & Related papers (2024-06-04T13:51:08Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Continuous-time Autoencoders for Regular and Irregular Time Series Imputation [21.25279298572273]
Time series imputation is one of the most fundamental tasks for time series.
Recent self-attention-based methods show the state-of-the-art imputation performance.
It has been overlooked for a long time to design an imputation method based on continuous-time recurrent neural networks.
arXiv Detail & Related papers (2023-12-27T14:13:42Z) - Streaming Factor Trajectory Learning for Temporal Tensor Decomposition [33.18423605559094]
We propose Streaming Factor Trajectory Learning for temporal tensor decomposition.
We use Gaussian processes (GPs) to model the trajectory of factors so as to flexibly estimate their temporal evolution.
We have shown the advantage of SFTL in both synthetic tasks and real-world applications.
arXiv Detail & Related papers (2023-10-25T21:58:52Z) - Compatible Transformer for Irregularly Sampled Multivariate Time Series [75.79309862085303]
We propose a transformer-based encoder to achieve comprehensive temporal-interaction feature learning for each individual sample.
We conduct extensive experiments on 3 real-world datasets and validate that the proposed CoFormer significantly and consistently outperforms existing methods.
arXiv Detail & Related papers (2023-10-17T06:29:09Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Multi-scale Anomaly Detection for Big Time Series of Industrial Sensors [50.6434162489902]
We propose a reconstruction-based anomaly detection method, MissGAN, iteratively learning to decode and encode naturally smooth time series.
MissGAN does not need labels or only needs labels of normal instances, making it widely applicable.
arXiv Detail & Related papers (2022-04-18T04:34:15Z) - Elastic Product Quantization for Time Series [19.839572576189187]
We propose the use of product quantization for efficient similarity-based comparison of time series under time warping.
The proposed solution emerges as a highly efficient (both in terms of memory usage and time) replacement for elastic measures in time series applications.
arXiv Detail & Related papers (2022-01-04T09:23:06Z) - Scaling Structured Inference with Randomization [64.18063627155128]
We propose a family of dynamic programming (RDP) randomized for scaling structured models to tens of thousands of latent states.
Our method is widely applicable to classical DP-based inference.
It is also compatible with automatic differentiation so can be integrated with neural networks seamlessly.
arXiv Detail & Related papers (2021-12-07T11:26:41Z) - Sparse Algorithms for Markovian Gaussian Processes [18.999495374836584]
Sparse Markovian processes combine the use of inducing variables with efficient Kalman filter-likes recursion.
We derive a general site-based approach to approximate the non-Gaussian likelihood with local Gaussian terms, called sites.
Our approach results in a suite of novel sparse extensions to algorithms from both the machine learning and signal processing, including variational inference, expectation propagation, and the classical nonlinear Kalman smoothers.
The derived methods are suited to literature-temporal data, where the model has separate inducing points in both time and space.
arXiv Detail & Related papers (2021-03-19T09:50:53Z) - Fast Variational Learning in State-Space Gaussian Process Models [29.630197272150003]
We build upon an existing method called conjugate-computation variational inference.
We provide an efficient JAX implementation which exploits just-in-time compilation.
Our approach leads to fast and stable variational inference in state-space GP models that can be scaled to time series with millions of data points.
arXiv Detail & Related papers (2020-07-09T12:06:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.