Low-Rank Autoregressive Tensor Completion for Multivariate Time Series
Forecasting
- URL: http://arxiv.org/abs/2006.10436v1
- Date: Thu, 18 Jun 2020 11:31:16 GMT
- Title: Low-Rank Autoregressive Tensor Completion for Multivariate Time Series
Forecasting
- Authors: Xinyu Chen and Lijun Sun
- Abstract summary: Time series collected from sensor networks are often large-scale and incomplete with considerable corruption and missing values.
We propose a low-rank autoregressive tensor completion (LATC) framework to model multivariate time series data.
Our numerical experiments on three real-world data sets demonstrate the superiority of the integration of global and local trends in LATC.
- Score: 15.927264813606637
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series prediction has been a long-standing research topic and an
essential application in many domains. Modern time series collected from sensor
networks (e.g., energy consumption and traffic flow) are often large-scale and
incomplete with considerable corruption and missing values, making it difficult
to perform accurate predictions. In this paper, we propose a low-rank
autoregressive tensor completion (LATC) framework to model multivariate time
series data. The key of LATC is to transform the original multivariate time
series matrix (e.g., sensor$\times$time point) to a third-order tensor
structure (e.g., sensor$\times$time of day$\times$day) by introducing an
additional temporal dimension, which allows us to model the inherent rhythms
and seasonality of time series as global patterns. With the tensor structure,
we can transform the time series prediction and missing data imputation
problems into a universal low-rank tensor completion problem. Besides
minimizing tensor rank, we also integrate a novel autoregressive norm on the
original matrix representation into the objective function. The two components
serve different roles. The low-rank structure allows us to effectively capture
the global consistency and trends across all the three dimensions (i.e.,
similarity among sensors, similarity of different days, and current time v.s.
the same time of historical days). The autoregressive norm can better model the
local temporal trends. Our numerical experiments on three real-world data sets
demonstrate the superiority of the integration of global and local trends in
LATC in both missing data imputation and rolling prediction tasks.
Related papers
- Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - Learning Gaussian Mixture Representations for Tensor Time Series
Forecasting [8.31607451942671]
We develop a novel TTS forecasting framework, which seeks to individually model each heterogeneity component implied in the time, the location, and the source variables.
Experiment results on two real-world TTS datasets verify the superiority of our approach compared with the state-of-the-art baselines.
arXiv Detail & Related papers (2023-06-01T06:50:47Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - TimesNet: Temporal 2D-Variation Modeling for General Time Series
Analysis [80.56913334060404]
Time series analysis is of immense importance in applications, such as weather forecasting, anomaly detection, and action recognition.
Previous methods attempt to accomplish this directly from the 1D time series.
We ravel out the complex temporal variations into the multiple intraperiod- and interperiod-variations.
arXiv Detail & Related papers (2022-10-05T12:19:51Z) - STING: Self-attention based Time-series Imputation Networks using GAN [4.052758394413726]
STING (Self-attention based Time-series Imputation Networks using GAN) is proposed.
We take advantage of generative adversarial networks and bidirectional recurrent neural networks to learn latent representations of the time series.
Experimental results on three real-world datasets demonstrate that STING outperforms the existing state-of-the-art methods in terms of imputation accuracy.
arXiv Detail & Related papers (2022-09-22T06:06:56Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Hankel-structured Tensor Robust PCA for Multivariate Traffic Time Series
Anomaly Detection [9.067182100565695]
This study proposes a Hankel-structured tensor version of RPCA for anomaly detection in spatial data.
We decompose the corrupted matrix into a low-rank Hankel tensor and a sparse matrix.
We evaluate the method by synthetic data and passenger flow time series.
arXiv Detail & Related papers (2021-10-08T19:35:39Z) - Low-Rank Autoregressive Tensor Completion for Spatiotemporal Traffic
Data Imputation [4.9831085918734805]
Missing data imputation has been a long-standing research topic and critical application for real-world intelligent transportation systems.
We propose a low-rank autoregressive tensor completion (LATC) framework by introducing textittemporal variation as a new regularization term.
We conduct extensive numerical experiments on several real-world traffic data sets, and our results demonstrate the effectiveness of LATC in diverse missing scenarios.
arXiv Detail & Related papers (2021-04-30T12:00:57Z) - A Deep Structural Model for Analyzing Correlated Multivariate Time
Series [11.009809732645888]
We present a deep learning structural time series model which can handle correlated multivariate time series input.
The model explicitly learns/extracts the trend, seasonality, and event components.
We compare our model with several state-of-the-art methods through a comprehensive set of experiments on a variety of time series data sets.
arXiv Detail & Related papers (2020-01-02T18:48:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.