Temporal Latent Auto-Encoder: A Method for Probabilistic Multivariate
Time Series Forecasting
- URL: http://arxiv.org/abs/2101.10460v1
- Date: Mon, 25 Jan 2021 22:29:40 GMT
- Title: Temporal Latent Auto-Encoder: A Method for Probabilistic Multivariate
Time Series Forecasting
- Authors: Nam Nguyen, Brian Quanz
- Abstract summary: We introduce a novel temporal latent auto-encoder method which enables nonlinear factorization of time series.
By imposing a probabilistic latent space model, complex distributions of the input series are modeled via the decoder.
Our model achieves state-of-the-art performance on many popular multivariate datasets, with gains sometimes as high as $50%$ for several standard metrics.
- Score: 4.131842516813833
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Probabilistic forecasting of high dimensional multivariate time series is a
notoriously challenging task, both in terms of computational burden and
distribution modeling. Most previous work either makes simple distribution
assumptions or abandons modeling cross-series correlations. A promising line of
work exploits scalable matrix factorization for latent-space forecasting, but
is limited to linear embeddings, unable to model distributions, and not
trainable end-to-end when using deep learning forecasting. We introduce a novel
temporal latent auto-encoder method which enables nonlinear factorization of
multivariate time series, learned end-to-end with a temporal deep learning
latent space forecast model. By imposing a probabilistic latent space model,
complex distributions of the input series are modeled via the decoder.
Extensive experiments demonstrate that our model achieves state-of-the-art
performance on many popular multivariate datasets, with gains sometimes as high
as $50\%$ for several standard metrics.
Related papers
- Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Recurrent Interpolants for Probabilistic Time Series Prediction [10.422645245061899]
Sequential models like recurrent neural networks and transformers have become standard for probabilistic time series forecasting.
Recent work explores generative approaches using diffusion or flow-based models, extending to time series imputation and forecasting.
This work proposes a novel method combining recurrent neural networks' efficiency with diffusion models' probabilistic modeling, based on interpolants and conditional generation with control features.
arXiv Detail & Related papers (2024-09-18T03:52:48Z) - Multiscale Representation Enhanced Temporal Flow Fusion Model for Long-Term Workload Forecasting [19.426131129034115]
This paper proposes a novel framework leveraging self-supervised multiscale representation learning to capture both long-term and near-term workload patterns.
The long-term history is encoded through multiscale representations while the near-term observations are modeled via temporal flow fusion.
arXiv Detail & Related papers (2024-07-29T04:42:18Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Optimal Latent Space Forecasting for Large Collections of Short Time
Series Using Temporal Matrix Factorization [0.0]
It is a common practice to evaluate multiple methods and choose one of these methods or an ensemble for producing the best forecasts.
We propose a framework for forecasting short high-dimensional time series data by combining low-rank temporal matrix factorization and optimal model selection on latent time series.
arXiv Detail & Related papers (2021-12-15T11:39:21Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Ambiguity in Sequential Data: Predicting Uncertain Futures with
Recurrent Models [110.82452096672182]
We propose an extension of the Multiple Hypothesis Prediction (MHP) model to handle ambiguous predictions with sequential data.
We also introduce a novel metric for ambiguous problems, which is better suited to account for uncertainties.
arXiv Detail & Related papers (2020-03-10T09:15:42Z) - Multivariate Probabilistic Time Series Forecasting via Conditioned
Normalizing Flows [8.859284959951204]
Time series forecasting is fundamental to scientific and engineering problems.
Deep learning methods are well suited for this problem.
We show that it improves over the state-of-the-art for standard metrics on many real-world data sets.
arXiv Detail & Related papers (2020-02-14T16:16:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.