Probabilistic Decomposition Transformer for Time Series Forecasting
- URL: http://arxiv.org/abs/2210.17393v1
- Date: Mon, 31 Oct 2022 15:22:50 GMT
- Title: Probabilistic Decomposition Transformer for Time Series Forecasting
- Authors: Junlong Tong, Liping Xie, Wankou Yang, Kanjian Zhang
- Abstract summary: We propose a probabilistic decomposition Transformer model that combines the Transformer with a conditional generative model.
The Transformer is employed to learn temporal patterns and implement primary probabilistic forecasts.
The conditional generative model is used to achieve non-autoregressive hierarchical probabilistic forecasts.
- Score: 13.472690692157164
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series forecasting is crucial for many fields, such as disaster warning,
weather prediction, and energy consumption. The Transformer-based models are
considered to have revolutionized the field of sequence modeling. However, the
complex temporal patterns of the time series hinder the model from mining
reliable temporal dependencies. Furthermore, the autoregressive form of the
Transformer introduces cumulative errors in the inference step. In this paper,
we propose the probabilistic decomposition Transformer model that combines the
Transformer with a conditional generative model, which provides hierarchical
and interpretable probabilistic forecasts for intricate time series. The
Transformer is employed to learn temporal patterns and implement primary
probabilistic forecasts, while the conditional generative model is used to
achieve non-autoregressive hierarchical probabilistic forecasts by introducing
latent space feature representations. In addition, the conditional generative
model reconstructs typical features of the series, such as seasonality and
trend terms, from probability distributions in the latent space to enable
complex pattern separation and provide interpretable forecasts. Extensive
experiments on several datasets demonstrate the effectiveness and robustness of
the proposed model, indicating that it compares favorably with the state of the
art.
Related papers
- UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting [98.12558945781693]
We propose a transformer-based model UniTST containing a unified attention mechanism on the flattened patch tokens.
Although our proposed model employs a simple architecture, it offers compelling performance as shown in our experiments on several datasets for time series forecasting.
arXiv Detail & Related papers (2024-06-07T14:39:28Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Transforming Autoregression: Interpretable and Expressive Time Series
Forecast [0.0]
We propose Autoregressive Transformation Models (ATMs), a model class inspired from various research directions.
ATMs unite expressive distributional forecasts using a semi-parametric distribution assumption with an interpretable model specification.
We demonstrate the properties of ATMs both theoretically and through empirical evaluation on several simulated and real-world forecasting datasets.
arXiv Detail & Related papers (2021-10-15T17:58:49Z) - Flow-based Spatio-Temporal Structured Prediction of Motion Dynamics [21.24885597341643]
Conditional Flows (CNFs) are flexible generative models capable of representing complicated distributions with high dimensionality and interdimensional correlations.
We propose MotionFlow as a novel approach that autoregressively normalizes the output on the temporal input features.
We apply our method to different tasks, including prediction, motion prediction time series forecasting, and binary segmentation.
arXiv Detail & Related papers (2021-04-09T14:30:35Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Temporal Latent Auto-Encoder: A Method for Probabilistic Multivariate
Time Series Forecasting [4.131842516813833]
We introduce a novel temporal latent auto-encoder method which enables nonlinear factorization of time series.
By imposing a probabilistic latent space model, complex distributions of the input series are modeled via the decoder.
Our model achieves state-of-the-art performance on many popular multivariate datasets, with gains sometimes as high as $50%$ for several standard metrics.
arXiv Detail & Related papers (2021-01-25T22:29:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.