Considering Nonstationary within Multivariate Time Series with
Variational Hierarchical Transformer for Forecasting
- URL: http://arxiv.org/abs/2403.05406v1
- Date: Fri, 8 Mar 2024 16:04:36 GMT
- Title: Considering Nonstationary within Multivariate Time Series with
Variational Hierarchical Transformer for Forecasting
- Authors: Muyao Wang, Wenchao Chen, Bo Chen
- Abstract summary: We develop a powerful hierarchical probabilistic generative module to consider the non-stationarity and intrinsic characteristics within MTS.
We then combine it with transformer for a well-defined variational generative dynamic model named Hierarchical Time series Variational Transformer (HTV-Trans)
Being a powerful probabilistic model, HTV-Trans is utilized to learn expressive representations of MTS and applied to forecasting tasks.
- Score: 12.793705636683402
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The forecasting of Multivariate Time Series (MTS) has long been an important
but challenging task. Due to the non-stationary problem across long-distance
time steps, previous studies primarily adopt stationarization method to
attenuate the non-stationary problem of the original series for better
predictability. However, existing methods always adopt the stationarized
series, which ignores the inherent non-stationarity, and has difficulty in
modeling MTS with complex distributions due to the lack of stochasticity. To
tackle these problems, we first develop a powerful hierarchical probabilistic
generative module to consider the non-stationarity and stochastic
characteristics within MTS, and then combine it with transformer for a
well-defined variational generative dynamic model named Hierarchical Time
series Variational Transformer (HTV-Trans), which recovers the intrinsic
non-stationary information into temporal dependencies. Being a powerful
probabilistic model, HTV-Trans is utilized to learn expressive representations
of MTS and applied to forecasting tasks. Extensive experiments on diverse
datasets show the efficiency of HTV-Trans on MTS forecasting tasks
Related papers
- LSEAttention is All You Need for Time Series Forecasting [0.0]
Transformer-based architectures have achieved remarkable success in natural language processing and computer vision.
I introduce textbfLSEAttention, an approach designed to address entropy collapse and training instability commonly observed in transformer models.
arXiv Detail & Related papers (2024-10-31T09:09:39Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Robust Multivariate Time Series Forecasting against Intra- and Inter-Series Transitional Shift [40.734564394464556]
We present a unified Probabilistic Graphical Model to Jointly capturing intra-/inter-series correlations and modeling the time-variant transitional distribution.
We validate the effectiveness and efficiency of JointPGM through extensive experiments on six highly non-stationary MTS datasets.
arXiv Detail & Related papers (2024-07-18T06:16:03Z) - UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting [98.12558945781693]
We propose a transformer-based model UniTST containing a unified attention mechanism on the flattened patch tokens.
Although our proposed model employs a simple architecture, it offers compelling performance as shown in our experiments on several datasets for time series forecasting.
arXiv Detail & Related papers (2024-06-07T14:39:28Z) - A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Multi-scale Transformer Pyramid Networks for Multivariate Time Series
Forecasting [8.739572744117634]
We introduce a dimension invariant embedding technique that captures short-term temporal dependencies.
We present a novel Multi-scale Transformer Pyramid Network (MTPNet) specifically designed to capture temporal dependencies at multiple unconstrained scales.
arXiv Detail & Related papers (2023-08-23T06:40:05Z) - FormerTime: Hierarchical Multi-Scale Representations for Multivariate
Time Series Classification [53.55504611255664]
FormerTime is a hierarchical representation model for improving the classification capacity for the multivariate time series classification task.
It exhibits three aspects of merits: (1) learning hierarchical multi-scale representations from time series data, (2) inheriting the strength of both transformers and convolutional networks, and (3) tacking the efficiency challenges incurred by the self-attention mechanism.
arXiv Detail & Related papers (2023-02-20T07:46:14Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Distributional Drift Adaptation with Temporal Conditional Variational Autoencoder for Multivariate Time Series Forecasting [41.206310481507565]
We propose a novel framework temporal conditional variational autoencoder (TCVAE) to model the dynamic distributional dependencies over time.
The TCVAE infers the dependencies as a temporal conditional distribution to leverage latent variables.
We show the TCVAE's superior robustness and effectiveness over the state-of-the-art MTS forecasting baselines.
arXiv Detail & Related papers (2022-09-01T10:06:22Z) - Non-stationary Transformers: Exploring the Stationarity in Time Series
Forecasting [86.33543833145457]
We propose Non-stationary Transformers as a generic framework with two interdependent modules: Series Stationarization and De-stationary Attention.
Our framework consistently boosts mainstream Transformers by a large margin, which reduces MSE by 49.43% on Transformer, 47.34% on Informer, and 46.89% on Reformer.
arXiv Detail & Related papers (2022-05-28T12:27:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.