Autoformer: Decomposition Transformers with Auto-Correlation for
Long-Term Series Forecasting
- URL: http://arxiv.org/abs/2106.13008v1
- Date: Thu, 24 Jun 2021 13:43:43 GMT
- Title: Autoformer: Decomposition Transformers with Auto-Correlation for
Long-Term Series Forecasting
- Authors: Haixu Wu, Jiehui Xu, Jianmin Wang, Mingsheng Long
- Abstract summary: Autoformer is a novel decomposition architecture with an Auto-Correlation mechanism.
In long-term forecasting, Autoformer yields state-of-the-art accuracy, with a relative improvement on six benchmarks.
- Score: 68.86835407617778
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Extending the forecasting time is a critical demand for real applications,
such as extreme weather early warning and long-term energy consumption
planning. This paper studies the \textit{long-term forecasting} problem of time
series. Prior Transformer-based models adopt various self-attention mechanisms
to discover the long-range dependencies. However, intricate temporal patterns
of the long-term future prohibit the model from finding reliable dependencies.
Also, Transformers have to adopt the sparse versions of point-wise
self-attentions for long series efficiency, resulting in the information
utilization bottleneck. Towards these challenges, we propose Autoformer as a
novel decomposition architecture with an Auto-Correlation mechanism. We go
beyond the pre-processing convention of series decomposition and renovate it as
a basic inner block of deep models. This design empowers Autoformer with
progressive decomposition capacities for complex time series. Further, inspired
by the stochastic process theory, we design the Auto-Correlation mechanism
based on the series periodicity, which conducts the dependencies discovery and
representation aggregation at the sub-series level. Auto-Correlation
outperforms self-attention in both efficiency and accuracy. In long-term
forecasting, Autoformer yields state-of-the-art accuracy, with a 38% relative
improvement on six benchmarks, covering five practical applications: energy,
traffic, economics, weather and disease.
Related papers
- Powerformer: A Transformer with Weighted Causal Attention for Time-series Forecasting [50.298817606660826]
We introduce Powerformer, a novel Transformer variant that replaces noncausal attention weights with causal weights that are reweighted according to a smooth heavy-tailed decay.
Our empirical results demonstrate that Powerformer achieves state-of-the-art accuracy on public time-series benchmarks.
Our analyses show that the model's locality bias is amplified during training, demonstrating an interplay between time-series data and power-law-based attention.
arXiv Detail & Related papers (2025-02-10T04:42:11Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Local Attention Mechanism: Boosting the Transformer Architecture for Long-Sequence Time Series Forecasting [8.841114905151152]
Local Attention Mechanism (LAM) is an efficient attention mechanism tailored for time series analysis.
LAM exploits the continuity properties of time series to reduce the number of attention scores computed.
We present an algorithm for implementing LAM in algebra tensor that runs in time and memory O(nlogn)
arXiv Detail & Related papers (2024-10-04T11:32:02Z) - Multiscale Representation Enhanced Temporal Flow Fusion Model for Long-Term Workload Forecasting [19.426131129034115]
This paper proposes a novel framework leveraging self-supervised multiscale representation learning to capture both long-term and near-term workload patterns.
The long-term history is encoded through multiscale representations while the near-term observations are modeled via temporal flow fusion.
arXiv Detail & Related papers (2024-07-29T04:42:18Z) - Rough Transformers for Continuous and Efficient Time-Series Modelling [46.58170057001437]
Time-series data in real-world medical settings typically exhibit long-range dependencies and are observed at non-uniform intervals.
We introduce the Rough Transformer, a variation of the Transformer model which operates on continuous-time representations of input sequences.
We find that Rough Transformers consistently outperform their vanilla attention counterparts while obtaining the benefits of Neural ODE-based models.
arXiv Detail & Related papers (2024-03-15T13:29:45Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - NAST: Non-Autoregressive Spatial-Temporal Transformer for Time Series
Forecasting [24.510978166050293]
This work is the first attempt to propose a Non-Autoregressive Transformer architecture for time series forecasting.
We present a novel spatial-temporal attention mechanism, building a bridge by a learned temporal influence map to fill the gaps between the spatial and temporal attention.
arXiv Detail & Related papers (2021-02-10T18:36:11Z) - Spatial-Temporal Transformer Networks for Traffic Flow Forecasting [74.76852538940746]
We propose a novel paradigm of Spatial-Temporal Transformer Networks (STTNs) to improve the accuracy of long-term traffic forecasting.
Specifically, we present a new variant of graph neural networks, named spatial transformer, by dynamically modeling directed spatial dependencies.
The proposed model enables fast and scalable training over a long range spatial-temporal dependencies.
arXiv Detail & Related papers (2020-01-09T10:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.