Autoformer: Decomposition Transformers with Auto-Correlation for
Long-Term Series Forecasting
- URL: http://arxiv.org/abs/2106.13008v1
- Date: Thu, 24 Jun 2021 13:43:43 GMT
- Title: Autoformer: Decomposition Transformers with Auto-Correlation for
Long-Term Series Forecasting
- Authors: Haixu Wu, Jiehui Xu, Jianmin Wang, Mingsheng Long
- Abstract summary: Autoformer is a novel decomposition architecture with an Auto-Correlation mechanism.
In long-term forecasting, Autoformer yields state-of-the-art accuracy, with a relative improvement on six benchmarks.
- Score: 68.86835407617778
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Extending the forecasting time is a critical demand for real applications,
such as extreme weather early warning and long-term energy consumption
planning. This paper studies the \textit{long-term forecasting} problem of time
series. Prior Transformer-based models adopt various self-attention mechanisms
to discover the long-range dependencies. However, intricate temporal patterns
of the long-term future prohibit the model from finding reliable dependencies.
Also, Transformers have to adopt the sparse versions of point-wise
self-attentions for long series efficiency, resulting in the information
utilization bottleneck. Towards these challenges, we propose Autoformer as a
novel decomposition architecture with an Auto-Correlation mechanism. We go
beyond the pre-processing convention of series decomposition and renovate it as
a basic inner block of deep models. This design empowers Autoformer with
progressive decomposition capacities for complex time series. Further, inspired
by the stochastic process theory, we design the Auto-Correlation mechanism
based on the series periodicity, which conducts the dependencies discovery and
representation aggregation at the sub-series level. Auto-Correlation
outperforms self-attention in both efficiency and accuracy. In long-term
forecasting, Autoformer yields state-of-the-art accuracy, with a 38% relative
improvement on six benchmarks, covering five practical applications: energy,
traffic, economics, weather and disease.
Related papers
- Diffusion Auto-regressive Transformer for Effective Self-supervised Time Series Forecasting [47.58016750718323]
We propose a novel generative self-supervised method called TimeDART.
TimeDART captures both the global sequence dependence and local detail features within time series data.
Our code is publicly available at https://github.com/Melmaphother/TimeDART.
arXiv Detail & Related papers (2024-10-08T06:08:33Z) - Local Attention Mechanism: Boosting the Transformer Architecture for Long-Sequence Time Series Forecasting [8.841114905151152]
Local Attention Mechanism (LAM) is an efficient attention mechanism tailored for time series analysis.
LAM exploits the continuity properties of time series to reduce the number of attention scores computed.
We present an algorithm for implementing LAM in algebra tensor that runs in time and memory O(nlogn)
arXiv Detail & Related papers (2024-10-04T11:32:02Z) - Multiscale Representation Enhanced Temporal Flow Fusion Model for Long-Term Workload Forecasting [19.426131129034115]
This paper proposes a novel framework leveraging self-supervised multiscale representation learning to capture both long-term and near-term workload patterns.
The long-term history is encoded through multiscale representations while the near-term observations are modeled via temporal flow fusion.
arXiv Detail & Related papers (2024-07-29T04:42:18Z) - Rough Transformers for Continuous and Efficient Time-Series Modelling [46.58170057001437]
Time-series data in real-world medical settings typically exhibit long-range dependencies and are observed at non-uniform intervals.
We introduce the Rough Transformer, a variation of the Transformer model which operates on continuous-time representations of input sequences.
We find that Rough Transformers consistently outperform their vanilla attention counterparts while obtaining the benefits of Neural ODE-based models.
arXiv Detail & Related papers (2024-03-15T13:29:45Z) - Rethinking Urban Mobility Prediction: A Super-Multivariate Time Series
Forecasting Approach [71.67506068703314]
Long-term urban mobility predictions play a crucial role in the effective management of urban facilities and services.
Traditionally, urban mobility data has been structured as videos, treating longitude and latitude as fundamental pixels.
In our research, we introduce a fresh perspective on urban mobility prediction.
Instead of oversimplifying urban mobility data as traditional video data, we regard it as a complex time series.
arXiv Detail & Related papers (2023-12-04T07:39:05Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - NAST: Non-Autoregressive Spatial-Temporal Transformer for Time Series
Forecasting [24.510978166050293]
This work is the first attempt to propose a Non-Autoregressive Transformer architecture for time series forecasting.
We present a novel spatial-temporal attention mechanism, building a bridge by a learned temporal influence map to fill the gaps between the spatial and temporal attention.
arXiv Detail & Related papers (2021-02-10T18:36:11Z) - Spatial-Temporal Transformer Networks for Traffic Flow Forecasting [74.76852538940746]
We propose a novel paradigm of Spatial-Temporal Transformer Networks (STTNs) to improve the accuracy of long-term traffic forecasting.
Specifically, we present a new variant of graph neural networks, named spatial transformer, by dynamically modeling directed spatial dependencies.
The proposed model enables fast and scalable training over a long range spatial-temporal dependencies.
arXiv Detail & Related papers (2020-01-09T10:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.