Transformers in Time Series: A Survey
- URL: http://arxiv.org/abs/2202.07125v5
- Date: Thu, 11 May 2023 21:47:52 GMT
- Title: Transformers in Time Series: A Survey
- Authors: Qingsong Wen, Tian Zhou, Chaoli Zhang, Weiqi Chen, Ziqing Ma, Junchi
Yan, Liang Sun
- Abstract summary: We systematically review Transformer schemes for time series modeling by highlighting their strengths as well as limitations.
From the perspective of network structure, we summarize the adaptations and modifications that have been made to Transformers.
From the perspective of applications, we categorize time series Transformers based on common tasks including forecasting, anomaly detection, and classification.
- Score: 66.50847574634726
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Transformers have achieved superior performances in many tasks in natural
language processing and computer vision, which also triggered great interest in
the time series community. Among multiple advantages of Transformers, the
ability to capture long-range dependencies and interactions is especially
attractive for time series modeling, leading to exciting progress in various
time series applications. In this paper, we systematically review Transformer
schemes for time series modeling by highlighting their strengths as well as
limitations. In particular, we examine the development of time series
Transformers in two perspectives. From the perspective of network structure, we
summarize the adaptations and modifications that have been made to Transformers
in order to accommodate the challenges in time series analysis. From the
perspective of applications, we categorize time series Transformers based on
common tasks including forecasting, anomaly detection, and classification.
Empirically, we perform robust analysis, model size analysis, and
seasonal-trend decomposition analysis to study how Transformers perform in time
series. Finally, we discuss and suggest future directions to provide useful
research guidance. To the best of our knowledge, this paper is the first work
to comprehensively and systematically summarize the recent advances of
Transformers for modeling time series data. We hope this survey will ignite
further research interests in time series Transformers.
Related papers
- Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - A Systematic Review for Transformer-based Long-term Series Forecasting [7.414422194379818]
Transformer architecture has proven to be the most successful solution to extract semantic correlations.
Various variants have enabled transformer architecture to handle long-term time series forecasting tasks.
arXiv Detail & Related papers (2023-10-31T06:37:51Z) - iTransformer: Inverted Transformers Are Effective for Time Series Forecasting [62.40166958002558]
We propose iTransformer, which simply applies the attention and feed-forward network on the inverted dimensions.
The iTransformer model achieves state-of-the-art on challenging real-world datasets.
arXiv Detail & Related papers (2023-10-10T13:44:09Z) - U-shaped Transformer: Retain High Frequency Context in Time Series
Analysis [0.5710971447109949]
In this paper, we consider the low-pass characteristics of transformers and try to incorporate the advantages of them.
We introduce patch merge and split operation to extract features with different scales and use larger datasets to fully make use of the transformer backbone.
Our experiments demonstrate that the model performs at an advanced level across multiple datasets with relatively low cost.
arXiv Detail & Related papers (2023-07-18T07:15:26Z) - A Survey on Transformers in Reinforcement Learning [66.23773284875843]
Transformer has been considered the dominating neural architecture in NLP and CV, mostly under supervised settings.
Recently, a similar surge of using Transformers has appeared in the domain of reinforcement learning (RL), but it is faced with unique design choices and challenges brought by the nature of RL.
This paper systematically reviews motivations and progress on using Transformers in RL, provide a taxonomy on existing works, discuss each sub-field, and summarize future prospects.
arXiv Detail & Related papers (2023-01-08T14:04:26Z) - W-Transformers : A Wavelet-based Transformer Framework for Univariate
Time Series Forecasting [7.075125892721573]
We build a transformer model for non-stationary time series using wavelet-based transformer encoder architecture.
We evaluate our framework on several publicly available benchmark time series datasets from various domains.
arXiv Detail & Related papers (2022-09-08T17:39:38Z) - Transformers in Time-series Analysis: A Tutorial [0.0]
Transformer architecture has widespread applications, particularly in Natural Language Processing and computer vision.
This tutorial provides an overview of the Transformer architecture, its applications, and a collection of examples from recent research papers in time-series analysis.
arXiv Detail & Related papers (2022-04-28T05:17:45Z) - Transformers in Vision: A Survey [101.07348618962111]
Transformers enable modeling long dependencies between input sequence elements and support parallel processing of sequence.
Transformers require minimal inductive biases for their design and are naturally suited as set-functions.
This survey aims to provide a comprehensive overview of the Transformer models in the computer vision discipline.
arXiv Detail & Related papers (2021-01-04T18:57:24Z) - Long Range Arena: A Benchmark for Efficient Transformers [115.1654897514089]
Long-rangearena benchmark is a suite of tasks consisting of sequences ranging from $1K$ to $16K$ tokens.
We systematically evaluate ten well-established long-range Transformer models on our newly proposed benchmark suite.
arXiv Detail & Related papers (2020-11-08T15:53:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.