STAEformer: Spatio-Temporal Adaptive Embedding Makes Vanilla Transformer
SOTA for Traffic Forecasting
- URL: http://arxiv.org/abs/2308.10425v5
- Date: Sun, 8 Oct 2023 01:58:10 GMT
- Title: STAEformer: Spatio-Temporal Adaptive Embedding Makes Vanilla Transformer
SOTA for Traffic Forecasting
- Authors: Hangchen Liu, Zheng Dong, Renhe Jiang, Jiewen Deng, Jinliang Deng,
Quanjun Chen and Xuan Song
- Abstract summary: We present a component called adaptive embedding that can yield results with outstanding gains.
Experiments demonstrate that adaptive embedding plays a crucial role in traffic forecasting by capturing intrinsic-temporal relations and information traffic time series.
- Score: 10.875804648633832
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With the rapid development of the Intelligent Transportation System (ITS),
accurate traffic forecasting has emerged as a critical challenge. The key
bottleneck lies in capturing the intricate spatio-temporal traffic patterns. In
recent years, numerous neural networks with complicated architectures have been
proposed to address this issue. However, the advancements in network
architectures have encountered diminishing performance gains. In this study, we
present a novel component called spatio-temporal adaptive embedding that can
yield outstanding results with vanilla transformers. Our proposed
Spatio-Temporal Adaptive Embedding transformer (STAEformer) achieves
state-of-the-art performance on five real-world traffic forecasting datasets.
Further experiments demonstrate that spatio-temporal adaptive embedding plays a
crucial role in traffic forecasting by effectively capturing intrinsic
spatio-temporal relations and chronological information in traffic time series.
Related papers
- Fusion Matrix Prompt Enhanced Self-Attention Spatial-Temporal Interactive Traffic Forecasting Framework [2.9490249935740573]
We propose a Fusion Matrix Prompt Enhanced Self-Attention Spatial-Temporal Interactive Traffic Forecasting Framework (FMPESTF)
FMPESTF is composed of spatial and temporal modules for down-sampling traffic data.
We introduce attention mechanism in time modeling, and design hierarchical spatial-temporal interactive learning to help the model adapt to various traffic scenarios.
arXiv Detail & Related papers (2024-10-12T03:47:27Z) - Navigating Spatio-Temporal Heterogeneity: A Graph Transformer Approach for Traffic Forecasting [13.309018047313801]
Traffic forecasting has emerged as a crucial research area in the development of smart cities.
Recent advancements in network modeling for most-temporal correlations are starting to see diminishing returns in performance.
To tackle these challenges, we introduce the Spatio-Temporal Graph Transformer (STGormer)
We design two straightforward yet effective spatial encoding methods based on the structure and integrate time position into the vanilla transformer to capture-temporal traffic patterns.
arXiv Detail & Related papers (2024-08-20T13:18:21Z) - Hybrid Transformer and Spatial-Temporal Self-Supervised Learning for
Long-term Traffic Prediction [1.8531577178922987]
We propose a model that combines hybrid Transformer and self-supervised learning.
The model enhances its adaptive data augmentation by applying data augmentation techniques at the sequence-level of the traffic.
We design two self-supervised learning tasks to model the temporal and spatial dependencies, thereby improving the accuracy and ability of the model.
arXiv Detail & Related papers (2024-01-29T06:17:23Z) - Rethinking Urban Mobility Prediction: A Super-Multivariate Time Series
Forecasting Approach [71.67506068703314]
Long-term urban mobility predictions play a crucial role in the effective management of urban facilities and services.
Traditionally, urban mobility data has been structured as videos, treating longitude and latitude as fundamental pixels.
In our research, we introduce a fresh perspective on urban mobility prediction.
Instead of oversimplifying urban mobility data as traditional video data, we regard it as a complex time series.
arXiv Detail & Related papers (2023-12-04T07:39:05Z) - PDFormer: Propagation Delay-Aware Dynamic Long-Range Transformer for
Traffic Flow Prediction [78.05103666987655]
spatial-temporal Graph Neural Network (GNN) models have emerged as one of the most promising methods to solve this problem.
We propose a novel propagation delay-aware dynamic long-range transFormer, namely PDFormer, for accurate traffic flow prediction.
Our method can not only achieve state-of-the-art performance but also exhibit competitive computational efficiency.
arXiv Detail & Related papers (2023-01-19T08:42:40Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Correlating sparse sensing for large-scale traffic speed estimation: A
Laplacian-enhanced low-rank tensor kriging approach [76.45949280328838]
We propose a Laplacian enhanced low-rank tensor (LETC) framework featuring both lowrankness and multi-temporal correlations for large-scale traffic speed kriging.
We then design an efficient solution algorithm via several effective numeric techniques to scale up the proposed model to network-wide kriging.
arXiv Detail & Related papers (2022-10-21T07:25:57Z) - PSTN: Periodic Spatial-temporal Deep Neural Network for Traffic
Condition Prediction [8.255993195520306]
We propose a periodic deeptemporal neural network (PSTN) with three modules to improve the forecasting performance of traffic conditions.
First, the historical traffic information is folded and fed into a module consisting of a graph convolutional network and a temporal convolutional network.
arXiv Detail & Related papers (2021-08-05T07:42:43Z) - Learning dynamic and hierarchical traffic spatiotemporal features with
Transformer [4.506591024152763]
This paper proposes a novel model, Traffic Transformer, for spatial-temporal graph modeling and long-term traffic forecasting.
Transformer is the most popular framework in Natural Language Processing (NLP)
analyzing the attention weight matrixes can find the influential part of road networks, allowing us to learn the traffic networks better.
arXiv Detail & Related papers (2021-04-12T02:29:58Z) - Constructing Geographic and Long-term Temporal Graph for Traffic
Forecasting [88.5550074808201]
We propose Geographic and Long term Temporal Graph Convolutional Recurrent Neural Network (GLT-GCRNN) for traffic forecasting.
In this work, we propose a novel framework for traffic forecasting that learns the rich interactions between roads sharing similar geographic or longterm temporal patterns.
arXiv Detail & Related papers (2020-04-23T03:50:46Z) - Spatial-Temporal Transformer Networks for Traffic Flow Forecasting [74.76852538940746]
We propose a novel paradigm of Spatial-Temporal Transformer Networks (STTNs) to improve the accuracy of long-term traffic forecasting.
Specifically, we present a new variant of graph neural networks, named spatial transformer, by dynamically modeling directed spatial dependencies.
The proposed model enables fast and scalable training over a long range spatial-temporal dependencies.
arXiv Detail & Related papers (2020-01-09T10:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.