Hybrid Transformer and Spatial-Temporal Self-Supervised Learning for
Long-term Traffic Prediction
- URL: http://arxiv.org/abs/2401.16453v1
- Date: Mon, 29 Jan 2024 06:17:23 GMT
- Title: Hybrid Transformer and Spatial-Temporal Self-Supervised Learning for
Long-term Traffic Prediction
- Authors: Wang Zhu, Doudou Zhang, Baichao Long, Jianli Xiao
- Abstract summary: We propose a model that combines hybrid Transformer and self-supervised learning.
The model enhances its adaptive data augmentation by applying data augmentation techniques at the sequence-level of the traffic.
We design two self-supervised learning tasks to model the temporal and spatial dependencies, thereby improving the accuracy and ability of the model.
- Score: 1.8531577178922987
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Long-term traffic prediction has always been a challenging task due to its
dynamic temporal dependencies and complex spatial dependencies. In this paper,
we propose a model that combines hybrid Transformer and spatio-temporal
self-supervised learning. The model enhances its robustness by applying
adaptive data augmentation techniques at the sequence-level and graph-level of
the traffic data. It utilizes Transformer to overcome the limitations of
recurrent neural networks in capturing long-term sequences, and employs
Chebyshev polynomial graph convolution to capture complex spatial dependencies.
Furthermore, considering the impact of spatio-temporal heterogeneity on traffic
speed, we design two self-supervised learning tasks to model the temporal and
spatial heterogeneity, thereby improving the accuracy and generalization
ability of the model. Experimental evaluations are conducted on two real-world
datasets, PeMS04 and PeMS08, and the results are visualized and analyzed,
demonstrating the superior performance of the proposed model.
Related papers
- Navigating Spatio-Temporal Heterogeneity: A Graph Transformer Approach for Traffic Forecasting [13.309018047313801]
Traffic forecasting has emerged as a crucial research area in the development of smart cities.
Recent advancements in network modeling for most-temporal correlations are starting to see diminishing returns in performance.
To tackle these challenges, we introduce the Spatio-Temporal Graph Transformer (STGormer)
We design two straightforward yet effective spatial encoding methods based on the structure and integrate time position into the vanilla transformer to capture-temporal traffic patterns.
arXiv Detail & Related papers (2024-08-20T13:18:21Z) - A Multi-Channel Spatial-Temporal Transformer Model for Traffic Flow Forecasting [0.0]
We propose a multi-channel spatial-temporal transformer model for traffic flow forecasting.
It improves the accuracy of the prediction by fusing results from different channels of traffic data.
Experimental results on six real-world datasets demonstrate that introducing a multi-channel mechanism into the temporal model enhances performance.
arXiv Detail & Related papers (2024-05-10T06:37:07Z) - Rethinking Urban Mobility Prediction: A Super-Multivariate Time Series
Forecasting Approach [71.67506068703314]
Long-term urban mobility predictions play a crucial role in the effective management of urban facilities and services.
Traditionally, urban mobility data has been structured as videos, treating longitude and latitude as fundamental pixels.
In our research, we introduce a fresh perspective on urban mobility prediction.
Instead of oversimplifying urban mobility data as traditional video data, we regard it as a complex time series.
arXiv Detail & Related papers (2023-12-04T07:39:05Z) - Attention-based Spatial-Temporal Graph Convolutional Recurrent Networks
for Traffic Forecasting [12.568905377581647]
Traffic forecasting is one of the most fundamental problems in transportation science and artificial intelligence.
Existing methods cannot accurately model both long-term and short-term temporal correlations simultaneously.
We propose a novel spatial-temporal neural network framework, which consists of a graph convolutional recurrent module (GCRN) and a global attention module.
arXiv Detail & Related papers (2023-02-25T03:37:00Z) - PDFormer: Propagation Delay-Aware Dynamic Long-Range Transformer for
Traffic Flow Prediction [78.05103666987655]
spatial-temporal Graph Neural Network (GNN) models have emerged as one of the most promising methods to solve this problem.
We propose a novel propagation delay-aware dynamic long-range transFormer, namely PDFormer, for accurate traffic flow prediction.
Our method can not only achieve state-of-the-art performance but also exhibit competitive computational efficiency.
arXiv Detail & Related papers (2023-01-19T08:42:40Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Enhancing the Robustness via Adversarial Learning and Joint
Spatial-Temporal Embeddings in Traffic Forecasting [11.680589359294972]
We propose TrendGCN to address the challenge of balancing dynamics and robustness.
Our model simultaneously incorporates spatial (node-wise) embeddings and temporal (time-wise) embeddings to account for heterogeneous space-and-time convolutions.
Compared with traditional approaches that handle step-wise predictive errors independently, our approach can produce more realistic and robust forecasts.
arXiv Detail & Related papers (2022-08-05T09:36:55Z) - Multi-intersection Traffic Optimisation: A Benchmark Dataset and a
Strong Baseline [85.9210953301628]
Control of traffic signals is fundamental and critical to alleviate traffic congestion in urban areas.
Because of the high complexity of modelling the problem, experimental settings of current works are often inconsistent.
We propose a novel and strong baseline model based on deep reinforcement learning with the encoder-decoder structure.
arXiv Detail & Related papers (2021-01-24T03:55:39Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z) - Spatial-Temporal Transformer Networks for Traffic Flow Forecasting [74.76852538940746]
We propose a novel paradigm of Spatial-Temporal Transformer Networks (STTNs) to improve the accuracy of long-term traffic forecasting.
Specifically, we present a new variant of graph neural networks, named spatial transformer, by dynamically modeling directed spatial dependencies.
The proposed model enables fast and scalable training over a long range spatial-temporal dependencies.
arXiv Detail & Related papers (2020-01-09T10:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.