Multi-Scale Spatial-Temporal Recurrent Networks for Traffic Flow
Prediction
- URL: http://arxiv.org/abs/2310.08138v1
- Date: Thu, 12 Oct 2023 08:52:36 GMT
- Title: Multi-Scale Spatial-Temporal Recurrent Networks for Traffic Flow
Prediction
- Authors: Haiyang Liu, Chunjiang Zhu, Detian Zhang, Qing Li
- Abstract summary: We propose a Multi-Scale Spatial-Temporal Recurrent Network for traffic flow prediction, namely MSSTRN.
We propose a spatial-temporal synchronous attention mechanism that integrates adaptive position graph convolutions into the self-attention mechanism to achieve synchronous capture of spatial-temporal dependencies.
Our model achieves the best prediction accuracy with non-trivial margins compared to all the twenty baseline methods.
- Score: 13.426775574655135
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Traffic flow prediction is one of the most fundamental tasks of intelligent
transportation systems. The complex and dynamic spatial-temporal dependencies
make the traffic flow prediction quite challenging. Although existing
spatial-temporal graph neural networks hold prominent, they often encounter
challenges such as (1) ignoring the fixed graph that limits the predictive
performance of the model, (2) insufficiently capturing complex spatial-temporal
dependencies simultaneously, and (3) lacking attention to spatial-temporal
information at different time lengths. In this paper, we propose a Multi-Scale
Spatial-Temporal Recurrent Network for traffic flow prediction, namely MSSTRN,
which consists of two different recurrent neural networks: the single-step gate
recurrent unit and the multi-step gate recurrent unit to fully capture the
complex spatial-temporal information in the traffic data under different time
windows. Moreover, we propose a spatial-temporal synchronous attention
mechanism that integrates adaptive position graph convolutions into the
self-attention mechanism to achieve synchronous capture of spatial-temporal
dependencies. We conducted extensive experiments on four real traffic datasets
and demonstrated that our model achieves the best prediction accuracy with
non-trivial margins compared to all the twenty baseline methods.
Related papers
- Attention-based Spatial-Temporal Graph Convolutional Recurrent Networks
for Traffic Forecasting [12.568905377581647]
Traffic forecasting is one of the most fundamental problems in transportation science and artificial intelligence.
Existing methods cannot accurately model both long-term and short-term temporal correlations simultaneously.
We propose a novel spatial-temporal neural network framework, which consists of a graph convolutional recurrent module (GCRN) and a global attention module.
arXiv Detail & Related papers (2023-02-25T03:37:00Z) - PDFormer: Propagation Delay-Aware Dynamic Long-Range Transformer for
Traffic Flow Prediction [78.05103666987655]
spatial-temporal Graph Neural Network (GNN) models have emerged as one of the most promising methods to solve this problem.
We propose a novel propagation delay-aware dynamic long-range transFormer, namely PDFormer, for accurate traffic flow prediction.
Our method can not only achieve state-of-the-art performance but also exhibit competitive computational efficiency.
arXiv Detail & Related papers (2023-01-19T08:42:40Z) - Spatial-temporal traffic modeling with a fusion graph reconstructed by
tensor decomposition [10.104097475236014]
Graph convolutional networks (GCNs) have been widely used in traffic flow prediction.
The design of the spatial-temporal graph adjacency matrix is a key to the success of GCNs.
This paper proposes reconstructing the binary adjacency matrix via tensor decomposition.
arXiv Detail & Related papers (2022-12-12T01:44:52Z) - Decoupling and Recoupling Spatiotemporal Representation for RGB-D-based
Motion Recognition [62.46544616232238]
Previous motion recognition methods have achieved promising performance through the tightly coupled multi-temporal representation.
We propose to decouple and recouple caused caused representation for RGB-D-based motion recognition.
arXiv Detail & Related papers (2021-12-16T18:59:47Z) - DMGCRN: Dynamic Multi-Graph Convolution Recurrent Network for Traffic
Forecasting [7.232141271583618]
We propose a novel dynamic multi-graph convolution recurrent network (DMG) to tackle above issues.
We use the distance-based graph to capture spatial information from nodes are close in distance.
We also construct a novel latent graph which encoded the structure correlations among roads to capture spatial information from nodes are similar in structure.
arXiv Detail & Related papers (2021-12-04T06:51:55Z) - STJLA: A Multi-Context Aware Spatio-Temporal Joint Linear Attention
Network for Traffic Forecasting [7.232141271583618]
We propose a novel deep learning model for traffic forecasting named inefficient-Context Spatio-Temporal Joint Linear Attention (SSTLA)
SSTLA applies linear attention to a joint graph to capture global dependence between alltemporal- nodes efficiently.
Experiments on two real-world traffic datasets, England and Temporal7, demonstrate that our STJLA can achieve 9.83% and 3.08% 3.08% accuracy in MAE measure over state-of-the-art baselines.
arXiv Detail & Related papers (2021-12-04T06:39:18Z) - Spatio-Temporal Joint Graph Convolutional Networks for Traffic
Forecasting [75.10017445699532]
Recent have shifted their focus towards formulating traffic forecasting as atemporal graph modeling problem.
We propose a novel approach for accurate traffic forecasting on road networks over multiple future time steps.
arXiv Detail & Related papers (2021-11-25T08:45:14Z) - Spatial-Temporal Graph ODE Networks for Traffic Flow Forecasting [22.421667339552467]
Spatial-temporal forecasting has attracted tremendous attention in a wide range of applications, and traffic flow prediction is a canonical and typical example.
Existing works typically utilize shallow graph convolution networks (GNNs) and temporal extracting modules to model spatial and temporal dependencies respectively.
We propose Spatial-Temporal Graph Ordinary Differential Equation Networks (STGODE), which captures spatial-temporal dynamics through a tensor-based ordinary differential equation (ODE)
We evaluate our model on multiple real-world traffic datasets and superior performance is achieved over state-of-the-art baselines.
arXiv Detail & Related papers (2021-06-24T11:48:45Z) - Spatio-temporal Modeling for Large-scale Vehicular Networks Using Graph
Convolutional Networks [110.80088437391379]
A graph-based framework called SMART is proposed to model and keep track of the statistics of vehicle-to-temporal (V2I) communication latency across a large geographical area.
We develop a graph reconstruction-based approach using a graph convolutional network integrated with a deep Q-networks algorithm.
Our results show that the proposed method can significantly improve both the accuracy and efficiency for modeling and the latency performance of large vehicular networks.
arXiv Detail & Related papers (2021-03-13T06:56:29Z) - A Spatial-Temporal Attentive Network with Spatial Continuity for
Trajectory Prediction [74.00750936752418]
We propose a novel model named spatial-temporal attentive network with spatial continuity (STAN-SC)
First, spatial-temporal attention mechanism is presented to explore the most useful and important information.
Second, we conduct a joint feature sequence based on the sequence and instant state information to make the generative trajectories keep spatial continuity.
arXiv Detail & Related papers (2020-03-13T04:35:50Z) - Spatial-Temporal Transformer Networks for Traffic Flow Forecasting [74.76852538940746]
We propose a novel paradigm of Spatial-Temporal Transformer Networks (STTNs) to improve the accuracy of long-term traffic forecasting.
Specifically, we present a new variant of graph neural networks, named spatial transformer, by dynamically modeling directed spatial dependencies.
The proposed model enables fast and scalable training over a long range spatial-temporal dependencies.
arXiv Detail & Related papers (2020-01-09T10:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.