Forecast Network-Wide Traffic States for Multiple Steps Ahead: A Deep
Learning Approach Considering Dynamic Non-Local Spatial Correlation and
Non-Stationary Temporal Dependency
- URL: http://arxiv.org/abs/2004.02391v1
- Date: Mon, 6 Apr 2020 03:40:56 GMT
- Title: Forecast Network-Wide Traffic States for Multiple Steps Ahead: A Deep
Learning Approach Considering Dynamic Non-Local Spatial Correlation and
Non-Stationary Temporal Dependency
- Authors: Xinglei Wang, Xuefeng Guan, Jun Cao, Na Zhang, Huayi Wu
- Abstract summary: This research studies two particular problems in traffic forecasting: (1) capture the dynamic and non-local spatial correlation between traffic links and (2) model the dynamics of temporal dependency for accurate multiple steps ahead predictions.
We propose a deep learning framework named Spatial-Temporal Sequence to Sequence model (STSeq2Seq) to address these issues.
This model builds on sequence to sequence (seq2seq) architecture to capture temporal feature and relies on graph convolution for aggregating spatial information.
- Score: 6.019104024723682
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Obtaining accurate information about future traffic flows of all links in a
traffic network is of great importance for traffic management and control
applications. This research studies two particular problems in traffic
forecasting: (1) capture the dynamic and non-local spatial correlation between
traffic links and (2) model the dynamics of temporal dependency for accurate
multiple steps ahead predictions. To address these issues, we propose a deep
learning framework named Spatial-Temporal Sequence to Sequence model
(STSeq2Seq). This model builds on sequence to sequence (seq2seq) architecture
to capture temporal feature and relies on graph convolution for aggregating
spatial information. Moreover, STSeq2Seq defines and constructs pattern-aware
adjacency matrices (PAMs) based on pair-wise similarity of the recent traffic
patterns on traffic links and integrate it into graph convolution operation. It
also deploys a novel seq2sesq architecture which couples a convolutional
encoder and a recurrent decoder with attention mechanism for dynamic modeling
of long-range dependence between different time steps. We conduct extensive
experiments using two publicly-available large-scale traffic datasets and
compare STSeq2Seq with other baseline models. The numerical results demonstrate
that the proposed model achieves state-of-the-art forecasting performance in
terms of various error measures. The ablation study verifies the effectiveness
of PAMs in capturing dynamic non-local spatial correlation and the superiority
of proposed seq2seq architecture in modeling non-stationary temporal dependency
for multiple steps ahead prediction. Furthermore, qualitative analysis is
conducted on PAMs as well as the attention weights for model interpretation.
Related papers
- Improving Traffic Flow Predictions with SGCN-LSTM: A Hybrid Model for Spatial and Temporal Dependencies [55.2480439325792]
This paper introduces the Signal-Enhanced Graph Convolutional Network Long Short Term Memory (SGCN-LSTM) model for predicting traffic speeds across road networks.
Experiments on the PEMS-BAY road network traffic dataset demonstrate the SGCN-LSTM model's effectiveness.
arXiv Detail & Related papers (2024-11-01T00:37:00Z) - Fusion Matrix Prompt Enhanced Self-Attention Spatial-Temporal Interactive Traffic Forecasting Framework [2.9490249935740573]
We propose a Fusion Matrix Prompt Enhanced Self-Attention Spatial-Temporal Interactive Traffic Forecasting Framework (FMPESTF)
FMPESTF is composed of spatial and temporal modules for down-sampling traffic data.
We introduce attention mechanism in time modeling, and design hierarchical spatial-temporal interactive learning to help the model adapt to various traffic scenarios.
arXiv Detail & Related papers (2024-10-12T03:47:27Z) - A Multi-Graph Convolutional Neural Network Model for Short-Term Prediction of Turning Movements at Signalized Intersections [0.6215404942415159]
This study introduces a novel deep learning architecture, referred to as the multigraph convolution neural network (MGCNN) for turning movement prediction at intersections.
The proposed architecture combines a multigraph structure, built to model temporal variations in traffic data, with a spectral convolution operation to support modeling the spatial variations in traffic data over the graphs.
The model's ability to perform short-term predictions over 1, 2, 3, 4, and 5 minutes into the future was evaluated against four baseline state-of-the-art models.
arXiv Detail & Related papers (2024-06-02T05:41:25Z) - Dynamic Hypergraph Structure Learning for Traffic Flow Forecasting [35.0288931087826]
Traffic flow forecasting aims to predict future traffic conditions on the basis of networks and traffic conditions in the past.
The problem is typically solved by modeling complex-temporal correlations in traffic data using far-temporal neural networks (GNNs)
Existing methods follow the paradigm of message passing that aggregates neighborhood information linearly.
In this paper, we propose a model named Dynamic Hyper Structure Learning (DyHSL) for traffic flow prediction.
arXiv Detail & Related papers (2023-09-21T12:44:55Z) - A Dynamic Temporal Self-attention Graph Convolutional Network for
Traffic Prediction [7.23135508361981]
This paper proposes a temporal self-attention graph convolutional network (DT-SGN) model which considers the adjacent matrix as a trainable attention score matrix.
Experiments demonstrate the superiority of our method over state-of-art model-driven model and data-driven models on real-world traffic datasets.
arXiv Detail & Related papers (2023-02-21T03:51:52Z) - PDFormer: Propagation Delay-Aware Dynamic Long-Range Transformer for
Traffic Flow Prediction [78.05103666987655]
spatial-temporal Graph Neural Network (GNN) models have emerged as one of the most promising methods to solve this problem.
We propose a novel propagation delay-aware dynamic long-range transFormer, namely PDFormer, for accurate traffic flow prediction.
Our method can not only achieve state-of-the-art performance but also exhibit competitive computational efficiency.
arXiv Detail & Related papers (2023-01-19T08:42:40Z) - D2-TPred: Discontinuous Dependency for Trajectory Prediction under
Traffic Lights [68.76631399516823]
We present a trajectory prediction approach with respect to traffic lights, D2-TPred, using a spatial dynamic interaction graph (SDG) and a behavior dependency graph (BDG)
Our experimental results show that our model achieves more than 20.45% and 20.78% in terms of ADE and FDE, respectively, on VTP-TL.
arXiv Detail & Related papers (2022-07-21T10:19:07Z) - DMGCRN: Dynamic Multi-Graph Convolution Recurrent Network for Traffic
Forecasting [7.232141271583618]
We propose a novel dynamic multi-graph convolution recurrent network (DMG) to tackle above issues.
We use the distance-based graph to capture spatial information from nodes are close in distance.
We also construct a novel latent graph which encoded the structure correlations among roads to capture spatial information from nodes are similar in structure.
arXiv Detail & Related papers (2021-12-04T06:51:55Z) - Spatio-Temporal Joint Graph Convolutional Networks for Traffic
Forecasting [75.10017445699532]
Recent have shifted their focus towards formulating traffic forecasting as atemporal graph modeling problem.
We propose a novel approach for accurate traffic forecasting on road networks over multiple future time steps.
arXiv Detail & Related papers (2021-11-25T08:45:14Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z) - Spatial-Temporal Transformer Networks for Traffic Flow Forecasting [74.76852538940746]
We propose a novel paradigm of Spatial-Temporal Transformer Networks (STTNs) to improve the accuracy of long-term traffic forecasting.
Specifically, we present a new variant of graph neural networks, named spatial transformer, by dynamically modeling directed spatial dependencies.
The proposed model enables fast and scalable training over a long range spatial-temporal dependencies.
arXiv Detail & Related papers (2020-01-09T10:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.