Newell's theory based feature transformations for spatio-temporal
traffic prediction
- URL: http://arxiv.org/abs/2307.05949v2
- Date: Mon, 17 Jul 2023 00:09:14 GMT
- Title: Newell's theory based feature transformations for spatio-temporal
traffic prediction
- Authors: Agnimitra Sengupta, S. Ilgin Guler
- Abstract summary: We propose a traffic flow physics-based transformation feature for Deep learning (DL) models for traffic flow forecasting.
This transformation incorporates Newell's uncongested and congested filters of traffic flows at the target locations, enabling the models to learn broader dynamics of the system.
An important advantage of our framework is its ability to be transferred to new locations where data is unavailable.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning (DL) models for spatio-temporal traffic flow forecasting employ
convolutional or graph-convolutional filters along with recurrent neural
networks to capture spatial and temporal dependencies in traffic data. These
models, such as CNN-LSTM, utilize traffic flows from neighboring detector
stations to predict flows at a specific location of interest. However, these
models are limited in their ability to capture the broader dynamics of the
traffic system, as they primarily learn features specific to the detector
configuration and traffic characteristics at the target location. Hence, the
transferability of these models to different locations becomes challenging,
particularly when data is unavailable at the new location for model training.
To address this limitation, we propose a traffic flow physics-based feature
transformation for spatio-temporal DL models. This transformation incorporates
Newell's uncongested and congested-state estimators of traffic flows at the
target locations, enabling the models to learn broader dynamics of the system.
Our methodology is empirically validated using traffic data from two different
locations. The results demonstrate that the proposed feature transformation
improves the models' performance in predicting traffic flows over different
prediction horizons, as indicated by better goodness-of-fit statistics. An
important advantage of our framework is its ability to be transferred to new
locations where data is unavailable. This is achieved by appropriately
accounting for spatial dependencies based on station distances and various
traffic parameters. In contrast, regular DL models are not easily transferable
as their inputs remain fixed. It should be noted that due to data limitations,
we were unable to perform spatial sensitivity analysis, which calls for further
research using simulated data.
Related papers
- Improving Traffic Flow Predictions with SGCN-LSTM: A Hybrid Model for Spatial and Temporal Dependencies [55.2480439325792]
This paper introduces the Signal-Enhanced Graph Convolutional Network Long Short Term Memory (SGCN-LSTM) model for predicting traffic speeds across road networks.
Experiments on the PEMS-BAY road network traffic dataset demonstrate the SGCN-LSTM model's effectiveness.
arXiv Detail & Related papers (2024-11-01T00:37:00Z) - Strada-LLM: Graph LLM for traffic prediction [62.2015839597764]
A considerable challenge in traffic prediction lies in handling the diverse data distributions caused by vastly different traffic conditions.
We propose a graph-aware LLM for traffic prediction that considers proximal traffic information.
We adopt a lightweight approach for efficient domain adaptation when facing new data distributions in few-shot fashion.
arXiv Detail & Related papers (2024-10-28T09:19:29Z) - Physics-guided Active Sample Reweighting for Urban Flow Prediction [75.24539704456791]
Urban flow prediction is a nuanced-temporal modeling that estimates the throughput of transportation services like buses, taxis and ride-driven models.
Some recent prediction solutions bring remedies with the notion of physics-guided machine learning (PGML)
We develop a atized physics-guided network (PN), and propose a data-aware framework Physics-guided Active Sample Reweighting (P-GASR)
arXiv Detail & Related papers (2024-07-18T15:44:23Z) - A Multi-Graph Convolutional Neural Network Model for Short-Term Prediction of Turning Movements at Signalized Intersections [0.6215404942415159]
This study introduces a novel deep learning architecture, referred to as the multigraph convolution neural network (MGCNN) for turning movement prediction at intersections.
The proposed architecture combines a multigraph structure, built to model temporal variations in traffic data, with a spectral convolution operation to support modeling the spatial variations in traffic data over the graphs.
The model's ability to perform short-term predictions over 1, 2, 3, 4, and 5 minutes into the future was evaluated against four baseline state-of-the-art models.
arXiv Detail & Related papers (2024-06-02T05:41:25Z) - Transport-Hub-Aware Spatial-Temporal Adaptive Graph Transformer for
Traffic Flow Prediction [10.722455633629883]
We propose a Transport-Hub-aware spatial-temporal adaptive graph transFormer for traffic flow prediction.
Specifically, we first design a novel spatial self-attention module to capture the dynamic spatial dependencies.
We also employ a temporal self-attention module to detect dynamic temporal patterns in the traffic flow data.
arXiv Detail & Related papers (2023-10-12T13:44:35Z) - PDFormer: Propagation Delay-Aware Dynamic Long-Range Transformer for
Traffic Flow Prediction [78.05103666987655]
spatial-temporal Graph Neural Network (GNN) models have emerged as one of the most promising methods to solve this problem.
We propose a novel propagation delay-aware dynamic long-range transFormer, namely PDFormer, for accurate traffic flow prediction.
Our method can not only achieve state-of-the-art performance but also exhibit competitive computational efficiency.
arXiv Detail & Related papers (2023-01-19T08:42:40Z) - STLGRU: Spatio-Temporal Lightweight Graph GRU for Traffic Flow
Prediction [0.40964539027092917]
We propose STLGRU, a novel traffic forecasting model for predicting traffic flow accurately.
Our proposed STLGRU can effectively capture dynamic local and global spatial-temporal relations of traffic networks.
Our method can not only achieve state-of-the-art performance but also exhibit competitive computational efficiency.
arXiv Detail & Related papers (2022-12-08T20:24:59Z) - Correlating sparse sensing for large-scale traffic speed estimation: A
Laplacian-enhanced low-rank tensor kriging approach [76.45949280328838]
We propose a Laplacian enhanced low-rank tensor (LETC) framework featuring both lowrankness and multi-temporal correlations for large-scale traffic speed kriging.
We then design an efficient solution algorithm via several effective numeric techniques to scale up the proposed model to network-wide kriging.
arXiv Detail & Related papers (2022-10-21T07:25:57Z) - Learning dynamic and hierarchical traffic spatiotemporal features with
Transformer [4.506591024152763]
This paper proposes a novel model, Traffic Transformer, for spatial-temporal graph modeling and long-term traffic forecasting.
Transformer is the most popular framework in Natural Language Processing (NLP)
analyzing the attention weight matrixes can find the influential part of road networks, allowing us to learn the traffic networks better.
arXiv Detail & Related papers (2021-04-12T02:29:58Z) - Spatial-Temporal Transformer Networks for Traffic Flow Forecasting [74.76852538940746]
We propose a novel paradigm of Spatial-Temporal Transformer Networks (STTNs) to improve the accuracy of long-term traffic forecasting.
Specifically, we present a new variant of graph neural networks, named spatial transformer, by dynamically modeling directed spatial dependencies.
The proposed model enables fast and scalable training over a long range spatial-temporal dependencies.
arXiv Detail & Related papers (2020-01-09T10:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.