MCST-Mamba: Multivariate Mamba-Based Model for Traffic Prediction
- URL: http://arxiv.org/abs/2507.03927v1
- Date: Sat, 05 Jul 2025 07:22:36 GMT
- Title: MCST-Mamba: Multivariate Mamba-Based Model for Traffic Prediction
- Authors: Mohamed Hamad, Mohamed Mabrok, Nizar Zorba,
- Abstract summary: We introduce the Multi-Channel Spatio-Temporal (MCST) Mamba model, a forecasting framework built on the Mamba selective state-space architecture.<n>Our results show that MCST-Mamba achieves strong predictive performance with a lower parameter count compared to baseline models.
- Score: 4.499271869809039
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Accurate traffic prediction plays a vital role in intelligent transportation systems by enabling efficient routing, congestion mitigation, and proactive traffic control. However, forecasting is challenging due to the combined effects of dynamic road conditions, varying traffic patterns across different locations, and external influences such as weather and accidents. Traffic data often consists of several interrelated measurements - such as speed, flow and occupancy - yet many deep-learning approaches either predict only one of these variables or require a separate model for each. This limits their ability to capture joint patterns across channels. To address this, we introduce the Multi-Channel Spatio-Temporal (MCST) Mamba model, a forecasting framework built on the Mamba selective state-space architecture that natively handles multivariate inputs and simultaneously models all traffic features. The proposed MCST-Mamba model integrates adaptive spatio-temporal embeddings and separates the modeling of temporal sequences and spatial sensor interactions into two dedicated Mamba blocks, improving representation learning. Unlike prior methods that evaluate on a single channel, we assess MCST-Mamba across all traffic features at once, aligning more closely with how congestion arises in practice. Our results show that MCST-Mamba achieves strong predictive performance with a lower parameter count compared to baseline models.
Related papers
- Rethinking Traffic Flow Forecasting: From Transition to Generatation [0.0]
We propose an Effective Multi-Branch Similarity Transformer for Traffic Flow Prediction, namely EMBSFormer.<n>We find that the factors affecting traffic flow include node-level traffic generation and graph-level traffic transition, which describe the multi-periodicity and interaction pattern of nodes, respectively.<n>For traffic transition, we employ a temporal and spatial self-attention mechanism to maintain global node interactions, and use GNN and time conv to model local node interactions, respectively.
arXiv Detail & Related papers (2025-04-19T09:52:39Z) - Improving Traffic Flow Predictions with SGCN-LSTM: A Hybrid Model for Spatial and Temporal Dependencies [55.2480439325792]
This paper introduces the Signal-Enhanced Graph Convolutional Network Long Short Term Memory (SGCN-LSTM) model for predicting traffic speeds across road networks.
Experiments on the PEMS-BAY road network traffic dataset demonstrate the SGCN-LSTM model's effectiveness.
arXiv Detail & Related papers (2024-11-01T00:37:00Z) - A Time Series is Worth Five Experts: Heterogeneous Mixture of Experts for Traffic Flow Prediction [9.273632869779929]
We propose a Heterogeneous Mixture of Experts (TITAN) model for traffic flow prediction.
Experiments on two public traffic network datasets, METR-LA and P-BAY, demonstrate that TITAN effectively captures variable-centric dependencies.
It achieves improvements in all evaluation metrics, ranging from approximately 4.37% to 11.53%, compared to previous state-of-the-art (SOTA) models.
arXiv Detail & Related papers (2024-09-26T00:26:47Z) - ST-MambaSync: The Complement of Mamba and Transformers for Spatial-Temporal in Traffic Flow Prediction [36.89741338367832]
This paper introduces ST-MambaSync, an innovative traffic flow prediction model that combines transformer technology with the ST-Mamba block.
We are the pioneers in employing the Mamba mechanism which is an attention mechanism integrated with ResNet within a transformer framework.
arXiv Detail & Related papers (2024-04-24T14:41:41Z) - Deep Multi-View Channel-Wise Spatio-Temporal Network for Traffic Flow Prediction [18.008631008649658]
underlineMulti-underlineView underlineChannel-wise underlineSpatio-underlineTemporal underlineNetwork (MVC-STNet)
We study the novel problem of multi-channel traffic flow prediction, and propose a deep underlineMulti-underlineView underlineChannel-wise underlineSpatio-underlineTemp
arXiv Detail & Related papers (2024-04-23T13:39:04Z) - Trajeglish: Traffic Modeling as Next-Token Prediction [67.28197954427638]
A longstanding challenge for self-driving development is simulating dynamic driving scenarios seeded from recorded driving logs.
We apply tools from discrete sequence modeling to model how vehicles, pedestrians and cyclists interact in driving scenarios.
Our model tops the Sim Agents Benchmark, surpassing prior work along the realism meta metric by 3.3% and along the interaction metric by 9.9%.
arXiv Detail & Related papers (2023-12-07T18:53:27Z) - Contextualizing MLP-Mixers Spatiotemporally for Urban Data Forecast at Scale [54.15522908057831]
We propose an adapted version of the computationally-Mixer for STTD forecast at scale.
Our results surprisingly show that this simple-yeteffective solution can rival SOTA baselines when tested on several traffic benchmarks.
Our findings contribute to the exploration of simple-yet-effective models for real-world STTD forecasting.
arXiv Detail & Related papers (2023-07-04T05:19:19Z) - PDFormer: Propagation Delay-Aware Dynamic Long-Range Transformer for
Traffic Flow Prediction [78.05103666987655]
spatial-temporal Graph Neural Network (GNN) models have emerged as one of the most promising methods to solve this problem.
We propose a novel propagation delay-aware dynamic long-range transFormer, namely PDFormer, for accurate traffic flow prediction.
Our method can not only achieve state-of-the-art performance but also exhibit competitive computational efficiency.
arXiv Detail & Related papers (2023-01-19T08:42:40Z) - Short-term passenger flow prediction for multi-traffic modes: A residual
network and Transformer based multi-task learning method [21.13073816634534]
Res-Transformer is a learning model for short-term passenger flow prediction of multi-traffic modes.
Model is evaluated on two large-scale real-world datasets from Beijing, China.
This paper can give critical insights into the short-tern passenger flow prediction for multi-traffic modes.
arXiv Detail & Related papers (2022-02-27T01:09:19Z) - Multi-intersection Traffic Optimisation: A Benchmark Dataset and a
Strong Baseline [85.9210953301628]
Control of traffic signals is fundamental and critical to alleviate traffic congestion in urban areas.
Because of the high complexity of modelling the problem, experimental settings of current works are often inconsistent.
We propose a novel and strong baseline model based on deep reinforcement learning with the encoder-decoder structure.
arXiv Detail & Related papers (2021-01-24T03:55:39Z) - Spatial-Temporal Transformer Networks for Traffic Flow Forecasting [74.76852538940746]
We propose a novel paradigm of Spatial-Temporal Transformer Networks (STTNs) to improve the accuracy of long-term traffic forecasting.
Specifically, we present a new variant of graph neural networks, named spatial transformer, by dynamically modeling directed spatial dependencies.
The proposed model enables fast and scalable training over a long range spatial-temporal dependencies.
arXiv Detail & Related papers (2020-01-09T10:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.