Deformable Dynamic Convolution for Accurate yet Efficient Spatio-Temporal Traffic Prediction
- URL: http://arxiv.org/abs/2507.11550v2
- Date: Fri, 19 Sep 2025 16:33:33 GMT
- Title: Deformable Dynamic Convolution for Accurate yet Efficient Spatio-Temporal Traffic Prediction
- Authors: Hyeonseok Jin, Geonmin Kim, Kyungbaek Kim,
- Abstract summary: Deformable Dynamic Convolutional Network (DDCN) is a novel CNN-based architecture that integrates both deformable and receptive dynamic operations.<n> DDCN achieves competitive predictive performance while significantly reducing computational costs, underscoring its potential for large-scale and real-time deployment.
- Score: 1.7268829007643391
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Traffic prediction is a critical component of intelligent transportation systems, enabling applications such as congestion mitigation and accident risk prediction. While recent research has explored both graph-based and grid-based approaches, key limitations remain. Graph-based methods effectively capture non-Euclidean spatial structures but often incur high computational overhead, limiting their practicality in large-scale systems. In contrast, grid-based methods, which primarily leverage Convolutional Neural Networks (CNNs), offer greater computational efficiency but struggle to model irregular spatial patterns due to the fixed shape of their filters. Moreover, both approaches often fail to account for inherent spatio-temporal heterogeneity, as they typically apply a shared set of parameters across diverse regions and time periods. To address these challenges, we propose the Deformable Dynamic Convolutional Network (DDCN), a novel CNN-based architecture that integrates both deformable and dynamic convolution operations. The deformable layer introduces learnable offsets to create flexible receptive fields that better align with spatial irregularities, while the dynamic layer generates region-specific filters, allowing the model to adapt to varying spatio-temporal traffic patterns. By combining these two components, DDCN effectively captures both non-Euclidean spatial structures and spatio-temporal heterogeneity. Extensive experiments on four real-world traffic datasets demonstrate that DDCN achieves competitive predictive performance while significantly reducing computational costs, underscoring its potential for large-scale and real-time deployment.
Related papers
- Wireless Traffic Prediction with Large Language Model [54.07581399989292]
TIDES is a novel framework that captures spatial-temporal correlations for wireless traffic prediction.<n> TIDES achieves efficient adaptation to domain-specific patterns without incurring excessive training overhead.<n>Our results indicate that integrating spatial awareness into LLM-based predictors is the key to unlocking scalable and intelligent network management in future 6G systems.
arXiv Detail & Related papers (2025-12-19T04:47:40Z) - Adaptive Mesh-Quantization for Neural PDE Solvers [51.26961483962011]
Graph Neural Networks can handle the irregular meshes required for complex geometries and boundary conditions, but still apply uniform computational effort across all nodes.<n>We propose Adaptive Mesh Quantization: spatially adaptive quantization across mesh node, edge, and cluster features, dynamically adjusting the bit-width used by a quantized model.<n>We demonstrate our framework's effectiveness by integrating it with two state-of-the-art models, MP-PDE and GraphViT, to evaluate performance across multiple tasks.
arXiv Detail & Related papers (2025-11-23T14:47:24Z) - A Joint Topology-Data Fusion Graph Network for Robust Traffic Speed Prediction with Data Anomalism [12.43932698231744]
We propose Graph Fusion Enhanced Network (GFEN), an innovative framework for network-level traffic speed prediction.<n>GFEN extracts spatial temporal correlations from both data distribution and network topology using trainable methods.<n>Experiments demonstrate that GFEN surpasses state-of-the-art methods by approximately 6.3% in prediction accuracy and convergence rates nearly twice as fast as recent hybrid models.
arXiv Detail & Related papers (2025-06-30T06:33:47Z) - Virtual Nodes Improve Long-term Traffic Prediction [9.125554921271338]
This study introduces a novel framework that incorporates virtual nodes, which are additional nodes added to the graph and connected to existing nodes.<n>Our proposed model incorporates virtual nodes by constructing a semi-adaptive adjacency matrix.<n> Experimental results demonstrate that the inclusion of virtual nodes significantly enhances long-term prediction accuracy.
arXiv Detail & Related papers (2025-01-17T09:09:01Z) - A Multi-Layer CNN-GRUSKIP model based on transformer for spatial TEMPORAL traffic flow prediction [0.06597195879147556]
Traffic flow prediction remains a cornerstone for intelligent transportation systems ITS.<n>The CNN-GRUSKIP model emerges as pioneering approach.<n>The model consistently outperformed established models such as ARIMA, Graph Wave Net, HA, LSTM, STGCN, and APT.<n>With its potent predictive prowess and adaptive architecture, the CNN-GRUSKIP model stands to redefine ITS applications.
arXiv Detail & Related papers (2025-01-09T21:30:02Z) - Improving Traffic Flow Predictions with SGCN-LSTM: A Hybrid Model for Spatial and Temporal Dependencies [55.2480439325792]
This paper introduces the Signal-Enhanced Graph Convolutional Network Long Short Term Memory (SGCN-LSTM) model for predicting traffic speeds across road networks.
Experiments on the PEMS-BAY road network traffic dataset demonstrate the SGCN-LSTM model's effectiveness.
arXiv Detail & Related papers (2024-11-01T00:37:00Z) - Navigating Spatio-Temporal Heterogeneity: A Graph Transformer Approach for Traffic Forecasting [13.309018047313801]
Traffic forecasting has emerged as a crucial research area in the development of smart cities.
Recent advancements in network modeling for most-temporal correlations are starting to see diminishing returns in performance.
To tackle these challenges, we introduce the Spatio-Temporal Graph Transformer (STGormer)
We design two straightforward yet effective spatial encoding methods based on the structure and integrate time position into the vanilla transformer to capture-temporal traffic patterns.
arXiv Detail & Related papers (2024-08-20T13:18:21Z) - Accelerating Scalable Graph Neural Network Inference with Node-Adaptive
Propagation [80.227864832092]
Graph neural networks (GNNs) have exhibited exceptional efficacy in a diverse array of applications.
The sheer size of large-scale graphs presents a significant challenge to real-time inference with GNNs.
We propose an online propagation framework and two novel node-adaptive propagation methods.
arXiv Detail & Related papers (2023-10-17T05:03:00Z) - Contextualizing MLP-Mixers Spatiotemporally for Urban Data Forecast at Scale [54.15522908057831]
We propose an adapted version of the computationally-Mixer for STTD forecast at scale.
Our results surprisingly show that this simple-yeteffective solution can rival SOTA baselines when tested on several traffic benchmarks.
Our findings contribute to the exploration of simple-yet-effective models for real-world STTD forecasting.
arXiv Detail & Related papers (2023-07-04T05:19:19Z) - Adaptive Hierarchical SpatioTemporal Network for Traffic Forecasting [70.66710698485745]
We propose an Adaptive Hierarchical SpatioTemporal Network (AHSTN) to promote traffic forecasting.
AHSTN exploits the spatial hierarchy and modeling multi-scale spatial correlations.
Experiments on two real-world datasets show that AHSTN achieves better performance over several strong baselines.
arXiv Detail & Related papers (2023-06-15T14:50:27Z) - PDFormer: Propagation Delay-Aware Dynamic Long-Range Transformer for
Traffic Flow Prediction [78.05103666987655]
spatial-temporal Graph Neural Network (GNN) models have emerged as one of the most promising methods to solve this problem.
We propose a novel propagation delay-aware dynamic long-range transFormer, namely PDFormer, for accurate traffic flow prediction.
Our method can not only achieve state-of-the-art performance but also exhibit competitive computational efficiency.
arXiv Detail & Related papers (2023-01-19T08:42:40Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - MAF-GNN: Multi-adaptive Spatiotemporal-flow Graph Neural Network for
Traffic Speed Forecasting [3.614768552081925]
We propose a Multi-adaptive Spatiotemporal-flow Graph Neural Network (MAF-GNN) for traffic speed forecasting.
MAF-GNN introduces an effective Multi-adaptive Adjacency Matrices Mechanism to capture multiple latent spatial dependencies between traffic nodes.
It achieves better performance than other models on two real-world datasets of public traffic network, METR-LA and PeMS-Bay.
arXiv Detail & Related papers (2021-08-08T09:06:43Z) - Spatial-Temporal Graph ODE Networks for Traffic Flow Forecasting [22.421667339552467]
Spatial-temporal forecasting has attracted tremendous attention in a wide range of applications, and traffic flow prediction is a canonical and typical example.
Existing works typically utilize shallow graph convolution networks (GNNs) and temporal extracting modules to model spatial and temporal dependencies respectively.
We propose Spatial-Temporal Graph Ordinary Differential Equation Networks (STGODE), which captures spatial-temporal dynamics through a tensor-based ordinary differential equation (ODE)
We evaluate our model on multiple real-world traffic datasets and superior performance is achieved over state-of-the-art baselines.
arXiv Detail & Related papers (2021-06-24T11:48:45Z) - Adaptive Latent Space Tuning for Non-Stationary Distributions [62.997667081978825]
We present a method for adaptive tuning of the low-dimensional latent space of deep encoder-decoder style CNNs.
We demonstrate our approach for predicting the properties of a time-varying charged particle beam in a particle accelerator.
arXiv Detail & Related papers (2021-05-08T03:50:45Z) - Learning dynamic and hierarchical traffic spatiotemporal features with
Transformer [4.506591024152763]
This paper proposes a novel model, Traffic Transformer, for spatial-temporal graph modeling and long-term traffic forecasting.
Transformer is the most popular framework in Natural Language Processing (NLP)
analyzing the attention weight matrixes can find the influential part of road networks, allowing us to learn the traffic networks better.
arXiv Detail & Related papers (2021-04-12T02:29:58Z) - SST-GNN: Simplified Spatio-temporal Traffic forecasting model using
Graph Neural Network [2.524966118517392]
We have designed a simplified S-temporal GNN(SST-GNN) that effectively encodes the dependency by separately aggregating different neighborhood.
We have shown that our model has significantly outperformed the state-of-the-art models on three real-world traffic datasets.
arXiv Detail & Related papers (2021-03-31T18:28:44Z) - Spatial-Temporal Fusion Graph Neural Networks for Traffic Flow
Forecasting [35.072979313851235]
spatial-temporal data forecasting of traffic flow is a challenging task because of complicated spatial dependencies and dynamical trends of temporal pattern between different roads.
Existing frameworks typically utilize given spatial adjacency graph and sophisticated mechanisms for modeling spatial and temporal correlations.
This paper proposes Spatial-Temporal Fusion Graph Neural Networks (STFGNN) for traffic flow forecasting.
arXiv Detail & Related papers (2020-12-15T14:03:17Z) - Spatial-Temporal Transformer Networks for Traffic Flow Forecasting [74.76852538940746]
We propose a novel paradigm of Spatial-Temporal Transformer Networks (STTNs) to improve the accuracy of long-term traffic forecasting.
Specifically, we present a new variant of graph neural networks, named spatial transformer, by dynamically modeling directed spatial dependencies.
The proposed model enables fast and scalable training over a long range spatial-temporal dependencies.
arXiv Detail & Related papers (2020-01-09T10:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.