DETNO: A Diffusion-Enhanced Transformer Neural Operator for Long-Term Traffic Forecasting
- URL: http://arxiv.org/abs/2508.19389v1
- Date: Tue, 26 Aug 2025 19:32:32 GMT
- Title: DETNO: A Diffusion-Enhanced Transformer Neural Operator for Long-Term Traffic Forecasting
- Authors: Owais Ahmad, Milad Ramezankhani, Anirudh Deodhar,
- Abstract summary: Accurate long-term traffic forecasting remains a critical challenge in intelligent transportation systems.<n>We introduce a unified Diffusion-Enhanced Transformer Neural Operator architecture.<n>Our method demonstrates superior performance in extended rollout predictions compared to traditional and transformer-based neural operators.
- Score: 0.764671395172401
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Accurate long-term traffic forecasting remains a critical challenge in intelligent transportation systems, particularly when predicting high-frequency traffic phenomena such as shock waves and congestion boundaries over extended rollout horizons. Neural operators have recently gained attention as promising tools for modeling traffic flow. While effective at learning function space mappings, they inherently produce smooth predictions that fail to reconstruct high-frequency features such as sharp density gradients which results in rapid error accumulation during multi-step rollout predictions essential for real-time traffic management. To address these fundamental limitations, we introduce a unified Diffusion-Enhanced Transformer Neural Operator (DETNO) architecture. DETNO leverages a transformer neural operator with cross-attention mechanisms, providing model expressivity and super-resolution, coupled with a diffusion-based refinement component that iteratively reconstructs high-frequency traffic details through progressive denoising. This overcomes the inherent smoothing limitations and rollout instability of standard neural operators. Through comprehensive evaluation on chaotic traffic datasets, our method demonstrates superior performance in extended rollout predictions compared to traditional and transformer-based neural operators, preserving high-frequency components and improving stability over long prediction horizons.
Related papers
- Structured Noise Modeling for Enhanced Time-Series Forecasting [0.0]
This work introduces a forecast-blur-denoise framework that improves temporal fidelity through structured noise modeling.<n> Experiments across electricity, traffic, and solar datasets show consistent gains in multi-horizon accuracy and stability.<n>This framework contributes to more trustworthy AI systems used in forecasting-driven decision support across energy, infrastructure, and other time-critical domains.
arXiv Detail & Related papers (2025-11-24T19:44:46Z) - Unveiling the Power of Noise Priors: Enhancing Diffusion Models for Mobile Traffic Prediction [8.208273046006697]
We propose NPDiff, a framework that decomposes noise into prior and residual components, with the prior derived from data dynamics.<n>NPDiff can seamlessly integrate with various diffusion-based prediction models, delivering predictions that are effective, efficient, and robust.
arXiv Detail & Related papers (2025-01-23T16:13:08Z) - Implicit factorized transformer approach to fast prediction of turbulent channel flows [6.70175842351963]
We introduce a modified implicit factorized transformer (IFactFormer-m) model which replaces the original chained factorized attention with parallel factorized attention.<n>The IFactFormer-m model successfully performs long-term predictions for turbulent channel flow.
arXiv Detail & Related papers (2024-12-25T09:05:14Z) - A Multi-Channel Spatial-Temporal Transformer Model for Traffic Flow Forecasting [0.0]
We propose a multi-channel spatial-temporal transformer model for traffic flow forecasting.
It improves the accuracy of the prediction by fusing results from different channels of traffic data.
Experimental results on six real-world datasets demonstrate that introducing a multi-channel mechanism into the temporal model enhances performance.
arXiv Detail & Related papers (2024-05-10T06:37:07Z) - Accelerating Scalable Graph Neural Network Inference with Node-Adaptive
Propagation [80.227864832092]
Graph neural networks (GNNs) have exhibited exceptional efficacy in a diverse array of applications.
The sheer size of large-scale graphs presents a significant challenge to real-time inference with GNNs.
We propose an online propagation framework and two novel node-adaptive propagation methods.
arXiv Detail & Related papers (2023-10-17T05:03:00Z) - Towards Long-Term predictions of Turbulence using Neural Operators [68.8204255655161]
It aims to develop reduced-order/surrogate models for turbulent flow simulations using Machine Learning.
Different model structures are analyzed, with U-NET structures performing better than the standard FNO in accuracy and stability.
arXiv Detail & Related papers (2023-07-25T14:09:53Z) - PDFormer: Propagation Delay-Aware Dynamic Long-Range Transformer for
Traffic Flow Prediction [78.05103666987655]
spatial-temporal Graph Neural Network (GNN) models have emerged as one of the most promising methods to solve this problem.
We propose a novel propagation delay-aware dynamic long-range transFormer, namely PDFormer, for accurate traffic flow prediction.
Our method can not only achieve state-of-the-art performance but also exhibit competitive computational efficiency.
arXiv Detail & Related papers (2023-01-19T08:42:40Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Traffic Flow Prediction via Variational Bayesian Inference-based
Encoder-Decoder Framework [1.181206257787103]
This paper proposes a deep encoder-decoder prediction framework based on variational Bayesian inference.
A Bayesian neural network is constructed by combining variational inference with gated recurrent units (GRU) and used as the deep neural network unit of the encoder-decoder framework.
The proposed model achieves superior prediction performance on the Guangzhou urban traffic flow dataset over the benchmarks.
arXiv Detail & Related papers (2022-12-14T12:39:47Z) - Correlating sparse sensing for large-scale traffic speed estimation: A
Laplacian-enhanced low-rank tensor kriging approach [76.45949280328838]
We propose a Laplacian enhanced low-rank tensor (LETC) framework featuring both lowrankness and multi-temporal correlations for large-scale traffic speed kriging.
We then design an efficient solution algorithm via several effective numeric techniques to scale up the proposed model to network-wide kriging.
arXiv Detail & Related papers (2022-10-21T07:25:57Z) - Spatial-Temporal Transformer Networks for Traffic Flow Forecasting [74.76852538940746]
We propose a novel paradigm of Spatial-Temporal Transformer Networks (STTNs) to improve the accuracy of long-term traffic forecasting.
Specifically, we present a new variant of graph neural networks, named spatial transformer, by dynamically modeling directed spatial dependencies.
The proposed model enables fast and scalable training over a long range spatial-temporal dependencies.
arXiv Detail & Related papers (2020-01-09T10:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.