HUTFormer: Hierarchical U-Net Transformer for Long-Term Traffic
Forecasting
- URL: http://arxiv.org/abs/2307.14596v1
- Date: Thu, 27 Jul 2023 02:43:21 GMT
- Title: HUTFormer: Hierarchical U-Net Transformer for Long-Term Traffic
Forecasting
- Authors: Zezhi Shao, Fei Wang, Zhao Zhang, Yuchen Fang, Guangyin Jin, Yongjun
Xu
- Abstract summary: We make the first attempt to explore long-term traffic forecasting, e.g., 1-day forecasting.
We propose a novel Hierarchical U-net TransFormer to address the issues of long-term traffic forecasting.
The proposed HUTFormer significantly outperforms state-of-the-art traffic forecasting and long time series forecasting baselines.
- Score: 13.49661832917228
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Traffic forecasting, which aims to predict traffic conditions based on
historical observations, has been an enduring research topic and is widely
recognized as an essential component of intelligent transportation. Recent
proposals on Spatial-Temporal Graph Neural Networks (STGNNs) have made
significant progress by combining sequential models with graph convolution
networks. However, due to high complexity issues, STGNNs only focus on
short-term traffic forecasting, e.g., 1-hour forecasting, while ignoring more
practical long-term forecasting. In this paper, we make the first attempt to
explore long-term traffic forecasting, e.g., 1-day forecasting. To this end, we
first reveal its unique challenges in exploiting multi-scale representations.
Then, we propose a novel Hierarchical U-net TransFormer (HUTFormer) to address
the issues of long-term traffic forecasting. HUTFormer consists of a
hierarchical encoder and decoder to jointly generate and utilize multi-scale
representations of traffic data. Specifically, for the encoder, we propose
window self-attention and segment merging to extract multi-scale
representations from long-term traffic data. For the decoder, we design a
cross-scale attention mechanism to effectively incorporate multi-scale
representations. In addition, HUTFormer employs an efficient input embedding
strategy to address the complexity issues. Extensive experiments on four
traffic datasets show that the proposed HUTFormer significantly outperforms
state-of-the-art traffic forecasting and long time series forecasting
baselines.
Related papers
- BjTT: A Large-scale Multimodal Dataset for Traffic Prediction [49.93028461584377]
Traditional traffic prediction methods rely on historical traffic data to predict traffic trends.
In this work, we explore how generative models combined with text describing the traffic system can be applied for traffic generation.
We propose ChatTraffic, the first diffusion model for text-to-traffic generation.
arXiv Detail & Related papers (2024-03-08T04:19:56Z) - Rethinking Urban Mobility Prediction: A Super-Multivariate Time Series
Forecasting Approach [71.67506068703314]
Long-term urban mobility predictions play a crucial role in the effective management of urban facilities and services.
Traditionally, urban mobility data has been structured as videos, treating longitude and latitude as fundamental pixels.
In our research, we introduce a fresh perspective on urban mobility prediction.
Instead of oversimplifying urban mobility data as traditional video data, we regard it as a complex time series.
arXiv Detail & Related papers (2023-12-04T07:39:05Z) - Spatio-Temporal Graph Neural Point Process for Traffic Congestion Event
Prediction [16.530361912832763]
We propose a temporal graph neural point process framework, named STNPP, for traffic congestion event prediction.
Our method achieves superior performance in comparison to existing state-of-the-art approaches.
arXiv Detail & Related papers (2023-11-15T01:22:47Z) - FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure
Graph Perspective [48.00240550685946]
Current state-of-the-art graph neural network (GNN)-based forecasting methods usually require both graph networks (e.g., GCN) and temporal networks (e.g., LSTM) to capture inter-series (spatial) dynamics and intra-series (temporal) dependencies, respectively.
We propose a novel Fourier Graph Neural Network (FourierGNN) by stacking our proposed Fourier Graph Operator (FGO) to perform matrix multiplications in Fourier space.
Our experiments on seven datasets have demonstrated superior performance with higher efficiency and fewer parameters compared with state-of-the-
arXiv Detail & Related papers (2023-11-10T17:13:26Z) - PDFormer: Propagation Delay-Aware Dynamic Long-Range Transformer for
Traffic Flow Prediction [78.05103666987655]
spatial-temporal Graph Neural Network (GNN) models have emerged as one of the most promising methods to solve this problem.
We propose a novel propagation delay-aware dynamic long-range transFormer, namely PDFormer, for accurate traffic flow prediction.
Our method can not only achieve state-of-the-art performance but also exhibit competitive computational efficiency.
arXiv Detail & Related papers (2023-01-19T08:42:40Z) - Similarity-based Feature Extraction for Large-scale Sparse Traffic
Forecasting [4.295541562380963]
The NeurIPS 2022 Traffic4cast challenge is dedicated to predicting the citywide traffic states with publicly available sparse loop count data.
This technical report introduces our second-place winning solution to the extended challenge of ETA prediction.
arXiv Detail & Related papers (2022-11-13T22:19:21Z) - Explainable Graph Pyramid Autoformer for Long-Term Traffic Forecasting [3.5908670236727933]
We develop an explainable attention-based spatial-temporal graph neural network that uses a novel pyramid autocorrelation attention mechanism.
Our model can achieve up to 35 % better long-term traffic forecast accuracy than that of several state-of-the-art methods.
arXiv Detail & Related papers (2022-09-27T02:31:06Z) - Continuous-Time and Multi-Level Graph Representation Learning for
Origin-Destination Demand Prediction [52.0977259978343]
This paper proposes a Continuous-time and Multi-level dynamic graph representation learning method for Origin-Destination demand prediction (CMOD)
The state vectors keep historical transaction information and are continuously updated according to the most recently happened transactions.
Experiments are conducted on two real-world datasets from Beijing Subway and New York Taxi, and the results demonstrate the superiority of our model against the state-of-the-art approaches.
arXiv Detail & Related papers (2022-06-30T03:37:50Z) - PSTN: Periodic Spatial-temporal Deep Neural Network for Traffic
Condition Prediction [8.255993195520306]
We propose a periodic deeptemporal neural network (PSTN) with three modules to improve the forecasting performance of traffic conditions.
First, the historical traffic information is folded and fed into a module consisting of a graph convolutional network and a temporal convolutional network.
arXiv Detail & Related papers (2021-08-05T07:42:43Z) - Learning dynamic and hierarchical traffic spatiotemporal features with
Transformer [4.506591024152763]
This paper proposes a novel model, Traffic Transformer, for spatial-temporal graph modeling and long-term traffic forecasting.
Transformer is the most popular framework in Natural Language Processing (NLP)
analyzing the attention weight matrixes can find the influential part of road networks, allowing us to learn the traffic networks better.
arXiv Detail & Related papers (2021-04-12T02:29:58Z) - Spatio-temporal Modeling for Large-scale Vehicular Networks Using Graph
Convolutional Networks [110.80088437391379]
A graph-based framework called SMART is proposed to model and keep track of the statistics of vehicle-to-temporal (V2I) communication latency across a large geographical area.
We develop a graph reconstruction-based approach using a graph convolutional network integrated with a deep Q-networks algorithm.
Our results show that the proposed method can significantly improve both the accuracy and efficiency for modeling and the latency performance of large vehicular networks.
arXiv Detail & Related papers (2021-03-13T06:56:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.