Adaptive Graph Convolution Networks for Traffic Flow Forecasting
- URL: http://arxiv.org/abs/2307.05517v1
- Date: Fri, 7 Jul 2023 09:55:41 GMT
- Title: Adaptive Graph Convolution Networks for Traffic Flow Forecasting
- Authors: Zhengdao Li, Wei Li, and Kai Hwang
- Abstract summary: We propose a novel Adaptive Graph Convolution Networks (AGC-net) to address this issue in Graph neural networks (GNNs)
The AGC-net is constructed by the Adaptive Graph Convolution (AGC) based on a novel context attention mechanism.
Experimental results on two public traffic datasets demonstrate the effectiveness of the AGC-net.
- Score: 4.398745005061698
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Traffic flow forecasting is a highly challenging task due to the dynamic
spatial-temporal road conditions. Graph neural networks (GNN) has been widely
applied in this task. However, most of these GNNs ignore the effects of
time-varying road conditions due to the fixed range of the convolution
receptive field. In this paper, we propose a novel Adaptive Graph Convolution
Networks (AGC-net) to address this issue in GNN. The AGC-net is constructed by
the Adaptive Graph Convolution (AGC) based on a novel context attention
mechanism, which consists of a set of graph wavelets with various learnable
scales. The AGC transforms the spatial graph representations into
time-sensitive features considering the temporal context. Moreover, a shifted
graph convolution kernel is designed to enhance the AGC, which attempts to
correct the deviations caused by inaccurate topology. Experimental results on
two public traffic datasets demonstrate the effectiveness of the
AGC-net\footnote{Code is available at: https://github.com/zhengdaoli/AGC-net}
which outperforms other baseline models significantly.
Related papers
- Graph Condensation for Open-World Graph Learning [48.38802327346445]
Graph condensation (GC) has emerged as a promising acceleration solution for efficiently training graph neural networks (GNNs)
Existing GC methods are limited to aligning the condensed graph with merely the observed static graph distribution.
In real-world scenarios, however, graphs are dynamic and constantly evolving, with new nodes and edges being continually integrated.
We propose OpenGC, a robust GC framework that integrates structure-aware distribution shift to simulate evolving graph patterns.
arXiv Detail & Related papers (2024-05-27T09:47:09Z) - Adaptive Least Mean Squares Graph Neural Networks and Online Graph
Signal Estimation [3.6448362316632115]
We propose an efficient Neural Network architecture for the online estimation of time-varying graph signals.
The Adaptive Least Mean Squares Graph Neural Networks (LMS-GNN) is a combination of adaptive graph filters and Graph Neural Networks (GNN)
Experimenting on real-world temperature data reveals that our LMS-GNN achieves more accurate online predictions compared to graph-based methods.
arXiv Detail & Related papers (2024-01-27T05:47:12Z) - T-GAE: Transferable Graph Autoencoder for Network Alignment [79.89704126746204]
T-GAE is a graph autoencoder framework that leverages transferability and stability of GNNs to achieve efficient network alignment without retraining.
Our experiments demonstrate that T-GAE outperforms the state-of-the-art optimization method and the best GNN approach by up to 38.7% and 50.8%, respectively.
arXiv Detail & Related papers (2023-10-05T02:58:29Z) - Attention-based Dynamic Graph Convolutional Recurrent Neural Network for
Traffic Flow Prediction in Highway Transportation [0.6650227510403052]
Attention-based Dynamic Graph Convolutional Recurrent Neural Network (ADG-N) is proposed to improve traffic flow prediction in highway transportation.
A dedicated gated kernel emphasizing highly relative nodes is introduced on complete graphs to reduce overfitting for graph convolution operations.
arXiv Detail & Related papers (2023-09-13T13:57:21Z) - A Multidimensional Graph Fourier Transformation Neural Network for
Vehicle Trajectory Prediction [9.554569082679151]
This work introduces the multidimensional Graph Fourier Transformation Neural Network (GFTNN) for long-term trajectory predictions on highways.
Similar to Graph Neural Networks (GNNs), the GFTNN is a novel architecture that operates on graph structures.
For experiments and evaluation, the publicly available datasets highD and NGSIM are used.
arXiv Detail & Related papers (2023-05-12T12:36:48Z) - Spatial-Temporal Adaptive Graph Convolution with Attention Network for
Traffic Forecasting [4.1700160312787125]
We propose a novel network, Spatial-Temporal Adaptive graph convolution with Attention Network (STAAN) for traffic forecasting.
Firstly, we adopt an adaptive dependency matrix instead of using a pre-defined matrix during GCN processing to infer the inter-dependencies among nodes.
Secondly, we integrate PW-attention based on graph attention network which is designed for global dependency, and GCN as spatial block.
arXiv Detail & Related papers (2022-06-07T09:08:35Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z) - Constructing Geographic and Long-term Temporal Graph for Traffic
Forecasting [88.5550074808201]
We propose Geographic and Long term Temporal Graph Convolutional Recurrent Neural Network (GLT-GCRNN) for traffic forecasting.
In this work, we propose a novel framework for traffic forecasting that learns the rich interactions between roads sharing similar geographic or longterm temporal patterns.
arXiv Detail & Related papers (2020-04-23T03:50:46Z) - Graph Highway Networks [77.38665506495553]
Graph Convolution Networks (GCN) are widely used in learning graph representations due to their effectiveness and efficiency.
They suffer from the notorious over-smoothing problem, in which the learned representations converge to alike vectors when many layers are stacked.
We propose Graph Highway Networks (GHNet) which utilize gating units to balance the trade-off between homogeneity and heterogeneity in the GCN learning process.
arXiv Detail & Related papers (2020-04-09T16:26:43Z) - Gated Graph Recurrent Neural Networks [176.3960927323358]
We introduce Graph Recurrent Neural Networks (GRNNs) as a general learning framework for graph processes.
To address the problem of vanishing gradients, we put forward GRNNs with three different gating mechanisms: time, node and edge gates.
The numerical results also show that GRNNs outperform GNNs and RNNs, highlighting the importance of taking both the temporal and graph structures of a graph process into account.
arXiv Detail & Related papers (2020-02-03T22:35:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.