GPT-ST: Generative Pre-Training of Spatio-Temporal Graph Neural Networks
- URL: http://arxiv.org/abs/2311.04245v1
- Date: Tue, 7 Nov 2023 02:36:24 GMT
- Title: GPT-ST: Generative Pre-Training of Spatio-Temporal Graph Neural Networks
- Authors: Zhonghang Li, Lianghao Xia, Yong Xu, Chao Huang
- Abstract summary: This work aims to address challenges by introducing a pre-training framework that seamlessly integrates with baselines and enhances their performance.
The framework is built upon two key designs: (i) We propose a.
apple-to-apple mask autoencoder as a pre-training model for learning-temporal dependencies.
These modules are specifically designed to capture intra-temporal customized representations and semantic- and inter-cluster relationships.
- Score: 24.323017830938394
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, there has been a rapid development of spatio-temporal
prediction techniques in response to the increasing demands of traffic
management and travel planning. While advanced end-to-end models have achieved
notable success in improving predictive performance, their integration and
expansion pose significant challenges. This work aims to address these
challenges by introducing a spatio-temporal pre-training framework that
seamlessly integrates with downstream baselines and enhances their performance.
The framework is built upon two key designs: (i) We propose a spatio-temporal
mask autoencoder as a pre-training model for learning spatio-temporal
dependencies. The model incorporates customized parameter learners and
hierarchical spatial pattern encoding networks. These modules are specifically
designed to capture spatio-temporal customized representations and intra- and
inter-cluster region semantic relationships, which have often been neglected in
existing approaches. (ii) We introduce an adaptive mask strategy as part of the
pre-training mechanism. This strategy guides the mask autoencoder in learning
robust spatio-temporal representations and facilitates the modeling of
different relationships, ranging from intra-cluster to inter-cluster, in an
easy-to-hard training manner. Extensive experiments conducted on representative
benchmarks demonstrate the effectiveness of our proposed method. We have made
our model implementation publicly available at https://github.com/HKUDS/GPT-ST.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.