Localised Adaptive Spatial-Temporal Graph Neural Network
- URL: http://arxiv.org/abs/2306.06930v2
- Date: Thu, 15 Jun 2023 13:54:24 GMT
- Title: Localised Adaptive Spatial-Temporal Graph Neural Network
- Authors: Wenying Duan, Xiaoxi He, Zimu Zhou, Lothar Thiele, Hong Rao
- Abstract summary: Adaptive Graph Sparsification (AGS) is a graph sparsification algorithm which successfully enables the localisation of ASTGNNs to an extreme extent.
We observe that spatial graphs in ASTGNNs can be sparsified by over 99.5% without any decline in test accuracy.
Localisation of ASTGNNs holds the potential to reduce the heavy overhead required on large-scale spatial-temporal data.
- Score: 17.707594255626216
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spatial-temporal graph models are prevailing for abstracting and modelling
spatial and temporal dependencies. In this work, we ask the following question:
whether and to what extent can we localise spatial-temporal graph models? We
limit our scope to adaptive spatial-temporal graph neural networks (ASTGNNs),
the state-of-the-art model architecture. Our approach to localisation involves
sparsifying the spatial graph adjacency matrices. To this end, we propose
Adaptive Graph Sparsification (AGS), a graph sparsification algorithm which
successfully enables the localisation of ASTGNNs to an extreme extent (fully
localisation). We apply AGS to two distinct ASTGNN architectures and nine
spatial-temporal datasets. Intriguingly, we observe that spatial graphs in
ASTGNNs can be sparsified by over 99.5\% without any decline in test accuracy.
Furthermore, even when ASTGNNs are fully localised, becoming graph-less and
purely temporal, we record no drop in accuracy for the majority of tested
datasets, with only minor accuracy deterioration observed in the remaining
datasets. However, when the partially or fully localised ASTGNNs are
reinitialised and retrained on the same data, there is a considerable and
consistent drop in accuracy. Based on these observations, we reckon that
\textit{(i)} in the tested data, the information provided by the spatial
dependencies is primarily included in the information provided by the temporal
dependencies and, thus, can be essentially ignored for inference; and
\textit{(ii)} although the spatial dependencies provide redundant information,
it is vital for the effective training of ASTGNNs and thus cannot be ignored
during training. Furthermore, the localisation of ASTGNNs holds the potential
to reduce the heavy computation overhead required on large-scale
spatial-temporal data and further enable the distributed deployment of ASTGNNs.
Related papers
- DyG-Mamba: Continuous State Space Modeling on Dynamic Graphs [59.434893231950205]
Dynamic graph learning aims to uncover evolutionary laws in real-world systems.
We propose DyG-Mamba, a new continuous state space model for dynamic graph learning.
We show that DyG-Mamba achieves state-of-the-art performance on most datasets.
arXiv Detail & Related papers (2024-08-13T15:21:46Z) - Pre-Training Identification of Graph Winning Tickets in Adaptive Spatial-Temporal Graph Neural Networks [5.514795777097036]
We introduce the concept of the Graph Winning Ticket (GWT), derived from the Lottery Ticket Hypothesis (LTH)
By adopting a pre-determined star topology as a GWT prior to training, we balance edge reduction with efficient information propagation.
Our approach enables training ASTGNNs on the largest scale spatial-temporal dataset using a single A6000 equipped with 48 GB of memory.
arXiv Detail & Related papers (2024-06-12T14:53:23Z) - Mending of Spatio-Temporal Dependencies in Block Adjacency Matrix [3.529869282529924]
We propose a novel end-to-end learning architecture designed to mend the temporal dependencies, resulting in a well-connected graph.
Our methodology demonstrates superior performance on benchmark datasets, such as SurgVisDom and C2D2.
arXiv Detail & Related papers (2023-10-04T06:42:33Z) - Uncovering the Missing Pattern: Unified Framework Towards Trajectory
Imputation and Prediction [60.60223171143206]
Trajectory prediction is a crucial undertaking in understanding entity movement or human behavior from observed sequences.
Current methods often assume that the observed sequences are complete while ignoring the potential for missing values.
This paper presents a unified framework, the Graph-based Conditional Variational Recurrent Neural Network (GC-VRNN), which can perform trajectory imputation and prediction simultaneously.
arXiv Detail & Related papers (2023-03-28T14:27:27Z) - Dynamic Graph Neural Network with Adaptive Edge Attributes for Air
Quality Predictions [12.336689498639366]
We propose a novel Dynamic Graph Neural Network with Adaptive Edge Attributes (DGN-AEA) on the message passing network.
Unlike prior information to establish edges, our method can obtain adaptive edge information through end-to-end training without any prior information.
arXiv Detail & Related papers (2023-02-20T13:45:55Z) - Space-Time Graph Neural Networks with Stochastic Graph Perturbations [100.31591011966603]
Space-time graph neural networks (ST-GNNs) learn efficient graph representations of time-varying data.
In this paper we revisit the properties of ST-GNNs and prove that they are stable to graph stabilitys.
Our analysis suggests that ST-GNNs are suitable for transfer learning on time-varying graphs.
arXiv Detail & Related papers (2022-10-28T16:59:51Z) - STONet: A Neural-Operator-Driven Spatio-temporal Network [38.5696882090282]
Graph-based graph-temporal neural networks are effective to model spatial dependency among discrete points sampled irregularly.
We propose atemporal framework based on neural operators for PDEs, which learn the mechanisms governing the dynamics of spatially-continuous physical quantities.
Experiments show our model's performance on forecasting spatially-continuous physic quantities, and its superior to unseen spatial points and ability to handle temporally-irregular data.
arXiv Detail & Related papers (2022-04-18T17:20:12Z) - Positional Encoder Graph Neural Networks for Geographic Data [1.840220263320992]
Graph neural networks (GNNs) provide a powerful and scalable solution for modeling continuous spatial data.
In this paper, we propose PE-GNN, a new framework that incorporates spatial context and correlation explicitly into the models.
arXiv Detail & Related papers (2021-11-19T10:41:49Z) - Space-Time Graph Neural Networks [104.55175325870195]
We introduce space-time graph neural network (ST-GNN) to jointly process the underlying space-time topology of time-varying network data.
Our analysis shows that small variations in the network topology and time evolution of a system does not significantly affect the performance of ST-GNNs.
arXiv Detail & Related papers (2021-10-06T16:08:44Z) - Spatio-Temporal Graph Scattering Transform [54.52797775999124]
Graph neural networks may be impractical in some real-world scenarios due to a lack of sufficient high-quality training data.
We put forth a novel mathematically designed framework to analyze-temporal data.
arXiv Detail & Related papers (2020-12-06T19:49:55Z) - On the spatial attention in Spatio-Temporal Graph Convolutional Networks
for skeleton-based human action recognition [97.14064057840089]
Graphal networks (GCNs) promising performance in skeleton-based human action recognition by modeling a sequence of skeletons as a graph.
Most of the recently proposed G-temporal-based methods improve the performance by learning the graph structure at each layer of the network.
arXiv Detail & Related papers (2020-11-07T19:03:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.