Space-Time Graph Neural Networks with Stochastic Graph Perturbations
- URL: http://arxiv.org/abs/2210.16270v1
- Date: Fri, 28 Oct 2022 16:59:51 GMT
- Title: Space-Time Graph Neural Networks with Stochastic Graph Perturbations
- Authors: Samar Hadou, Charilaos Kanatsoulis, and Alejandro Ribeiro
- Abstract summary: Space-time graph neural networks (ST-GNNs) learn efficient graph representations of time-varying data.
In this paper we revisit the properties of ST-GNNs and prove that they are stable to graph stabilitys.
Our analysis suggests that ST-GNNs are suitable for transfer learning on time-varying graphs.
- Score: 100.31591011966603
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Space-time graph neural networks (ST-GNNs) are recently developed
architectures that learn efficient graph representations of time-varying data.
ST-GNNs are particularly useful in multi-agent systems, due to their stability
properties and their ability to respect communication delays between the
agents. In this paper we revisit the stability properties of ST-GNNs and prove
that they are stable to stochastic graph perturbations. Our analysis suggests
that ST-GNNs are suitable for transfer learning on time-varying graphs and
enables the design of generalized convolutional architectures that jointly
process time-varying graphs and time-varying signals. Numerical experiments on
decentralized control systems validate our theoretical results and showcase the
benefits of traditional and generalized ST-GNN architectures.
Related papers
- FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure
Graph Perspective [48.00240550685946]
Current state-of-the-art graph neural network (GNN)-based forecasting methods usually require both graph networks (e.g., GCN) and temporal networks (e.g., LSTM) to capture inter-series (spatial) dynamics and intra-series (temporal) dependencies, respectively.
We propose a novel Fourier Graph Neural Network (FourierGNN) by stacking our proposed Fourier Graph Operator (FGO) to perform matrix multiplications in Fourier space.
Our experiments on seven datasets have demonstrated superior performance with higher efficiency and fewer parameters compared with state-of-the-
arXiv Detail & Related papers (2023-11-10T17:13:26Z) - Learning Stable Graph Neural Networks via Spectral Regularization [18.32587282139282]
Stability of graph neural networks (GNNs) characterizes how GNNs react to graph perturbations and provides guarantees for architecture performance in noisy scenarios.
This paper develops a self-regularized graph neural network (SR-GNN) that improves the architecture stability by regularizing the filter frequency responses in the graph spectral domain.
arXiv Detail & Related papers (2022-11-13T17:27:21Z) - Graph-Time Convolutional Neural Networks: Architecture and Theoretical
Analysis [12.995632804090198]
We introduce Graph-Time Convolutional Neural Networks (GTCNNs) as principled architecture to aid learning.
The approach can work with any type of product graph and we also introduce a parametric graph to learn also the producttemporal coupling.
Extensive numerical results on benchmark corroborate our findings and show the GTCNN compares favorably with state-of-the-art solutions.
arXiv Detail & Related papers (2022-06-30T10:20:52Z) - Spatio-Temporal Latent Graph Structure Learning for Traffic Forecasting [6.428566223253948]
We propose a new traffic forecasting framework--S-Temporal Latent Graph Structure Learning networks (ST-LGSL)
The model employs a graph based on Multilayer perceptron and K-Nearest Neighbor, which learns the latent graph topological information from the entire data.
With the dependencies-kNN based on ground-truth adjacency matrix and similarity metric in kNN, ST-LGSL aggregates the top focusing on geography and node similarity.
arXiv Detail & Related papers (2022-02-25T10:02:49Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Space-Time Graph Neural Networks [104.55175325870195]
We introduce space-time graph neural network (ST-GNN) to jointly process the underlying space-time topology of time-varying network data.
Our analysis shows that small variations in the network topology and time evolution of a system does not significantly affect the performance of ST-GNNs.
arXiv Detail & Related papers (2021-10-06T16:08:44Z) - Spatio-Temporal Graph Scattering Transform [54.52797775999124]
Graph neural networks may be impractical in some real-world scenarios due to a lack of sufficient high-quality training data.
We put forth a novel mathematically designed framework to analyze-temporal data.
arXiv Detail & Related papers (2020-12-06T19:49:55Z) - Gated Graph Recurrent Neural Networks [176.3960927323358]
We introduce Graph Recurrent Neural Networks (GRNNs) as a general learning framework for graph processes.
To address the problem of vanishing gradients, we put forward GRNNs with three different gating mechanisms: time, node and edge gates.
The numerical results also show that GRNNs outperform GNNs and RNNs, highlighting the importance of taking both the temporal and graph structures of a graph process into account.
arXiv Detail & Related papers (2020-02-03T22:35:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.