Spatio-Temporal Graph Scattering Transform
- URL: http://arxiv.org/abs/2012.03363v3
- Date: Tue, 9 Feb 2021 05:08:41 GMT
- Title: Spatio-Temporal Graph Scattering Transform
- Authors: Chao Pan, Siheng Chen, Antonio Ortega
- Abstract summary: Graph neural networks may be impractical in some real-world scenarios due to a lack of sufficient high-quality training data.
We put forth a novel mathematically designed framework to analyze-temporal data.
- Score: 54.52797775999124
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Although spatio-temporal graph neural networks have achieved great empirical
success in handling multiple correlated time series, they may be impractical in
some real-world scenarios due to a lack of sufficient high-quality training
data. Furthermore, spatio-temporal graph neural networks lack theoretical
interpretation. To address these issues, we put forth a novel mathematically
designed framework to analyze spatio-temporal data. Our proposed
spatio-temporal graph scattering transform (ST-GST) extends traditional
scattering transforms to the spatio-temporal domain. It performs iterative
applications of spatio-temporal graph wavelets and nonlinear activation
functions, which can be viewed as a forward pass of spatio-temporal graph
convolutional networks without training. Since all the filter coefficients in
ST-GST are mathematically designed, it is promising for the real-world
scenarios with limited training data, and also allows for a theoretical
analysis, which shows that the proposed ST-GST is stable to small perturbations
of input signals and structures. Finally, our experiments show that i) ST-GST
outperforms spatio-temporal graph convolutional networks by an increase of 35%
in accuracy for MSR Action3D dataset; ii) it is better and computationally more
efficient to design the transform based on separable spatio-temporal graphs
than the joint ones; and iii) the nonlinearity in ST-GST is critical to
empirical performance.
Related papers
- FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure
Graph Perspective [48.00240550685946]
Current state-of-the-art graph neural network (GNN)-based forecasting methods usually require both graph networks (e.g., GCN) and temporal networks (e.g., LSTM) to capture inter-series (spatial) dynamics and intra-series (temporal) dependencies, respectively.
We propose a novel Fourier Graph Neural Network (FourierGNN) by stacking our proposed Fourier Graph Operator (FGO) to perform matrix multiplications in Fourier space.
Our experiments on seven datasets have demonstrated superior performance with higher efficiency and fewer parameters compared with state-of-the-
arXiv Detail & Related papers (2023-11-10T17:13:26Z) - Space-Time Graph Neural Networks with Stochastic Graph Perturbations [100.31591011966603]
Space-time graph neural networks (ST-GNNs) learn efficient graph representations of time-varying data.
In this paper we revisit the properties of ST-GNNs and prove that they are stable to graph stabilitys.
Our analysis suggests that ST-GNNs are suitable for transfer learning on time-varying graphs.
arXiv Detail & Related papers (2022-10-28T16:59:51Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Spatio-Temporal Graph Complementary Scattering Networks [27.78922896432688]
This work proposes a complementary mechanism to combine the novel-temporal graph scattering and transform neural networks.
The essence is to leverage the mathematically designed graph wavelets with pruning techniques to cover major information and use trainable capture networks to capture networks to capture complementary information.
arXiv Detail & Related papers (2021-10-23T06:02:43Z) - Space-Time Graph Neural Networks [104.55175325870195]
We introduce space-time graph neural network (ST-GNN) to jointly process the underlying space-time topology of time-varying network data.
Our analysis shows that small variations in the network topology and time evolution of a system does not significantly affect the performance of ST-GNNs.
arXiv Detail & Related papers (2021-10-06T16:08:44Z) - Efficient and Stable Graph Scattering Transforms via Pruning [86.76336979318681]
Graph scattering transforms ( GSTs) offer training-free deep GCN models that extract features from graph data.
The price paid by GSTs is exponential complexity in space and time that increases with the number of layers.
The present work addresses the complexity limitation of GSTs by introducing an efficient so-termed pruned (p) GST approach.
arXiv Detail & Related papers (2020-01-27T16:05:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.