How Expressive are Spectral-Temporal Graph Neural Networks for Time
Series Forecasting?
- URL: http://arxiv.org/abs/2305.06587v2
- Date: Wed, 14 Jun 2023 04:20:31 GMT
- Title: How Expressive are Spectral-Temporal Graph Neural Networks for Time
Series Forecasting?
- Authors: Ming Jin, Guangsi Shi, Yuan-Fang Li, Qingsong Wen, Bo Xiong, Tian
Zhou, Shirui Pan
- Abstract summary: We establish a theoretical framework that unravels the expressive power of spectral-temporal GNNs.
Our results show that linear spectral-temporal GNNs are universal under mild assumptions.
We propose a simple instantiation named Temporal Graph GegenConv (TGC) which significantly outperforms most existing models.
- Score: 46.95117922583253
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spectral-temporal graph neural network is a promising abstraction underlying
most time series forecasting models that are based on graph neural networks
(GNNs). However, more is needed to know about the underpinnings of this branch
of methods. In this paper, we establish a theoretical framework that unravels
the expressive power of spectral-temporal GNNs. Our results show that linear
spectral-temporal GNNs are universal under mild assumptions, and their
expressive power is bounded by our extended first-order Weisfeiler-Leman
algorithm on discrete-time dynamic graphs. To make our findings useful in
practice on valid instantiations, we discuss related constraints in detail and
outline a theoretical blueprint for designing spatial and temporal modules in
spectral domains. Building on these insights and to demonstrate how powerful
spectral-temporal GNNs are based on our framework, we propose a simple
instantiation named Temporal Graph GegenConv (TGC), which significantly
outperforms most existing models with only linear components and shows better
model efficiency.
Related papers
- On The Temporal Domain of Differential Equation Inspired Graph Neural
Networks [14.779420473274737]
We show that our model, called TDE-GNN, can capture a wide range of temporal dynamics that go beyond typical first or second-order methods.
We demonstrate the benefit of learning the temporal dependencies using our method rather than using pre-defined temporal dynamics on several graph benchmarks.
arXiv Detail & Related papers (2024-01-20T01:12:57Z) - FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure
Graph Perspective [48.00240550685946]
Current state-of-the-art graph neural network (GNN)-based forecasting methods usually require both graph networks (e.g., GCN) and temporal networks (e.g., LSTM) to capture inter-series (spatial) dynamics and intra-series (temporal) dependencies, respectively.
We propose a novel Fourier Graph Neural Network (FourierGNN) by stacking our proposed Fourier Graph Operator (FGO) to perform matrix multiplications in Fourier space.
Our experiments on seven datasets have demonstrated superior performance with higher efficiency and fewer parameters compared with state-of-the-
arXiv Detail & Related papers (2023-11-10T17:13:26Z) - DyExplainer: Explainable Dynamic Graph Neural Networks [37.16783248212211]
We present DyExplainer, a novel approach to explaining dynamic Graph Neural Networks (GNNs) on the fly.
DyExplainer trains a dynamic GNN backbone to extract representations of the graph at each snapshot.
We also augment our approach with contrastive learning techniques to provide priori-guided regularization.
arXiv Detail & Related papers (2023-10-25T05:26:33Z) - Space-Time Graph Neural Networks with Stochastic Graph Perturbations [100.31591011966603]
Space-time graph neural networks (ST-GNNs) learn efficient graph representations of time-varying data.
In this paper we revisit the properties of ST-GNNs and prove that they are stable to graph stabilitys.
Our analysis suggests that ST-GNNs are suitable for transfer learning on time-varying graphs.
arXiv Detail & Related papers (2022-10-28T16:59:51Z) - Scalable Spatiotemporal Graph Neural Networks [14.415967477487692]
Graph neural networks (GNNs) are often the core component of the forecasting architecture.
In most pretemporal GNNs, the computational complexity scales up to a quadratic factor with the length of the sequence times the number of links in the graph.
We propose a scalable architecture that exploits an efficient encoding of both temporal and spatial dynamics.
arXiv Detail & Related papers (2022-09-14T09:47:38Z) - An Explainer for Temporal Graph Neural Networks [27.393641343203363]
Temporal graph neural networks (TGNNs) have been widely used for modeling time-evolving graph-related tasks.
We propose a novel explainer framework for TGNN models.
arXiv Detail & Related papers (2022-09-02T04:12:40Z) - EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural
Networks [51.42338058718487]
Graph Neural Networks (GNNs) have received extensive research attention for their promising performance in graph machine learning.
Existing approaches, such as GCN and GPRGNN, are not robust in the face of homophily changes on test graphs.
We propose EvenNet, a spectral GNN corresponding to an even-polynomial graph filter.
arXiv Detail & Related papers (2022-05-27T10:48:14Z) - Fast Graph Attention Networks Using Effective Resistance Based Graph
Sparsification [70.50751397870972]
FastGAT is a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph.
We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks.
arXiv Detail & Related papers (2020-06-15T22:07:54Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.