Fast Temporal Wavelet Graph Neural Networks
- URL: http://arxiv.org/abs/2302.08643v3
- Date: Sat, 28 Oct 2023 20:38:05 GMT
- Title: Fast Temporal Wavelet Graph Neural Networks
- Authors: Duc Thien Nguyen, Manh Duc Tuan Nguyen, Truong Son Hy, Risi Kondor
- Abstract summary: We propose Fast Temporal Wavelet Graph Neural Networks (FTWGNN) for learning tasks on timeseries data.
We employ Multiresolution Matrix Factorization (MMF) to factorize the highly dense graph structure and compute the corresponding sparse wavelet basis.
Experimental results on real-world PEMS-BAY, METR-LA traffic datasets and AJILE12 ECoG dataset show that FTWGNN is competitive with the state-of-the-arts.
- Score: 7.477634824955323
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spatio-temporal signals forecasting plays an important role in numerous
domains, especially in neuroscience and transportation. The task is challenging
due to the highly intricate spatial structure, as well as the non-linear
temporal dynamics of the network. To facilitate reliable and timely forecast
for the human brain and traffic networks, we propose the Fast Temporal Wavelet
Graph Neural Networks (FTWGNN) that is both time- and memory-efficient for
learning tasks on timeseries data with the underlying graph structure, thanks
to the theories of multiresolution analysis and wavelet theory on discrete
spaces. We employ Multiresolution Matrix Factorization (MMF) (Kondor et al.,
2014) to factorize the highly dense graph structure and compute the
corresponding sparse wavelet basis that allows us to construct fast wavelet
convolution as the backbone of our novel architecture. Experimental results on
real-world PEMS-BAY, METR-LA traffic datasets and AJILE12 ECoG dataset show
that FTWGNN is competitive with the state-of-the-arts while maintaining a low
computational footprint. Our PyTorch implementation is publicly available at
https://github.com/HySonLab/TWGNN
Related papers
- FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure
Graph Perspective [48.00240550685946]
Current state-of-the-art graph neural network (GNN)-based forecasting methods usually require both graph networks (e.g., GCN) and temporal networks (e.g., LSTM) to capture inter-series (spatial) dynamics and intra-series (temporal) dependencies, respectively.
We propose a novel Fourier Graph Neural Network (FourierGNN) by stacking our proposed Fourier Graph Operator (FGO) to perform matrix multiplications in Fourier space.
Our experiments on seven datasets have demonstrated superior performance with higher efficiency and fewer parameters compared with state-of-the-
arXiv Detail & Related papers (2023-11-10T17:13:26Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Space-Time Graph Neural Networks with Stochastic Graph Perturbations [100.31591011966603]
Space-time graph neural networks (ST-GNNs) learn efficient graph representations of time-varying data.
In this paper we revisit the properties of ST-GNNs and prove that they are stable to graph stabilitys.
Our analysis suggests that ST-GNNs are suitable for transfer learning on time-varying graphs.
arXiv Detail & Related papers (2022-10-28T16:59:51Z) - STGIN: A Spatial Temporal Graph-Informer Network for Long Sequence
Traffic Speed Forecasting [8.596556653895028]
This study proposes a new spatial-temporal neural network architecture to handle the long-term traffic parameters forecasting issue.
The attention mechanism potentially guarantees long-term prediction performance without significant information loss from distant inputs.
arXiv Detail & Related papers (2022-10-01T05:58:22Z) - Scalable Spatiotemporal Graph Neural Networks [14.415967477487692]
Graph neural networks (GNNs) are often the core component of the forecasting architecture.
In most pretemporal GNNs, the computational complexity scales up to a quadratic factor with the length of the sequence times the number of links in the graph.
We propose a scalable architecture that exploits an efficient encoding of both temporal and spatial dynamics.
arXiv Detail & Related papers (2022-09-14T09:47:38Z) - Learning Wave Propagation with Attention-Based Convolutional Recurrent
Autoencoder Net [0.0]
We present an end-to-end attention-based convolutional recurrent autoencoder (AB-CRAN) network for data-driven modeling of wave propagation phenomena.
We employ a denoising-based convolutional autoencoder from the full-order snapshots given by time-dependent hyperbolic partial differential equations for wave propagation.
The attention-based sequence-to-sequence network increases the time-horizon of prediction by five times compared to the plain RNN-LSTM.
arXiv Detail & Related papers (2022-01-17T20:51:59Z) - Efficient-Dyn: Dynamic Graph Representation Learning via Event-based
Temporal Sparse Attention Network [2.0047096160313456]
Dynamic graph neural networks have received more and more attention from researchers.
We propose a novel dynamic graph neural network, Efficient-Dyn.
It adaptively encodes temporal information into a sequence of patches with an equal amount of temporal-topological structure.
arXiv Detail & Related papers (2022-01-04T23:52:24Z) - Space-Time Graph Neural Networks [104.55175325870195]
We introduce space-time graph neural network (ST-GNN) to jointly process the underlying space-time topology of time-varying network data.
Our analysis shows that small variations in the network topology and time evolution of a system does not significantly affect the performance of ST-GNNs.
arXiv Detail & Related papers (2021-10-06T16:08:44Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - Spatio-Temporal Graph Scattering Transform [54.52797775999124]
Graph neural networks may be impractical in some real-world scenarios due to a lack of sufficient high-quality training data.
We put forth a novel mathematically designed framework to analyze-temporal data.
arXiv Detail & Related papers (2020-12-06T19:49:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.