Beyond Spatio-Temporal Representations: Evolving Fourier Transform for Temporal Graphs
- URL: http://arxiv.org/abs/2402.16078v2
- Date: Thu, 18 Apr 2024 14:13:19 GMT
- Title: Beyond Spatio-Temporal Representations: Evolving Fourier Transform for Temporal Graphs
- Authors: Anson Bastos, Kuldeep Singh, Abhishek Nadgeri, Manish Singh, Toyotaro Suzumura,
- Abstract summary: We present the first invertible spectral transform that captures evolving representations on temporal graphs.
We develop a simple neural model induced with EFT for capturing evolving graph spectra.
We empirically validate our theoretical findings on a number of large-scale and standard temporal graph benchmarks.
- Score: 5.752149463974228
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present the Evolving Graph Fourier Transform (EFT), the first invertible spectral transform that captures evolving representations on temporal graphs. We motivate our work by the inadequacy of existing methods for capturing the evolving graph spectra, which are also computationally expensive due to the temporal aspect along with the graph vertex domain. We view the problem as an optimization over the Laplacian of the continuous time dynamic graph. Additionally, we propose pseudo-spectrum relaxations that decompose the transformation process, making it highly computationally efficient. The EFT method adeptly captures the evolving graph's structural and positional properties, making it effective for downstream tasks on evolving graphs. Hence, as a reference implementation, we develop a simple neural model induced with EFT for capturing evolving graph spectra. We empirically validate our theoretical findings on a number of large-scale and standard temporal graph benchmarks and demonstrate that our model achieves state-of-the-art performance.
Related papers
- Through the Dual-Prism: A Spectral Perspective on Graph Data
Augmentation for Graph Classification [71.36575018271405]
We introduce the Dual-Prism (DP) augmentation method, comprising DP-Noise and DP-Mask.
We find that keeping the low-frequency eigenvalues unchanged can preserve the critical properties at a large scale when generating augmented graphs.
arXiv Detail & Related papers (2024-01-18T12:58:53Z) - FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure
Graph Perspective [48.00240550685946]
Current state-of-the-art graph neural network (GNN)-based forecasting methods usually require both graph networks (e.g., GCN) and temporal networks (e.g., LSTM) to capture inter-series (spatial) dynamics and intra-series (temporal) dependencies, respectively.
We propose a novel Fourier Graph Neural Network (FourierGNN) by stacking our proposed Fourier Graph Operator (FGO) to perform matrix multiplications in Fourier space.
Our experiments on seven datasets have demonstrated superior performance with higher efficiency and fewer parameters compared with state-of-the-
arXiv Detail & Related papers (2023-11-10T17:13:26Z) - Deep Prompt Tuning for Graph Transformers [55.2480439325792]
Fine-tuning is resource-intensive and requires storing multiple copies of large models.
We propose a novel approach called deep graph prompt tuning as an alternative to fine-tuning.
By freezing the pre-trained parameters and only updating the added tokens, our approach reduces the number of free parameters and eliminates the need for multiple model copies.
arXiv Detail & Related papers (2023-09-18T20:12:17Z) - TransformerG2G: Adaptive time-stepping for learning temporal graph
embeddings using transformers [2.2120851074630177]
We develop a graph embedding model with uncertainty quantification, TransformerG2G, to learn temporal dynamics of temporal graphs.
Our experiments demonstrate that the proposed TransformerG2G model outperforms conventional multi-step methods.
By examining the attention weights, we can uncover temporal dependencies, identify influential elements, and gain insights into the complex interactions within the graph structure.
arXiv Detail & Related papers (2023-07-05T18:34:22Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - FTM: A Frame-level Timeline Modeling Method for Temporal Graph
Representation Learning [47.52733127616005]
We propose a Frame-level Timeline Modeling (FTM) method that helps to capture both short-term and long-term features.
Our method can be easily assembled with most temporal GNNs.
arXiv Detail & Related papers (2023-02-23T06:53:16Z) - Time-aware Dynamic Graph Embedding for Asynchronous Structural Evolution [60.695162101159134]
Existing works merely view a dynamic graph as a sequence of changes.
We formulate dynamic graphs as temporal edge sequences associated with joining time of.
vertex and timespan of edges.
A time-aware Transformer is proposed to embed.
vertex' dynamic connections and ToEs into the learned.
vertex representations.
arXiv Detail & Related papers (2022-07-01T15:32:56Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Evolving-Graph Gaussian Processes [20.065168755580558]
Existing approaches have focused on static structures, whereas many real graph data represent a dynamic structure, limiting the applications of GGPs.
We propose evolving-Graph Gaussian Processes (e-GGPs) to overcome this.
We demonstrate the benefits of e-GGPs over static graph Gaussian Process approaches.
arXiv Detail & Related papers (2021-06-29T07:16:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.