Graph Wave Networks
- URL: http://arxiv.org/abs/2505.20034v2
- Date: Thu, 29 May 2025 03:11:51 GMT
- Title: Graph Wave Networks
- Authors: Juwei Yue, Haikuo Li, Jiawei Sheng, Yihan Guo, Xinghua Zhang, Chuan Zhou, Tingwen Liu, Li Guo,
- Abstract summary: We develop a graph wave equation to leverage the wave propagation on graphs.<n>In details, we demonstrate that the graph wave equation can be connected to traditional spectral GNNs.<n>Experiments show that GWNs achieve SOTA and efficient performance on benchmark datasets.
- Score: 17.80926325018177
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dynamics modeling has been introduced as a novel paradigm in message passing (MP) of graph neural networks (GNNs). Existing methods consider MP between nodes as a heat diffusion process, and leverage heat equation to model the temporal evolution of nodes in the embedding space. However, heat equation can hardly depict the wave nature of graph signals in graph signal processing. Besides, heat equation is essentially a partial differential equation (PDE) involving a first partial derivative of time, whose numerical solution usually has low stability, and leads to inefficient model training. In this paper, we would like to depict more wave details in MP, since graph signals are essentially wave signals that can be seen as a superposition of a series of waves in the form of eigenvector. This motivates us to consider MP as a wave propagation process to capture the temporal evolution of wave signals in the space. Based on wave equation in physics, we innovatively develop a graph wave equation to leverage the wave propagation on graphs. In details, we demonstrate that the graph wave equation can be connected to traditional spectral GNNs, facilitating the design of graph wave networks based on various Laplacians and enhancing the performance of the spectral GNNs. Besides, the graph wave equation is particularly a PDE involving a second partial derivative of time, which has stronger stability on graphs than the heat equation that involves a first partial derivative of time. Additionally, we theoretically prove that the numerical solution derived from the graph wave equation are constantly stable, enabling to significantly enhance model efficiency while ensuring its performance. Extensive experiments show that GWNs achieve SOTA and efficient performance on benchmark datasets, and exhibit outstanding performance in addressing challenging graph problems, such as over-smoothing and heterophily.
Related papers
- Generating Graphs via Spectral Diffusion [48.70458395826864]
We present GGSD, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process.<n>An extensive set of experiments on both synthetic and real-world graphs demonstrates the strengths of our model against state-of-the-art alternatives.
arXiv Detail & Related papers (2024-02-29T09:26:46Z) - Beyond Spatio-Temporal Representations: Evolving Fourier Transform for Temporal Graphs [5.752149463974228]
We present the first invertible spectral transform that captures evolving representations on temporal graphs.
We develop a simple neural model induced with EFT for capturing evolving graph spectra.
We empirically validate our theoretical findings on a number of large-scale and standard temporal graph benchmarks.
arXiv Detail & Related papers (2024-02-25T13:05:25Z) - Learning graph geometry and topology using dynamical systems based message-passing [21.571006438656323]
We introduce DYMAG: a message passing paradigm for GNNs built on the expressive power of graph-dynamics.
DYMAG makes use of complex graph dynamics based on the heat and wave equation as well as a more complex equation which admits chaotic solutions.
We demonstrate that DYMAG achieves superior performance in recovering the generating parameters of Erd"os-Renyi and block random graphs.
arXiv Detail & Related papers (2023-09-18T16:39:51Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Non-separable Spatio-temporal Graph Kernels via SPDEs [69.4678086015418]
A lack of justified graph kernels for principled-temporal modelling has held back their use in graph problems.<n>We leverage a link between partial differential equations (SPDEs) and onsepa-temporal graphs, introduce a framework for deriving graph kernels via SPDEs.<n>We show that by providing novel tools for GP modelling on graphs, we outperform pre-existing graph kernels in real-world applications.
arXiv Detail & Related papers (2021-11-16T14:53:19Z) - Predicting traffic signals on transportation networks using
spatio-temporal correlations on graphs [56.48498624951417]
This paper proposes a traffic propagation model that merges multiple heat diffusion kernels into a data-driven prediction model to forecast traffic signals.
We optimize the model parameters using Bayesian inference to minimize the prediction errors and, consequently, determine the mixing ratio of the two approaches.
The proposed model demonstrates prediction accuracy comparable to that of the state-of-the-art deep neural networks with lower computational effort.
arXiv Detail & Related papers (2021-04-27T18:17:42Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - Graph-Time Convolutional Neural Networks [9.137554315375919]
We represent spatial relationships through product graphs with a first principle graph-time convolutional neural network (GTCNN)
We develop a graph-time convolutional filter by following the shift-and-sumtemporal operator to learn higher-level features over the product graph.
We develop a zero-pad pooling that preserves the spatial graph while reducing the number of active nodes and the parameters.
arXiv Detail & Related papers (2021-03-02T14:03:44Z) - From Spectrum Wavelet to Vertex Propagation: Graph Convolutional
Networks Based on Taylor Approximation [85.47548256308515]
Graph convolutional networks (GCN) have been recently utilized to extract the underlying structures of datasets with some labeled data and high-dimensional features.
Existing GCNs mostly rely on a first-order Chebyshev approximation of graph wavelet- Kernels.
arXiv Detail & Related papers (2020-07-01T20:07:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.