Hyperbolic-PDE GNN: Spectral Graph Neural Networks in the Perspective of A System of Hyperbolic Partial Differential Equations
- URL: http://arxiv.org/abs/2505.23014v1
- Date: Thu, 29 May 2025 02:49:26 GMT
- Title: Hyperbolic-PDE GNN: Spectral Graph Neural Networks in the Perspective of A System of Hyperbolic Partial Differential Equations
- Authors: Juwei Yue, Haikuo Li, Jiawei Sheng, Xiaodong Li, Taoyu Su, Tingwen Liu, Li Guo,
- Abstract summary: Graph neural networks (GNNs) leverage message passing mechanisms to learn the topological features of graph data.<n>We formulate message passing as a system of hyperbolic partial differential equations (hyperbolic PDEs)<n>We establish a connection with spectral graph neural networks (spectral GNNs) serving as a message passing enhancement paradigm for spectral GNNs.
- Score: 17.919550332541963
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) leverage message passing mechanisms to learn the topological features of graph data. Traditional GNNs learns node features in a spatial domain unrelated to the topology, which can hardly ensure topological features. In this paper, we formulates message passing as a system of hyperbolic partial differential equations (hyperbolic PDEs), constituting a dynamical system that explicitly maps node representations into a particular solution space. This solution space is spanned by a set of eigenvectors describing the topological structure of graphs. Within this system, for any moment in time, a node features can be decomposed into a superposition of the basis of eigenvectors. This not only enhances the interpretability of message passing but also enables the explicit extraction of fundamental characteristics about the topological structure. Furthermore, by solving this system of hyperbolic partial differential equations, we establish a connection with spectral graph neural networks (spectral GNNs), serving as a message passing enhancement paradigm for spectral GNNs.We further introduce polynomials to approximate arbitrary filter functions. Extensive experiments demonstrate that the paradigm of hyperbolic PDEs not only exhibits strong flexibility but also significantly enhances the performance of various spectral GNNs across diverse graph tasks.
Related papers
- Large-Scale Spectral Graph Neural Networks via Laplacian Sparsification: Technical Report [21.288230563135055]
We propose a novel graph spectral sparsification method to approximate the propagation patterns of spectral Graph Neural Networks (GNNs)<n>Our method allows the application of linear layers on the input node features, enabling end-to-end training as well as the handling of raw features.
arXiv Detail & Related papers (2025-01-08T15:36:19Z) - On the Expressive Power of Spectral Invariant Graph Neural Networks [28.557550571187253]
We introduce a unified message-passing framework for designing spectral invariant GNNs, called Eigenspace Projection GNN (EPNN)
We show that EPNN essentially unifies all prior spectral invariant architectures, in that they are either strictly less expressive or equivalent to EPNN.
We discuss whether using spectral features can gain additional expressiveness when combined with more expressive GNNs.
arXiv Detail & Related papers (2024-06-06T17:59:41Z) - Contextualized Messages Boost Graph Representations [1.5178009359320295]
A graph convolution network (SIR-GCN) that emphasizes the contextualized transformation of neighborhood feature representations is proposed.<n>Experiments on synthetic superiority of SIR-GCN and benchmark datasets demonstrate outperforming in datasets and property prediction tasks.
arXiv Detail & Related papers (2024-03-19T08:05:49Z) - Supercharging Graph Transformers with Advective Diffusion [28.40109111316014]
This paper proposes AdvDIFFormer, a physics-inspired graph Transformer model designed to address this challenge.<n>We show that AdvDIFFormer has provable capability for controlling generalization error with topological shifts.<n> Empirically, the model demonstrates superiority in various predictive tasks across information networks, molecular screening and protein interactions.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - Spectral Heterogeneous Graph Convolutions via Positive Noncommutative Polynomials [34.74726720818622]
We present Positive Spectral Heterogeneous Graph Convolutional Network (PSHGCN)
PSHGCN offers a simple yet effective method for learning valid heterogeneous graph filters.
PSHGCN exhibits remarkable scalability, efficiently handling large real-world graphs comprising millions of nodes and edges.
arXiv Detail & Related papers (2023-05-31T14:09:42Z) - Feature Expansion for Graph Neural Networks [26.671557021142572]
We decompose graph neural networks into determined feature spaces and trainable weights.
We theoretically find that the feature space tends to be linearly correlated due to repeated aggregations.
Motivated by these findings, we propose 1) feature subspaces flattening and 2) structural principal components to expand the feature space.
arXiv Detail & Related papers (2023-05-10T13:45:57Z) - On the Expressiveness and Generalization of Hypergraph Neural Networks [77.65788763444877]
This extended abstract describes a framework for analyzing the expressiveness, learning, and (structural) generalization of hypergraph neural networks (HyperGNNs)
Specifically, we focus on how HyperGNNs can learn from finite datasets and generalize structurally to graph reasoning problems of arbitrary input sizes.
arXiv Detail & Related papers (2023-03-09T18:42:18Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.