Spectro-Riemannian Graph Neural Networks
- URL: http://arxiv.org/abs/2502.00401v1
- Date: Sat, 01 Feb 2025 11:31:01 GMT
- Title: Spectro-Riemannian Graph Neural Networks
- Authors: Karish Grover, Haiyang Yu, Xiang Song, Qi Zhu, Han Xie, Vassilis N. Ioannidis, Christos Faloutsos,
- Abstract summary: We propose the first graph representation learning paradigm that unifies CUrvature (geometric) and SPectral insights.
Cusp Laplacian is an extension of the traditional graph Laplacian based on Ollivier-Ricci curvature.
Cusp Pooling is a hierarchical attention mechanism combined with a curvature-based positional encoding.
Empirical evaluation across eight homophilic and heterophilic datasets demonstrates the superiority of CUSP in node classification and link prediction tasks.
- Score: 39.901731107377095
- License:
- Abstract: Can integrating spectral and curvature signals unlock new potential in graph representation learning? Non-Euclidean geometries, particularly Riemannian manifolds such as hyperbolic (negative curvature) and spherical (positive curvature), offer powerful inductive biases for embedding complex graph structures like scale-free, hierarchical, and cyclic patterns. Meanwhile, spectral filtering excels at processing signal variations across graphs, making it effective in homophilic and heterophilic settings. Leveraging both can significantly enhance the learned representations. To this end, we propose Spectro-Riemannian Graph Neural Networks (CUSP) - the first graph representation learning paradigm that unifies both CUrvature (geometric) and SPectral insights. CUSP is a mixed-curvature spectral GNN that learns spectral filters to optimize node embeddings in products of constant-curvature manifolds (hyperbolic, spherical, and Euclidean). Specifically, CUSP introduces three novel components: (a) Cusp Laplacian, an extension of the traditional graph Laplacian based on Ollivier-Ricci curvature, designed to capture the curvature signals better; (b) Cusp Filtering, which employs multiple Riemannian graph filters to obtain cues from various bands in the eigenspectrum; and (c) Cusp Pooling, a hierarchical attention mechanism combined with a curvature-based positional encoding to assess the relative importance of differently curved substructures in our graph. Empirical evaluation across eight homophilic and heterophilic datasets demonstrates the superiority of CUSP in node classification and link prediction tasks, with a gain of up to 5.3% over state-of-the-art models.
Related papers
- Point Cloud Denoising With Fine-Granularity Dynamic Graph Convolutional Networks [58.050130177241186]
Noise perturbations often corrupt 3-D point clouds, hindering downstream tasks such as surface reconstruction, rendering, and further processing.
This paper introduces finegranularity dynamic graph convolutional networks called GDGCN, a novel approach to denoising in 3-D point clouds.
arXiv Detail & Related papers (2024-11-21T14:19:32Z) - Spectral Graph Reasoning Network for Hyperspectral Image Classification [0.43512163406551996]
Convolutional neural networks (CNNs) have achieved remarkable performance in hyperspectral image (HSI) classification.
We propose a spectral graph reasoning network (SGR) learning framework comprising two crucial modules.
Experiments on two HSI datasets demonstrate that the proposed architecture can significantly improve the classification accuracy.
arXiv Detail & Related papers (2024-07-02T20:29:23Z) - Shape-aware Graph Spectral Learning [36.63516222161871]
Spectral Graph Neural Networks (GNNs) are gaining attention for their ability to surpass the limitations of message-passing GNNs.
Some works empirically show that the preferred graph frequency is related to the graph homophily level.
This relationship between graph frequency and graphs with homophily/heterophily has not been systematically analyzed and considered in existing spectral GNNs.
We propose shape-aware regularization on a Newton Interpolation-based spectral filter that can learn an arbitrary spectral filter and incorporate prior knowledge about the desired shape of the corresponding homophily level.
arXiv Detail & Related papers (2023-10-16T04:57:30Z) - HoloNets: Spectral Convolutions do extend to Directed Graphs [59.851175771106625]
Conventional wisdom dictates that spectral convolutional networks may only be deployed on undirected graphs.
Here we show this traditional reliance on the graph Fourier transform to be superfluous.
We provide a frequency-response interpretation of newly developed filters, investigate the influence of the basis used to express filters and discuss the interplay with characteristic operators on which networks are based.
arXiv Detail & Related papers (2023-10-03T17:42:09Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Graph Contrastive Learning under Heterophily via Graph Filters [51.46061703680498]
Graph contrastive learning (CL) methods learn node representations in a self-supervised manner by maximizing the similarity between the augmented node representations obtained via a GNN-based encoder.
In this work, we propose an effective graph CL method, namely HLCL, for learning graph representations under heterophily.
Our extensive experiments show that HLCL outperforms state-of-the-art graph CL methods on benchmark datasets with heterophily, as well as large-scale real-world graphs, by up to 7%, and outperforms graph supervised learning methods on datasets with heterophily by up to 10%.
arXiv Detail & Related papers (2023-03-11T08:32:39Z) - Pointspectrum: Equivariance Meets Laplacian Filtering for Graph
Representation Learning [3.7875603451557063]
Graph Representation Learning (GRL) has become essential for modern graph data mining and learning tasks.
While Graph Neural Networks (GNNs) have been used in state-of-the-art GRL architectures, they have been shown to suffer from over smoothing.
We propose PointSpectrum, a spectral method that incorporates a set equivariant network to account for a graph's structure.
arXiv Detail & Related papers (2021-09-06T10:59:11Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Gaussian Processes on Graphs via Spectral Kernel Learning [9.260186030255081]
We propose a graph spectrum-based Gaussian process for prediction of signals defined on nodes of the graph.
We demonstrate the interpretability of the model in synthetic experiments from which we show the various ground truth spectral filters can be accurately recovered.
arXiv Detail & Related papers (2020-06-12T17:51:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.