On the Expressive Power of Spectral Invariant Graph Neural Networks
- URL: http://arxiv.org/abs/2406.04336v1
- Date: Thu, 6 Jun 2024 17:59:41 GMT
- Title: On the Expressive Power of Spectral Invariant Graph Neural Networks
- Authors: Bohang Zhang, Lingxiao Zhao, Haggai Maron,
- Abstract summary: We introduce a unified message-passing framework for designing spectral invariant GNNs, called Eigenspace Projection GNN (EPNN)
We show that EPNN essentially unifies all prior spectral invariant architectures, in that they are either strictly less expressive or equivalent to EPNN.
We discuss whether using spectral features can gain additional expressiveness when combined with more expressive GNNs.
- Score: 28.557550571187253
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Incorporating spectral information to enhance Graph Neural Networks (GNNs) has shown promising results but raises a fundamental challenge due to the inherent ambiguity of eigenvectors. Various architectures have been proposed to address this ambiguity, referred to as spectral invariant architectures. Notable examples include GNNs and Graph Transformers that use spectral distances, spectral projection matrices, or other invariant spectral features. However, the potential expressive power of these spectral invariant architectures remains largely unclear. The goal of this work is to gain a deep theoretical understanding of the expressive power obtainable when using spectral features. We first introduce a unified message-passing framework for designing spectral invariant GNNs, called Eigenspace Projection GNN (EPNN). A comprehensive analysis shows that EPNN essentially unifies all prior spectral invariant architectures, in that they are either strictly less expressive or equivalent to EPNN. A fine-grained expressiveness hierarchy among different architectures is also established. On the other hand, we prove that EPNN itself is bounded by a recently proposed class of Subgraph GNNs, implying that all these spectral invariant architectures are strictly less expressive than 3-WL. Finally, we discuss whether using spectral features can gain additional expressiveness when combined with more expressive GNNs.
Related papers
- Spectral Graph Neural Networks are Incomplete on Graphs with a Simple Spectrum [2.916558661202724]
Spectral features are widely incorporated within Graph Neural Networks (GNNs) to improve their expressive power.<n>We propose a method to provably improve SGNNs' expressivity on simple spectrum graphs.
arXiv Detail & Related papers (2025-06-05T19:26:29Z) - Hyperbolic-PDE GNN: Spectral Graph Neural Networks in the Perspective of A System of Hyperbolic Partial Differential Equations [17.919550332541963]
Graph neural networks (GNNs) leverage message passing mechanisms to learn the topological features of graph data.<n>We formulate message passing as a system of hyperbolic partial differential equations (hyperbolic PDEs)<n>We establish a connection with spectral graph neural networks (spectral GNNs) serving as a message passing enhancement paradigm for spectral GNNs.
arXiv Detail & Related papers (2025-05-29T02:49:26Z) - Homomorphism Expressivity of Spectral Invariant Graph Neural Networks [25.040557677497745]
We prove that spectral invariant GNNs can homomorphism-count exactly a class of specific tree-like graphs which we refer to as parallel trees.
Our results significantly extend Arvind et al. (2024) and settle their open questions.
We generalize our analysis to higher-order GNNs and answer an open question raised by Zhang et al. (2024)
arXiv Detail & Related papers (2025-03-01T13:23:49Z) - Large-Scale Spectral Graph Neural Networks via Laplacian Sparsification: Technical Report [21.288230563135055]
We propose a novel graph spectral sparsification method to approximate the propagation patterns of spectral Graph Neural Networks (GNNs)
Our method allows the application of linear layers on the input node features, enabling end-to-end training as well as the handling of raw features.
arXiv Detail & Related papers (2025-01-08T15:36:19Z) - GrassNet: State Space Model Meets Graph Neural Network [57.62885438406724]
Graph State Space Network (GrassNet) is a novel graph neural network with theoretical support that provides a simple yet effective scheme for designing arbitrary graph spectral filters.
To the best of our knowledge, our work is the first to employ SSMs for the design of graph GNN spectral filters.
Extensive experiments on nine public benchmarks reveal that GrassNet achieves superior performance in real-world graph modeling tasks.
arXiv Detail & Related papers (2024-08-16T07:33:58Z) - Rethinking Spectral Graph Neural Networks with Spatially Adaptive Filtering [31.595664867365322]
spectral Graph Neural Networks (GNNs) are well-founded in the spectral domain, but their practical reliance on approximation implies a profound linkage to the spatial domain.
We establish a theoretical connection between spectral and spatial aggregation, unveiling an intrinsic interaction that spectral implicitly leads the original graph to an adapted new graph.
We propose a novel Spatially Adaptive Filtering (SAF) framework, which leverages the adapted new graph by spectral filtering for an auxiliary non-local aggregation.
arXiv Detail & Related papers (2024-01-17T09:12:31Z) - HoloNets: Spectral Convolutions do extend to Directed Graphs [59.851175771106625]
Conventional wisdom dictates that spectral convolutional networks may only be deployed on undirected graphs.
Here we show this traditional reliance on the graph Fourier transform to be superfluous.
We provide a frequency-response interpretation of newly developed filters, investigate the influence of the basis used to express filters and discuss the interplay with characteristic operators on which networks are based.
arXiv Detail & Related papers (2023-10-03T17:42:09Z) - Towards Expressive Spectral-Temporal Graph Neural Networks for Time Series Forecasting [101.5022396668152]
Spectral-temporal graph neural network is a promising abstraction underlying most time series forecasting models.
We establish a theoretical framework that unravels the expressive power of spectral-temporal GNNs.
Our findings pave the way for devising a broader array of provably expressive GNN-based models for time series.
arXiv Detail & Related papers (2023-05-11T05:56:38Z) - Specformer: Spectral Graph Neural Networks Meet Transformers [51.644312964537356]
Spectral graph neural networks (GNNs) learn graph representations via spectral-domain graph convolutions.
We introduce Specformer, which effectively encodes the set of all eigenvalues and performs self-attention in the spectral domain.
By stacking multiple Specformer layers, one can build a powerful spectral GNN.
arXiv Detail & Related papers (2023-03-02T07:36:23Z) - Equivariant Polynomials for Graph Neural Networks [38.15983687193912]
Graph Networks (GNN) are inherently limited in their expressive power.
This paper introduces an alternative power hierarchy based on the ability of GNNs to calculate equivariants of certain degree.
These enhanced GNNs demonstrate state-of-the-art results in experiments across multiple graph learning benchmarks.
arXiv Detail & Related papers (2023-02-22T18:53:38Z) - Equivariant Transduction through Invariant Alignment [71.45263447328374]
We introduce a novel group-equivariant architecture that incorporates a group-in hard alignment mechanism.
We find that our network's structure allows it to develop stronger equivariant properties than existing group-equivariant approaches.
We additionally find that it outperforms previous group-equivariant networks empirically on the SCAN task.
arXiv Detail & Related papers (2022-09-22T11:19:45Z) - How Powerful are Spectral Graph Neural Networks [9.594432031144715]
Spectral Graph Neural Network is a kind of Graph Neural Network based on graph signal filters.
We first prove that even spectral GNNs without nonlinearity can produce arbitrary graph signals.
We also establish a connection between the expressive power of spectral GNNs and Graph Isomorphism (GI) testing.
arXiv Detail & Related papers (2022-05-23T10:22:12Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Equivariant Subgraph Aggregation Networks [23.26140936226352]
This paper proposes a novel framework called Equivariant Subgraph Aggregation Networks (ESAN) to address this issue.
While two graphs may not be distinguishable by an MPNN, they often contain distinguishable subgraphs.
We develop novel variants of the 1-dimensional Weisfeiler-Leman (1-WL) test for graph isomorphism, and prove lower bounds on the expressiveness of ESAN.
We provide theoretical results that describe how design choices such as the subgraph selection policy and equivariant neural architecture affect our architecture's expressive power.
arXiv Detail & Related papers (2021-10-06T16:45:07Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.