Position: Message-passing and spectral GNNs are two sides of the same coin
- URL: http://arxiv.org/abs/2602.10031v1
- Date: Tue, 10 Feb 2026 17:53:40 GMT
- Title: Position: Message-passing and spectral GNNs are two sides of the same coin
- Authors: Antonis Vasileiou, Juan Cervino, Pascal Frossard, Charilaos I. Kanatsoulis, Christopher Morris, Michael T. Schaub, Pierre Vandergheynst, Zhiyang Wang, Guy Wolf, Ron Levie,
- Abstract summary: Graph neural networks (GNNs) are commonly divided into message-passing neural networks (MPNNs) and spectral graph neural networks (SGNs)<n>This paper argues that this divide is mostly artificial, hindering progress in the field.
- Score: 60.47572761832418
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural networks (GNNs) are commonly divided into message-passing neural networks (MPNNs) and spectral graph neural networks, reflecting two largely separate research traditions in machine learning and signal processing. This paper argues that this divide is mostly artificial, hindering progress in the field. We propose a viewpoint in which both MPNNs and spectral GNNs are understood as different parametrizations of permutation-equivariant operators acting on graph signals. From this perspective, many popular architectures are equivalent in expressive power, while genuine gaps arise only in specific regimes. We further argue that MPNNs and spectral GNNs offer complementary strengths. That is, MPNNs provide a natural language for discrete structure and expressivity analysis using tools from logic and graph isomorphism research, while the spectral perspective provides principled tools for understanding smoothing, bottlenecks, stability, and community structure. Overall, we posit that progress in graph learning will be accelerated by clearly understanding the key similarities and differences between these two types of GNNs, and by working towards unifying these perspectives within a common theoretical and conceptual framework rather than treating them as competing paradigms.
Related papers
- Topology-aware Neural Flux Prediction Guided by Physics [13.352980442733987]
Graph Neural Networks (GNNs) often struggle in preserving high-frequency components of nodal signals when dealing with directed graphs.<n>This paper proposes a novel framework that combines 1) explicit difference matrices that model directional gradients and 2) implicit physical constraints that enforce messages passing within GNNs to be consistent with natural laws.
arXiv Detail & Related papers (2025-06-06T02:01:50Z) - The Correspondence Between Bounded Graph Neural Networks and Fragments of First-Order Logic [8.430502131775723]
We propose GNN architectures that correspond precisely to prominent fragments of first-order logic (FO)<n>Our results provide a unifying framework for understanding the logical expressiveness of GNNs within FO.
arXiv Detail & Related papers (2025-05-12T19:45:45Z) - A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
We take a manifold perspective to establish the statistical generalization theory of GNNs on graphs sampled from a manifold in the spectral domain.<n>We prove that the generalization bounds of GNNs decrease linearly with the size of the graphs in the logarithmic scale, and increase linearly with the spectral continuity constants of the filter functions.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - Generalization Limits of Graph Neural Networks in Identity Effects
Learning [12.302336258860116]
Graph Neural Networks (GNNs) have emerged as a powerful tool for data-driven learning on various graph domains.
We establish new generalization properties and fundamental limits of GNNs in the context of learning so-called identity effects.
Our study is motivated by the need to understand the capabilities of GNNs when performing simple cognitive tasks.
arXiv Detail & Related papers (2023-06-30T20:56:38Z) - Quantifying the Optimization and Generalization Advantages of Graph Neural Networks Over Multilayer Perceptrons [50.33260238739837]
Graph networks (GNNs) have demonstrated remarkable capabilities in learning from graph-structured data.<n>There remains a lack of analysis comparing GNNs and generalizations from an optimization and generalization perspective.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - Fine-grained Expressivity of Graph Neural Networks [15.766353435658043]
We consider continuous extensions of both $1$-WL and MPNNs to graphons.
We show that the continuous variant of $1$-WL delivers an accurate topological characterization of the expressive power of MPNNs on graphons.
We also evaluate different MPNN architectures based on their ability to preserve graph distances.
arXiv Detail & Related papers (2023-06-06T14:12:23Z) - Privacy-Preserving Representation Learning for Text-Attributed Networks
with Simplicial Complexes [24.82096971322501]
I will study learning network representations with text attributes for simplicial complexes (RT4SC) via simplicial neural networks (SNNs)
I will conduct research on two potential attacks on the representation outputs from SNNs.
I will study a privacy-preserving deterministic differentially private alternating direction method of multiplier to learn secure representation outputs from SNNs.
arXiv Detail & Related papers (2023-02-09T00:32:06Z) - Graph Neural Networks with Learnable Structural and Positional
Representations [83.24058411666483]
A major issue with arbitrary graphs is the absence of canonical positional information of nodes.
We introduce Positional nodes (PE) of nodes, and inject it into the input layer, like in Transformers.
We observe a performance increase for molecular datasets, from 2.87% up to 64.14% when considering learnable PE for both GNN classes.
arXiv Detail & Related papers (2021-10-15T05:59:15Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Bridging the Gap between Spatial and Spectral Domains: A Survey on Graph
Neural Networks [52.76042362922247]
Graph neural networks (GNNs) are designed to handle the non-Euclidean graph-structure.
Existing GNNs are presented using various techniques, making direct comparison and cross-reference more complex.
We organize existing GNNs into spatial and spectral domains, as well as expose the connections within each domain.
arXiv Detail & Related papers (2020-02-27T01:15:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.