The Generalized Skew Spectrum of Graphs
- URL: http://arxiv.org/abs/2505.23609v1
- Date: Thu, 29 May 2025 16:18:01 GMT
- Title: The Generalized Skew Spectrum of Graphs
- Authors: Armando Bellante, Martin Plávala, Alessandro Luongo,
- Abstract summary: We introduce a new class of graph invariants that are isomorphism-invariant and capable of embedding richer graph structures.<n>By applying generalization-preservings to this family, we improve the Skew Spectrum's expressivity at the same computational cost.
- Score: 42.70026220176376
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper proposes a family of permutation-invariant graph embeddings, generalizing the Skew Spectrum of graphs of Kondor & Borgwardt (2008). Grounded in group theory and harmonic analysis, our method introduces a new class of graph invariants that are isomorphism-invariant and capable of embedding richer graph structures - including attributed graphs, multilayer graphs, and hypergraphs - which the Skew Spectrum could not handle. Our generalization further defines a family of functions that enables a trade-off between computational complexity and expressivity. By applying generalization-preserving heuristics to this family, we improve the Skew Spectrum's expressivity at the same computational cost. We formally prove the invariance of our generalization, demonstrate its improved expressiveness through experiments, and discuss its efficient computation.
Related papers
- Invariant Graph Transformer for Out-of-Distribution Generalization [21.60139614144787]
We introduce Graph Out-Of-Distribution generalized Transformer (GOODFormer)<n>It aims to learn generalized graph representations by capturing invariant relationships between predictive graph structures and labels.<n>We design an evolving subgraph positional and structural encoder to effectively and efficiently capture the encoding information of dynamically changing subgraphs.
arXiv Detail & Related papers (2025-08-01T04:11:53Z) - Equivariance Everywhere All At Once: A Recipe for Graph Foundation Models [13.053266613831447]
We present a recipe for designing graph foundation models for node-level tasks from first principles.<n>The key ingredient underpinning our study is a systematic investigation of the symmetries that a graph foundation model must respect.<n>We validate our approach through extensive experiments on 29 real-world node classification datasets.
arXiv Detail & Related papers (2025-06-17T08:05:08Z) - What Improves the Generalization of Graph Transformers? A Theoretical Dive into the Self-attention and Positional Encoding [67.59552859593985]
Graph Transformers, which incorporate self-attention and positional encoding, have emerged as a powerful architecture for various graph learning tasks.
This paper introduces first theoretical investigation of a shallow Graph Transformer for semi-supervised classification.
arXiv Detail & Related papers (2024-06-04T05:30:16Z) - CKGConv: General Graph Convolution with Continuous Kernels [24.58050212186722]
We propose a novel and general graph convolution framework by parameterizing the kernels as continuous functions of pseudo-coordinates derived via graph positional encoding.
We name this Continuous Kernel Graph Convolution (CKGConv)
We show that CKGConv-based Networks outperform existing graph convolutional networks and perform comparably to the best graph transformers across a variety of graph datasets.
arXiv Detail & Related papers (2024-04-21T10:26:13Z) - Homomorphism Counts for Graph Neural Networks: All About That Basis [8.25219440625445]
We argue for a more fine-grained approach, which incorporates the homomorphism counts of all structures in the basis'' of the target pattern.
This yields strictly more expressive architectures without incurring any additional overhead in terms of computational complexity.
arXiv Detail & Related papers (2024-02-13T16:57:06Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - Generalized Spectral Clustering for Directed and Undirected Graphs [4.286327408435937]
We present a generalized spectral clustering framework that can address both directed and undirected graphs.
Our approach is based on the spectral relaxation of a new functional that we introduce as the generalized Dirichlet energy of a graph function.
We also propose a practical parametrization of the regularizing measure constructed from the iterated powers of the natural random walk on the graph.
arXiv Detail & Related papers (2022-03-07T09:18:42Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Scaling Graph Clustering with Distributed Sketches [1.1011268090482575]
We present a method inspired by spectral clustering where we instead use matrix sketches derived from random dimension-reducing projections.
We show that our method produces embeddings that yield performant clustering results given a fully-dynamic block model stream.
We also discuss the effects of block model parameters upon the required dimensionality of the subsequent embeddings, and show how random projections could significantly improve the performance of graph clustering in distributed memory.
arXiv Detail & Related papers (2020-07-24T17:38:04Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - The Inverse G-Wishart Distribution and Variational Message Passing [0.0]
We show that the Inverse G-Wishart family of distributions enables fundamental variational message passing factor graph fragments to be expressed elegantly and succinctly.
Message passing on a factor graph is a powerful paradigm for the coding of approximate inference algorithms for arbitrarily graphical large models.
arXiv Detail & Related papers (2020-05-20T06:59:48Z) - Permutation Invariant Graph Generation via Score-Based Generative
Modeling [114.12935776726606]
We propose a permutation invariant approach to modeling graphs, using the recent framework of score-based generative modeling.
In particular, we design a permutation equivariant, multi-channel graph neural network to model the gradient of the data distribution at the input graph.
For graph generation, we find that our learning approach achieves better or comparable results to existing models on benchmark datasets.
arXiv Detail & Related papers (2020-03-02T03:06:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.