A Theory of Random Graph Shift in Truncated-Spectrum vRKHS
- URL: http://arxiv.org/abs/2602.23880v1
- Date: Fri, 27 Feb 2026 10:19:57 GMT
- Title: A Theory of Random Graph Shift in Truncated-Spectrum vRKHS
- Authors: Zhang Wan, Tingting Mu, Samuel Kaski,
- Abstract summary: This paper develops a theory of graph classification under domain shift through a random-graph generative lens.<n>We consider intra-class graphs sharing the same random graph model (RGM) and the domain shift induced by changes in RGM components.
- Score: 26.195791008324495
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper develops a theory of graph classification under domain shift through a random-graph generative lens, where we consider intra-class graphs sharing the same random graph model (RGM) and the domain shift induced by changes in RGM components. While classic domain adaptation (DA) theories have well-underpinned existing techniques to handle graph distribution shift, the information of graph samples, which are itself structured objects, is less explored. The non-Euclidean nature of graphs and specialized architectures for graph learning further complicate a fine-grained analysis of graph distribution shifts. In this paper, we propose a theory that assumes RGM as the data generative process, exploiting its connection to hypothesis complexity in function space perspective for such fine-grained analysis. Building on a vector-valued reproducing kernel Hilbert space (vRKHS) formulation, we derive a generalization bound whose shift penalty admits a factorization into (i) a domain discrepancy term, (ii) a spectral-geometry term summarized by the accessible truncated spectrum, and (iii) an amplitude term that aggregates convergence and construction-stability effects. We empirically verify the insights on these terms in both real data and simulations.
Related papers
- Generative Risk Minimization for Out-of-Distribution Generalization on Graphs [71.48583448654522]
We propose an innovative framework, named Generative Risk Minimization (GRM), designed to generate an invariant subgraph for each input graph to be classified, instead of extraction.<n>We conduct extensive experiments across a variety of real-world graph datasets for both node-level and graph-level OOD generalization.
arXiv Detail & Related papers (2025-02-11T21:24:13Z) - What Improves the Generalization of Graph Transformers? A Theoretical Dive into the Self-attention and Positional Encoding [67.59552859593985]
Graph Transformers, which incorporate self-attention and positional encoding, have emerged as a powerful architecture for various graph learning tasks.
This paper introduces first theoretical investigation of a shallow Graph Transformer for semi-supervised classification.
arXiv Detail & Related papers (2024-06-04T05:30:16Z) - Generalization of Graph Neural Networks through the Lens of Homomorphism [7.223313563198697]
We propose to study the generalization of Graph Neural Networks (GNNs) through a novel perspective - analyzing the entropy of graph homomorphism.
By linking graph homomorphism with information-theoretic measures, we derive generalization bounds for both graph and node classifications.
These bounds are capable of capturing subtleties inherent in various graph structures, including but not limited to paths, cycles and cliques.
arXiv Detail & Related papers (2024-03-10T03:51:59Z) - Supercharging Graph Transformers with Advective Diffusion [28.40109111316014]
This paper proposes Advective Diffusion Transformer (AdvDIFFormer), a physics-inspired graph Transformer model designed to address this challenge.<n>We show that AdvDIFFormer has provable capability for controlling generalization error with topological shifts.<n> Empirically, the model demonstrates superiority in various predictive tasks across information networks, molecular screening and protein interactions.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - Graph Condensation via Receptive Field Distribution Matching [61.71711656856704]
This paper focuses on creating a small graph to represent the original graph, so that GNNs trained on the size-reduced graph can make accurate predictions.
We view the original graph as a distribution of receptive fields and aim to synthesize a small graph whose receptive fields share a similar distribution.
arXiv Detail & Related papers (2022-06-28T02:10:05Z) - Neural Sheaf Diffusion: A Topological Perspective on Heterophily and
Oversmoothing in GNNs [16.88394293874848]
We use cellular sheaf theory to show that the underlying geometry of the graph is deeply linked with the performance of GNNs.
By considering a hierarchy of increasingly general sheaves, we study how the ability of the sheaf diffusion process to achieve linear separation of the classes in the infinite time limit expands.
We prove that when the sheaf is non-trivial, discretised parametric diffusion processes have greater control than GNNs over their behaviour.
arXiv Detail & Related papers (2022-02-09T17:25:02Z) - Generating the Graph Gestalt: Kernel-Regularized Graph Representation
Learning [47.506013386710954]
A complete scientific understanding of graph data should address both global and local structure.
We propose a joint model for both as complementary objectives in a graph VAE framework.
Our experiments demonstrate a significant improvement in the realism of the generated graph structures, typically by 1-2 orders of magnitude of graph structure metrics.
arXiv Detail & Related papers (2021-06-29T10:48:28Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Hyperbolic Graph Embedding with Enhanced Semi-Implicit Variational
Inference [48.63194907060615]
We build off of semi-implicit graph variational auto-encoders to capture higher-order statistics in a low-dimensional graph latent representation.
We incorporate hyperbolic geometry in the latent space through a Poincare embedding to efficiently represent graphs exhibiting hierarchical structure.
arXiv Detail & Related papers (2020-10-31T05:48:34Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.