SpectralGap: Graph-Level Out-of-Distribution Detection via Laplacian Eigenvalue Gaps
- URL: http://arxiv.org/abs/2505.15177v2
- Date: Fri, 23 May 2025 09:32:51 GMT
- Title: SpectralGap: Graph-Level Out-of-Distribution Detection via Laplacian Eigenvalue Gaps
- Authors: Jiawei Gu, Ziyue Qiao, Zechao Li,
- Abstract summary: We propose SpecGap, an effective post-hoc approach for OOD detection on graphs.<n> SpecGap achieves state-of-the-art performance across multiple benchmark datasets.
- Score: 19.580332929984028
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The task of graph-level out-of-distribution (OOD) detection is crucial for deploying graph neural networks in real-world settings. In this paper, we observe a significant difference in the relationship between the largest and second-largest eigenvalues of the Laplacian matrix for in-distribution (ID) and OOD graph samples: \textit{OOD samples often exhibit anomalous spectral gaps (the difference between the largest and second-largest eigenvalues)}. This observation motivates us to propose SpecGap, an effective post-hoc approach for OOD detection on graphs. SpecGap adjusts features by subtracting the component associated with the second-largest eigenvalue, scaled by the spectral gap, from the high-level features (i.e., $\mathbf{X}-\left(\lambda_n-\lambda_{n-1}\right) \mathbf{u}_{n-1} \mathbf{v}_{n-1}^T$). SpecGap achieves state-of-the-art performance across multiple benchmark datasets. We present extensive ablation studies and comprehensive theoretical analyses to support our empirical results. As a parameter-free post-hoc method, SpecGap can be easily integrated into existing graph neural network models without requiring any additional training or model modification.
Related papers
- GCEPNet: Graph Convolution-Enhanced Expectation Propagation for Massive MIMO Detection [5.714553194279462]
We show that a real-valued system can be modeled as spectral signal convolution on graph, through which the correlation between unknown variables can be captured.
Based on such analysis, we propose graph convolution-enhanced expectation propagation (GCEPNet) with better generalization capacity.
arXiv Detail & Related papers (2024-04-23T10:13:39Z) - Graph Out-of-Distribution Generalization with Controllable Data
Augmentation [51.17476258673232]
Graph Neural Network (GNN) has demonstrated extraordinary performance in classifying graph properties.
Due to the selection bias of training and testing data, distribution deviation is widespread.
We propose OOD calibration to measure the distribution deviation of virtual samples.
arXiv Detail & Related papers (2023-08-16T13:10:27Z) - Quantifying the Optimization and Generalization Advantages of Graph Neural Networks Over Multilayer Perceptrons [50.33260238739837]
Graph networks (GNNs) have demonstrated remarkable capabilities in learning from graph-structured data.<n>There remains a lack of analysis comparing GNNs and generalizations from an optimization and generalization perspective.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - Graph Classification Gaussian Processes via Spectral Features [7.474662887810221]
Graph classification aims to categorise graphs based on their structure and node attributes.
In this work, we propose to tackle this task using tools from graph signal processing by deriving spectral features.
We show that even such a simple approach, having no learned parameters, can yield competitive performance compared to strong neural network and graph kernel baselines.
arXiv Detail & Related papers (2023-06-06T15:31:05Z) - Graph Fourier MMD for Signals on Graphs [67.68356461123219]
We propose a novel distance between distributions and signals on graphs.
GFMMD is defined via an optimal witness function that is both smooth on the graph and maximizes difference in expectation.
We showcase it on graph benchmark datasets as well as on single cell RNA-sequencing data analysis.
arXiv Detail & Related papers (2023-06-05T00:01:17Z) - Optimality of Message-Passing Architectures for Sparse Graphs [12.42591017155152]
We study the node classification problem on feature-decorated graphs in the sparse setting, i.e., when the expected degree of a node is $O(1)$ in the number of nodes.<n>We introduce a notion of Bayes optimality for node classification tasks, called local Bayes optimality.<n>We show that the optimal message-passing architecture interpolates between a standard in the regime of low graph signal and a typical in the regime of high graph signal.
arXiv Detail & Related papers (2023-05-17T17:31:20Z) - RankFeat: Rank-1 Feature Removal for Out-of-distribution Detection [65.67315418971688]
textttRankFeat is a simple yet effective textttpost hoc approach for OOD detection.
textttRankFeat achieves the emphstate-of-the-art performance and reduces the average false positive rate (FPR95) by 17.90% compared with the previous best method.
arXiv Detail & Related papers (2022-09-18T16:01:31Z) - Graph Spectral Embedding using the Geodesic Betweeness Centrality [76.27138343125985]
We introduce the Graph Sylvester Embedding (GSE), an unsupervised graph representation of local similarity, connectivity, and global structure.
GSE uses the solution of the Sylvester equation to capture both network structure and neighborhood proximity in a single representation.
arXiv Detail & Related papers (2022-05-07T04:11:23Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Issues with Propagation Based Models for Graph-Level Outlier Detection [16.980621769406916]
Graph-Level Outlier Detection ( GLOD) is the task of identifying unusual graphs within a graph database.
This paper identifies and delves into a fundamental and intriguing issue with applying propagation based models to GLOD.
We find that ROC-AUC performance of the models change significantly depending on which class is down-sampled.
arXiv Detail & Related papers (2020-12-23T19:38:21Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.