SmoothGNN: Smoothing-based GNN for Unsupervised Node Anomaly Detection
- URL: http://arxiv.org/abs/2405.17525v1
- Date: Mon, 27 May 2024 14:23:30 GMT
- Title: SmoothGNN: Smoothing-based GNN for Unsupervised Node Anomaly Detection
- Authors: Xiangyu Dong, Xingyi Zhang, Yanni Sun, Lei Chen, Mingxuan Yuan, Sibo Wang,
- Abstract summary: The smoothing issue leads to indistinguishable node representations, which poses a significant challenge in the field of graph learning.
We observe that anomalous and normal nodes show different patterns in the smoothing process, which can be leveraged to enhance node detection tasks.
We introduce a Smoothing-aware Spectral Graph Neural Network, which establishes the connection between the spectral space of graphs and the smoothing process.
- Score: 14.1365182085686
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The smoothing issue leads to indistinguishable node representations, which poses a significant challenge in the field of graph learning. However, this issue also presents an opportunity to reveal underlying properties behind different types of nodes, which have been overlooked in previous studies. Through empirical and theoretical analysis of real-world node anomaly detection (NAD) datasets, we observe that anomalous and normal nodes show different patterns in the smoothing process, which can be leveraged to enhance NAD tasks. Motivated by these findings, in this paper, we propose a novel unsupervised NAD framework. Specifically, according to our theoretical analysis, we design a Smoothing Learning Component. Subsequently, we introduce a Smoothing-aware Spectral Graph Neural Network, which establishes the connection between the spectral space of graphs and the smoothing process. Additionally, we demonstrate that the Dirichlet Energy, which reflects the smoothness of a graph, can serve as coefficients for node representations across different dimensions of the spectral space. Building upon these observations and analyses, we devise a novel anomaly measure for the NAD task. Extensive experiments on 9 real-world datasets show that SmoothGNN outperforms the best rival by an average of 14.66% in AUC and 7.28% in Precision, with 75x running time speed-up, which validates the effectiveness and efficiency of our framework.
Related papers
- Decoupled Graph Energy-based Model for Node Out-of-Distribution Detection on Heterophilic Graphs [61.226857589092]
OOD detection on nodes in graph learning remains underexplored.
GNNSafe adapted energy-based detection to the graph domain with state-of-the-art performance.
We introduce DeGEM, which decomposes the learning process into two parts: a graph encoder that leverages topology information for node representations and an energy head that operates in latent space.
arXiv Detail & Related papers (2025-02-25T07:20:00Z) - High-Pass Graph Convolutional Network for Enhanced Anomaly Detection: A Novel Approach [0.0]
This paper proposes a novel approach by introducing a High-Pass Graph Convolution Network (HP-GCN) for Graph Anomaly Detection (GAD)
The proposed HP-GCN leverages high-frequency components to detect anomalies, as anomalies tend to increase high-frequency signals within the network of normal nodes.
The model is evaluated and validated on YelpChi, Amazon, T-Finance, and T-Social datasets.
arXiv Detail & Related papers (2024-11-04T05:38:07Z) - Spiking Graph Neural Network on Riemannian Manifolds [51.15400848660023]
Graph neural networks (GNNs) have become the dominant solution for learning on graphs.
Existing spiking GNNs consider graphs in Euclidean space, ignoring the structural geometry.
We present a Manifold-valued Spiking GNN (MSG)
MSG achieves superior performance to previous spiking GNNs and energy efficiency to conventional GNNs.
arXiv Detail & Related papers (2024-10-23T15:09:02Z) - Understanding Virtual Nodes: Oversmoothing, Oversquashing, and Node Heterogeneity [4.59357989139429]
Augmenting MPNNs with a virtual node (VN) has been found to improve performance on a range of benchmarks.
We show that VNs typically avoid replicating anti-smoothing approaches to maintain expressive power.
We propose a variant of VN with the same computational complexity, which can have different sensitivity to nodes based on the graph structure.
arXiv Detail & Related papers (2024-05-22T10:51:12Z) - Robust Node Representation Learning via Graph Variational Diffusion
Networks [7.335425547621226]
In recent years, compelling evidence has revealed that GNN-based node representation learning can be substantially deteriorated by perturbations in a graph structure.
To learn robust node representation in the presence of perturbations, various works have been proposed to safeguard GNNs.
We propose the Graph Variational Diffusion Network (GVDN), a new node encoder that effectively manipulates Gaussian noise to safeguard robustness on perturbed graphs.
arXiv Detail & Related papers (2023-12-18T03:18:53Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - A Dimension-Augmented Physics-Informed Neural Network (DaPINN) with High
Level Accuracy and Efficiency [0.20391237204597357]
Physics-informed neural networks (PINNs) have been widely applied in different fields.
We propose a novel dimension-augmented physics-informed neural network (DaPINN)
DaPINN simultaneously and significantly improves the accuracy and efficiency of the PINN.
arXiv Detail & Related papers (2022-10-19T15:54:37Z) - Momentum Diminishes the Effect of Spectral Bias in Physics-Informed
Neural Networks [72.09574528342732]
Physics-informed neural network (PINN) algorithms have shown promising results in solving a wide range of problems involving partial differential equations (PDEs)
They often fail to converge to desirable solutions when the target function contains high-frequency features, due to a phenomenon known as spectral bias.
In the present work, we exploit neural tangent kernels (NTKs) to investigate the training dynamics of PINNs evolving under gradient descent with momentum (SGDM)
arXiv Detail & Related papers (2022-06-29T19:03:10Z) - Edge Graph Neural Networks for Massive MIMO Detection [15.970981766599035]
Massive Multiple-Input Multiple-Out (MIMO) detection is an important problem in modern wireless communication systems.
While traditional Belief Propagation (BP) detectors perform poorly on loopy graphs, the recent Graph Neural Networks (GNNs)-based method can overcome the drawbacks of BP and achieve superior performance.
arXiv Detail & Related papers (2022-05-22T08:01:47Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Understanding and Resolving Performance Degradation in Graph
Convolutional Networks [105.14867349802898]
Graph Convolutional Network (GCN) stacks several layers and in each layer performs a PROPagation operation (PROP) and a TRANsformation operation (TRAN) for learning node representations over graph-structured data.
GCNs tend to suffer performance drop when the model gets deep.
We study performance degradation of GCNs by experimentally examining how stacking only TRANs or PROPs works.
arXiv Detail & Related papers (2020-06-12T12:12:12Z) - Stochastic Graph Neural Networks [123.39024384275054]
Graph neural networks (GNNs) model nonlinear representations in graph data with applications in distributed agent coordination, control, and planning.
Current GNN architectures assume ideal scenarios and ignore link fluctuations that occur due to environment, human factors, or external attacks.
In these situations, the GNN fails to address its distributed task if the topological randomness is not considered accordingly.
arXiv Detail & Related papers (2020-06-04T08:00:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.