SmoothGNN: Smoothing-based GNN for Unsupervised Node Anomaly Detection
- URL: http://arxiv.org/abs/2405.17525v1
- Date: Mon, 27 May 2024 14:23:30 GMT
- Title: SmoothGNN: Smoothing-based GNN for Unsupervised Node Anomaly Detection
- Authors: Xiangyu Dong, Xingyi Zhang, Yanni Sun, Lei Chen, Mingxuan Yuan, Sibo Wang,
- Abstract summary: The smoothing issue leads to indistinguishable node representations, which poses a significant challenge in the field of graph learning.
We observe that anomalous and normal nodes show different patterns in the smoothing process, which can be leveraged to enhance node detection tasks.
We introduce a Smoothing-aware Spectral Graph Neural Network, which establishes the connection between the spectral space of graphs and the smoothing process.
- Score: 14.1365182085686
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The smoothing issue leads to indistinguishable node representations, which poses a significant challenge in the field of graph learning. However, this issue also presents an opportunity to reveal underlying properties behind different types of nodes, which have been overlooked in previous studies. Through empirical and theoretical analysis of real-world node anomaly detection (NAD) datasets, we observe that anomalous and normal nodes show different patterns in the smoothing process, which can be leveraged to enhance NAD tasks. Motivated by these findings, in this paper, we propose a novel unsupervised NAD framework. Specifically, according to our theoretical analysis, we design a Smoothing Learning Component. Subsequently, we introduce a Smoothing-aware Spectral Graph Neural Network, which establishes the connection between the spectral space of graphs and the smoothing process. Additionally, we demonstrate that the Dirichlet Energy, which reflects the smoothness of a graph, can serve as coefficients for node representations across different dimensions of the spectral space. Building upon these observations and analyses, we devise a novel anomaly measure for the NAD task. Extensive experiments on 9 real-world datasets show that SmoothGNN outperforms the best rival by an average of 14.66% in AUC and 7.28% in Precision, with 75x running time speed-up, which validates the effectiveness and efficiency of our framework.
Related papers
- Design Your Own Universe: A Physics-Informed Agnostic Method for Enhancing Graph Neural Networks [34.16727363891593]
We propose a model-agnostic enhancement framework for Graph Neural Networks (GNNs)
This framework enriches the graph structure by introducing additional nodes and rewiring connections with both positive and negative weights.
We theoretically verify that GNNs enhanced through our approach can effectively circumvent the over-smoothing issue and exhibit robustness against over-squashing.
Empirical validations on benchmarks for homophilic, heterophilic graphs, and long-term graph datasets show that GNNs enhanced by our method significantly outperform their original counterparts.
arXiv Detail & Related papers (2024-01-26T00:47:43Z) - SimCalib: Graph Neural Network Calibration based on Similarity between
Nodes [60.92081159963772]
Graph neural networks (GNNs) have exhibited impressive performance in modeling graph data as exemplified in various applications.
We shed light on the relationship between GNN calibration and nodewise similarity via theoretical analysis.
A novel calibration framework, named SimCalib, is accordingly proposed to consider similarity between nodes at global and local levels.
arXiv Detail & Related papers (2023-12-19T04:58:37Z) - Robust Node Representation Learning via Graph Variational Diffusion
Networks [7.335425547621226]
In recent years, compelling evidence has revealed that GNN-based node representation learning can be substantially deteriorated by perturbations in a graph structure.
To learn robust node representation in the presence of perturbations, various works have been proposed to safeguard GNNs.
We propose the Graph Variational Diffusion Network (GVDN), a new node encoder that effectively manipulates Gaussian noise to safeguard robustness on perturbed graphs.
arXiv Detail & Related papers (2023-12-18T03:18:53Z) - Neural Tangent Kernels Motivate Graph Neural Networks with
Cross-Covariance Graphs [94.44374472696272]
We investigate NTKs and alignment in the context of graph neural networks (GNNs)
Our results establish the theoretical guarantees on the optimality of the alignment for a two-layer GNN.
These guarantees are characterized by the graph shift operator being a function of the cross-covariance between the input and the output data.
arXiv Detail & Related papers (2023-10-16T19:54:21Z) - Deep Graph Neural Networks via Flexible Subgraph Aggregation [50.034313206471694]
Graph neural networks (GNNs) can learn from graph-structured data and learn the representation of nodes through aggregating neighborhood information.
In this paper, we evaluate the expressive power of GNNs from the perspective of subgraph aggregation.
We propose a sampling-based node-level residual module (SNR) that can achieve a more flexible utilization of different hops of subgraph aggregation.
arXiv Detail & Related papers (2023-05-09T12:03:42Z) - EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural
Networks [51.42338058718487]
Graph Neural Networks (GNNs) have received extensive research attention for their promising performance in graph machine learning.
Existing approaches, such as GCN and GPRGNN, are not robust in the face of homophily changes on test graphs.
We propose EvenNet, a spectral GNN corresponding to an even-polynomial graph filter.
arXiv Detail & Related papers (2022-05-27T10:48:14Z) - Deep Graph-level Anomaly Detection by Glocal Knowledge Distillation [61.39364567221311]
Graph-level anomaly detection (GAD) describes the problem of detecting graphs that are abnormal in their structure and/or the features of their nodes.
One of the challenges in GAD is to devise graph representations that enable the detection of both locally- and globally-anomalous graphs.
We introduce a novel deep anomaly detection approach for GAD that learns rich global and local normal pattern information by joint random distillation of graph and node representations.
arXiv Detail & Related papers (2021-12-19T05:04:53Z) - SAIL: Self-Augmented Graph Contrastive Learning [40.76236706250037]
This paper studies learning node representations with graph neural networks (GNNs) for unsupervised scenario.
We derive a theoretical analysis and provide an empirical demonstration about the non-steady performance of GNNs over different graph datasets.
arXiv Detail & Related papers (2020-09-02T10:27:30Z) - Stochastic Graph Recurrent Neural Network [6.656993023468793]
We propose SGRNN, a novel neural architecture that applies latent variables to simultaneously capture evolution in node attributes and topology.
Specifically, deterministic states are separated from states in the iterative process to suppress mutual interference.
Experiments on real-world datasets demonstrate the effectiveness of the proposed model.
arXiv Detail & Related papers (2020-09-01T16:14:30Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.