Integrating Graph Neural Networks with Scattering Transform for Anomaly Detection
- URL: http://arxiv.org/abs/2404.10800v3
- Date: Wed, 24 Apr 2024 12:43:30 GMT
- Title: Integrating Graph Neural Networks with Scattering Transform for Anomaly Detection
- Authors: Abdeljalil Zoubir, Badr Missaoui,
- Abstract summary: We present two novel methods in Network Intrusion Detection Systems (NIDS) using Graph Neural Networks (GNNs)
The first approach, Scattering Transform with E-GraphSAGE (STEG), utilizes the scattering transform to conduct multi-resolution analysis of edge feature vectors.
The second approach improves node representation by initiating with Node2Vec, diverging from standard methods of using uniform values.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we present two novel methods in Network Intrusion Detection Systems (NIDS) using Graph Neural Networks (GNNs). The first approach, Scattering Transform with E-GraphSAGE (STEG), utilizes the scattering transform to conduct multi-resolution analysis of edge feature vectors. This provides a detailed representation that is essential for identifying subtle anomalies in network traffic. The second approach improves node representation by initiating with Node2Vec, diverging from standard methods of using uniform values, thereby capturing a more accurate and holistic network picture. Our methods have shown significant improvements in performance compared to existing state-of-the-art methods in benchmark NIDS datasets.
Related papers
- TCCT-Net: Two-Stream Network Architecture for Fast and Efficient Engagement Estimation via Behavioral Feature Signals [58.865901821451295]
We present a novel two-stream feature fusion "Tensor-Convolution and Convolution-Transformer Network" (TCCT-Net) architecture.
To better learn the meaningful patterns in the temporal-spatial domain, we design a "CT" stream that integrates a hybrid convolutional-transformer.
In parallel, to efficiently extract rich patterns from the temporal-frequency domain, we introduce a "TC" stream that uses Continuous Wavelet Transform (CWT) to represent information in a 2D tensor form.
arXiv Detail & Related papers (2024-04-15T06:01:48Z) - Ensemble Quadratic Assignment Network for Graph Matching [52.20001802006391]
Graph matching is a commonly used technique in computer vision and pattern recognition.
Recent data-driven approaches have improved the graph matching accuracy remarkably.
We propose a graph neural network (GNN) based approach to combine the advantages of data-driven and traditional methods.
arXiv Detail & Related papers (2024-03-11T06:34:05Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Leveraging Graph Diffusion Models for Network Refinement Tasks [72.54590628084178]
We propose a novel graph generative framework, SGDM, based on subgraph diffusion.
Our framework not only improves the scalability and fidelity of graph diffusion models, but also leverages the reverse process to perform novel, conditional generation tasks.
arXiv Detail & Related papers (2023-11-29T18:02:29Z) - Graph Neural Network Based Node Deployment for Throughput Enhancement [20.56966053013759]
We propose a novel graph neural network (GNN) method for the network node deployment problem.
We show that an expressive GNN has the capacity to approximate both the function value and the traffic permutation, as a theoretic support for the proposed method.
arXiv Detail & Related papers (2022-08-19T08:06:28Z) - Interpolation-based Correlation Reduction Network for Semi-Supervised
Graph Learning [49.94816548023729]
We propose a novel graph contrastive learning method, termed Interpolation-based Correlation Reduction Network (ICRN)
In our method, we improve the discriminative capability of the latent feature by enlarging the margin of decision boundaries.
By combining the two settings, we extract rich supervision information from both the abundant unlabeled nodes and the rare yet valuable labeled nodes for discnative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - Improving the Training of Graph Neural Networks with Consistency
Regularization [9.239633445211574]
We investigate how consistency regularization can help improve the performance of graph neural networks.
We combine the consistency regularization methods with two state-of-the-art GNNs and conduct experiments on the ogbn-products dataset.
With the consistency regularization, the performance of state-of-the-art GNNs can be improved by 0.3% on the ogbn-products dataset.
arXiv Detail & Related papers (2021-12-08T14:51:30Z) - Equivariance-bridged SO(2)-Invariant Representation Learning using Graph
Convolutional Network [0.1657441317977376]
Training a Convolutional Neural Network (CNN) to be robust against rotation has mostly been done with data augmentation.
This paper highlights to encourage less dependence on data augmentation by achieving structural rotational invariance of a network.
Our method achieves the state-of-the-art image classification performance on rotated MNIST and CIFAR-10 images.
arXiv Detail & Related papers (2021-06-18T08:37:45Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.