Encrypted Internet traffic classification using a supervised Spiking
Neural Network
- URL: http://arxiv.org/abs/2101.09818v1
- Date: Sun, 24 Jan 2021 22:46:08 GMT
- Title: Encrypted Internet traffic classification using a supervised Spiking
Neural Network
- Authors: Ali Rasteh, Florian Delpech, Carlos Aguilar-Melchor, Romain Zimmer,
Saeed Bagheri Shouraki and Timoth\'ee Masquelier
- Abstract summary: This paper uses machine learning techniques for encrypted traffic classification, looking only at packet size and time of arrival.
Spiking neural networks (SNNs) are inspired by how biological neurons operate.
Surprisingly, a simple SNN reached an accuracy of 95.9% on ISCX datasets, outperforming previous approaches.
- Score: 2.8544513613730205
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Internet traffic recognition is an essential tool for access providers since
recognizing traffic categories related to different data packets transmitted on
a network help them define adapted priorities. That means, for instance, high
priority requirements for an audio conference and low ones for a file transfer,
to enhance user experience. As internet traffic becomes increasingly encrypted,
the mainstream classic traffic recognition technique, payload inspection, is
rendered ineffective. This paper uses machine learning techniques for encrypted
traffic classification, looking only at packet size and time of arrival.
Spiking neural networks (SNN), largely inspired by how biological neurons
operate, were used for two reasons. Firstly, they are able to recognize
time-related data packet features. Secondly, they can be implemented
efficiently on neuromorphic hardware with a low energy footprint. Here we used
a very simple feedforward SNN, with only one fully-connected hidden layer, and
trained in a supervised manner using the newly introduced method known as
Surrogate Gradient Learning. Surprisingly, such a simple SNN reached an
accuracy of 95.9% on ISCX datasets, outperforming previous approaches. Besides
better accuracy, there is also a very significant improvement on simplicity:
input size, number of neurons, trainable parameters are all reduced by one to
four orders of magnitude. Next, we analyzed the reasons for this good accuracy.
It turns out that, beyond spatial (i.e. packet size) features, the SNN also
exploits temporal ones, mostly the nearly synchronous (within a 200ms range)
arrival times of packets with certain sizes. Taken together, these results show
that SNNs are an excellent fit for encrypted internet traffic classification:
they can be more accurate than conventional artificial neural networks (ANN),
and they could be implemented efficiently on low power embedded systems.
Related papers
- Adaptive Spiking Neural Networks with Hybrid Coding [0.0]
Spi-temporal Neural Network (SNN) is a more energy-efficient and effective neural network compared to Artificial Neural Networks (ANNs)
Traditional SNNs utilize same neurons when processing input data across different time steps, limiting their ability to integrate and utilizetemporal information effectively.
This paper introduces a hybrid encoding approach that not only reduces the required time steps for training but also continues to improve the overall network performance.
arXiv Detail & Related papers (2024-08-22T13:58:35Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Efficient Federated Learning with Spike Neural Networks for Traffic Sign
Recognition [70.306089187104]
We introduce powerful Spike Neural Networks (SNNs) into traffic sign recognition for energy-efficient and fast model training.
Numerical results indicate that the proposed federated SNN outperforms traditional federated convolutional neural networks in terms of accuracy, noise immunity, and energy efficiency as well.
arXiv Detail & Related papers (2022-05-28T03:11:48Z) - Object Detection with Spiking Neural Networks on Automotive Event Data [0.0]
We propose to train spiking neural networks (SNNs) directly on data coming from event cameras to design fast and efficient automotive embedded applications.
In this paper, we conducted experiments on two automotive event datasets, establishing new state-of-the-art classification results for spiking neural networks.
arXiv Detail & Related papers (2022-05-09T14:39:47Z) - Mining the Weights Knowledge for Optimizing Neural Network Structures [1.995792341399967]
We introduce a switcher neural network (SNN) that uses as inputs the weights of a task-specific neural network (called TNN for short)
By mining the knowledge contained in the weights, the SNN outputs scaling factors for turning off neurons in the TNN.
In terms of accuracy, we outperform baseline networks and other structure learning methods stably and significantly.
arXiv Detail & Related papers (2021-10-11T05:20:56Z) - The Devil Is in the Details: An Efficient Convolutional Neural Network
for Transport Mode Detection [3.008051369744002]
Transport mode detection is a classification problem aiming to design an algorithm that can infer the transport mode of a user given multimodal signals.
We show that a small, optimized model can perform as well as a current deep model.
arXiv Detail & Related papers (2021-09-16T08:05:47Z) - Quantized Neural Networks via {-1, +1} Encoding Decomposition and
Acceleration [83.84684675841167]
We propose a novel encoding scheme using -1, +1 to decompose quantized neural networks (QNNs) into multi-branch binary networks.
We validate the effectiveness of our method on large-scale image classification, object detection, and semantic segmentation tasks.
arXiv Detail & Related papers (2021-06-18T03:11:15Z) - Binary Graph Neural Networks [69.51765073772226]
Graph Neural Networks (GNNs) have emerged as a powerful and flexible framework for representation learning on irregular data.
In this paper, we present and evaluate different strategies for the binarization of graph neural networks.
We show that through careful design of the models, and control of the training process, binary graph neural networks can be trained at only a moderate cost in accuracy on challenging benchmarks.
arXiv Detail & Related papers (2020-12-31T18:48:58Z) - Going Deeper With Directly-Trained Larger Spiking Neural Networks [20.40894876501739]
Spiking neural networks (SNNs) are promising in coding for bio-usible information and event-driven signal processing.
However, the unique working mode of SNNs makes them more difficult to train than traditional networks.
We propose a CIF-dependent batch normalization (tpladBN) method based on the emerging-temporal backproation threshold.
arXiv Detail & Related papers (2020-10-29T07:15:52Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.