Bayesian Hyperparameter Optimization for Deep Neural Network-Based
Network Intrusion Detection
- URL: http://arxiv.org/abs/2207.09902v1
- Date: Thu, 7 Jul 2022 20:08:38 GMT
- Title: Bayesian Hyperparameter Optimization for Deep Neural Network-Based
Network Intrusion Detection
- Authors: Mohammad Masum, Hossain Shahriar, Hisham Haddad, Md Jobair Hossain
Faruk, Maria Valero, Md Abdullah Khan, Mohammad A. Rahman, Muhaiminul I.
Adnan, Alfredo Cuzzocrea
- Abstract summary: Deep neural networks (DNN) have been successfully applied for intrusion detection problems.
This paper proposes a novel Bayesian optimization-based framework for the automatic optimization of hyper parameters.
We show that the proposed framework demonstrates significantly higher intrusion detection performance than the random search optimization-based approach.
- Score: 2.304713283039168
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Traditional network intrusion detection approaches encounter feasibility and
sustainability issues to combat modern, sophisticated, and unpredictable
security attacks. Deep neural networks (DNN) have been successfully applied for
intrusion detection problems. The optimal use of DNN-based classifiers requires
careful tuning of the hyper-parameters. Manually tuning the hyperparameters is
tedious, time-consuming, and computationally expensive. Hence, there is a need
for an automatic technique to find optimal hyperparameters for the best use of
DNN in intrusion detection. This paper proposes a novel Bayesian
optimization-based framework for the automatic optimization of hyperparameters,
ensuring the best DNN architecture. We evaluated the performance of the
proposed framework on NSL-KDD, a benchmark dataset for network intrusion
detection. The experimental results show the framework's effectiveness as the
resultant DNN architecture demonstrates significantly higher intrusion
detection performance than the random search optimization-based approach in
terms of accuracy, precision, recall, and f1-score.
Related papers
- Enhanced Convolution Neural Network with Optimized Pooling and Hyperparameter Tuning for Network Intrusion Detection [0.0]
We propose an Enhanced Convolutional Neural Network (EnCNN) for Network Intrusion Detection Systems (NIDS)
We compare EnCNN with various machine learning algorithms, including Logistic Regression, Decision Trees, Support Vector Machines (SVM), and ensemble methods like Random Forest, AdaBoost, and Voting Ensemble.
The results show that EnCNN significantly improves detection accuracy, with a notable 10% increase over state-of-art approaches.
arXiv Detail & Related papers (2024-09-27T11:20:20Z) - Parallel Hyperparameter Optimization Of Spiking Neural Network [0.5371337604556311]
Spiking Neural Networks (SNNs) are based on a more biologically inspired approach than usual artificial neural networks.
We tackle the signal loss issue of SNNs to what we call silent networks.
By defining an early stopping criterion, we were able to instantiate larger and more flexible search spaces.
arXiv Detail & Related papers (2024-03-01T11:11:59Z) - Quantifying uncertainty for deep learning based forecasting and
flow-reconstruction using neural architecture search ensembles [0.8258451067861933]
We present an automated approach to deep neural network (DNN) discovery and demonstrate how this may also be utilized for ensemble-based uncertainty quantification.
We highlight how the proposed method not only discovers high-performing neural network ensembles for our tasks, but also quantifies uncertainty seamlessly.
We demonstrate the feasibility of this framework for two tasks - forecasting from historical data and flow reconstruction from sparse sensors for the sea-surface temperature.
arXiv Detail & Related papers (2023-02-20T03:57:06Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Comparative Analysis of Interval Reachability for Robust Implicit and
Feedforward Neural Networks [64.23331120621118]
We use interval reachability analysis to obtain robustness guarantees for implicit neural networks (INNs)
INNs are a class of implicit learning models that use implicit equations as layers.
We show that our approach performs at least as well as, and generally better than, applying state-of-the-art interval bound propagation methods to INNs.
arXiv Detail & Related papers (2022-04-01T03:31:27Z) - Adaptive Anomaly Detection for Internet of Things in Hierarchical Edge
Computing: A Contextual-Bandit Approach [81.5261621619557]
We propose an adaptive anomaly detection scheme with hierarchical edge computing (HEC)
We first construct multiple anomaly detection DNN models with increasing complexity, and associate each of them to a corresponding HEC layer.
Then, we design an adaptive model selection scheme that is formulated as a contextual-bandit problem and solved by using a reinforcement learning policy network.
arXiv Detail & Related papers (2021-08-09T08:45:47Z) - Delta-STN: Efficient Bilevel Optimization for Neural Networks using
Structured Response Jacobians [5.33024001730262]
Self-Tuning Networks (STNs) have recently gained traction due to their ability to amortize the optimization of the inner objective.
We propose the $Delta$-STN, an improved hypernetwork architecture which stabilizes training.
arXiv Detail & Related papers (2020-10-26T12:12:23Z) - Bayesian Optimization with Machine Learning Algorithms Towards Anomaly
Detection [66.05992706105224]
In this paper, an effective anomaly detection framework is proposed utilizing Bayesian Optimization technique.
The performance of the considered algorithms is evaluated using the ISCX 2012 dataset.
Experimental results show the effectiveness of the proposed framework in term of accuracy rate, precision, low-false alarm rate, and recall.
arXiv Detail & Related papers (2020-08-05T19:29:35Z) - An Asymptotically Optimal Multi-Armed Bandit Algorithm and
Hyperparameter Optimization [48.5614138038673]
We propose an efficient and robust bandit-based algorithm called Sub-Sampling (SS) in the scenario of hyper parameter search evaluation.
We also develop a novel hyper parameter optimization algorithm called BOSS.
Empirical studies validate our theoretical arguments of SS and demonstrate the superior performance of BOSS on a number of applications.
arXiv Detail & Related papers (2020-07-11T03:15:21Z) - GraN: An Efficient Gradient-Norm Based Detector for Adversarial and
Misclassified Examples [77.99182201815763]
Deep neural networks (DNNs) are vulnerable to adversarial examples and other data perturbations.
GraN is a time- and parameter-efficient method that is easily adaptable to any DNN.
GraN achieves state-of-the-art performance on numerous problem set-ups.
arXiv Detail & Related papers (2020-04-20T10:09:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.