Quantum Machine Learning for UAV Swarm Intrusion Detection
- URL: http://arxiv.org/abs/2509.01812v1
- Date: Mon, 01 Sep 2025 22:36:30 GMT
- Title: Quantum Machine Learning for UAV Swarm Intrusion Detection
- Authors: Kuan-Cheng Chen, Samuel Yen-Chi Chen, Tai-Yue Li, Chen-Yu Liu, Kin K. Leung,
- Abstract summary: Intrusion detection inUAV swarms is complicated by high mobility, non-stationary traffic, and severe class imbalance.<n>We benchmark three quantum-machine-learning (QML) approaches - quantum kernels, variational quantum neural networks (QNNs), and hybrid quantum-trained neural networks (QT-NNs)<n>All models consume an 8-feature flow representation and are evaluated under identical preprocessing, balancing, and noise-model assumptions.<n>Results reveal clear trade-offs: quantum kernels and QT-NNs excel in low-data, nonlinear regimes, while deeper QNNs suffer from trainability issues, and CNN
- Score: 25.52804434998647
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Intrusion detection in unmanned-aerial-vehicle (UAV) swarms is complicated by high mobility, non-stationary traffic, and severe class imbalance. Leveraging a 120 k-flow simulation corpus that covers five attack types, we benchmark three quantum-machine-learning (QML) approaches - quantum kernels, variational quantum neural networks (QNNs), and hybrid quantum-trained neural networks (QT-NNs) - against strong classical baselines. All models consume an 8-feature flow representation and are evaluated under identical preprocessing, balancing, and noise-model assumptions. We analyse the influence of encoding strategy, circuit depth, qubit count, and shot noise, reporting accuracy, macro-F1, ROC-AUC, Matthews correlation, and quantum-resource footprints. Results reveal clear trade-offs: quantum kernels and QT-NNs excel in low-data, nonlinear regimes, while deeper QNNs suffer from trainability issues, and CNNs dominate when abundant data offset their larger parameter count. The complete codebase and dataset partitions are publicly released to enable reproducible QML research in network security.
Related papers
- Critical Evaluation of Quantum Machine Learning for Adversarial Robustness [1.274988274746616]
We present a systematization of adversarial robustness in Quantum Machine Learning (QML)<n>We implement representative attacks in three threat models-black-box, gray-box, and white-box.<n>Our findings guide the development of secure and resilient QML architectures for practical deployment.
arXiv Detail & Related papers (2025-11-19T00:13:17Z) - CTRQNets & LQNets: Continuous Time Recurrent and Liquid Quantum Neural Networks [76.53016529061821]
Liquid Quantum Neural Network (LQNet) and Continuous Time Recurrent Quantum Neural Network (CTRQNet) developed.<n>LQNet and CTRQNet achieve accuracy increases as high as 40% on CIFAR 10 through binary classification.
arXiv Detail & Related papers (2024-08-28T00:56:03Z) - Coherent Feed Forward Quantum Neural Network [2.1178416840822027]
Quantum machine learning, focusing on quantum neural networks (QNNs), remains a vastly uncharted field of study.
We introduce a bona fide QNN model, which seamlessly aligns with the versatility of a traditional FFNN in terms of its adaptable intermediate layers and nodes.
We test our proposed model on various benchmarking datasets such as the diagnostic breast cancer (Wisconsin) and credit card fraud detection datasets.
arXiv Detail & Related papers (2024-02-01T15:13:26Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Exploring the Vulnerabilities of Machine Learning and Quantum Machine
Learning to Adversarial Attacks using a Malware Dataset: A Comparative
Analysis [0.0]
Machine learning (ML) and quantum machine learning (QML) have shown remarkable potential in tackling complex problems.
Their susceptibility to adversarial attacks raises concerns when deploying these systems in security sensitive applications.
We present a comparative analysis of the vulnerability of ML and QNN models to adversarial attacks using a malware dataset.
arXiv Detail & Related papers (2023-05-31T06:31:42Z) - Problem-Dependent Power of Quantum Neural Networks on Multi-Class
Classification [83.20479832949069]
Quantum neural networks (QNNs) have become an important tool for understanding the physical world, but their advantages and limitations are not fully understood.
Here we investigate the problem-dependent power of QCs on multi-class classification tasks.
Our work sheds light on the problem-dependent power of QNNs and offers a practical tool for evaluating their potential merit.
arXiv Detail & Related papers (2022-12-29T10:46:40Z) - DeepQMLP: A Scalable Quantum-Classical Hybrid DeepNeural Network
Architecture for Classification [6.891238879512672]
Quantum machine learning (QML) is promising for potential speedups and improvements in conventional machine learning (ML) tasks.
We present a scalable quantum-classical hybrid deep neural network (DeepQMLP) architecture inspired by classical deep neural network architectures.
DeepQMLP provides up to 25.3% lower loss and 7.92% higher accuracy during inference under noise than QMLP.
arXiv Detail & Related papers (2022-02-02T15:29:46Z) - Quantum convolutional neural network for classical data classification [0.8057006406834467]
We benchmark fully parameterized quantum convolutional neural networks (QCNNs) for classical data classification.
We propose a quantum neural network model inspired by CNN that only uses two-qubit interactions throughout the entire algorithm.
arXiv Detail & Related papers (2021-08-02T06:48:34Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - Branching Quantum Convolutional Neural Networks [0.0]
Small-scale quantum computers are already showing potential gains in learning tasks on large quantum and very large classical data sets.
We present a generalization of QCNN, the branching quantum convolutional neural network, or bQCNN, with substantially higher expressibility.
arXiv Detail & Related papers (2020-12-28T19:00:03Z) - Decentralizing Feature Extraction with Quantum Convolutional Neural
Network for Automatic Speech Recognition [101.69873988328808]
We build upon a quantum convolutional neural network (QCNN) composed of a quantum circuit encoder for feature extraction.
An input speech is first up-streamed to a quantum computing server to extract Mel-spectrogram.
The corresponding convolutional features are encoded using a quantum circuit algorithm with random parameters.
The encoded features are then down-streamed to the local RNN model for the final recognition.
arXiv Detail & Related papers (2020-10-26T03:36:01Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.