Research of the Variational Shadow Quantum Circuit Based on the Whale Optimization Algorithm in Image Classification
- URL: http://arxiv.org/abs/2505.09994v1
- Date: Thu, 15 May 2025 06:14:02 GMT
- Title: Research of the Variational Shadow Quantum Circuit Based on the Whale Optimization Algorithm in Image Classification
- Authors: Shuang Wu, Xueliang Song, Yumin Dong, Fanghua Jia,
- Abstract summary: This paper proposes an improved Variable Split Shadow Quantum Circuit (VSQC) model based on the Whale Optimization Algorithm.<n>In this paper, we use different localized shadow circuit VSQC models to achieve the binary classification task on the MNIST dataset.<n>Our design of strongly entangled shadow circuits performs the best in terms of classification accuracy.
- Score: 5.476164902473674
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In order to explore the possibility of cross-fertilization between quantum computing and neural networks as well as to improve the classification performance of quantum neural networks, this paper proposes an improved Variable Split Shadow Quantum Circuit (VSQC-WOA) model based on the Whale Optimization Algorithm. In this model, we design a strongly entangled local shadow circuit to achieve efficient characterization of global features through local shadow feature extraction and a sliding mechanism, which provides a rich quantum feature representation for the classification task. The gradient is then computed by the parameter-shifting method, and finally the features processed by the shadow circuit are passed to the classical fully connected neural network (FCNN) for processing and classification. The model also introduces the Whale Optimization Algorithm (WOA) to further optimize the weights and biases of the fully connected neural network, which improves the expressive power and classification accuracy of the model. In this paper, we firstly use different localized shadow circuit VSQC models to achieve the binary classification task on the MNIST dataset, and our design of strongly entangled shadow circuits performs the best in terms of classification accuracy. The VSQC-WOA model is then used to multi-classify the MNIST dataset (three classifications as an example), and the effectiveness of the proposed VSQC-WOA model as well as the robustness and generalization ability of the model are verified through various comparison experiments.
Related papers
- A Comprehensively Adaptive Architectural Optimization-Ingrained Quantum Neural Network Model for Cloud Workloads Prediction [4.501295034557007]
This work proposes a novel Comprehensively Adaptive Architectural Optimization-based Variable Quantum Neural Network (CA-QNN)<n>The model converts workload data into qubits, processed through qubit neurons with Controlled NOT-gated activation functions for intuitive pattern recognition.<n>The proposed model demonstrates superior prediction accuracy, reducing prediction errors by up to 93.40% and 91.27% compared to existing deep learning and QNN-based approaches.
arXiv Detail & Related papers (2025-07-11T05:07:21Z) - Selective Feature Re-Encoded Quantum Convolutional Neural Network with Joint Optimization for Image Classification [3.8876018618878585]
Quantum convolutional neural networks (QCNNs) have demonstrated promising results in classifying both quantum and classical data.<n>This study proposes a novel strategy to enhance feature processing and a QCNN architecture for improved classification accuracy.
arXiv Detail & Related papers (2025-07-02T18:51:56Z) - Single-Qudit Quantum Neural Networks for Multiclass Classification [0.0]
This paper proposes a single-qudit quantum neural network for multiclass classification.<n>Our design employs an $d$-dimensional unitary operator, where $d$ corresponds to the number of classes.<n>We evaluate our model on the MNIST and EMNIST datasets, demonstrating competitive accuracy.
arXiv Detail & Related papers (2025-03-12T11:12:05Z) - Lean classical-quantum hybrid neural network model for image classification [12.353900068459446]
We introduce a Lean Classical-Quantum Hybrid Neural Network (LCQHNN), which achieves efficient classification performance with only four layers of variational circuits.<n>Our experiments demonstrate that LCQHNN achieves 100%, 99.02%, and 85.55% classification accuracy on MNIST, FashionMNIST, and CIFAR-10 datasets.
arXiv Detail & Related papers (2024-12-03T00:37:11Z) - Pointer Networks with Q-Learning for Combinatorial Optimization [55.2480439325792]
We introduce the Pointer Q-Network (PQN), a hybrid neural architecture that integrates model-free Q-value policy approximation with Pointer Networks (Ptr-Nets)
Our empirical results demonstrate the efficacy of this approach, also testing the model in unstable environments.
arXiv Detail & Related papers (2023-11-05T12:03:58Z) - Tensor Ring Optimized Quantum-Enhanced Tensor Neural Networks [32.76948546010625]
Quantum machine learning researchers often rely on incorporating Networks (TN) into Deep Neural Networks (DNN)
To address this issue, a multi-layer design of a Ring optimized variational Quantum learning classifier (Quan-TR) is proposed.
It is referred to as Ring optimized Quantum-enhanced neural Networks (TR-QNet)
On quantum simulations, the proposed TR-QNet achieves promising accuracy of $94.5%$, $86.16%$, and $83.54%$ on the Iris, MNIST, and CIFAR-10 datasets, respectively.
arXiv Detail & Related papers (2023-10-02T18:07:10Z) - Development of a Novel Quantum Pre-processing Filter to Improve Image
Classification Accuracy of Neural Network Models [1.2965700352825555]
This paper proposes a novel quantum pre-processing filter (QPF) to improve the image classification accuracy of neural network (NN) models.
The results show that the image classification accuracy based on the MNIST (handwritten 10 digits) and the EMNIST (handwritten 47 class digits and letters) datasets can be improved.
However, tests performed on the developed QPF approach against a relatively complex GTSRB dataset with 43 distinct class real-life traffic sign images showed a degradation in the classification accuracy.
arXiv Detail & Related papers (2023-08-22T01:27:04Z) - SA-CNN: Application to text categorization issues using simulated
annealing-based convolutional neural network optimization [0.0]
Convolutional neural networks (CNNs) are a representative class of deep learning algorithms.
We introduce SA-CNN neural networks for text classification tasks based on Text-CNN neural networks.
arXiv Detail & Related papers (2023-03-13T14:27:34Z) - Vertical Layering of Quantized Neural Networks for Heterogeneous
Inference [57.42762335081385]
We study a new vertical-layered representation of neural network weights for encapsulating all quantized models into a single one.
We can theoretically achieve any precision network for on-demand service while only needing to train and maintain one model.
arXiv Detail & Related papers (2022-12-10T15:57:38Z) - Towards Theoretically Inspired Neural Initialization Optimization [66.04735385415427]
We propose a differentiable quantity, named GradCosine, with theoretical insights to evaluate the initial state of a neural network.
We show that both the training and test performance of a network can be improved by maximizing GradCosine under norm constraint.
Generalized from the sample-wise analysis into the real batch setting, NIO is able to automatically look for a better initialization with negligible cost.
arXiv Detail & Related papers (2022-10-12T06:49:16Z) - Efficient Quantum Feature Extraction for CNN-based Learning [5.236201168829204]
We propose a quantum-classical deep network structure to enhance classical CNN model discriminability.
We build PQC, which is a more potent function approximator, with more complex structures to capture the features within the receptive field.
The results disclose that the model with ansatz in high expressibility achieves lower cost and higher accuracy.
arXiv Detail & Related papers (2022-01-04T17:04:07Z) - Optimising for Interpretability: Convolutional Dynamic Alignment
Networks [108.83345790813445]
We introduce a new family of neural network models called Convolutional Dynamic Alignment Networks (CoDA Nets)
Their core building blocks are Dynamic Alignment Units (DAUs), which are optimised to transform their inputs with dynamically computed weight vectors that align with task-relevant patterns.
CoDA Nets model the classification prediction through a series of input-dependent linear transformations, allowing for linear decomposition of the output into individual input contributions.
arXiv Detail & Related papers (2021-09-27T12:39:46Z) - On Circuit-based Hybrid Quantum Neural Networks for Remote Sensing
Imagery Classification [88.31717434938338]
The hybrid QCNNs enrich the classical architecture of CNNs by introducing a quantum layer within a standard neural network.
The novel QCNN proposed in this work is applied to the Land Use and Land Cover (LULC) classification, chosen as an Earth Observation (EO) use case.
The results of the multiclass classification prove the effectiveness of the presented approach, by demonstrating that the QCNN performances are higher than the classical counterparts.
arXiv Detail & Related papers (2021-09-20T12:41:50Z) - Quantum Machine Learning with SQUID [64.53556573827525]
We present the Scaled QUantum IDentifier (SQUID), an open-source framework for exploring hybrid Quantum-Classical algorithms for classification problems.
We provide examples of using SQUID in a standard binary classification problem from the popular MNIST dataset.
arXiv Detail & Related papers (2021-04-30T21:34:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.