Hybrid Quantum-Classical Feature Extraction approach for Image Classification using Autoencoders and Quantum SVMs
- URL: http://arxiv.org/abs/2410.18814v1
- Date: Thu, 24 Oct 2024 15:02:05 GMT
- Title: Hybrid Quantum-Classical Feature Extraction approach for Image Classification using Autoencoders and Quantum SVMs
- Authors: Donovan Slabbert, Francesco Petruccione,
- Abstract summary: NISQ-era quantum computers have limitations, which include noise, scalability, read-in and read-out times, and gate operation times.
strategies should be devised to mitigate the impact that complex datasets can have on the overall efficiency of a quantum machine learning pipeline.
We apply a classical feature extraction method using a ResNet10-inspired convolutional autoencoder to both reduce the dimensionality of the dataset and extract meaningful features.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In order to leverage quantum computers for machine learning tasks such as image classification, careful consideration is required: NISQ-era quantum computers have limitations, which include noise, scalability, read-in and read-out times, and gate operation times. Therefore, strategies should be devised to mitigate the impact that complex datasets can have on the overall efficiency of a quantum machine learning pipeline. This may otherwise lead to excessive resource demands or increased noise. We apply a classical feature extraction method using a ResNet10-inspired convolutional autoencoder to both reduce the dimensionality of the dataset and extract abstract and meaningful features before feeding them into a quantum machine learning block. The quantum block of choice is a quantum-enhanced support vector machine (QSVM), as support vector machines typically do not require large sample sizes to identify patterns in data and have short-depth quantum circuits, which limits the impact of noise. The autoencoder is trained to extract meaningful features through image reconstruction, aiming to minimize the mean squared error across a training set. Three image datasets are used to illustrate the pipeline: HTRU-1, MNIST, and CIFAR-10. We also include a quantum-enhanced one-class support vector machine (QOCSVM) for the highly unbalanced HTRU-1 set, as well as classical machine learning results to serve as a benchmark. Finally, the HTRU-2 dataset is also included to serve as a benchmark for a dataset with well-correlated features. The autoencoder achieved near-perfect reconstruction and high classification accuracy for MNIST, while CIFAR-10 showed poorer performance due to image complexity, and HTRU-1 struggled because of dataset imbalance. This highlights the need for a balance between dimensionality reduction through classical feature extraction and prediction performance using quantum methods.
Related papers
- An Efficient Quantum Classifier Based on Hamiltonian Representations [50.467930253994155]
Quantum machine learning (QML) is a discipline that seeks to transfer the advantages of quantum computing to data-driven tasks.
We propose an efficient approach that circumvents the costs associated with data encoding by mapping inputs to a finite set of Pauli strings.
We evaluate our approach on text and image classification tasks, against well-established classical and quantum models.
arXiv Detail & Related papers (2025-04-13T11:49:53Z) - HQViT: Hybrid Quantum Vision Transformer for Image Classification [48.72766405978677]
We propose a Hybrid Quantum Vision Transformer (HQViT) to accelerate model training while enhancing model performance.
HQViT introduces whole-image processing with amplitude encoding to better preserve global image information without additional positional encoding.
Experiments across various computer vision datasets demonstrate that HQViT outperforms existing models, achieving a maximum improvement of up to $10.9%$ (on the MNIST 10-classification task) over the state of the art.
arXiv Detail & Related papers (2025-04-03T16:13:34Z) - EnQode: Fast Amplitude Embedding for Quantum Machine Learning Using Classical Data [4.329112531155235]
Amplitude embedding (AE) is essential in quantum machine learning (QML) for encoding classical data onto quantum circuits.
We introduce EnQode, a fast AE technique based on symbolic representation that addresses limitations by clustering dataset samples.
With over 90% fidelity in data mapping, EnQode enables robust, high-performance QML on noisy intermediate-scale quantum (NISQ) devices.
arXiv Detail & Related papers (2025-03-18T17:48:03Z) - Extending Quantum Perceptrons: Rydberg Devices, Multi-Class Classification, and Error Tolerance [67.77677387243135]
Quantum Neuromorphic Computing (QNC) merges quantum computation with neural computation to create scalable, noise-resilient algorithms for quantum machine learning (QML)
At the core of QNC is the quantum perceptron (QP), which leverages the analog dynamics of interacting qubits to enable universal quantum computation.
arXiv Detail & Related papers (2024-11-13T23:56:20Z) - Quantum Transfer Learning for MNIST Classification Using a Hybrid Quantum-Classical Approach [0.0]
This research explores the integration of quantum computing with classical machine learning for image classification tasks.
We propose a hybrid quantum-classical approach that leverages the strengths of both paradigms.
The experimental results indicate that while the hybrid model demonstrates the feasibility of integrating quantum computing with classical techniques, the accuracy of the final model, trained on quantum outcomes, is currently lower than the classical model trained on compressed features.
arXiv Detail & Related papers (2024-08-05T22:16:27Z) - Supervised binary classification of small-scale digits images with a trapped-ion quantum processor [56.089799129458875]
We show that a quantum processor can correctly solve the basic classification task considered.
With the increase of the capabilities quantum processors, they can become a useful tool for machine learning.
arXiv Detail & Related papers (2024-06-17T18:20:51Z) - Quantum Transfer Learning with Adversarial Robustness for Classification
of High-Resolution Image Datasets [1.7246639313869705]
We propose a quantum transfer learning architecture that integrates quantum variational circuits with a classical machine learning network pre-trained on ImageNet dataset.
We demonstrate the superior performance of our QTL approach over classical and quantum machine learning without involving transfer learning.
arXiv Detail & Related papers (2024-01-30T13:45:39Z) - Quantum support vector data description for anomaly detection [0.5439020425819]
Anomaly detection is a critical problem in data analysis and pattern recognition, finding applications in various domains.
We introduce quantum support vector data description (QSVDD), an unsupervised learning algorithm designed for anomaly detection.
arXiv Detail & Related papers (2023-10-10T07:35:09Z) - Quantum support vector machines for classification and regression on a trapped-ion quantum computer [9.736685719039599]
We examine our quantum machine learning models, which are based on quantum support vector classification (QSVC) and quantum support vector regression (QSVR)
We investigate these models using a quantum-circuit simulator, both with and without noise, as well as the IonQ Harmony quantum processor.
For the classification tasks, the performance of our QSVC models using 4 qubits of the trapped-ion quantum computer was comparable to that obtained from noiseless quantum-circuit simulations.
arXiv Detail & Related papers (2023-07-05T08:06:41Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - QSAN: A Near-term Achievable Quantum Self-Attention Network [73.15524926159702]
Self-Attention Mechanism (SAM) is good at capturing the internal connections of features.
A novel Quantum Self-Attention Network (QSAN) is proposed for image classification tasks on near-term quantum devices.
arXiv Detail & Related papers (2022-07-14T12:22:51Z) - Hybrid quantum-classical classifier based on tensor network and
variational quantum circuit [0.0]
We introduce a hybrid model combining the quantum-inspired tensor networks (TN) and the variational quantum circuits (VQC) to perform supervised learning tasks.
We show that a matrix product state based TN with low bond dimensions performs better than PCA as a feature extractor to compress data for the input of VQCs in the binary classification of MNIST dataset.
arXiv Detail & Related papers (2020-11-30T09:43:59Z) - Decentralizing Feature Extraction with Quantum Convolutional Neural
Network for Automatic Speech Recognition [101.69873988328808]
We build upon a quantum convolutional neural network (QCNN) composed of a quantum circuit encoder for feature extraction.
An input speech is first up-streamed to a quantum computing server to extract Mel-spectrogram.
The corresponding convolutional features are encoded using a quantum circuit algorithm with random parameters.
The encoded features are then down-streamed to the local RNN model for the final recognition.
arXiv Detail & Related papers (2020-10-26T03:36:01Z) - Widening and Squeezing: Towards Accurate and Efficient QNNs [125.172220129257]
Quantization neural networks (QNNs) are very attractive to the industry because their extremely cheap calculation and storage overhead, but their performance is still worse than that of networks with full-precision parameters.
Most of existing methods aim to enhance performance of QNNs especially binary neural networks by exploiting more effective training techniques.
We address this problem by projecting features in original full-precision networks to high-dimensional quantization features.
arXiv Detail & Related papers (2020-02-03T04:11:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.