Towards Transfer Learning for Large-Scale Image Classification Using
Annealing-based Quantum Boltzmann Machines
- URL: http://arxiv.org/abs/2311.15966v1
- Date: Mon, 27 Nov 2023 16:07:49 GMT
- Title: Towards Transfer Learning for Large-Scale Image Classification Using
Annealing-based Quantum Boltzmann Machines
- Authors: Dani\"elle Schuman, Leo S\"unkel, Philipp Altmann, Jonas Stein,
Christoph Roch, Thomas Gabor, Claudia Linnhoff-Popien
- Abstract summary: We present an approach to employ Quantum Annealing (QA) in image classification.
We propose using annealing-based Quantum Boltzmann Machines as part of a hybrid quantum-classical pipeline.
We find that our approach consistently outperforms its classical baseline in terms of test accuracy and AUC-ROC-Score.
- Score: 7.106829260811707
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum Transfer Learning (QTL) recently gained popularity as a hybrid
quantum-classical approach for image classification tasks by efficiently
combining the feature extraction capabilities of large Convolutional Neural
Networks with the potential benefits of Quantum Machine Learning (QML).
Existing approaches, however, only utilize gate-based Variational Quantum
Circuits for the quantum part of these procedures. In this work we present an
approach to employ Quantum Annealing (QA) in QTL-based image classification.
Specifically, we propose using annealing-based Quantum Boltzmann Machines as
part of a hybrid quantum-classical pipeline to learn the classification of
real-world, large-scale data such as medical images through supervised
training. We demonstrate our approach by applying it to the three-class
COVID-CT-MD dataset, a collection of lung Computed Tomography (CT) scan slices.
Using Simulated Annealing as a stand-in for actual QA, we compare our method to
classical transfer learning, using a neural network of the same order of
magnitude, to display its improved classification performance. We find that our
approach consistently outperforms its classical baseline in terms of test
accuracy and AUC-ROC-Score and needs less training epochs to do this.
Related papers
- Quantum Transfer Learning for MNIST Classification Using a Hybrid Quantum-Classical Approach [0.0]
This research explores the integration of quantum computing with classical machine learning for image classification tasks.
We propose a hybrid quantum-classical approach that leverages the strengths of both paradigms.
The experimental results indicate that while the hybrid model demonstrates the feasibility of integrating quantum computing with classical techniques, the accuracy of the final model, trained on quantum outcomes, is currently lower than the classical model trained on compressed features.
arXiv Detail & Related papers (2024-08-05T22:16:27Z) - Bridging Classical and Quantum Machine Learning: Knowledge Transfer From
Classical to Quantum Neural Networks Using Knowledge Distillation [0.0]
This paper introduces a new method to transfer knowledge from classical to quantum neural networks using knowledge distillation.
We adapt classical convolutional neural network (CNN) architectures like LeNet and AlexNet to serve as teacher networks.
Quantum models achieve an average accuracy improvement of 0.80% on the MNIST dataset and 5.40% on the more complex Fashion MNIST dataset.
arXiv Detail & Related papers (2023-11-23T05:06:43Z) - Hybrid quantum transfer learning for crack image classification on NISQ
hardware [62.997667081978825]
We present an application of quantum transfer learning for detecting cracks in gray value images.
We compare the performance and training time of PennyLane's standard qubits with IBM's qasm_simulator and real backends.
arXiv Detail & Related papers (2023-07-31T14:45:29Z) - Problem-Dependent Power of Quantum Neural Networks on Multi-Class
Classification [83.20479832949069]
Quantum neural networks (QNNs) have become an important tool for understanding the physical world, but their advantages and limitations are not fully understood.
Here we investigate the problem-dependent power of QCs on multi-class classification tasks.
Our work sheds light on the problem-dependent power of QNNs and offers a practical tool for evaluating their potential merit.
arXiv Detail & Related papers (2022-12-29T10:46:40Z) - The Quantum Path Kernel: a Generalized Quantum Neural Tangent Kernel for
Deep Quantum Machine Learning [52.77024349608834]
Building a quantum analog of classical deep neural networks represents a fundamental challenge in quantum computing.
Key issue is how to address the inherent non-linearity of classical deep learning.
We introduce the Quantum Path Kernel, a formulation of quantum machine learning capable of replicating those aspects of deep machine learning.
arXiv Detail & Related papers (2022-12-22T16:06:24Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Multiclass classification using quantum convolutional neural networks
with hybrid quantum-classical learning [0.5999777817331318]
We propose a quantum machine learning approach based on quantum convolutional neural networks for solving multiclass classification problems.
We use the proposed approach to demonstrate the 4-class classification for the case of the MNIST dataset using eight qubits for data encoding and four acnilla qubits.
Our results demonstrate comparable accuracy of our solution with classical convolutional neural networks with comparable numbers of trainable parameters.
arXiv Detail & Related papers (2022-03-29T09:07:18Z) - Classical-to-Quantum Transfer Learning for Spoken Command Recognition
Based on Quantum Neural Networks [13.485144642413907]
This work investigates an extension of transfer learning applied in machine learning algorithms to the emerging hybrid end-to-end quantum neural network (QNN) for spoken command recognition (SCR)
We put forth a hybrid transfer learning algorithm that allows a pre-trained classical network to be transferred to the classical part of the hybrid QNN model.
We assess the hybrid transfer learning algorithm applied to the hybrid classical-quantum QNN for SCR on the Google speech command dataset.
arXiv Detail & Related papers (2021-10-17T00:45:31Z) - On Circuit-based Hybrid Quantum Neural Networks for Remote Sensing
Imagery Classification [88.31717434938338]
The hybrid QCNNs enrich the classical architecture of CNNs by introducing a quantum layer within a standard neural network.
The novel QCNN proposed in this work is applied to the Land Use and Land Cover (LULC) classification, chosen as an Earth Observation (EO) use case.
The results of the multiclass classification prove the effectiveness of the presented approach, by demonstrating that the QCNN performances are higher than the classical counterparts.
arXiv Detail & Related papers (2021-09-20T12:41:50Z) - RGB Image Classification with Quantum Convolutional Ansaetze [18.379304679643436]
We propose two types of quantum circuit ansaetze to simulate convolution operations on RGB images.
To the best of our knowledge, this is the first work of a quantum convolutional circuit to deal with RGB images effectively.
We also investigate the relationship between the size of quantum circuit ansatz and the learnability of the hybrid quantum-classical convolutional neural network.
arXiv Detail & Related papers (2021-07-23T09:38:59Z) - Quantum Machine Learning with SQUID [64.53556573827525]
We present the Scaled QUantum IDentifier (SQUID), an open-source framework for exploring hybrid Quantum-Classical algorithms for classification problems.
We provide examples of using SQUID in a standard binary classification problem from the popular MNIST dataset.
arXiv Detail & Related papers (2021-04-30T21:34:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.