Bridging Classical and Quantum Machine Learning: Knowledge Transfer From
Classical to Quantum Neural Networks Using Knowledge Distillation
- URL: http://arxiv.org/abs/2311.13810v1
- Date: Thu, 23 Nov 2023 05:06:43 GMT
- Title: Bridging Classical and Quantum Machine Learning: Knowledge Transfer From
Classical to Quantum Neural Networks Using Knowledge Distillation
- Authors: Mohammad Junayed Hasan and M.R.C.Mahdy
- Abstract summary: This paper introduces a new method to transfer knowledge from classical to quantum neural networks using knowledge distillation.
We adapt classical convolutional neural network (CNN) architectures like LeNet and AlexNet to serve as teacher networks.
Quantum models achieve an average accuracy improvement of 0.80% on the MNIST dataset and 5.40% on the more complex Fashion MNIST dataset.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Very recently, studies have shown that quantum neural networks surpass
classical neural networks in tasks like image classification when a similar
number of learnable parameters are used. However, the development and
optimization of quantum models are currently hindered by issues such as qubit
instability and limited qubit availability, leading to error-prone systems with
weak performance. In contrast, classical models can exhibit high-performance
owing to substantial resource availability. As a result, more studies have been
focusing on hybrid classical-quantum integration. A line of research
particularly focuses on transfer learning through classical-quantum integration
or quantum-quantum approaches. Unlike previous studies, this paper introduces a
new method to transfer knowledge from classical to quantum neural networks
using knowledge distillation, effectively bridging the gap between classical
machine learning and emergent quantum computing techniques. We adapt classical
convolutional neural network (CNN) architectures like LeNet and AlexNet to
serve as teacher networks, facilitating the training of student quantum models
by sending supervisory signals during backpropagation through KL-divergence.
The approach yields significant performance improvements for the quantum models
by solely depending on classical CNNs, with quantum models achieving an average
accuracy improvement of 0.80% on the MNIST dataset and 5.40% on the more
complex Fashion MNIST dataset. Applying this technique eliminates the
cumbersome training of huge quantum models for transfer learning in
resource-constrained settings and enables re-using existing pre-trained
classical models to improve performance.Thus, this study paves the way for
future research in quantum machine learning (QML) by positioning knowledge
distillation as a core technique for advancing QML applications.
Related papers
- Quantum-Trained Convolutional Neural Network for Deepfake Audio Detection [3.2927352068925444]
deepfake technologies pose challenges to privacy, security, and information integrity.
This paper introduces a Quantum-Trained Convolutional Neural Network framework designed to enhance the detection of deepfake audio.
arXiv Detail & Related papers (2024-10-11T20:52:10Z) - Let the Quantum Creep In: Designing Quantum Neural Network Models by
Gradually Swapping Out Classical Components [1.024113475677323]
Modern AI systems are often built on neural networks.
We propose a framework where classical neural network layers are gradually replaced by quantum layers.
We conduct numerical experiments on image classification datasets to demonstrate the change of performance brought by the systematic introduction of quantum components.
arXiv Detail & Related papers (2024-09-26T07:01:29Z) - QTRL: Toward Practical Quantum Reinforcement Learning via Quantum-Train [18.138290778243075]
We apply the Quantum-Train method to reinforcement learning tasks, called QTRL, training the classical policy network model.
The training result of the QTRL is a classical model, meaning the inference stage only requires classical computer.
arXiv Detail & Related papers (2024-07-08T16:41:03Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Classical-to-Quantum Transfer Learning Facilitates Machine Learning with Variational Quantum Circuit [62.55763504085508]
We prove that a classical-to-quantum transfer learning architecture using a Variational Quantum Circuit (VQC) improves the representation and generalization (estimation error) capabilities of the VQC model.
We show that the architecture of classical-to-quantum transfer learning leverages pre-trained classical generative AI models, making it easier to find the optimal parameters for the VQC in the training stage.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - Pooling techniques in hybrid quantum-classical convolutional neural
networks [0.0]
In-depth study of pooling techniques in hybrid quantum-classical convolutional neural networks (QCCNNs) for classifying 2D medical images is performed.
We find similar or better performance in comparison to an equivalent classical model and QCCNN without pooling.
It is promising to study architectural choices in QCCNNs in more depth for future applications.
arXiv Detail & Related papers (2023-05-09T16:51:46Z) - Problem-Dependent Power of Quantum Neural Networks on Multi-Class
Classification [83.20479832949069]
Quantum neural networks (QNNs) have become an important tool for understanding the physical world, but their advantages and limitations are not fully understood.
Here we investigate the problem-dependent power of QCs on multi-class classification tasks.
Our work sheds light on the problem-dependent power of QNNs and offers a practical tool for evaluating their potential merit.
arXiv Detail & Related papers (2022-12-29T10:46:40Z) - Classical-to-quantum convolutional neural network transfer learning [1.9336815376402723]
Machine learning using quantum convolutional neural networks (QCNNs) has demonstrated success in both quantum and classical data classification.
We propose transfer learning as an effective strategy for utilizing small QCNNs in the noisy intermediate-scale quantum era.
arXiv Detail & Related papers (2022-08-31T09:15:37Z) - The dilemma of quantum neural networks [63.82713636522488]
We show that quantum neural networks (QNNs) fail to provide any benefit over classical learning models.
QNNs suffer from the severely limited effective model capacity, which incurs poor generalization on real-world datasets.
These results force us to rethink the role of current QNNs and to design novel protocols for solving real-world problems with quantum advantages.
arXiv Detail & Related papers (2021-06-09T10:41:47Z) - Quantum Deformed Neural Networks [83.71196337378022]
We develop a new quantum neural network layer designed to run efficiently on a quantum computer.
It can be simulated on a classical computer when restricted in the way it entangles input states.
arXiv Detail & Related papers (2020-10-21T09:46:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.