Bridging Classical and Quantum Machine Learning: Knowledge Transfer From
Classical to Quantum Neural Networks Using Knowledge Distillation
- URL: http://arxiv.org/abs/2311.13810v1
- Date: Thu, 23 Nov 2023 05:06:43 GMT
- Title: Bridging Classical and Quantum Machine Learning: Knowledge Transfer From
Classical to Quantum Neural Networks Using Knowledge Distillation
- Authors: Mohammad Junayed Hasan and M.R.C.Mahdy
- Abstract summary: This paper introduces a new method to transfer knowledge from classical to quantum neural networks using knowledge distillation.
We adapt classical convolutional neural network (CNN) architectures like LeNet and AlexNet to serve as teacher networks.
Quantum models achieve an average accuracy improvement of 0.80% on the MNIST dataset and 5.40% on the more complex Fashion MNIST dataset.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Very recently, studies have shown that quantum neural networks surpass
classical neural networks in tasks like image classification when a similar
number of learnable parameters are used. However, the development and
optimization of quantum models are currently hindered by issues such as qubit
instability and limited qubit availability, leading to error-prone systems with
weak performance. In contrast, classical models can exhibit high-performance
owing to substantial resource availability. As a result, more studies have been
focusing on hybrid classical-quantum integration. A line of research
particularly focuses on transfer learning through classical-quantum integration
or quantum-quantum approaches. Unlike previous studies, this paper introduces a
new method to transfer knowledge from classical to quantum neural networks
using knowledge distillation, effectively bridging the gap between classical
machine learning and emergent quantum computing techniques. We adapt classical
convolutional neural network (CNN) architectures like LeNet and AlexNet to
serve as teacher networks, facilitating the training of student quantum models
by sending supervisory signals during backpropagation through KL-divergence.
The approach yields significant performance improvements for the quantum models
by solely depending on classical CNNs, with quantum models achieving an average
accuracy improvement of 0.80% on the MNIST dataset and 5.40% on the more
complex Fashion MNIST dataset. Applying this technique eliminates the
cumbersome training of huge quantum models for transfer learning in
resource-constrained settings and enables re-using existing pre-trained
classical models to improve performance.Thus, this study paves the way for
future research in quantum machine learning (QML) by positioning knowledge
distillation as a core technique for advancing QML applications.
Related papers
- Quantum-Trained Convolutional Neural Network for Deepfake Audio Detection [3.2927352068925444]
deepfake technologies pose challenges to privacy, security, and information integrity.
This paper introduces a Quantum-Trained Convolutional Neural Network framework designed to enhance the detection of deepfake audio.
arXiv Detail & Related papers (2024-10-11T20:52:10Z) - Let the Quantum Creep In: Designing Quantum Neural Network Models by
Gradually Swapping Out Classical Components [1.024113475677323]
Modern AI systems are often built on neural networks.
We propose a framework where classical neural network layers are gradually replaced by quantum layers.
We conduct numerical experiments on image classification datasets to demonstrate the change of performance brought by the systematic introduction of quantum components.
arXiv Detail & Related papers (2024-09-26T07:01:29Z) - CTRQNets & LQNets: Continuous Time Recurrent and Liquid Quantum Neural Networks [76.53016529061821]
Liquid Quantum Neural Network (LQNet) and Continuous Time Recurrent Quantum Neural Network (CTRQNet) developed.
LQNet and CTRQNet achieve accuracy increases as high as 40% on CIFAR 10 through binary classification.
arXiv Detail & Related papers (2024-08-28T00:56:03Z) - QTRL: Toward Practical Quantum Reinforcement Learning via Quantum-Train [18.138290778243075]
We apply the Quantum-Train method to reinforcement learning tasks, called QTRL, training the classical policy network model.
The training result of the QTRL is a classical model, meaning the inference stage only requires classical computer.
arXiv Detail & Related papers (2024-07-08T16:41:03Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Pooling techniques in hybrid quantum-classical convolutional neural
networks [0.0]
In-depth study of pooling techniques in hybrid quantum-classical convolutional neural networks (QCCNNs) for classifying 2D medical images is performed.
We find similar or better performance in comparison to an equivalent classical model and QCCNN without pooling.
It is promising to study architectural choices in QCCNNs in more depth for future applications.
arXiv Detail & Related papers (2023-05-09T16:51:46Z) - Quantum Imitation Learning [74.15588381240795]
We propose quantum imitation learning (QIL) with a hope to utilize quantum advantage to speed up IL.
We develop two QIL algorithms, quantum behavioural cloning (Q-BC) and quantum generative adversarial imitation learning (Q-GAIL)
Experiment results demonstrate that both Q-BC and Q-GAIL can achieve comparable performance compared to classical counterparts.
arXiv Detail & Related papers (2023-04-04T12:47:35Z) - Variational Quantum Neural Networks (VQNNS) in Image Classification [0.0]
This paper investigates how training of quantum neural network (QNNs) can be done using quantum optimization algorithms.
In this paper, a QNN structure is made where a variational parameterized circuit is incorporated as an input layer named as Variational Quantum Neural Network (VQNNs)
VQNNs is experimented with MNIST digit recognition (less complex) and crack image classification datasets which converge the computation in lesser time than QNN with decent training accuracy.
arXiv Detail & Related papers (2023-03-10T11:24:32Z) - Problem-Dependent Power of Quantum Neural Networks on Multi-Class
Classification [83.20479832949069]
Quantum neural networks (QNNs) have become an important tool for understanding the physical world, but their advantages and limitations are not fully understood.
Here we investigate the problem-dependent power of QCs on multi-class classification tasks.
Our work sheds light on the problem-dependent power of QNNs and offers a practical tool for evaluating their potential merit.
arXiv Detail & Related papers (2022-12-29T10:46:40Z) - The Quantum Path Kernel: a Generalized Quantum Neural Tangent Kernel for
Deep Quantum Machine Learning [52.77024349608834]
Building a quantum analog of classical deep neural networks represents a fundamental challenge in quantum computing.
Key issue is how to address the inherent non-linearity of classical deep learning.
We introduce the Quantum Path Kernel, a formulation of quantum machine learning capable of replicating those aspects of deep machine learning.
arXiv Detail & Related papers (2022-12-22T16:06:24Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Classical-to-quantum convolutional neural network transfer learning [1.9336815376402723]
Machine learning using quantum convolutional neural networks (QCNNs) has demonstrated success in both quantum and classical data classification.
We propose transfer learning as an effective strategy for utilizing small QCNNs in the noisy intermediate-scale quantum era.
arXiv Detail & Related papers (2022-08-31T09:15:37Z) - Quantum-inspired Complex Convolutional Neural Networks [17.65730040410185]
We improve the quantum-inspired neurons by exploiting the complex-valued weights which have richer representational capacity and better non-linearity.
We draw the models of quantum-inspired convolutional neural networks (QICNNs) capable of processing high-dimensional data.
The performance of classification accuracy of the five QICNNs are tested on the MNIST and CIFAR-10 datasets.
arXiv Detail & Related papers (2021-10-31T03:10:48Z) - The dilemma of quantum neural networks [63.82713636522488]
We show that quantum neural networks (QNNs) fail to provide any benefit over classical learning models.
QNNs suffer from the severely limited effective model capacity, which incurs poor generalization on real-world datasets.
These results force us to rethink the role of current QNNs and to design novel protocols for solving real-world problems with quantum advantages.
arXiv Detail & Related papers (2021-06-09T10:41:47Z) - Quantum neural networks with deep residual learning [29.929891641757273]
In this paper, a novel quantum neural network with deep residual learning (ResQNN) is proposed.
Our ResQNN is able to learn an unknown unitary and get remarkable performance.
arXiv Detail & Related papers (2020-12-14T18:11:07Z) - Quantum Deformed Neural Networks [83.71196337378022]
We develop a new quantum neural network layer designed to run efficiently on a quantum computer.
It can be simulated on a classical computer when restricted in the way it entangles input states.
arXiv Detail & Related papers (2020-10-21T09:46:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.