Problem-Dependent Power of Quantum Neural Networks on Multi-Class
Classification
- URL: http://arxiv.org/abs/2301.01597v3
- Date: Mon, 30 Oct 2023 10:48:11 GMT
- Title: Problem-Dependent Power of Quantum Neural Networks on Multi-Class
Classification
- Authors: Yuxuan Du, Yibo Yang, Dacheng Tao, Min-Hsiu Hsieh
- Abstract summary: Quantum neural networks (QNNs) have become an important tool for understanding the physical world, but their advantages and limitations are not fully understood.
Here we investigate the problem-dependent power of QCs on multi-class classification tasks.
Our work sheds light on the problem-dependent power of QNNs and offers a practical tool for evaluating their potential merit.
- Score: 83.20479832949069
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Quantum neural networks (QNNs) have become an important tool for
understanding the physical world, but their advantages and limitations are not
fully understood. Some QNNs with specific encoding methods can be efficiently
simulated by classical surrogates, while others with quantum memory may perform
better than classical classifiers. Here we systematically investigate the
problem-dependent power of quantum neural classifiers (QCs) on multi-class
classification tasks. Through the analysis of expected risk, a measure that
weighs the training loss and the generalization error of a classifier jointly,
we identify two key findings: first, the training loss dominates the power
rather than the generalization ability; second, QCs undergo a U-shaped risk
curve, in contrast to the double-descent risk curve of deep neural classifiers.
We also reveal the intrinsic connection between optimal QCs and the Helstrom
bound and the equiangular tight frame. Using these findings, we propose a
method that uses loss dynamics to probe whether a QC may be more effective than
a classical classifier on a particular learning task. Numerical results
demonstrate the effectiveness of our approach to explain the superiority of QCs
over multilayer Perceptron on parity datasets and their limitations over
convolutional neural networks on image datasets. Our work sheds light on the
problem-dependent power of QNNs and offers a practical tool for evaluating
their potential merit.
Related papers
- Bridging Classical and Quantum Machine Learning: Knowledge Transfer From
Classical to Quantum Neural Networks Using Knowledge Distillation [0.0]
This paper introduces a new method to transfer knowledge from classical to quantum neural networks using knowledge distillation.
We adapt classical convolutional neural network (CNN) architectures like LeNet and AlexNet to serve as teacher networks.
Quantum models achieve an average accuracy improvement of 0.80% on the MNIST dataset and 5.40% on the more complex Fashion MNIST dataset.
arXiv Detail & Related papers (2023-11-23T05:06:43Z) - Coreset selection can accelerate quantum machine learning models with
provable generalization [6.733416056422756]
Quantum neural networks (QNNs) and quantum kernels stand as prominent figures in the realm of quantum machine learning.
We present a unified approach: coreset selection, aimed at expediting the training of QNNs and quantum kernels.
arXiv Detail & Related papers (2023-09-19T08:59:46Z) - A Post-Training Approach for Mitigating Overfitting in Quantum
Convolutional Neural Networks [0.24578723416255752]
We study post-training approaches for mitigating overfitting in Quantum convolutional neural network (QCNN)
We find that a straightforward adaptation of a classical post-training method, known as neuron dropout, to the quantum setting leads to a substantial decrease in success probability of the QCNN.
We argue that this effect exposes the crucial role of entanglement in QCNNs and the vulnerability of QCNNs to entanglement loss.
arXiv Detail & Related papers (2023-09-04T21:46:24Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Classical-to-Quantum Transfer Learning Facilitates Machine Learning with Variational Quantum Circuit [62.55763504085508]
We prove that a classical-to-quantum transfer learning architecture using a Variational Quantum Circuit (VQC) improves the representation and generalization (estimation error) capabilities of the VQC model.
We show that the architecture of classical-to-quantum transfer learning leverages pre-trained classical generative AI models, making it easier to find the optimal parameters for the VQC in the training stage.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - Quantum Imitation Learning [74.15588381240795]
We propose quantum imitation learning (QIL) with a hope to utilize quantum advantage to speed up IL.
We develop two QIL algorithms, quantum behavioural cloning (Q-BC) and quantum generative adversarial imitation learning (Q-GAIL)
Experiment results demonstrate that both Q-BC and Q-GAIL can achieve comparable performance compared to classical counterparts.
arXiv Detail & Related papers (2023-04-04T12:47:35Z) - On Circuit-based Hybrid Quantum Neural Networks for Remote Sensing
Imagery Classification [88.31717434938338]
The hybrid QCNNs enrich the classical architecture of CNNs by introducing a quantum layer within a standard neural network.
The novel QCNN proposed in this work is applied to the Land Use and Land Cover (LULC) classification, chosen as an Earth Observation (EO) use case.
The results of the multiclass classification prove the effectiveness of the presented approach, by demonstrating that the QCNN performances are higher than the classical counterparts.
arXiv Detail & Related papers (2021-09-20T12:41:50Z) - The dilemma of quantum neural networks [63.82713636522488]
We show that quantum neural networks (QNNs) fail to provide any benefit over classical learning models.
QNNs suffer from the severely limited effective model capacity, which incurs poor generalization on real-world datasets.
These results force us to rethink the role of current QNNs and to design novel protocols for solving real-world problems with quantum advantages.
arXiv Detail & Related papers (2021-06-09T10:41:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.