Subtleties in the trainability of quantum machine learning models
- URL: http://arxiv.org/abs/2110.14753v1
- Date: Wed, 27 Oct 2021 20:28:53 GMT
- Title: Subtleties in the trainability of quantum machine learning models
- Authors: Supanut Thanasilp, Samson Wang, Nhat A. Nghiem, Patrick J. Coles, M.
Cerezo
- Abstract summary: We show that gradient scaling results for Variational Quantum Algorithms can be applied to study the gradient scaling of Quantum Machine Learning models.
Our results indicate that features deemed detrimental for VQA trainability can also lead to issues such as barren plateaus in QML.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A new paradigm for data science has emerged, with quantum data, quantum
models, and quantum computational devices. This field, called Quantum Machine
Learning (QML), aims to achieve a speedup over traditional machine learning for
data analysis. However, its success usually hinges on efficiently training the
parameters in quantum neural networks, and the field of QML is still lacking
theoretical scaling results for their trainability. Some trainability results
have been proven for a closely related field called Variational Quantum
Algorithms (VQAs). While both fields involve training a parametrized quantum
circuit, there are crucial differences that make the results for one setting
not readily applicable to the other. In this work we bridge the two frameworks
and show that gradient scaling results for VQAs can also be applied to study
the gradient scaling of QML models. Our results indicate that features deemed
detrimental for VQA trainability can also lead to issues such as barren
plateaus in QML. Consequently, our work has implications for several QML
proposals in the literature. In addition, we provide theoretical and numerical
evidence that QML models exhibit further trainability issues not present in
VQAs, arising from the use of a training dataset. We refer to these as
dataset-induced barren plateaus. These results are most relevant when dealing
with classical data, as here the choice of embedding scheme (i.e., the map
between classical data and quantum states) can greatly affect the gradient
scaling.
Related papers
- Quantum Active Learning [3.3202982522589934]
Training a quantum neural network typically demands a substantial labeled training set for supervised learning.
QAL effectively trains the model, achieving performance comparable to that on fully labeled datasets.
We elucidate the negative result of QAL being overtaken by random sampling baseline through miscellaneous numerical experiments.
arXiv Detail & Related papers (2024-05-28T14:39:54Z) - Transition Role of Entangled Data in Quantum Machine Learning [51.6526011493678]
Entanglement serves as the resource to empower quantum computing.
Recent progress has highlighted its positive impact on learning quantum dynamics.
We establish a quantum no-free-lunch (NFL) theorem for learning quantum dynamics using entangled data.
arXiv Detail & Related papers (2023-06-06T08:06:43Z) - Classical-to-Quantum Transfer Learning Facilitates Machine Learning with Variational Quantum Circuit [62.55763504085508]
We prove that a classical-to-quantum transfer learning architecture using a Variational Quantum Circuit (VQC) improves the representation and generalization (estimation error) capabilities of the VQC model.
We show that the architecture of classical-to-quantum transfer learning leverages pre-trained classical generative AI models, making it easier to find the optimal parameters for the VQC in the training stage.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - Quantum Imitation Learning [74.15588381240795]
We propose quantum imitation learning (QIL) with a hope to utilize quantum advantage to speed up IL.
We develop two QIL algorithms, quantum behavioural cloning (Q-BC) and quantum generative adversarial imitation learning (Q-GAIL)
Experiment results demonstrate that both Q-BC and Q-GAIL can achieve comparable performance compared to classical counterparts.
arXiv Detail & Related papers (2023-04-04T12:47:35Z) - Reflection Equivariant Quantum Neural Networks for Enhanced Image
Classification [0.7232471205719458]
We build new machine learning models which explicitly respect the symmetries inherent in their data, so-called geometric quantum machine learning (GQML)
We find that these networks are capable of consistently and significantly outperforming generic ansatze on complicated real-world image datasets.
arXiv Detail & Related papers (2022-12-01T04:10:26Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Entangled Datasets for Quantum Machine Learning [0.0]
We argue that one should instead employ quantum datasets composed of quantum states.
We show how a quantum neural network can be trained to generate the states in the NTangled dataset.
We also consider an alternative entanglement-based dataset, which is scalable and is composed of states prepared by quantum circuits.
arXiv Detail & Related papers (2021-09-08T02:20:13Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z) - Hybrid Quantum-Classical Graph Convolutional Network [7.0132255816377445]
This research provides a hybrid quantum-classical graph convolutional network (QGCNN) for learning HEP data.
The proposed framework demonstrates an advantage over classical multilayer perceptron and convolutional neural networks in the aspect of number of parameters.
In terms of testing accuracy, the QGCNN shows comparable performance to a quantum convolutional neural network on the same HEP dataset.
arXiv Detail & Related papers (2021-01-15T16:02:52Z) - Quantum circuit architecture search for variational quantum algorithms [88.71725630554758]
We propose a resource and runtime efficient scheme termed quantum architecture search (QAS)
QAS automatically seeks a near-optimal ansatz to balance benefits and side-effects brought by adding more noisy quantum gates.
We implement QAS on both the numerical simulator and real quantum hardware, via the IBM cloud, to accomplish data classification and quantum chemistry tasks.
arXiv Detail & Related papers (2020-10-20T12:06:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.