Generalization in quantum machine learning from few training data
- URL: http://arxiv.org/abs/2111.05292v1
- Date: Tue, 9 Nov 2021 17:49:46 GMT
- Title: Generalization in quantum machine learning from few training data
- Authors: Matthias C. Caro, Hsin-Yuan Huang, M. Cerezo, Kunal Sharma, Andrew
Sornborger, Lukasz Cincio, Patrick J. Coles
- Abstract summary: Modern quantum machine learning (QML) methods involve variationally optimizing a parameterized quantum circuit on a training data set.
We show that the generalization error of a quantum machine learning model with $T$ trainable gates at worst as $sqrtT/N$.
We also show that classification of quantum states across a phase transition with a quantum convolutional neural network requires only a very small training data set.
- Score: 4.325561431427748
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Modern quantum machine learning (QML) methods involve variationally
optimizing a parameterized quantum circuit on a training data set, and
subsequently making predictions on a testing data set (i.e., generalizing). In
this work, we provide a comprehensive study of generalization performance in
QML after training on a limited number $N$ of training data points. We show
that the generalization error of a quantum machine learning model with $T$
trainable gates scales at worst as $\sqrt{T/N}$. When only $K \ll T$ gates have
undergone substantial change in the optimization process, we prove that the
generalization error improves to $\sqrt{K / N}$. Our results imply that the
compiling of unitaries into a polynomial number of native gates, a crucial
application for the quantum computing industry that typically uses
exponential-size training data, can be sped up significantly. We also show that
classification of quantum states across a phase transition with a quantum
convolutional neural network requires only a very small training data set.
Other potential applications include learning quantum error correcting codes or
quantum dynamical simulation. Our work injects new hope into the field of QML,
as good generalization is guaranteed from few training data.
Related papers
- Variational quantum regression algorithm with encoded data structure [0.21756081703276003]
We construct a quantum regression algorithm wherein the quantum state directly encodes the classical data table.
We show for the first time explicitly how the linkage of the classical data structure can be taken advantage of directly through quantum subroutines.
arXiv Detail & Related papers (2023-07-07T00:30:16Z) - Transition Role of Entangled Data in Quantum Machine Learning [51.6526011493678]
Entanglement serves as the resource to empower quantum computing.
Recent progress has highlighted its positive impact on learning quantum dynamics.
We establish a quantum no-free-lunch (NFL) theorem for learning quantum dynamics using entangled data.
arXiv Detail & Related papers (2023-06-06T08:06:43Z) - Classical-to-Quantum Transfer Learning Facilitates Machine Learning with Variational Quantum Circuit [62.55763504085508]
We prove that a classical-to-quantum transfer learning architecture using a Variational Quantum Circuit (VQC) improves the representation and generalization (estimation error) capabilities of the VQC model.
We show that the architecture of classical-to-quantum transfer learning leverages pre-trained classical generative AI models, making it easier to find the optimal parameters for the VQC in the training stage.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Theoretical Guarantees for Permutation-Equivariant Quantum Neural
Networks [0.0]
We show how to build equivariant quantum neural networks (QNNs)
We prove that they do not suffer from barren plateaus, quickly reach overparametrization, and generalize well from small amounts of data.
Our work provides the first theoretical guarantees for equivariant QNNs, thus indicating the extreme power and potential of GQML.
arXiv Detail & Related papers (2022-10-18T16:35:44Z) - Quantum neural networks [0.0]
This thesis combines two of the most exciting research areas of the last decades: quantum computing and machine learning.
We introduce dissipative quantum neural networks (DQNNs), which are capable of universal quantum computation and have low memory requirements while training.
arXiv Detail & Related papers (2022-05-17T07:47:00Z) - Out-of-distribution generalization for learning quantum dynamics [2.1503874224655997]
We show that one can learn the action of a unitary on entangled states having trained only product states.
This advances the prospects of learning quantum dynamics on near term quantum hardware.
arXiv Detail & Related papers (2022-04-21T17:15:23Z) - Optimal quantum dataset for learning a unitary transformation [5.526775342940154]
How to learn a unitary transformation efficiently is a fundamental problem in quantum machine learning.
We introduce a quantum dataset consisting of $n+1$ mixed states that are sufficient for exact training.
We show that the size of quantum dataset with mixed states can be reduced to a constant, which yields an optimal quantum dataset for learning a unitary.
arXiv Detail & Related papers (2022-03-01T15:29:39Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z) - Quantum Gram-Schmidt Processes and Their Application to Efficient State
Read-out for Quantum Algorithms [87.04438831673063]
We present an efficient read-out protocol that yields the classical vector form of the generated state.
Our protocol suits the case that the output state lies in the row space of the input matrix.
One of our technical tools is an efficient quantum algorithm for performing the Gram-Schmidt orthonormal procedure.
arXiv Detail & Related papers (2020-04-14T11:05:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.