Generalization in quantum machine learning from few training data
- URL: http://arxiv.org/abs/2111.05292v1
- Date: Tue, 9 Nov 2021 17:49:46 GMT
- Title: Generalization in quantum machine learning from few training data
- Authors: Matthias C. Caro, Hsin-Yuan Huang, M. Cerezo, Kunal Sharma, Andrew
Sornborger, Lukasz Cincio, Patrick J. Coles
- Abstract summary: Modern quantum machine learning (QML) methods involve variationally optimizing a parameterized quantum circuit on a training data set.
We show that the generalization error of a quantum machine learning model with $T$ trainable gates at worst as $sqrtT/N$.
We also show that classification of quantum states across a phase transition with a quantum convolutional neural network requires only a very small training data set.
- Score: 4.325561431427748
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Modern quantum machine learning (QML) methods involve variationally
optimizing a parameterized quantum circuit on a training data set, and
subsequently making predictions on a testing data set (i.e., generalizing). In
this work, we provide a comprehensive study of generalization performance in
QML after training on a limited number $N$ of training data points. We show
that the generalization error of a quantum machine learning model with $T$
trainable gates scales at worst as $\sqrt{T/N}$. When only $K \ll T$ gates have
undergone substantial change in the optimization process, we prove that the
generalization error improves to $\sqrt{K / N}$. Our results imply that the
compiling of unitaries into a polynomial number of native gates, a crucial
application for the quantum computing industry that typically uses
exponential-size training data, can be sped up significantly. We also show that
classification of quantum states across a phase transition with a quantum
convolutional neural network requires only a very small training data set.
Other potential applications include learning quantum error correcting codes or
quantum dynamical simulation. Our work injects new hope into the field of QML,
as good generalization is guaranteed from few training data.
Related papers
- Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - The curse of random quantum data [62.24825255497622]
We quantify the performances of quantum machine learning in the landscape of quantum data.
We find that the training efficiency and generalization capabilities in quantum machine learning will be exponentially suppressed with the increase in qubits.
Our findings apply to both the quantum kernel method and the large-width limit of quantum neural networks.
arXiv Detail & Related papers (2024-08-19T12:18:07Z) - Variational quantum regression algorithm with encoded data structure [0.21756081703276003]
We construct a quantum regression algorithm wherein the quantum state directly encodes the classical data table.
We show for the first time explicitly how the linkage of the classical data structure can be taken advantage of directly through quantum subroutines.
arXiv Detail & Related papers (2023-07-07T00:30:16Z) - Transition Role of Entangled Data in Quantum Machine Learning [51.6526011493678]
Entanglement serves as the resource to empower quantum computing.
Recent progress has highlighted its positive impact on learning quantum dynamics.
We establish a quantum no-free-lunch (NFL) theorem for learning quantum dynamics using entangled data.
arXiv Detail & Related papers (2023-06-06T08:06:43Z) - Classical-to-Quantum Transfer Learning Facilitates Machine Learning with Variational Quantum Circuit [62.55763504085508]
We prove that a classical-to-quantum transfer learning architecture using a Variational Quantum Circuit (VQC) improves the representation and generalization (estimation error) capabilities of the VQC model.
We show that the architecture of classical-to-quantum transfer learning leverages pre-trained classical generative AI models, making it easier to find the optimal parameters for the VQC in the training stage.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - Generalization of Quantum Machine Learning Models Using Quantum Fisher Information Metric [0.0]
We introduce the data quantum Fisher information metric (DQFIM)
It describes the capacity of variational quantum algorithms depending on variational ansatz, training data and their symmetries.
Using the Lie algebra, we explain how to generalize using a low number of training states.
Finally, we find that out-of-distribution generalization, where training and testing data are drawn from different data distributions, can be better than using the same distribution.
arXiv Detail & Related papers (2023-03-23T17:32:20Z) - Quantum neural networks [0.0]
This thesis combines two of the most exciting research areas of the last decades: quantum computing and machine learning.
We introduce dissipative quantum neural networks (DQNNs), which are capable of universal quantum computation and have low memory requirements while training.
arXiv Detail & Related papers (2022-05-17T07:47:00Z) - Out-of-distribution generalization for learning quantum dynamics [2.1503874224655997]
We show that one can learn the action of a unitary on entangled states having trained only product states.
This advances the prospects of learning quantum dynamics on near term quantum hardware.
arXiv Detail & Related papers (2022-04-21T17:15:23Z) - Optimal quantum dataset for learning a unitary transformation [5.526775342940154]
How to learn a unitary transformation efficiently is a fundamental problem in quantum machine learning.
We introduce a quantum dataset consisting of $n+1$ mixed states that are sufficient for exact training.
We show that the size of quantum dataset with mixed states can be reduced to a constant, which yields an optimal quantum dataset for learning a unitary.
arXiv Detail & Related papers (2022-03-01T15:29:39Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.