Out-of-distribution generalization for learning quantum dynamics
- URL: http://arxiv.org/abs/2204.10268v3
- Date: Sun, 9 Jul 2023 04:31:25 GMT
- Title: Out-of-distribution generalization for learning quantum dynamics
- Authors: Matthias C. Caro, Hsin-Yuan Huang, Nicholas Ezzell, Joe Gibbs, Andrew
T. Sornborger, Lukasz Cincio, Patrick J. Coles, Zo\"e Holmes
- Abstract summary: We show that one can learn the action of a unitary on entangled states having trained only product states.
This advances the prospects of learning quantum dynamics on near term quantum hardware.
- Score: 2.1503874224655997
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generalization bounds are a critical tool to assess the training data
requirements of Quantum Machine Learning (QML). Recent work has established
guarantees for in-distribution generalization of quantum neural networks
(QNNs), where training and testing data are drawn from the same data
distribution. However, there are currently no results on out-of-distribution
generalization in QML, where we require a trained model to perform well even on
data drawn from a different distribution to the training distribution. Here, we
prove out-of-distribution generalization for the task of learning an unknown
unitary. In particular, we show that one can learn the action of a unitary on
entangled states having trained only product states. Since product states can
be prepared using only single-qubit gates, this advances the prospects of
learning quantum dynamics on near term quantum hardware, and further opens up
new methods for both the classical and quantum compilation of quantum circuits.
Related papers
- Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Understanding quantum machine learning also requires rethinking
generalization [0.3683202928838613]
We show that traditional approaches to understanding generalization fail to explain the behavior of quantum models.
Experiments reveal that state-of-the-art quantum neural networks accurately fit random states and random labeling of training data.
arXiv Detail & Related papers (2023-06-23T12:04:13Z) - Generalization of Quantum Machine Learning Models Using Quantum Fisher Information Metric [0.0]
We introduce the data quantum Fisher information metric (DQFIM)
It describes the capacity of variational quantum algorithms depending on variational ansatz, training data and their symmetries.
Using the Lie algebra, we explain how to generalize using a low number of training states.
Finally, we find that out-of-distribution generalization, where training and testing data are drawn from different data distributions, can be better than using the same distribution.
arXiv Detail & Related papers (2023-03-23T17:32:20Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Protocols for classically training quantum generative models on
probability distributions [17.857341127079305]
Quantum Generative Modelling (QGM) relies on preparing quantum states and generating samples as hidden - or known - probability distributions.
We propose protocols for classical training of QGMs based on circuits of the specific type that admit an efficient gradient.
We numerically demonstrate the end-to-end training of IQP circuits using probability distributions for up to 30 qubits on a regular desktop computer.
arXiv Detail & Related papers (2022-10-24T17:57:09Z) - Do Quantum Circuit Born Machines Generalize? [58.720142291102135]
We present the first work in the literature that presents the QCBM's generalization performance as an integral evaluation metric for quantum generative models.
We show that the QCBM is able to effectively learn the reweighted dataset and generate unseen samples with higher quality than those in the training set.
arXiv Detail & Related papers (2022-07-27T17:06:34Z) - Quantum neural networks [0.0]
This thesis combines two of the most exciting research areas of the last decades: quantum computing and machine learning.
We introduce dissipative quantum neural networks (DQNNs), which are capable of universal quantum computation and have low memory requirements while training.
arXiv Detail & Related papers (2022-05-17T07:47:00Z) - Theory of Quantum Generative Learning Models with Maximum Mean
Discrepancy [67.02951777522547]
We study learnability of quantum circuit Born machines (QCBMs) and quantum generative adversarial networks (QGANs)
We first analyze the generalization ability of QCBMs and identify their superiorities when the quantum devices can directly access the target distribution.
Next, we prove how the generalization error bound of QGANs depends on the employed Ansatz, the number of qudits, and input states.
arXiv Detail & Related papers (2022-05-10T08:05:59Z) - Generalization in quantum machine learning from few training data [4.325561431427748]
Modern quantum machine learning (QML) methods involve variationally optimizing a parameterized quantum circuit on a training data set.
We show that the generalization error of a quantum machine learning model with $T$ trainable gates at worst as $sqrtT/N$.
We also show that classification of quantum states across a phase transition with a quantum convolutional neural network requires only a very small training data set.
arXiv Detail & Related papers (2021-11-09T17:49:46Z) - Learnability of the output distributions of local quantum circuits [53.17490581210575]
We investigate, within two different oracle models, the learnability of quantum circuit Born machines.
We first show a negative result, that the output distributions of super-logarithmic depth Clifford circuits are not sample-efficiently learnable.
We show that in a more powerful oracle model, namely when directly given access to samples, the output distributions of local Clifford circuits are computationally efficiently PAC learnable.
arXiv Detail & Related papers (2021-10-11T18:00:20Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.