Protocols for classically training quantum generative models on
probability distributions
- URL: http://arxiv.org/abs/2210.13442v2
- Date: Fri, 6 Oct 2023 13:14:06 GMT
- Title: Protocols for classically training quantum generative models on
probability distributions
- Authors: Sachin Kasture, Oleksandr Kyriienko, Vincent E. Elfving
- Abstract summary: Quantum Generative Modelling (QGM) relies on preparing quantum states and generating samples as hidden - or known - probability distributions.
We propose protocols for classical training of QGMs based on circuits of the specific type that admit an efficient gradient.
We numerically demonstrate the end-to-end training of IQP circuits using probability distributions for up to 30 qubits on a regular desktop computer.
- Score: 17.857341127079305
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Quantum Generative Modelling (QGM) relies on preparing quantum states and
generating samples from these states as hidden - or known - probability
distributions. As distributions from some classes of quantum states (circuits)
are inherently hard to sample classically, QGM represents an excellent testbed
for quantum supremacy experiments. Furthermore, generative tasks are
increasingly relevant for industrial machine learning applications, and thus
QGM is a strong candidate for demonstrating a practical quantum advantage.
However, this requires that quantum circuits are trained to represent
industrially relevant distributions, and the corresponding training stage has
an extensive training cost for current quantum hardware in practice. In this
work, we propose protocols for classical training of QGMs based on circuits of
the specific type that admit an efficient gradient computation, while remaining
hard to sample. In particular, we consider Instantaneous Quantum Polynomial
(IQP) circuits and their extensions. Showing their classical simulability in
terms of the time complexity, sparsity and anti-concentration properties, we
develop a classically tractable way of simulating their output probability
distributions, allowing classical training to a target probability
distribution. The corresponding quantum sampling from IQPs can be performed
efficiently, unlike when using classical sampling. We numerically demonstrate
the end-to-end training of IQP circuits using probability distributions for up
to 30 qubits on a regular desktop computer. When applied to industrially
relevant distributions this combination of classical training with quantum
sampling represents an avenue for reaching advantage in the NISQ era.
Related papers
- Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - Learning hard distributions with quantum-enhanced Variational
Autoencoders [2.545905720487589]
We introduce a quantum-enhanced VAE (QeVAE) that uses quantum correlations to improve the fidelity over classical VAEs.
We empirically show that the QeVAE outperforms classical models on several classes of quantum states.
Our work paves the way for new applications of quantum generative learning algorithms.
arXiv Detail & Related papers (2023-05-02T16:50:24Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - TeD-Q: a tensor network enhanced distributed hybrid quantum machine
learning framework [59.07246314484875]
TeD-Q is an open-source software framework for quantum machine learning.
It seamlessly integrates classical machine learning libraries with quantum simulators.
It provides a graphical mode in which the quantum circuit and the training progress can be visualized in real-time.
arXiv Detail & Related papers (2023-01-13T09:35:05Z) - Theory of Quantum Generative Learning Models with Maximum Mean
Discrepancy [67.02951777522547]
We study learnability of quantum circuit Born machines (QCBMs) and quantum generative adversarial networks (QGANs)
We first analyze the generalization ability of QCBMs and identify their superiorities when the quantum devices can directly access the target distribution.
Next, we prove how the generalization error bound of QGANs depends on the employed Ansatz, the number of qudits, and input states.
arXiv Detail & Related papers (2022-05-10T08:05:59Z) - Quantum-enhanced Markov chain Monte Carlo [0.22166578153935784]
We introduce a quantum algorithm to sample from distributions that pose a bottleneck in several applications.
In each step, the quantum processor explores the model in superposition to propose a random move, which is then accepted or rejected by a classical computer.
We find that this quantum algorithm converges in fewer iterations than common classical MCMC alternatives on relevant problem instances.
arXiv Detail & Related papers (2022-03-23T15:50:12Z) - Learnability of the output distributions of local quantum circuits [53.17490581210575]
We investigate, within two different oracle models, the learnability of quantum circuit Born machines.
We first show a negative result, that the output distributions of super-logarithmic depth Clifford circuits are not sample-efficiently learnable.
We show that in a more powerful oracle model, namely when directly given access to samples, the output distributions of local Clifford circuits are computationally efficiently PAC learnable.
arXiv Detail & Related papers (2021-10-11T18:00:20Z) - Every Classical Sampling Circuit is a Quantum Sampling Circuit [0.8122270502556371]
This note introduces "Q-marginals", which are quantum states encoding some probability distribution.
It shows that these can be prepared directly from a classical circuit sampling for the probability distribution of interest.
arXiv Detail & Related papers (2021-09-10T12:52:23Z) - Enhancing Generative Models via Quantum Correlations [1.6099403809839032]
Generative modeling using samples drawn from the probability distribution constitutes a powerful approach for unsupervised machine learning.
We show theoretically that such quantum correlations provide a powerful resource for generative modeling.
We numerically test this separation on standard machine learning data sets and show that it holds for practical problems.
arXiv Detail & Related papers (2021-01-20T22:57:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.