Learnability of the output distributions of local quantum circuits
- URL: http://arxiv.org/abs/2110.05517v1
- Date: Mon, 11 Oct 2021 18:00:20 GMT
- Title: Learnability of the output distributions of local quantum circuits
- Authors: Marcel Hinsche, Marios Ioannou, Alexander Nietner, Jonas Haferkamp,
Yihui Quek, Dominik Hangleiter, Jean-Pierre Seifert, Jens Eisert, Ryan Sweke
- Abstract summary: We investigate, within two different oracle models, the learnability of quantum circuit Born machines.
We first show a negative result, that the output distributions of super-logarithmic depth Clifford circuits are not sample-efficiently learnable.
We show that in a more powerful oracle model, namely when directly given access to samples, the output distributions of local Clifford circuits are computationally efficiently PAC learnable.
- Score: 53.17490581210575
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: There is currently a large interest in understanding the potential advantages
quantum devices can offer for probabilistic modelling. In this work we
investigate, within two different oracle models, the probably approximately
correct (PAC) learnability of quantum circuit Born machines, i.e., the output
distributions of local quantum circuits. We first show a negative result,
namely, that the output distributions of super-logarithmic depth Clifford
circuits are not sample-efficiently learnable in the statistical query model,
i.e., when given query access to empirical expectation values of bounded
functions over the sample space. This immediately implies the hardness, for
both quantum and classical algorithms, of learning from statistical queries the
output distributions of local quantum circuits using any gate set which
includes the Clifford group. As many practical generative modelling algorithms
use statistical queries -- including those for training quantum circuit Born
machines -- our result is broadly applicable and strongly limits the
possibility of a meaningful quantum advantage for learning the output
distributions of local quantum circuits. As a positive result, we show that in
a more powerful oracle model, namely when directly given access to samples, the
output distributions of local Clifford circuits are computationally efficiently
PAC learnable by a classical learner. Our results are equally applicable to the
problems of learning an algorithm for generating samples from the target
distribution (generative modelling) and learning an algorithm for evaluating
its probabilities (density modelling). They provide the first rigorous insights
into the learnability of output distributions of local quantum circuits from
the probabilistic modelling perspective.
Related papers
- QuantumSEA: In-Time Sparse Exploration for Noise Adaptive Quantum
Circuits [82.50620782471485]
QuantumSEA is an in-time sparse exploration for noise-adaptive quantum circuits.
It aims to achieve two key objectives: (1) implicit circuits capacity during training and (2) noise robustness.
Our method establishes state-of-the-art results with only half the number of quantum gates and 2x time saving of circuit executions.
arXiv Detail & Related papers (2024-01-10T22:33:00Z) - A New Initial Distribution for Quantum Generative Adversarial Networks
to Load Probability Distributions [4.043200001974071]
We propose a novel method for generating an initial distribution that improves the learning efficiency of qGANs.
Our method uses the classical process of label replacement to generate various probability distributions in shallow quantum circuits.
arXiv Detail & Related papers (2023-06-21T14:33:35Z) - Importance sampling for stochastic quantum simulations [68.8204255655161]
We introduce the qDrift protocol, which builds random product formulas by sampling from the Hamiltonian according to the coefficients.
We show that the simulation cost can be reduced while achieving the same accuracy, by considering the individual simulation cost during the sampling stage.
Results are confirmed by numerical simulations performed on a lattice nuclear effective field theory.
arXiv Detail & Related papers (2022-12-12T15:06:32Z) - Protocols for classically training quantum generative models on
probability distributions [17.857341127079305]
Quantum Generative Modelling (QGM) relies on preparing quantum states and generating samples as hidden - or known - probability distributions.
We propose protocols for classical training of QGMs based on circuits of the specific type that admit an efficient gradient.
We numerically demonstrate the end-to-end training of IQP circuits using probability distributions for up to 30 qubits on a regular desktop computer.
arXiv Detail & Related papers (2022-10-24T17:57:09Z) - A single $T$-gate makes distribution learning hard [56.045224655472865]
This work provides an extensive characterization of the learnability of the output distributions of local quantum circuits.
We show that for a wide variety of the most practically relevant learning algorithms -- including hybrid-quantum classical algorithms -- even the generative modelling problem associated with depth $d=omega(log(n))$ Clifford circuits is hard.
arXiv Detail & Related papers (2022-07-07T08:04:15Z) - Theory of Quantum Generative Learning Models with Maximum Mean
Discrepancy [67.02951777522547]
We study learnability of quantum circuit Born machines (QCBMs) and quantum generative adversarial networks (QGANs)
We first analyze the generalization ability of QCBMs and identify their superiorities when the quantum devices can directly access the target distribution.
Next, we prove how the generalization error bound of QGANs depends on the employed Ansatz, the number of qudits, and input states.
arXiv Detail & Related papers (2022-05-10T08:05:59Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - Enhancing Generative Models via Quantum Correlations [1.6099403809839032]
Generative modeling using samples drawn from the probability distribution constitutes a powerful approach for unsupervised machine learning.
We show theoretically that such quantum correlations provide a powerful resource for generative modeling.
We numerically test this separation on standard machine learning data sets and show that it holds for practical problems.
arXiv Detail & Related papers (2021-01-20T22:57:22Z) - On the Quantum versus Classical Learnability of Discrete Distributions [9.980327191634071]
We study the comparative power of classical and quantum learners for generative modelling within the Probably Approximately Correct (PAC) framework.
Our primary result is the explicit construction of a class of discrete probability distributions which, under the decisional Diffie-Hellman assumption, is provably not efficiently PAC learnable by a classical generative modelling algorithm.
This class of distributions provides a concrete example of a generative modelling problem for which quantum learners exhibit a provable advantage over classical learning algorithms.
arXiv Detail & Related papers (2020-07-28T19:43:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.