On the Quantum versus Classical Learnability of Discrete Distributions
- URL: http://arxiv.org/abs/2007.14451v2
- Date: Tue, 9 Mar 2021 09:49:32 GMT
- Title: On the Quantum versus Classical Learnability of Discrete Distributions
- Authors: Ryan Sweke, Jean-Pierre Seifert, Dominik Hangleiter and Jens Eisert
- Abstract summary: We study the comparative power of classical and quantum learners for generative modelling within the Probably Approximately Correct (PAC) framework.
Our primary result is the explicit construction of a class of discrete probability distributions which, under the decisional Diffie-Hellman assumption, is provably not efficiently PAC learnable by a classical generative modelling algorithm.
This class of distributions provides a concrete example of a generative modelling problem for which quantum learners exhibit a provable advantage over classical learning algorithms.
- Score: 9.980327191634071
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Here we study the comparative power of classical and quantum learners for
generative modelling within the Probably Approximately Correct (PAC) framework.
More specifically we consider the following task: Given samples from some
unknown discrete probability distribution, output with high probability an
efficient algorithm for generating new samples from a good approximation of the
original distribution. Our primary result is the explicit construction of a
class of discrete probability distributions which, under the decisional
Diffie-Hellman assumption, is provably not efficiently PAC learnable by a
classical generative modelling algorithm, but for which we construct an
efficient quantum learner. This class of distributions therefore provides a
concrete example of a generative modelling problem for which quantum learners
exhibit a provable advantage over classical learning algorithms. In addition,
we discuss techniques for proving classical generative modelling hardness
results, as well as the relationship between the PAC learnability of Boolean
functions and the PAC learnability of discrete probability distributions.
Related papers
- A New Initial Distribution for Quantum Generative Adversarial Networks
to Load Probability Distributions [4.043200001974071]
We propose a novel method for generating an initial distribution that improves the learning efficiency of qGANs.
Our method uses the classical process of label replacement to generate various probability distributions in shallow quantum circuits.
arXiv Detail & Related papers (2023-06-21T14:33:35Z) - A super-polynomial quantum-classical separation for density modelling [9.980327191634071]
We show that fault-tolerant quantum computers can offer a super-polynomial advantage over classical learning algorithms.
We also show that any weak pseudo-random function can be used to construct a classically hard density modelling problem.
arXiv Detail & Related papers (2022-10-26T18:00:03Z) - A single $T$-gate makes distribution learning hard [56.045224655472865]
This work provides an extensive characterization of the learnability of the output distributions of local quantum circuits.
We show that for a wide variety of the most practically relevant learning algorithms -- including hybrid-quantum classical algorithms -- even the generative modelling problem associated with depth $d=omega(log(n))$ Clifford circuits is hard.
arXiv Detail & Related papers (2022-07-07T08:04:15Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - Learnability of the output distributions of local quantum circuits [53.17490581210575]
We investigate, within two different oracle models, the learnability of quantum circuit Born machines.
We first show a negative result, that the output distributions of super-logarithmic depth Clifford circuits are not sample-efficiently learnable.
We show that in a more powerful oracle model, namely when directly given access to samples, the output distributions of local Clifford circuits are computationally efficiently PAC learnable.
arXiv Detail & Related papers (2021-10-11T18:00:20Z) - Probabilistic Generating Circuits [50.98473654244851]
We propose probabilistic generating circuits (PGCs) for their efficient representation.
PGCs are not just a theoretical framework that unifies vastly different existing models, but also show huge potential in modeling realistic data.
We exhibit a simple class of PGCs that are not trivially subsumed by simple combinations of PCs and DPPs, and obtain competitive performance on a suite of density estimation benchmarks.
arXiv Detail & Related papers (2021-02-19T07:06:53Z) - Enhancing Generative Models via Quantum Correlations [1.6099403809839032]
Generative modeling using samples drawn from the probability distribution constitutes a powerful approach for unsupervised machine learning.
We show theoretically that such quantum correlations provide a powerful resource for generative modeling.
We numerically test this separation on standard machine learning data sets and show that it holds for practical problems.
arXiv Detail & Related papers (2021-01-20T22:57:22Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Anomaly detection with variational quantum generative adversarial
networks [0.0]
Generative adversarial networks (GANs) are a machine learning framework comprising a generative model for sampling from a target distribution.
We introduce variational quantum-classical Wasserstein GANs to address these issues and embed this model in a classical machine learning framework for anomaly detection.
Our model replaces the generator of Wasserstein GANs with a hybrid quantum-classical neural net and leaves the classical discriminative model unchanged.
arXiv Detail & Related papers (2020-10-20T17:48:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.