Generative Quantum Learning of Joint Probability Distribution Functions
- URL: http://arxiv.org/abs/2109.06315v2
- Date: Tue, 8 Nov 2022 16:55:17 GMT
- Title: Generative Quantum Learning of Joint Probability Distribution Functions
- Authors: Elton Yechao Zhu, Sonika Johri, Dave Bacon, Mert Esencan, Jungsang
Kim, Mark Muir, Nikhil Murgai, Jason Nguyen, Neal Pisenti, Adam Schouela,
Ksenia Sosnova, Ken Wright
- Abstract summary: We design quantum machine learning algorithms to model copulas.
We show that any copula can be naturally mapped to a multipartite maximally entangled state.
A variational ansatz we christen as a qopula' creates arbitrary correlations between variables.
- Score: 1.221966660783828
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Modeling joint probability distributions is an important task in a wide
variety of fields. One popular technique for this employs a family of
multivariate distributions with uniform marginals called copulas. While the
theory of modeling joint distributions via copulas is well understood, it gets
practically challenging to accurately model real data with many variables. In
this work, we design quantum machine learning algorithms to model copulas. We
show that any copula can be naturally mapped to a multipartite maximally
entangled state. A variational ansatz we christen as a `qopula' creates
arbitrary correlations between variables while maintaining the copula structure
starting from a set of Bell pairs for two variables, or GHZ states for multiple
variables. As an application, we train a Quantum Generative Adversarial Network
(QGAN) and a Quantum Circuit Born Machine (QCBM) using this variational ansatz
to generate samples from joint distributions of two variables for historical
data from the stock market. We demonstrate our generative learning algorithms
on trapped ion quantum computers from IonQ for up to 8 qubits and show that our
results outperform those obtained through equivalent classical generative
learning. Further, we present theoretical arguments for exponential advantage
in our model's expressivity over classical models based on communication and
computational complexity arguments.
Related papers
- QonFusion -- Quantum Approaches to Gaussian Random Variables:
Applications in Stable Diffusion and Brownian Motion [1.90365714903665]
This strategy serves as a substitute for conventional pseudorandom number generators (PRNGs)
QonFusion is a Python library congruent with both PyTorch and PennyLane.
arXiv Detail & Related papers (2023-09-28T08:51:18Z) - Learning hard distributions with quantum-enhanced Variational
Autoencoders [2.545905720487589]
We introduce a quantum-enhanced VAE (QeVAE) that uses quantum correlations to improve the fidelity over classical VAEs.
We empirically show that the QeVAE outperforms classical models on several classes of quantum states.
Our work paves the way for new applications of quantum generative learning algorithms.
arXiv Detail & Related papers (2023-05-02T16:50:24Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - A performance characterization of quantum generative models [35.974070202997176]
We compare quantum circuits used for quantum generative modeling.
We learn the underlying probability distribution of the data sets via two popular training methods.
We empirically find that a variant of the discrete architecture, which learns the copula of the probability distribution, outperforms all other methods.
arXiv Detail & Related papers (2023-01-23T11:00:29Z) - Importance sampling for stochastic quantum simulations [68.8204255655161]
We introduce the qDrift protocol, which builds random product formulas by sampling from the Hamiltonian according to the coefficients.
We show that the simulation cost can be reduced while achieving the same accuracy, by considering the individual simulation cost during the sampling stage.
Results are confirmed by numerical simulations performed on a lattice nuclear effective field theory.
arXiv Detail & Related papers (2022-12-12T15:06:32Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Copula-based Risk Aggregation with Trapped Ion Quantum Computers [1.541403735141431]
Copulas are mathematical tools for modeling joint probability distributions.
Recent finding that copulas can be expressed as maximally entangled quantum states has revealed a promising approach to practical quantum advantages.
We study the training of QCBMs with different levels of precision and circuit design on a simulator and a state-of-the-art trapped ion quantum computer.
arXiv Detail & Related papers (2022-06-23T18:39:30Z) - Learnability of the output distributions of local quantum circuits [53.17490581210575]
We investigate, within two different oracle models, the learnability of quantum circuit Born machines.
We first show a negative result, that the output distributions of super-logarithmic depth Clifford circuits are not sample-efficiently learnable.
We show that in a more powerful oracle model, namely when directly given access to samples, the output distributions of local Clifford circuits are computationally efficiently PAC learnable.
arXiv Detail & Related papers (2021-10-11T18:00:20Z) - Top-N: Equivariant set and graph generation without exchangeability [61.24699600833916]
We consider one-shot probabilistic decoders that map a vector-shaped prior to a distribution over sets or graphs.
These functions can be integrated into variational autoencoders (VAE), generative adversarial networks (GAN) or normalizing flows.
Top-n is a deterministic, non-exchangeable set creation mechanism which learns to select the most relevant points from a trainable reference set.
arXiv Detail & Related papers (2021-10-05T14:51:19Z) - Deep Archimedean Copulas [98.96141706464425]
ACNet is a novel differentiable neural network architecture that enforces structural properties.
We show that ACNet is able to both approximate common Archimedean Copulas and generate new copulas which may provide better fits to data.
arXiv Detail & Related papers (2020-12-05T22:58:37Z) - Learnability and Complexity of Quantum Samples [26.425493366198207]
Given a quantum circuit, a quantum computer can sample the output distribution exponentially faster in the number of bits than classical computers.
Can we learn the underlying quantum distribution using models with training parameters that scale in n under a fixed training time?
We study four kinds of generative models: Deep Boltzmann machine (DBM), Generative Adrial Networks (GANs), Long Short-Term Memory (LSTM) and Autoregressive GAN, on learning quantum data set generated by deep random circuits.
arXiv Detail & Related papers (2020-10-22T18:45:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.