A performance characterization of quantum generative models
- URL: http://arxiv.org/abs/2301.09363v3
- Date: Tue, 26 Mar 2024 09:48:12 GMT
- Title: A performance characterization of quantum generative models
- Authors: Carlos A. Riofrío, Oliver Mitevski, Caitlin Jones, Florian Krellner, Aleksandar Vučković, Joseph Doetsch, Johannes Klepsch, Thomas Ehmer, Andre Luckow,
- Abstract summary: We compare quantum circuits used for quantum generative modeling.
We learn the underlying probability distribution of the data sets via two popular training methods.
We empirically find that a variant of the discrete architecture, which learns the copula of the probability distribution, outperforms all other methods.
- Score: 35.974070202997176
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum generative modeling is a growing area of interest for industry-relevant applications. With the field still in its infancy, there are many competing techniques. This work is an attempt to systematically compare a broad range of these techniques to guide quantum computing practitioners when deciding which models and techniques to use in their applications. We compare fundamentally different architectural ansatzes of parametric quantum circuits used for quantum generative modeling: 1. A continuous architecture, which produces continuous-valued data samples, and 2. a discrete architecture, which samples on a discrete grid. We compare the performance of different data transformations: normalization by the min-max transform or by the probability integral transform. We learn the underlying probability distribution of the data sets via two popular training methods: 1. quantum circuit Born machines (QCBM), and 2. quantum generative adversarial networks (QGAN). We study their performance and trade-offs as the number of model parameters increases, with the baseline of similarly trained classical neural networks. The study is performed on six low-dimensional synthetic and two real financial data sets. Our two key findings are that: 1. For all data sets, our quantum models require similar or fewer parameters than their classical counterparts. In the extreme case, the quantum models require two of orders of magnitude less parameters. 2. We empirically find that a variant of the discrete architecture, which learns the copula of the probability distribution, outperforms all other methods.
Related papers
- Towards Efficient Quantum Hybrid Diffusion Models [68.43405413443175]
We propose a new methodology to design quantum hybrid diffusion models.
We propose two possible hybridization schemes combining quantum computing's superior generalization with classical networks' modularity.
arXiv Detail & Related papers (2024-02-25T16:57:51Z) - Multimodal deep representation learning for quantum cross-platform
verification [60.01590250213637]
Cross-platform verification, a critical undertaking in the realm of early-stage quantum computing, endeavors to characterize the similarity of two imperfect quantum devices executing identical algorithms.
We introduce an innovative multimodal learning approach, recognizing that the formalism of data in this task embodies two distinct modalities.
We devise a multimodal neural network to independently extract knowledge from these modalities, followed by a fusion operation to create a comprehensive data representation.
arXiv Detail & Related papers (2023-11-07T04:35:03Z) - Quantum machine learning for image classification [39.58317527488534]
This research introduces two quantum machine learning models that leverage the principles of quantum mechanics for effective computations.
Our first model, a hybrid quantum neural network with parallel quantum circuits, enables the execution of computations even in the noisy intermediate-scale quantum era.
A second model introduces a hybrid quantum neural network with a Quanvolutional layer, reducing image resolution via a convolution process.
arXiv Detail & Related papers (2023-04-18T18:23:20Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - Tensor Networks or Decision Diagrams? Guidelines for Classical Quantum
Circuit Simulation [65.93830818469833]
tensor networks and decision diagrams have independently been developed with differing perspectives, terminologies, and backgrounds in mind.
We consider how these techniques approach classical quantum circuit simulation, and examine their (dis)similarities with regard to their most applicable abstraction level.
We provide guidelines for when to better use tensor networks and when to better use decision diagrams in classical quantum circuit simulation.
arXiv Detail & Related papers (2023-02-13T19:00:00Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Copula-based Risk Aggregation with Trapped Ion Quantum Computers [1.541403735141431]
Copulas are mathematical tools for modeling joint probability distributions.
Recent finding that copulas can be expressed as maximally entangled quantum states has revealed a promising approach to practical quantum advantages.
We study the training of QCBMs with different levels of precision and circuit design on a simulator and a state-of-the-art trapped ion quantum computer.
arXiv Detail & Related papers (2022-06-23T18:39:30Z) - Provably efficient variational generative modeling of quantum many-body
systems via quantum-probabilistic information geometry [3.5097082077065003]
We introduce a generalization of quantum natural gradient descent to parameterized mixed states.
We also provide a robust first-order approximating algorithm, Quantum-Probabilistic Mirror Descent.
Our approaches extend previously sample-efficient techniques to allow for flexibility in model choice.
arXiv Detail & Related papers (2022-06-09T17:58:15Z) - A tensor network discriminator architecture for classification of
quantum data on quantum computers [0.0]
We demonstrate the use of matrix product state (MPS) models for discriminating quantum data on quantum computers using holographic algorithms.
We experimentally evaluate models on Quantinuum's H1-2 trapped ion quantum computer using entangled input data modeled as translationally invariant, bond 4 MPSs.
arXiv Detail & Related papers (2022-02-22T14:19:42Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - Binary classifiers for noisy datasets: a comparative study of existing
quantum machine learning frameworks and some new approaches [0.0]
We apply Quantum Machine Learning frameworks to improve binary classification.
noisy datasets are in financial datasets.
New models exhibit better learning characteristics to asymmetrical noise in the dataset.
arXiv Detail & Related papers (2021-11-05T10:29:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.