Towards a scalable discrete quantum generative adversarial neural
network
- URL: http://arxiv.org/abs/2209.13993v1
- Date: Wed, 28 Sep 2022 10:42:38 GMT
- Title: Towards a scalable discrete quantum generative adversarial neural
network
- Authors: Smit Chaudhary, Patrick Huembeli, Ian MacCormack, Taylor L. Patti,
Jean Kossaifi, and Alexey Galda
- Abstract summary: We introduce a fully quantum generative adversarial network intended for use with binary data.
In particular, we incorporate noise reuploading in the generator, auxiliary qubits in the discriminator to enhance expressivity.
We empirically demonstrate the expressive power of our model on both synthetic data and low energy states of an Ising model.
- Score: 8.134122612459633
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce a fully quantum generative adversarial network intended for use
with binary data. The architecture incorporates several features found in other
classical and quantum machine learning models, which up to this point had not
been used in conjunction. In particular, we incorporate noise reuploading in
the generator, auxiliary qubits in the discriminator to enhance expressivity,
and a direct connection between the generator and discriminator circuits,
obviating the need to access the generator's probability distribution. We show
that, as separate components, the generator and discriminator perform as
desired. We empirically demonstrate the expressive power of our model on both
synthetic data as well as low energy states of an Ising model. Our
demonstrations suggest that the model is not only capable of reproducing
discrete training data, but also of potentially generalizing from it.
Related papers
- Neural Network Parameter Diffusion [50.85251415173792]
Diffusion models have achieved remarkable success in image and video generation.
In this work, we demonstrate that diffusion models can also.
generate high-performing neural network parameters.
arXiv Detail & Related papers (2024-02-20T16:59:03Z) - Quantum Generative Modeling of Sequential Data with Trainable Token
Embedding [0.0]
A quantum-inspired generative model known as the Born machines have shown great advancements in learning classical and quantum data.
We generalize the embedding method into trainable quantum measurement operators that can be simultaneously honed with MPS.
Our study indicated that combined with trainable embedding, Born machines can exhibit better performance and learn deeper correlations from the dataset.
arXiv Detail & Related papers (2023-11-08T22:56:37Z) - Exponential Quantum Communication Advantage in Distributed Inference and Learning [19.827903766111987]
We present a framework for distributed computation over a quantum network.
We show that for models within this framework, inference and training using gradient descent can be performed with exponentially less communication.
We also show that models in this class can encode highly nonlinear features of their inputs, and their expressivity increases exponentially with model depth.
arXiv Detail & Related papers (2023-10-11T02:19:50Z) - Improving Out-of-Distribution Robustness of Classifiers via Generative
Interpolation [56.620403243640396]
Deep neural networks achieve superior performance for learning from independent and identically distributed (i.i.d.) data.
However, their performance deteriorates significantly when handling out-of-distribution (OoD) data.
We develop a simple yet effective method called Generative Interpolation to fuse generative models trained from multiple domains for synthesizing diverse OoD samples.
arXiv Detail & Related papers (2023-07-23T03:53:53Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - Parallel Hybrid Networks: an interplay between quantum and classical
neural networks [0.0]
We introduce a new, interpretable class of hybrid quantum neural networks that pass the inputs of the dataset in parallel.
We demonstrate this claim on two synthetic datasets sampled from periodic distributions with added protrusions as noise.
arXiv Detail & Related papers (2023-03-06T15:45:28Z) - Effective Dynamics of Generative Adversarial Networks [16.51305515824504]
Generative adversarial networks (GANs) are a class of machine-learning models that use adversarial training to generate new samples.
One major form of training failure, known as mode collapse, involves the generator failing to reproduce the full diversity of modes in the target probability distribution.
We present an effective model of GAN training, which captures the learning dynamics by replacing the generator neural network with a collection of particles in the output space.
arXiv Detail & Related papers (2022-12-08T22:04:01Z) - Dynamic Inference with Neural Interpreters [72.90231306252007]
We present Neural Interpreters, an architecture that factorizes inference in a self-attention network as a system of modules.
inputs to the model are routed through a sequence of functions in a way that is end-to-end learned.
We show that Neural Interpreters perform on par with the vision transformer using fewer parameters, while being transferrable to a new task in a sample efficient manner.
arXiv Detail & Related papers (2021-10-12T23:22:45Z) - Discrete-Valued Neural Communication [85.3675647398994]
We show that restricting the transmitted information among components to discrete representations is a beneficial bottleneck.
Even though individuals have different understandings of what a "cat" is based on their specific experiences, the shared discrete token makes it possible for communication among individuals to be unimpeded by individual differences in internal representation.
We extend the quantization mechanism from the Vector-Quantized Variational Autoencoder to multi-headed discretization with shared codebooks and use it for discrete-valued neural communication.
arXiv Detail & Related papers (2021-07-06T03:09:25Z) - Quantum semi-supervised generative adversarial network for enhanced data
classification [1.1110435360741175]
We propose a quantum semi-supervised generative adversarial network (qSGAN)
The system is composed of a quantum generator and a classical discriminator/classifier (D/C)
arXiv Detail & Related papers (2020-10-26T17:11:49Z) - Discriminator Contrastive Divergence: Semi-Amortized Generative Modeling
by Exploring Energy of the Discriminator [85.68825725223873]
Generative Adversarial Networks (GANs) have shown great promise in modeling high dimensional data.
We introduce the Discriminator Contrastive Divergence, which is well motivated by the property of WGAN's discriminator.
We demonstrate the benefits of significant improved generation on both synthetic data and several real-world image generation benchmarks.
arXiv Detail & Related papers (2020-04-05T01:50:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.