VAE-QWGAN: Addressing Mode Collapse in Quantum GANs via Autoencoding Priors
- URL: http://arxiv.org/abs/2409.10339v2
- Date: Thu, 22 May 2025 00:46:32 GMT
- Title: VAE-QWGAN: Addressing Mode Collapse in Quantum GANs via Autoencoding Priors
- Authors: Aaron Mark Thomas, Harry Youel, Sharu Theresa Jose,
- Abstract summary: VAE-QWGAN combines the strengths of a classical Variational AutoEncoder (VAE) with a hybrid Quantum Wasserstein GAN (QWGAN)<n>We show that VAE-QWGAN demonstrates significant improvement over existing QGAN approaches.
- Score: 3.823356975862005
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent proposals for quantum generative adversarial networks (GANs) suffer from the issue of mode collapse, analogous to classical GANs, wherein the distribution learnt by the GAN fails to capture the high mode complexities of the target distribution. Mode collapse can arise due to the use of uninformed prior distributions in the generative learning task. To alleviate the issue of mode collapse for quantum GANs, this work presents a novel \textbf{hybrid quantum-classical generative model}, the VAE-QWGAN, which combines the strengths of a classical Variational AutoEncoder (VAE) with a hybrid Quantum Wasserstein GAN (QWGAN). The VAE-QWGAN fuses the VAE decoder and QWGAN generator into a single quantum model, and utilizes the VAE encoder for data-dependant latent vector sampling during training. This in turn, enhances the diversity and quality of generated images. To generate new data from the trained model at inference, we sample from a Gaussian mixture model (GMM) prior that is learnt on the latent vectors generated during training. We conduct extensive experiments for image generation QGANs on MNIST/Fashion-MNIST datasets and compute a range of metrics that measure the diversity and quality of generated samples. We show that VAE-QWGAN demonstrates significant improvement over existing QGAN approaches.
Related papers
- Improving GANs by leveraging the quantum noise from real hardware [0.0]
We propose a novel approach to generative adversarial networks (GANs) in which the standard i.i.d. Gaussian latent prior is replaced or hybridized with a quantum-correlated prior.<n>We show that intrinsic quantum randomness and device-specific imperfections can provide a structured inductive bias that enhances GAN performance.
arXiv Detail & Related papers (2025-07-02T16:56:09Z) - Text Generation Beyond Discrete Token Sampling [75.96920867382859]
Mixture of Inputs (MoI) is a training-free method for autoregressive generation.<n>MoI consistently improves performance across multiple models including QwQ-32B, Nemotron-Super-49B, Gemma-3-27B, and DAPO-Qwen-32B.
arXiv Detail & Related papers (2025-05-20T18:41:46Z) - Quantum Generative Models for Image Generation: Insights from MNIST and MedMNIST [0.0]
We introduce two novel noise strategies: intrinsic quantum-generated noise and a tailored noise scheduling mechanism.
We evaluate our model on MNIST and MedMNIST datasets to examine its feasibility and performance.
arXiv Detail & Related papers (2025-03-30T06:36:22Z) - Investigating Parameter-Efficiency of Hybrid QuGANs Based on Geometric Properties of Generated Sea Route Graphs [3.9456729020535013]
We use quantum-classical hybrid generative adversarial networks (QuGANs) to artificially generate graphs of shipping routes.<n>We compare hybrid QuGANs with classical Generative Adversarial Networks (GANs)<n>Our results indicate that QuGANs are indeed able to quickly learn and represent underlying geometric properties and distributions.
arXiv Detail & Related papers (2025-01-15T09:08:05Z) - Quantum Down Sampling Filter for Variational Auto-encoder [0.504868948270058]
Variational autoencoders (VAEs) are fundamental for generative modeling and image reconstruction.
This study introduces a hybrid model, quantum variational autoencoder (Q-VAE)
Q-VAE integrates quantum encoding within the encoder while utilizing fully connected layers to extract meaningful representations.
arXiv Detail & Related papers (2025-01-09T11:08:55Z) - Efficient Generative Modeling with Residual Vector Quantization-Based Tokens [5.949779668853557]
ResGen is an efficient RVQ-based discrete diffusion model that generates high-fidelity samples without compromising sampling speed.
We validate the efficacy and generalizability of the proposed method on two challenging tasks: conditional image generation on ImageNet 256x256 and zero-shot text-to-speech synthesis.
As we scale the depth of RVQ, our generative models exhibit enhanced generation fidelity or faster sampling speeds compared to similarly sized baseline models.
arXiv Detail & Related papers (2024-12-13T15:31:17Z) - A Matrix Product State Model for Simultaneous Classification and Generation [0.8192907805418583]
Quantum machine learning (QML) is a rapidly expanding field that merges the principles of quantum computing with the techniques of machine learning.
Here, we present a novel matrix product state (MPS) model, where the MPS functions as both a classifier and a generator.
Our contributions offer insights into the mechanics of tensor network methods for generation tasks.
arXiv Detail & Related papers (2024-06-25T10:23:36Z) - Towards Efficient Quantum Hybrid Diffusion Models [68.43405413443175]
We propose a new methodology to design quantum hybrid diffusion models.
We propose two possible hybridization schemes combining quantum computing's superior generalization with classical networks' modularity.
arXiv Detail & Related papers (2024-02-25T16:57:51Z) - Approximately Equivariant Quantum Neural Network for $p4m$ Group
Symmetries in Images [30.01160824817612]
This work proposes equivariant Quantum Convolutional Neural Networks (EquivQCNNs) for image classification under planar $p4m$ symmetry.
We present the results tested in different use cases, such as phase detection of the 2D Ising model and classification of the extended MNIST dataset.
arXiv Detail & Related papers (2023-10-03T18:01:02Z) - A Bayesian Non-parametric Approach to Generative Models: Integrating Variational Autoencoder and Generative Adversarial Networks using Wasserstein and Maximum Mean Discrepancy [2.5109359014278954]
We propose a novel generative model within the Bayesian non-parametric learning (BNPL) framework to address some notable failure modes in generative adversarial networks (GANs) and variational autoencoders (VAEs)<n>We will demonstrate that the BNPL framework enhances training stability and provides robustness and accuracy guarantees when incorporating the Wasserstein distance and maximum mean discrepancy measure (WMMD) into our model's loss function.
arXiv Detail & Related papers (2023-08-27T08:58:31Z) - Learning hard distributions with quantum-enhanced Variational
Autoencoders [2.545905720487589]
We introduce a quantum-enhanced VAE (QeVAE) that uses quantum correlations to improve the fidelity over classical VAEs.
We empirically show that the QeVAE outperforms classical models on several classes of quantum states.
Our work paves the way for new applications of quantum generative learning algorithms.
arXiv Detail & Related papers (2023-05-02T16:50:24Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - Fully Bayesian Autoencoders with Latent Sparse Gaussian Processes [23.682509357305406]
Autoencoders and their variants are among the most widely used models in representation learning and generative modeling.
We propose a novel Sparse Gaussian Process Bayesian Autoencoder model in which we impose fully sparse Gaussian Process priors on the latent space of a Bayesian Autoencoder.
arXiv Detail & Related papers (2023-02-09T09:57:51Z) - Hybrid Quantum-Classical Generative Adversarial Network for High
Resolution Image Generation [14.098992977726942]
Quantum machine learning (QML) has received increasing attention due to its potential to outperform classical machine learning methods in various problems.
A subclass of QML methods is quantum generative adversarial networks (QGANs) which have been studied as a quantum counterpart of classical GANs.
Here we integrate classical and quantum techniques to propose a new hybrid quantum-classical GAN framework.
arXiv Detail & Related papers (2022-12-22T11:18:35Z) - FewGAN: Generating from the Joint Distribution of a Few Images [95.6635227371479]
We introduce FewGAN, a generative model for generating novel, high-quality and diverse images.
FewGAN is a hierarchical patch-GAN that applies quantization at the first coarse scale, followed by a pyramid of residual fully convolutional GANs at finer scales.
In an extensive set of experiments, it is shown that FewGAN outperforms baselines both quantitatively and qualitatively.
arXiv Detail & Related papers (2022-07-18T07:11:28Z) - ClusterQ: Semantic Feature Distribution Alignment for Data-Free
Quantization [111.12063632743013]
We propose a new and effective data-free quantization method termed ClusterQ.
To obtain high inter-class separability of semantic features, we cluster and align the feature distribution statistics.
We also incorporate the intra-class variance to solve class-wise mode collapse.
arXiv Detail & Related papers (2022-04-30T06:58:56Z) - A new perspective on probabilistic image modeling [92.89846887298852]
We present a new probabilistic approach for image modeling capable of density estimation, sampling and tractable inference.
DCGMMs can be trained end-to-end by SGD from random initial conditions, much like CNNs.
We show that DCGMMs compare favorably to several recent PC and SPN models in terms of inference, classification and sampling.
arXiv Detail & Related papers (2022-03-21T14:53:57Z) - Diffusion bridges vector quantized Variational AutoEncoders [0.0]
We show that our model is competitive with the autoregressive prior on the mini-Imagenet dataset.
Our framework also extends the standard VQ-VAE and enables end-to-end training.
arXiv Detail & Related papers (2022-02-10T08:38:12Z) - Controllable and Compositional Generation with Latent-Space Energy-Based
Models [60.87740144816278]
Controllable generation is one of the key requirements for successful adoption of deep generative models in real-world applications.
In this work, we use energy-based models (EBMs) to handle compositional generation over a set of attributes.
By composing energy functions with logical operators, this work is the first to achieve such compositionality in generating photo-realistic images of resolution 1024x1024.
arXiv Detail & Related papers (2021-10-21T03:31:45Z) - Quantum Machine Learning with SQUID [64.53556573827525]
We present the Scaled QUantum IDentifier (SQUID), an open-source framework for exploring hybrid Quantum-Classical algorithms for classification problems.
We provide examples of using SQUID in a standard binary classification problem from the popular MNIST dataset.
arXiv Detail & Related papers (2021-04-30T21:34:11Z) - Anomaly detection with variational quantum generative adversarial
networks [0.0]
Generative adversarial networks (GANs) are a machine learning framework comprising a generative model for sampling from a target distribution.
We introduce variational quantum-classical Wasserstein GANs to address these issues and embed this model in a classical machine learning framework for anomaly detection.
Our model replaces the generator of Wasserstein GANs with a hybrid quantum-classical neural net and leaves the classical discriminative model unchanged.
arXiv Detail & Related papers (2020-10-20T17:48:04Z) - GANs with Variational Entropy Regularizers: Applications in Mitigating
the Mode-Collapse Issue [95.23775347605923]
Building on the success of deep learning, Generative Adversarial Networks (GANs) provide a modern approach to learn a probability distribution from observed samples.
GANs often suffer from the mode collapse issue where the generator fails to capture all existing modes of the input distribution.
We take an information-theoretic approach and maximize a variational lower bound on the entropy of the generated samples to increase their diversity.
arXiv Detail & Related papers (2020-09-24T19:34:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.