Quantum latent distributions in deep generative models
- URL: http://arxiv.org/abs/2508.19857v1
- Date: Wed, 27 Aug 2025 13:20:01 GMT
- Title: Quantum latent distributions in deep generative models
- Authors: Omar Bacarreza, Thorin Farnsworth, Alexander Makarovskiy, Hugo Wallner, Tessa Hicks, Santiago Sempere-Llagostera, John Price, Robert J. A. Francis-Jones, William R. Clements,
- Abstract summary: We show that quantum latent distributions can lead to improved generative performance in GANs.<n>This work confirms that near-term quantum processors can expand the capabilities of deep generative models.
- Score: 31.751144081456683
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many successful families of generative models leverage a low-dimensional latent distribution that is mapped to a data distribution. Though simple latent distributions are commonly used, it has been shown that more sophisticated distributions can improve performance. For instance, recent work has explored using the distributions produced by quantum processors and found empirical improvements. However, when latent space distributions produced by quantum processors can be expected to improve performance, and whether these improvements are reproducible, are open questions that we investigate in this work. We prove that, under certain conditions, these "quantum latent distributions" enable generative models to produce data distributions that classical latent distributions cannot efficiently produce. We also provide actionable intuitions to identify when such quantum advantages may arise in real-world settings. We perform benchmarking experiments on both a synthetic quantum dataset and the QM9 molecular dataset, using both simulated and real photonic quantum processors. Our results demonstrate that quantum latent distributions can lead to improved generative performance in GANs compared to a range of classical baselines. We also explore diffusion and flow matching models, identifying architectures compatible with quantum latent distributions. This work confirms that near-term quantum processors can expand the capabilities of deep generative models.
Related papers
- Limits of quantum generative models with classical sampling hardness [2.321580694317368]
We study quantum generative models from the perspective of output distributions.<n>We find that models that anticoncentrate are not trainable on average, including those exhibiting quantum advantage.<n>We conclude that quantum advantage can still be found in generative models, although its source must be distinct from anticoncentration.
arXiv Detail & Related papers (2025-12-31T11:40:50Z) - Mitigating Barren plateaus in quantum denoising diffusion probabilistic models [49.90716699848553]
Quantum generative models leverage quantum superposition and entanglement to enhance learning efficiency for both classical and quantum data.<n>QuDDPM has been proposed as a promising framework for quantum generative learning.<n>We show that barren plateaus emerge in QuDDPMs due to the use of 2-design states as the input for the denoising process.<n>We introduce an improved QuDDPM that utilizes a distribution maintaining a certain distance from the Haar distribution, ensuring better trainability.
arXiv Detail & Related papers (2025-12-07T07:01:44Z) - Overcoming Dimensional Factorization Limits in Discrete Diffusion Models through Quantum Joint Distribution Learning [79.65014491424151]
We propose a quantum Discrete Denoising Diffusion Probabilistic Model (QD3PM)<n>It enables joint probability learning through diffusion and denoising in exponentially large Hilbert spaces.<n>This paper establishes a new theoretical paradigm in generative models by leveraging the quantum advantage in joint distribution learning.
arXiv Detail & Related papers (2025-05-08T11:48:21Z) - Quantum Chebyshev Probabilistic Models for Fragmentation Functions [14.379311972506791]
We study fragmentation functions(FFs) of charged pions and kaons from single-inclusive hadron annihilation in electron-positron.<n>Our results highlight the growing potential of quantum generative modeling for addressing problems in scientific discovery and advancing data analysis in high-energy physics.
arXiv Detail & Related papers (2025-03-20T12:09:44Z) - Quantum Latent Diffusion Models [65.16624577812436]
We propose a potential version of a quantum diffusion model that leverages the established idea of classical latent diffusion models.<n>This involves using a traditional autoencoder to reduce images, followed by operations with variational circuits in the latent space.<n>The results demonstrate an advantage in using a quantum version, as evidenced by obtaining better metrics for the images generated by the quantum version.
arXiv Detail & Related papers (2025-01-19T21:24:02Z) - Quantum-Noise-Driven Generative Diffusion Models [1.6385815610837167]
We propose three quantum-noise-driven generative diffusion models that could be experimentally tested on real quantum systems.
The idea is to harness unique quantum features, in particular the non-trivial interplay among coherence, entanglement and noise.
Our results are expected to pave the way for new quantum-inspired or quantum-based generative diffusion algorithms.
arXiv Detail & Related papers (2023-08-23T09:09:32Z) - Importance sampling for stochastic quantum simulations [68.8204255655161]
We introduce the qDrift protocol, which builds random product formulas by sampling from the Hamiltonian according to the coefficients.
We show that the simulation cost can be reduced while achieving the same accuracy, by considering the individual simulation cost during the sampling stage.
Results are confirmed by numerical simulations performed on a lattice nuclear effective field theory.
arXiv Detail & Related papers (2022-12-12T15:06:32Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - Learnability of the output distributions of local quantum circuits [53.17490581210575]
We investigate, within two different oracle models, the learnability of quantum circuit Born machines.
We first show a negative result, that the output distributions of super-logarithmic depth Clifford circuits are not sample-efficiently learnable.
We show that in a more powerful oracle model, namely when directly given access to samples, the output distributions of local Clifford circuits are computationally efficiently PAC learnable.
arXiv Detail & Related papers (2021-10-11T18:00:20Z) - Learnability and Complexity of Quantum Samples [26.425493366198207]
Given a quantum circuit, a quantum computer can sample the output distribution exponentially faster in the number of bits than classical computers.
Can we learn the underlying quantum distribution using models with training parameters that scale in n under a fixed training time?
We study four kinds of generative models: Deep Boltzmann machine (DBM), Generative Adrial Networks (GANs), Long Short-Term Memory (LSTM) and Autoregressive GAN, on learning quantum data set generated by deep random circuits.
arXiv Detail & Related papers (2020-10-22T18:45:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.