Conditioning in Generative Quantum Denoising Diffusion Models
- URL: http://arxiv.org/abs/2509.17569v1
- Date: Mon, 22 Sep 2025 11:01:03 GMT
- Title: Conditioning in Generative Quantum Denoising Diffusion Models
- Authors: Daniel Quinn, Lorenzo Buffoni, Stefano Gherardini, Gabriele De Chiara,
- Abstract summary: We introduce a conditioning mechanism that enables the generation of quantum states from multiple target distributions.<n>We validate our method through numerical simulations that span single-qubit generation tasks, entangled state preparation, and many-body ground state generation.
- Score: 0.8399688944263843
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Quantum denoising diffusion models have recently emerged as a powerful framework for generative quantum machine learning. In this work, we extend these models by introducing a conditioning mechanism that enables the generation of quantum states drawn from multiple target distributions. By sharing parameters across distinct classes of quantum states, our approach avoids the need to train separate models for each distribution. We validate our method through numerical simulations that span single-qubit generation tasks, entangled state preparation, and many-body ground state generation. Across these tasks, conditioning significantly reduced the error of targeted state generation by up to an order of magnitude. Finally, we perform an ablation study to quantify the effect of key hyperparameters on the model performance.
Related papers
- Quantum Scrambling Born Machine [0.0]
Quantum generative modeling, where the Born rule naturally defines probability distributions, is a promising near-term application of quantum computing.<n>We propose a Quantum Scrambling Born Machine in which a fixed entangling unitary provides multi-qubit entanglement, while only single-qubit rotations are optimized.<n>We show that, for the benchmark distributions and system sizes considered, once the entangler produces near-Haar-typical entanglement the model learns the target distribution with weak sensitivity to the scrambler's microscopic origin.
arXiv Detail & Related papers (2026-02-19T11:33:56Z) - Mitigating Barren plateaus in quantum denoising diffusion probabilistic models [49.90716699848553]
Quantum generative models leverage quantum superposition and entanglement to enhance learning efficiency for both classical and quantum data.<n>QuDDPM has been proposed as a promising framework for quantum generative learning.<n>We show that barren plateaus emerge in QuDDPMs due to the use of 2-design states as the input for the denoising process.<n>We introduce an improved QuDDPM that utilizes a distribution maintaining a certain distance from the Haar distribution, ensuring better trainability.
arXiv Detail & Related papers (2025-12-07T07:01:44Z) - Overcoming Dimensional Factorization Limits in Discrete Diffusion Models through Quantum Joint Distribution Learning [79.65014491424151]
We propose a quantum Discrete Denoising Diffusion Probabilistic Model (QD3PM)<n>It enables joint probability learning through diffusion and denoising in exponentially large Hilbert spaces.<n>This paper establishes a new theoretical paradigm in generative models by leveraging the quantum advantage in joint distribution learning.
arXiv Detail & Related papers (2025-05-08T11:48:21Z) - Quantum Latent Diffusion Models [65.16624577812436]
We propose a potential version of a quantum diffusion model that leverages the established idea of classical latent diffusion models.<n>This involves using a traditional autoencoder to reduce images, followed by operations with variational circuits in the latent space.<n>The results demonstrate an advantage in using a quantum version, as evidenced by obtaining better metrics for the images generated by the quantum version.
arXiv Detail & Related papers (2025-01-19T21:24:02Z) - Energy-Based Diffusion Language Models for Text Generation [126.23425882687195]
Energy-based Diffusion Language Model (EDLM) is an energy-based model operating at the full sequence level for each diffusion step.<n>Our framework offers a 1.3$times$ sampling speedup over existing diffusion models.
arXiv Detail & Related papers (2024-10-28T17:25:56Z) - Quantum State Generation with Structure-Preserving Diffusion Model [33.108168285414195]
This article considers the generative modeling of the (mixed) states of quantum systems.
The key contribution is an algorithmic innovation that respects the physical nature of quantum states.
arXiv Detail & Related papers (2024-04-09T14:21:51Z) - QuEST: Low-bit Diffusion Model Quantization via Efficient Selective Finetuning [52.157939524815866]
In this paper, we identify imbalanced activation distributions as a primary source of quantization difficulty.<n>We propose to adjust these distributions through weight finetuning to be more quantization-friendly.<n>Our method demonstrates its efficacy across three high-resolution image generation tasks.
arXiv Detail & Related papers (2024-02-06T03:39:44Z) - Quantum Denoising Diffusion Models [4.763438526927999]
We introduce two quantum diffusion models and benchmark their capabilities against their classical counterparts.
Our models surpass the classical models with similar parameter counts in terms of performance metrics FID, SSIM, and PSNR.
arXiv Detail & Related papers (2024-01-13T11:38:08Z) - Quantum Generative Diffusion Model: A Fully Quantum-Mechanical Model for Generating Quantum State Ensemble [40.06696963935616]
We introduce Quantum Generative Diffusion Model (QGDM) as their simple and elegant quantum counterpart.
QGDM exhibits faster convergence than Quantum Generative Adversarial Network (QGAN)
It can achieve 53.02% higher fidelity in mixed-state generation than QGAN.
arXiv Detail & Related papers (2024-01-13T10:56:34Z) - Quantum-Noise-Driven Generative Diffusion Models [1.6385815610837167]
We propose three quantum-noise-driven generative diffusion models that could be experimentally tested on real quantum systems.
The idea is to harness unique quantum features, in particular the non-trivial interplay among coherence, entanglement and noise.
Our results are expected to pave the way for new quantum-inspired or quantum-based generative diffusion algorithms.
arXiv Detail & Related papers (2023-08-23T09:09:32Z) - Learning hard distributions with quantum-enhanced Variational
Autoencoders [2.545905720487589]
We introduce a quantum-enhanced VAE (QeVAE) that uses quantum correlations to improve the fidelity over classical VAEs.
We empirically show that the QeVAE outperforms classical models on several classes of quantum states.
Our work paves the way for new applications of quantum generative learning algorithms.
arXiv Detail & Related papers (2023-05-02T16:50:24Z) - State preparation and measurement in a quantum simulation of the O(3)
sigma model [65.01359242860215]
We show that fixed points of the non-linear O(3) sigma model can be reproduced near a quantum phase transition of a spin model with just two qubits per lattice site.
We apply Trotter methods to obtain results for the complexity of adiabatic ground state preparation in both the weak-coupling and quantum-critical regimes.
We present and analyze a quantum algorithm based on non-unitary randomized simulation methods.
arXiv Detail & Related papers (2020-06-28T23:44:12Z) - Certified variational quantum algorithms for eigenstate preparation [0.0]
We develop a means to certify the termination of variational algorithms.
We demonstrate our approach by applying it to three models: the transverse field Ising model, the model of one-dimensional spinless fermions with competing interactions, and the Schwinger model of quantum electrodynamics.
arXiv Detail & Related papers (2020-06-23T18:00:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.