Sample-based training of quantum generative models
- URL: http://arxiv.org/abs/2511.11802v1
- Date: Fri, 14 Nov 2025 19:00:02 GMT
- Title: Sample-based training of quantum generative models
- Authors: Maria Demidik, Cenk Tüysüz, Michele Grossi, Karl Jansen,
- Abstract summary: We introduce a training framework that extends the principle of contrastive divergence to quantum models.<n>By deriving the circuit structure and providing a general recipe for constructing it, we obtain quantum circuits that generate the samples required for parameter updates.
- Score: 1.3521721488318912
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum computers can efficiently sample from probability distributions that are believed to be classically intractable, providing a foundation for quantum generative modeling. However, practical training of such models remains challenging, as gradient evaluation via the parameter-shift rule scales linearly with the number of parameters and requires repeated expectation-value estimation under finite-shot noise. We introduce a training framework that extends the principle of contrastive divergence to quantum models. By deriving the circuit structure and providing a general recipe for constructing it, we obtain quantum circuits that generate the samples required for parameter updates, yielding constant scaling with respect to the cost of a forward pass, analogous to backpropagation in classical neural networks. Numerical results demonstrate that it attains comparable accuracy to likelihood-based optimization while requiring substantially fewer samples. The framework thereby establishes a scalable route to training expressive quantum generative models directly on quantum hardware.
Related papers
- Symbolic Pauli Propagation for Gradient-Enabled Pre-Training of Quantum Circuits [0.0]
Quantum Machine Learning models typically require expensive on-chip training procedures and often lack efficient gradient estimation methods.<n>By employing Pauli propagation, it is possible to derive a symbolic representation of observables as analytic functions of a circuit's parameters.<n>The proposed approach is demonstrated on the Variational Quantum Eigensolver for obtaining the ground state of a spin model.
arXiv Detail & Related papers (2025-12-18T15:44:07Z) - Fermionic Born Machines: Classical training of quantum generative models based on Fermion Sampling [0.0]
We introduce Fermionic Born Machines as an example of classically trainable quantum generative models.<n>The model employs parameterized magic states and fermionic linear optical (FLO) transformations with learnable parameters.<n>The specific structure of the ansatz induces a loss landscape that exhibits favorable characteristics for optimization.
arXiv Detail & Related papers (2025-11-17T19:03:03Z) - A purely Quantum Generative Modeling through Unitary Scrambling and Collapse [6.647966634235082]
Quantum Scrambling and Collapse Generative Model (QGen) is a purely quantum paradigm that eliminates classical dependencies.<n>We introduce a measurement-based training principle that decomposes learning into tractable subproblems, mitigating barren plateaus.<n> Empirically, QGen outperforms classical and hybrid baselines under matched parameter budget, while maintaining robustness under finite-shot sampling.
arXiv Detail & Related papers (2025-06-12T11:00:21Z) - VQC-MLPNet: An Unconventional Hybrid Quantum-Classical Architecture for Scalable and Robust Quantum Machine Learning [50.95799256262098]
Variational quantum circuits (VQCs) hold promise for quantum machine learning but face challenges in expressivity, trainability, and noise resilience.<n>We propose VQC-MLPNet, a hybrid architecture where a VQC generates the first-layer weights of a classical multilayer perceptron during training, while inference is performed entirely classically.
arXiv Detail & Related papers (2025-06-12T01:38:15Z) - Flowing Through Hilbert Space: Quantum-Enhanced Generative Models for Lattice Field Theory [0.9208007322096533]
We develop a hybrid quantum-classical normalizing flow model to explore quantum-enhanced sampling in such regimes.<n>Our approach embeds parameterized quantum circuits within a classical normalizing flow architecture, leveraging amplitude encoding and quantum entanglement to enhance expressivity in the generative process.
arXiv Detail & Related papers (2025-05-15T17:58:16Z) - Train on classical, deploy on quantum: scaling generative quantum machine learning to a thousand qubits [0.27309692684728604]
We show that instantaneous generative models based on quantum circuits can be trained efficiently on classical hardware.<n>By combining our approach with a data-dependent parameter initialisation strategy, we do not encounter issues of barren plateaus.<n>We find that the quantum models can successfully learn from high dimensional data, and perform surprisingly well compared to simple energy-based classical generative models.
arXiv Detail & Related papers (2025-03-04T19:00:02Z) - Quantum Latent Diffusion Models [65.16624577812436]
We propose a potential version of a quantum diffusion model that leverages the established idea of classical latent diffusion models.<n>This involves using a traditional autoencoder to reduce images, followed by operations with variational circuits in the latent space.<n>The results demonstrate an advantage in using a quantum version, as evidenced by obtaining better metrics for the images generated by the quantum version.
arXiv Detail & Related papers (2025-01-19T21:24:02Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [62.46800898243033]
Recent progress in quantum learning theory prompts a question: can linear properties of a large-qubit circuit be efficiently learned from measurement data generated by varying classical inputs?<n>We prove that the sample complexity scaling linearly in $d$ is required to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.<n>We propose a kernel-based method leveraging classical shadows and truncated trigonometric expansions, enabling a controllable trade-off between prediction accuracy and computational overhead.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Parameterized quantum circuits as universal generative models for continuous multivariate distributions [1.118478900782898]
ized quantum circuits have been extensively used as the basis for machine learning models in regression, classification, and generative tasks.
In this work, we elucidate expectation value sampling-based models and prove the universality of such variational quantum algorithms.
Our results may help guide the design of future quantum circuits in generative modelling tasks.
arXiv Detail & Related papers (2024-02-15T10:08:31Z) - Importance sampling for stochastic quantum simulations [68.8204255655161]
We introduce the qDrift protocol, which builds random product formulas by sampling from the Hamiltonian according to the coefficients.
We show that the simulation cost can be reduced while achieving the same accuracy, by considering the individual simulation cost during the sampling stage.
Results are confirmed by numerical simulations performed on a lattice nuclear effective field theory.
arXiv Detail & Related papers (2022-12-12T15:06:32Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - Quantum algorithms for quantum dynamics: A performance study on the
spin-boson model [68.8204255655161]
Quantum algorithms for quantum dynamics simulations are traditionally based on implementing a Trotter-approximation of the time-evolution operator.
variational quantum algorithms have become an indispensable alternative, enabling small-scale simulations on present-day hardware.
We show that, despite providing a clear reduction of quantum gate cost, the variational method in its current implementation is unlikely to lead to a quantum advantage.
arXiv Detail & Related papers (2021-08-09T18:00:05Z) - Flexible Model Aggregation for Quantile Regression [92.63075261170302]
Quantile regression is a fundamental problem in statistical learning motivated by a need to quantify uncertainty in predictions.
We investigate methods for aggregating any number of conditional quantile models.
All of the models we consider in this paper can be fit using modern deep learning toolkits.
arXiv Detail & Related papers (2021-02-26T23:21:16Z) - Gradient $\ell_1$ Regularization for Quantization Robustness [70.39776106458858]
We derive a simple regularization scheme that improves robustness against post-training quantization.
By training quantization-ready networks, our approach enables storing a single set of weights that can be quantized on-demand to different bit-widths.
arXiv Detail & Related papers (2020-02-18T12:31:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.