Fermionic Born Machines: Classical training of quantum generative models based on Fermion Sampling
- URL: http://arxiv.org/abs/2511.13844v1
- Date: Mon, 17 Nov 2025 19:03:03 GMT
- Title: Fermionic Born Machines: Classical training of quantum generative models based on Fermion Sampling
- Authors: Bence Bakó, Zoltán Kolarovszki, Zoltán Zimborás,
- Abstract summary: We introduce Fermionic Born Machines as an example of classically trainable quantum generative models.<n>The model employs parameterized magic states and fermionic linear optical (FLO) transformations with learnable parameters.<n>The specific structure of the ansatz induces a loss landscape that exhibits favorable characteristics for optimization.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum generative learning is a promising application of quantum computers, but faces several trainability challenges, including the difficulty in experimental gradient estimations. For certain structured quantum generative models, however, expectation values of local observables can be efficiently computed on a classical computer, enabling fully classical training without quantum gradient evaluations. Although training is classically efficient, sampling from these circuits is still believed to be classically hard, so inference must be carried out on a quantum device, potentially yielding a computational advantage. In this work, we introduce Fermionic Born Machines as an example of such classically trainable quantum generative models. The model employs parameterized magic states and fermionic linear optical (FLO) transformations with learnable parameters. The training exploits a decomposition of the magic states into Gaussian operators, which permits efficient estimation of expectation values. Furthermore, the specific structure of the ansatz induces a loss landscape that exhibits favorable characteristics for optimization. The FLO circuits can be implemented, via fermion-to-qubit mappings, on qubit architectures to sample from the learned distribution during inference. Numerical experiments on systems up to 160 qubits demonstrate the effectiveness of our model and training framework.
Related papers
- Symbolic Pauli Propagation for Gradient-Enabled Pre-Training of Quantum Circuits [0.0]
Quantum Machine Learning models typically require expensive on-chip training procedures and often lack efficient gradient estimation methods.<n>By employing Pauli propagation, it is possible to derive a symbolic representation of observables as analytic functions of a circuit's parameters.<n>The proposed approach is demonstrated on the Variational Quantum Eigensolver for obtaining the ground state of a spin model.
arXiv Detail & Related papers (2025-12-18T15:44:07Z) - Sample-based training of quantum generative models [1.3521721488318912]
We introduce a training framework that extends the principle of contrastive divergence to quantum models.<n>By deriving the circuit structure and providing a general recipe for constructing it, we obtain quantum circuits that generate the samples required for parameter updates.
arXiv Detail & Related papers (2025-11-14T19:00:02Z) - Experimental demonstration of boson sampling as a hardware accelerator for monte carlo integration [0.0]
We present an experimental demonstration of boson sampling as a hardware accelerator for Monte Carlo integration.<n>We implement a proof-of-principle experiment on a programmable photonic platform to compute the first-order energy correction of a three-boson system.
arXiv Detail & Related papers (2025-09-29T18:59:34Z) - Flowing Through Hilbert Space: Quantum-Enhanced Generative Models for Lattice Field Theory [0.9208007322096533]
We develop a hybrid quantum-classical normalizing flow model to explore quantum-enhanced sampling in such regimes.<n>Our approach embeds parameterized quantum circuits within a classical normalizing flow architecture, leveraging amplitude encoding and quantum entanglement to enhance expressivity in the generative process.
arXiv Detail & Related papers (2025-05-15T17:58:16Z) - Train on classical, deploy on quantum: scaling generative quantum machine learning to a thousand qubits [0.27309692684728604]
We show that instantaneous generative models based on quantum circuits can be trained efficiently on classical hardware.<n>By combining our approach with a data-dependent parameter initialisation strategy, we do not encounter issues of barren plateaus.<n>We find that the quantum models can successfully learn from high dimensional data, and perform surprisingly well compared to simple energy-based classical generative models.
arXiv Detail & Related papers (2025-03-04T19:00:02Z) - Quantum Latent Diffusion Models [65.16624577812436]
We propose a potential version of a quantum diffusion model that leverages the established idea of classical latent diffusion models.<n>This involves using a traditional autoencoder to reduce images, followed by operations with variational circuits in the latent space.<n>The results demonstrate an advantage in using a quantum version, as evidenced by obtaining better metrics for the images generated by the quantum version.
arXiv Detail & Related papers (2025-01-19T21:24:02Z) - Efficient quantum-enhanced classical simulation for patches of quantum landscapes [0.0]
We show that it is always possible to generate a classical surrogate of a sub-region of an expectation landscape produced by a parameterized quantum circuit.<n>We provide a quantum-enhanced classical algorithm which, after simple measurements on a quantum device, allows one to classically simulate approximate expectation values of a subregion of a landscape.
arXiv Detail & Related papers (2024-11-29T18:00:07Z) - Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [62.46800898243033]
Recent progress in quantum learning theory prompts a question: can linear properties of a large-qubit circuit be efficiently learned from measurement data generated by varying classical inputs?<n>We prove that the sample complexity scaling linearly in $d$ is required to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.<n>We propose a kernel-based method leveraging classical shadows and truncated trigonometric expansions, enabling a controllable trade-off between prediction accuracy and computational overhead.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Quantum data learning for quantum simulations in high-energy physics [55.41644538483948]
We explore the applicability of quantum-data learning to practical problems in high-energy physics.
We make use of ansatz based on quantum convolutional neural networks and numerically show that it is capable of recognizing quantum phases of ground states.
The observation of non-trivial learning properties demonstrated in these benchmarks will motivate further exploration of the quantum-data learning architecture in high-energy physics.
arXiv Detail & Related papers (2023-06-29T18:00:01Z) - Pre-training Tensor-Train Networks Facilitates Machine Learning with Variational Quantum Circuits [70.97518416003358]
Variational quantum circuits (VQCs) hold promise for quantum machine learning on noisy intermediate-scale quantum (NISQ) devices.
While tensor-train networks (TTNs) can enhance VQC representation and generalization, the resulting hybrid model, TTN-VQC, faces optimization challenges due to the Polyak-Lojasiewicz (PL) condition.
To mitigate this challenge, we introduce Pre+TTN-VQC, a pre-trained TTN model combined with a VQC.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - Efficient estimation of trainability for variational quantum circuits [43.028111013960206]
We find an efficient method to compute the cost function and its variance for a wide class of variational quantum circuits.
This method can be used to certify trainability for variational quantum circuits and explore design strategies that can overcome the barren plateau problem.
arXiv Detail & Related papers (2023-02-09T14:05:18Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - Quantum-tailored machine-learning characterization of a superconducting
qubit [50.591267188664666]
We develop an approach to characterize the dynamics of a quantum device and learn device parameters.
This approach outperforms physics-agnostic recurrent neural networks trained on numerically generated and experimental data.
This demonstration shows how leveraging domain knowledge improves the accuracy and efficiency of this characterization task.
arXiv Detail & Related papers (2021-06-24T15:58:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.