Generative quantum machine learning via denoising diffusion
probabilistic models
- URL: http://arxiv.org/abs/2310.05866v4
- Date: Fri, 16 Feb 2024 16:39:10 GMT
- Title: Generative quantum machine learning via denoising diffusion
probabilistic models
- Authors: Bingzhi Zhang, Peng Xu, Xiaohui Chen and Quntao Zhuang
- Abstract summary: We propose the quantum denoising diffusion probabilistic model (QuDDPM) to enable efficiently trainable generative learning of quantum data.
We provide bounds on the learning error and demonstrate QuDDPM's capability in learning correlated quantum noise model, quantum many-body phases, and topological structure of quantum data.
- Score: 17.439525936236166
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep generative models are key-enabling technology to computer vision, text
generation, and large language models. Denoising diffusion probabilistic models
(DDPMs) have recently gained much attention due to their ability to generate
diverse and high-quality samples in many computer vision tasks, as well as to
incorporate flexible model architectures and a relatively simple training
scheme. Quantum generative models, empowered by entanglement and superposition,
have brought new insight to learning classical and quantum data. Inspired by
the classical counterpart, we propose the quantum denoising diffusion
probabilistic model (QuDDPM) to enable efficiently trainable generative
learning of quantum data. QuDDPM adopts sufficient layers of circuits to
guarantee expressivity, while it introduces multiple intermediate training
tasks as interpolation between the target distribution and noise to avoid
barren plateau and guarantee efficient training. We provide bounds on the
learning error and demonstrate QuDDPM's capability in learning correlated
quantum noise model, quantum many-body phases, and topological structure of
quantum data. The results provide a paradigm for versatile and efficient
quantum generative learning.
Related papers
- Mixed-State Quantum Denoising Diffusion Probabilistic Model [0.40964539027092906]
We propose a mixed-state quantum denoising diffusion probabilistic model (MSQuDDPM) to eliminate the need for scrambling unitaries.
MSQuDDPM integrates depolarizing noise channels in the forward diffusion process and parameterized quantum circuits with projective measurements in the backward denoising steps.
We evaluate MSQuDDPM on quantum ensemble generation tasks, demonstrating its successful performance.
arXiv Detail & Related papers (2024-11-26T17:20:58Z) - Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - The curse of random quantum data [62.24825255497622]
We quantify the performances of quantum machine learning in the landscape of quantum data.
We find that the training efficiency and generalization capabilities in quantum machine learning will be exponentially suppressed with the increase in qubits.
Our findings apply to both the quantum kernel method and the large-width limit of quantum neural networks.
arXiv Detail & Related papers (2024-08-19T12:18:07Z) - Quantum-Noise-Driven Generative Diffusion Models [1.6385815610837167]
We propose three quantum-noise-driven generative diffusion models that could be experimentally tested on real quantum systems.
The idea is to harness unique quantum features, in particular the non-trivial interplay among coherence, entanglement and noise.
Our results are expected to pave the way for new quantum-inspired or quantum-based generative diffusion algorithms.
arXiv Detail & Related papers (2023-08-23T09:09:32Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Generative Quantum Machine Learning [0.0]
The aim of this thesis is to develop new generative quantum machine learning algorithms.
We introduce a quantum generative adversarial network and a quantum Boltzmann machine implementation, both of which can be realized with parameterized quantum circuits.
arXiv Detail & Related papers (2021-11-24T19:00:21Z) - Enhancing Generative Models via Quantum Correlations [1.6099403809839032]
Generative modeling using samples drawn from the probability distribution constitutes a powerful approach for unsupervised machine learning.
We show theoretically that such quantum correlations provide a powerful resource for generative modeling.
We numerically test this separation on standard machine learning data sets and show that it holds for practical problems.
arXiv Detail & Related papers (2021-01-20T22:57:22Z) - Generation of High-Resolution Handwritten Digits with an Ion-Trap
Quantum Computer [55.41644538483948]
We implement a quantum-circuit based generative model to learn and sample the prior distribution of a Generative Adversarial Network.
We train this hybrid algorithm on an ion-trap device based on $171$Yb$+$ ion qubits to generate high-quality images.
arXiv Detail & Related papers (2020-12-07T18:51:28Z) - Experimental Quantum Generative Adversarial Networks for Image
Generation [93.06926114985761]
We experimentally achieve the learning and generation of real-world hand-written digit images on a superconducting quantum processor.
Our work provides guidance for developing advanced quantum generative models on near-term quantum devices.
arXiv Detail & Related papers (2020-10-13T06:57:17Z) - Noise robustness and experimental demonstration of a quantum generative
adversarial network for continuous distributions [0.5249805590164901]
We numerically simulate the noisy hybrid quantum generative adversarial networks (HQGANs) to learn continuous probability distributions.
We also investigate the effect of different parameters on the training time to reduce the computational scaling of the algorithm.
Our results pave the way for experimental exploration of different quantum machine learning algorithms on noisy intermediate scale quantum devices.
arXiv Detail & Related papers (2020-06-02T23:14:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.