Towards Efficient Quantum Hybrid Diffusion Models
- URL: http://arxiv.org/abs/2402.16147v1
- Date: Sun, 25 Feb 2024 16:57:51 GMT
- Title: Towards Efficient Quantum Hybrid Diffusion Models
- Authors: Francesca De Falco, Andrea Ceschini, Alessandro Sebastianelli,
Bertrand Le Saux, Massimo Panella
- Abstract summary: We propose a new methodology to design quantum hybrid diffusion models.
We propose two possible hybridization schemes combining quantum computing's superior generalization with classical networks' modularity.
- Score: 68.43405413443175
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we propose a new methodology to design quantum hybrid
diffusion models, derived from classical U-Nets with ResNet and Attention
layers. Specifically, we propose two possible different hybridization schemes
combining quantum computing's superior generalization with classical networks'
modularity. In the first one, we acted at the vertex: ResNet convolutional
layers are gradually replaced with variational circuits to create Quantum
ResNet blocks. In the second proposed architecture, we extend the hybridization
to the intermediate level of the encoder, due to its higher sensitivity in the
feature extraction process. In order to conduct an in-depth analysis of the
potential advantages stemming from the integration of quantum layers, images
generated by quantum hybrid diffusion models are compared to those generated by
classical models, and evaluated in terms of several quantitative metrics. The
results demonstrate an advantage in using a hybrid quantum diffusion models, as
they generally synthesize better-quality images and converges faster. Moreover,
they show the additional advantage of having a lower number of parameters to
train compared to the classical one, with a reduction that depends on the
extent to which the vertex is hybridized.
Related papers
- Quantum Transfer Learning for MNIST Classification Using a Hybrid Quantum-Classical Approach [0.0]
This research explores the integration of quantum computing with classical machine learning for image classification tasks.
We propose a hybrid quantum-classical approach that leverages the strengths of both paradigms.
The experimental results indicate that while the hybrid model demonstrates the feasibility of integrating quantum computing with classical techniques, the accuracy of the final model, trained on quantum outcomes, is currently lower than the classical model trained on compressed features.
arXiv Detail & Related papers (2024-08-05T22:16:27Z) - Hybrid Quantum-Classical Normalizing Flow [5.85475369017678]
We propose a hybrid quantum-classical normalizing flow (HQCNF) model based on parameterized quantum circuits.
We test our model on the image generation problem.
Compared with other quantum generative models, such as quantum generative adversarial networks (QGAN), our model achieves lower (better) Fr'echet distance (FID) score.
arXiv Detail & Related papers (2024-05-22T16:37:22Z) - A Comparative Analysis of Hybrid-Quantum Classical Neural Networks [5.247197295547863]
This paper performs an extensive comparative analysis between different hybrid quantum-classical machine learning algorithms for image classification.
The performance comparison of the hybrid models, based on the accuracy, provides us with an understanding of hybrid quantum-classical convergence in correlation with the quantum layer count and the qubit count variations in the circuit.
arXiv Detail & Related papers (2024-02-16T09:59:44Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - The Quantum Path Kernel: a Generalized Quantum Neural Tangent Kernel for
Deep Quantum Machine Learning [52.77024349608834]
Building a quantum analog of classical deep neural networks represents a fundamental challenge in quantum computing.
Key issue is how to address the inherent non-linearity of classical deep learning.
We introduce the Quantum Path Kernel, a formulation of quantum machine learning capable of replicating those aspects of deep machine learning.
arXiv Detail & Related papers (2022-12-22T16:06:24Z) - Hybrid Quantum-Classical Generative Adversarial Network for High
Resolution Image Generation [14.098992977726942]
Quantum machine learning (QML) has received increasing attention due to its potential to outperform classical machine learning methods in various problems.
A subclass of QML methods is quantum generative adversarial networks (QGANs) which have been studied as a quantum counterpart of classical GANs.
Here we integrate classical and quantum techniques to propose a new hybrid quantum-classical GAN framework.
arXiv Detail & Related papers (2022-12-22T11:18:35Z) - Photonic Quantum Computing For Polymer Classification [62.997667081978825]
Two polymer classes visual (VIS) and near-infrared (NIR) are defined based on the size of the polymer gaps.
We present a hybrid classical-quantum approach to the binary classification of polymer structures.
arXiv Detail & Related papers (2022-11-22T11:59:52Z) - Training Hybrid Classical-Quantum Classifiers via Stochastic Variational
Optimization [32.562122826341266]
Quantum machine learning has emerged as a potential practical application of near-term quantum devices.
In this work, we study a two-layer hybrid classical-quantum classifier in which a first layer of quantum neurons implementing generalized linear models (QGLMs) is followed by a second classical combining layer.
Experiments show the advantages of the approach for a variety of activation functions implemented by QGLM neurons.
arXiv Detail & Related papers (2022-01-21T10:30:24Z) - Variational Quantum Optimization with Multi-Basis Encodings [62.72309460291971]
We introduce a new variational quantum algorithm that benefits from two innovations: multi-basis graph complexity and nonlinear activation functions.
Our results in increased optimization performance, two increase in effective landscapes and a reduction in measurement progress.
arXiv Detail & Related papers (2021-06-24T20:16:02Z) - Optimal Gradient Quantization Condition for Communication-Efficient
Distributed Training [99.42912552638168]
Communication of gradients is costly for training deep neural networks with multiple devices in computer vision applications.
In this work, we deduce the optimal condition of both the binary and multi-level gradient quantization for textbfANY gradient distribution.
Based on the optimal condition, we develop two novel quantization schemes: biased BinGrad and unbiased ORQ for binary and multi-level gradient quantization respectively.
arXiv Detail & Related papers (2020-02-25T18:28:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.