Quantum Neural Networks in Practice: A Comparative Study with Classical Models from Standard Data Sets to Industrial Images
- URL: http://arxiv.org/abs/2411.19276v1
- Date: Thu, 28 Nov 2024 17:13:45 GMT
- Title: Quantum Neural Networks in Practice: A Comparative Study with Classical Models from Standard Data Sets to Industrial Images
- Authors: Daniel Basilewitsch, João F. Bravo, Christian Tutschku, Frederick Struckmeier,
- Abstract summary: In this study, we compare the performance of randomized classical and quantum neural networks for the task of binary image classification.
Our study provides an industry perspective on the prospects of quantum machine learning for practical image classification tasks.
- Score: 0.5892638927736115
- License:
- Abstract: Image classification tasks are among the most prominent examples that can be reliably solved by classical machine learning models. In this study, we compare the performance of randomized classical and quantum neural networks as well as classical and quantum-classical hybrid convolutional neural networks for the task of binary image classification. To this end, we employ various data sets of increasing complexity - (i) an artificial hypercube dataset, (ii) MNIST handwritten digits, and (iii) real-world industrial images from laser cutting machines. We analyze the performance of the employed quantum models with respect to correlations between classification accuracy and various hyperparameters. For the random quantum neural networks, we additionally compare their performance with some known literature models and how top-performing models from one data set perform on the others. In general, we observe fairly similar performances of classical and quantum or hybrid models. Our study provides an industry perspective on the prospects of quantum machine learning for practical image classification tasks.
Related papers
- Quantum Latent Diffusion Models [65.16624577812436]
We propose a potential version of a quantum diffusion model that leverages the established idea of classical latent diffusion models.
This involves using a traditional autoencoder to reduce images, followed by operations with variational circuits in the latent space.
The results demonstrate an advantage in using a quantum version, as evidenced by obtaining better metrics for the images generated by the quantum version.
arXiv Detail & Related papers (2025-01-19T21:24:02Z) - Photonic quantum generative adversarial networks for classical data [0.0]
In generative learning, models are trained to produce new samples that follow the distribution of the target data.
We present a quantum GAN based on linear optical circuits and Fock-space encoding.
We demonstrate that the model can learn to generate images by training the model end-to-end experimentally on a single-photon quantum processor.
arXiv Detail & Related papers (2024-05-09T18:00:10Z) - CQural: A Novel CNN based Hybrid Architecture for Quantum Continual
Machine Learning [0.0]
We show that it is possible to circumvent catastrophic forgetting in continual learning with novel hybrid classical-quantum neural networks.
We also claim that if the model is trained with these explanations, it tends to give better performance and learn specific features that are far from the decision boundary.
arXiv Detail & Related papers (2023-05-16T18:19:12Z) - Quantum machine learning for image classification [39.58317527488534]
This research introduces two quantum machine learning models that leverage the principles of quantum mechanics for effective computations.
Our first model, a hybrid quantum neural network with parallel quantum circuits, enables the execution of computations even in the noisy intermediate-scale quantum era.
A second model introduces a hybrid quantum neural network with a Quanvolutional layer, reducing image resolution via a convolution process.
arXiv Detail & Related papers (2023-04-18T18:23:20Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Comparing concepts of quantum and classical neural network models for
image classification task [0.456877715768796]
This material includes the results of experiments on training and performance of a hybrid quantum-classical neural network.
Although its simulation is time-consuming, the quantum network, although its simulation is time-consuming, overcomes the classical network.
arXiv Detail & Related papers (2021-08-19T18:49:30Z) - Quantum Self-Supervised Learning [22.953284192004034]
We propose a hybrid quantum-classical neural network architecture for contrastive self-supervised learning.
We apply our best quantum model to classify unseen images on the ibmq_paris quantum computer.
arXiv Detail & Related papers (2021-03-26T18:00:00Z) - Counterfactual Generative Networks [59.080843365828756]
We propose to decompose the image generation process into independent causal mechanisms that we train without direct supervision.
By exploiting appropriate inductive biases, these mechanisms disentangle object shape, object texture, and background.
We show that the counterfactual images can improve out-of-distribution with a marginal drop in performance on the original classification task.
arXiv Detail & Related papers (2021-01-15T10:23:12Z) - Generation of High-Resolution Handwritten Digits with an Ion-Trap
Quantum Computer [55.41644538483948]
We implement a quantum-circuit based generative model to learn and sample the prior distribution of a Generative Adversarial Network.
We train this hybrid algorithm on an ion-trap device based on $171$Yb$+$ ion qubits to generate high-quality images.
arXiv Detail & Related papers (2020-12-07T18:51:28Z) - Reservoir Memory Machines as Neural Computers [70.5993855765376]
Differentiable neural computers extend artificial neural networks with an explicit memory without interference.
We achieve some of the computational capabilities of differentiable neural computers with a model that can be trained very efficiently.
arXiv Detail & Related papers (2020-09-14T12:01:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.