Predict better with less training data using a QNN
- URL: http://arxiv.org/abs/2206.03960v1
- Date: Wed, 8 Jun 2022 15:25:58 GMT
- Title: Predict better with less training data using a QNN
- Authors: Barry D. Reese and Marek Kowalik and Christian Metzl and Christian
Bauckhage and Eldar Sultanow
- Abstract summary: We describe a quanvolutional neural network (QNN) algorithm that efficiently maps classical image data to quantum states.
We empirically observe a genuine quantum advantage for an industrial application where the advantage is due to superior data encoding.
- Score: 1.7481852615249125
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Over the past decade, machine learning revolutionized vision-based quality
assessment for which convolutional neural networks (CNNs) have now become the
standard. In this paper, we consider a potential next step in this development
and describe a quanvolutional neural network (QNN) algorithm that efficiently
maps classical image data to quantum states and allows for reliable image
analysis. We practically demonstrate how to leverage quantum devices in
computer vision and how to introduce quantum convolutions into classical CNNs.
Dealing with a real world use case in industrial quality control, we implement
our hybrid QNN model within the PennyLane framework and empirically observe it
to achieve better predictions using much fewer training data than classical
CNNs. In other words, we empirically observe a genuine quantum advantage for an
industrial application where the advantage is due to superior data encoding.
Related papers
- CTRQNets & LQNets: Continuous Time Recurrent and Liquid Quantum Neural Networks [76.53016529061821]
Liquid Quantum Neural Network (LQNet) and Continuous Time Recurrent Quantum Neural Network (CTRQNet) developed.
LQNet and CTRQNet achieve accuracy increases as high as 40% on CIFAR 10 through binary classification.
arXiv Detail & Related papers (2024-08-28T00:56:03Z) - Impact of Data Augmentation on QCNNs [1.1510009152620664]
Quantum Convolutional Neural Networks (QCNNs) are proposed as a novel generalization to CNNs by using quantum mechanisms.
This paper implements and compares both CNNs and QCNNs by testing losses and prediction accuracy on three commonly used datasets.
arXiv Detail & Related papers (2023-12-01T05:28:19Z) - A Quantum Convolutional Neural Network Approach for Object Detection and
Classification [0.0]
The time and accuracy of QCNNs are compared with classical CNNs and ANN models under different conditions.
The analysis shows that QCNNs have the potential to outperform both classical CNNs and ANN models in terms of accuracy and efficiency for certain applications.
arXiv Detail & Related papers (2023-07-17T02:38:04Z) - Traffic Sign Classification Using Deep and Quantum Neural Networks [0.0]
Quantum Neural Networks (QNNs) are an emerging technology that can be used in many applications including computer vision.
In this paper, we presented a traffic sign classification system implemented using a hybrid quantum-classical convolutional neural network.
arXiv Detail & Related papers (2022-09-30T06:16:03Z) - On Circuit-based Hybrid Quantum Neural Networks for Remote Sensing
Imagery Classification [88.31717434938338]
The hybrid QCNNs enrich the classical architecture of CNNs by introducing a quantum layer within a standard neural network.
The novel QCNN proposed in this work is applied to the Land Use and Land Cover (LULC) classification, chosen as an Earth Observation (EO) use case.
The results of the multiclass classification prove the effectiveness of the presented approach, by demonstrating that the QCNN performances are higher than the classical counterparts.
arXiv Detail & Related papers (2021-09-20T12:41:50Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - A Quantum Convolutional Neural Network for Image Classification [7.745213180689952]
We propose a novel neural network model named Quantum Convolutional Neural Network (QCNN)
QCNN is based on implementable quantum circuits and has a similar structure as classical convolutional neural networks.
Numerical simulation results on the MNIST dataset demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2021-07-08T06:47:34Z) - The dilemma of quantum neural networks [63.82713636522488]
We show that quantum neural networks (QNNs) fail to provide any benefit over classical learning models.
QNNs suffer from the severely limited effective model capacity, which incurs poor generalization on real-world datasets.
These results force us to rethink the role of current QNNs and to design novel protocols for solving real-world problems with quantum advantages.
arXiv Detail & Related papers (2021-06-09T10:41:47Z) - Quantum Optical Convolutional Neural Network: A Novel Image Recognition
Framework for Quantum Computing [0.0]
We report a novel quantum computing based deep learning model, the Quantum Optical Convolutional Neural Network (QOCNN)
We benchmarked this new architecture against a traditional CNN based on the seminal LeNet model.
We conclude that switching to a quantum computing based approach to deep learning may result in comparable accuracies to classical models.
arXiv Detail & Related papers (2020-12-19T23:10:04Z) - Toward Trainability of Quantum Neural Networks [87.04438831673063]
Quantum Neural Networks (QNNs) have been proposed as generalizations of classical neural networks to achieve the quantum speed-up.
Serious bottlenecks exist for training QNNs due to the vanishing with gradient rate exponential to the input qubit number.
We show that QNNs with tree tensor and step controlled structures for the application of binary classification. Simulations show faster convergent rates and better accuracy compared to QNNs with random structures.
arXiv Detail & Related papers (2020-11-12T08:32:04Z) - Decentralizing Feature Extraction with Quantum Convolutional Neural
Network for Automatic Speech Recognition [101.69873988328808]
We build upon a quantum convolutional neural network (QCNN) composed of a quantum circuit encoder for feature extraction.
An input speech is first up-streamed to a quantum computing server to extract Mel-spectrogram.
The corresponding convolutional features are encoded using a quantum circuit algorithm with random parameters.
The encoded features are then down-streamed to the local RNN model for the final recognition.
arXiv Detail & Related papers (2020-10-26T03:36:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.