Scalable Quantum Convolutional Neural Networks
- URL: http://arxiv.org/abs/2209.12372v1
- Date: Mon, 26 Sep 2022 02:07:00 GMT
- Title: Scalable Quantum Convolutional Neural Networks
- Authors: Hankyul Baek, Won Joon Yun, Joongheon Kim
- Abstract summary: We propose a new version of quantum neural network (QCNN) named scalable quantum convolutional neural network (sQCNN)
In addition, using the fidelity of QC, we propose an sQCNN training algorithm named reverse fidelity training (RF-Train) that maximizes the performance of sQCNN.
- Score: 12.261689483681145
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: With the beginning of the noisy intermediate-scale quantum (NISQ) era,
quantum neural network (QNN) has recently emerged as a solution for the
problems that classical neural networks cannot solve. Moreover, QCNN is
attracting attention as the next generation of QNN because it can process
high-dimensional vector input. However, due to the nature of quantum computing,
it is difficult for the classical QCNN to extract a sufficient number of
features. Motivated by this, we propose a new version of QCNN, named scalable
quantum convolutional neural network (sQCNN). In addition, using the fidelity
of QC, we propose an sQCNN training algorithm named reverse fidelity training
(RF-Train) that maximizes the performance of sQCNN.
Related papers
- CTRQNets & LQNets: Continuous Time Recurrent and Liquid Quantum Neural Networks [76.53016529061821]
Liquid Quantum Neural Network (LQNet) and Continuous Time Recurrent Quantum Neural Network (CTRQNet) developed.
LQNet and CTRQNet achieve accuracy increases as high as 40% on CIFAR 10 through binary classification.
arXiv Detail & Related papers (2024-08-28T00:56:03Z) - A Quantum Convolutional Neural Network Approach for Object Detection and
Classification [0.0]
The time and accuracy of QCNNs are compared with classical CNNs and ANN models under different conditions.
The analysis shows that QCNNs have the potential to outperform both classical CNNs and ANN models in terms of accuracy and efficiency for certain applications.
arXiv Detail & Related papers (2023-07-17T02:38:04Z) - Variational Quantum Neural Networks (VQNNS) in Image Classification [0.0]
This paper investigates how training of quantum neural network (QNNs) can be done using quantum optimization algorithms.
In this paper, a QNN structure is made where a variational parameterized circuit is incorporated as an input layer named as Variational Quantum Neural Network (VQNNs)
VQNNs is experimented with MNIST digit recognition (less complex) and crack image classification datasets which converge the computation in lesser time than QNN with decent training accuracy.
arXiv Detail & Related papers (2023-03-10T11:24:32Z) - Quantum Recurrent Neural Networks for Sequential Learning [11.133759363113867]
We propose a new kind of quantum recurrent neural network (QRNN) to find quantum advantageous applications in the near term.
Our QRNN is built by stacking the QRBs in a staggered way that can greatly reduce the algorithm's requirement with regard to the coherent time of quantum devices.
The numerical experiments show that our QRNN achieves much better performance in prediction (classification) accuracy against the classical RNN and state-of-the-art QNN models for sequential learning.
arXiv Detail & Related papers (2023-02-07T04:04:39Z) - 3D Scalable Quantum Convolutional Neural Networks for Point Cloud Data
Processing in Classification Applications [10.90994913062223]
A quantum convolutional neural network (QCNN) is proposed for point cloud data processing in classification applications.
A novel 3D scalable QCNN (sQCNN-3D) is proposed for point cloud data processing in classification applications.
arXiv Detail & Related papers (2022-10-18T10:14:03Z) - QTN-VQC: An End-to-End Learning framework for Quantum Neural Networks [71.14713348443465]
We introduce a trainable quantum tensor network (QTN) for quantum embedding on a variational quantum circuit (VQC)
QTN enables an end-to-end parametric model pipeline, namely QTN-VQC, from the generation of quantum embedding to the output measurement.
Our experiments on the MNIST dataset demonstrate the advantages of QTN for quantum embedding over other quantum embedding approaches.
arXiv Detail & Related papers (2021-10-06T14:44:51Z) - The dilemma of quantum neural networks [63.82713636522488]
We show that quantum neural networks (QNNs) fail to provide any benefit over classical learning models.
QNNs suffer from the severely limited effective model capacity, which incurs poor generalization on real-world datasets.
These results force us to rethink the role of current QNNs and to design novel protocols for solving real-world problems with quantum advantages.
arXiv Detail & Related papers (2021-06-09T10:41:47Z) - Toward Trainability of Quantum Neural Networks [87.04438831673063]
Quantum Neural Networks (QNNs) have been proposed as generalizations of classical neural networks to achieve the quantum speed-up.
Serious bottlenecks exist for training QNNs due to the vanishing with gradient rate exponential to the input qubit number.
We show that QNNs with tree tensor and step controlled structures for the application of binary classification. Simulations show faster convergent rates and better accuracy compared to QNNs with random structures.
arXiv Detail & Related papers (2020-11-12T08:32:04Z) - Decentralizing Feature Extraction with Quantum Convolutional Neural
Network for Automatic Speech Recognition [101.69873988328808]
We build upon a quantum convolutional neural network (QCNN) composed of a quantum circuit encoder for feature extraction.
An input speech is first up-streamed to a quantum computing server to extract Mel-spectrogram.
The corresponding convolutional features are encoded using a quantum circuit algorithm with random parameters.
The encoded features are then down-streamed to the local RNN model for the final recognition.
arXiv Detail & Related papers (2020-10-26T03:36:01Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.