Random Quantum Neural Networks (RQNN) for Noisy Image Recognition
- URL: http://arxiv.org/abs/2203.01764v1
- Date: Thu, 3 Mar 2022 15:15:29 GMT
- Title: Random Quantum Neural Networks (RQNN) for Noisy Image Recognition
- Authors: Debanjan Konar, Erol Gelenbe, Soham Bhandary, Aditya Das Sarma, and
Attila Cangi
- Abstract summary: We introduce a novel class of supervised Random Quantum Neural Networks (RQNNs) with a robust training strategy.
The proposed RQNN employs hybrid classical-quantum algorithms with superposition state and amplitude encoding features.
Experiments on the MNIST, FashionMNIST, and KMNIST datasets demonstrate that the proposed RQNN model achieves an average classification accuracy of $94.9%$.
- Score: 0.9205287316703888
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Classical Random Neural Networks (RNNs) have demonstrated effective
applications in decision making, signal processing, and image recognition
tasks. However, their implementation has been limited to deterministic digital
systems that output probability distributions in lieu of stochastic behaviors
of random spiking signals. We introduce the novel class of supervised Random
Quantum Neural Networks (RQNNs) with a robust training strategy to better
exploit the random nature of the spiking RNN. The proposed RQNN employs hybrid
classical-quantum algorithms with superposition state and amplitude encoding
features, inspired by quantum information theory and the brain's
spatial-temporal stochastic spiking property of neuron information encoding. We
have extensively validated our proposed RQNN model, relying on hybrid
classical-quantum algorithms via the PennyLane Quantum simulator with a limited
number of \emph{qubits}. Experiments on the MNIST, FashionMNIST, and KMNIST
datasets demonstrate that the proposed RQNN model achieves an average
classification accuracy of $94.9\%$. Additionally, the experimental findings
illustrate the proposed RQNN's effectiveness and resilience in noisy settings,
with enhanced image classification accuracy when compared to the classical
counterparts (RNNs), classical Spiking Neural Networks (SNNs), and the
classical convolutional neural network (AlexNet). Furthermore, the RQNN can
deal with noise, which is useful for various applications, including computer
vision in NISQ devices. The PyTorch code (https://github.com/darthsimpus/RQN)
is made available on GitHub to reproduce the results reported in this
manuscript.
Related papers
- Accurate Mapping of RNNs on Neuromorphic Hardware with Adaptive Spiking Neurons [2.9410174624086025]
We present a $SigmaDelta$-low-pass RNN (lpRNN) for mapping rate-based RNNs to spiking neural networks (SNNs)
An adaptive spiking neuron model encodes signals using $SigmaDelta$-modulation and enables precise mapping.
We demonstrate the implementation of the lpRNN on Intel's neuromorphic research chip Loihi.
arXiv Detail & Related papers (2024-07-18T14:06:07Z) - Quantum Implicit Neural Representations [4.2216663697289665]
Implicit neural representations have emerged as a powerful paradigm to represent signals such as images and sounds.
Traditional neural networks face challenges in accurately modeling high-frequency components of signals.
We propose Quantum Implicit Representation Network (QIREN), a novel quantum generalization of FNNs.
arXiv Detail & Related papers (2024-06-06T09:04:48Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Variational Quantum Neural Networks (VQNNS) in Image Classification [0.0]
This paper investigates how training of quantum neural network (QNNs) can be done using quantum optimization algorithms.
In this paper, a QNN structure is made where a variational parameterized circuit is incorporated as an input layer named as Variational Quantum Neural Network (VQNNs)
VQNNs is experimented with MNIST digit recognition (less complex) and crack image classification datasets which converge the computation in lesser time than QNN with decent training accuracy.
arXiv Detail & Related papers (2023-03-10T11:24:32Z) - Quantum Recurrent Neural Networks for Sequential Learning [11.133759363113867]
We propose a new kind of quantum recurrent neural network (QRNN) to find quantum advantageous applications in the near term.
Our QRNN is built by stacking the QRBs in a staggered way that can greatly reduce the algorithm's requirement with regard to the coherent time of quantum devices.
The numerical experiments show that our QRNN achieves much better performance in prediction (classification) accuracy against the classical RNN and state-of-the-art QNN models for sequential learning.
arXiv Detail & Related papers (2023-02-07T04:04:39Z) - Quantum-inspired Complex Convolutional Neural Networks [17.65730040410185]
We improve the quantum-inspired neurons by exploiting the complex-valued weights which have richer representational capacity and better non-linearity.
We draw the models of quantum-inspired convolutional neural networks (QICNNs) capable of processing high-dimensional data.
The performance of classification accuracy of the five QICNNs are tested on the MNIST and CIFAR-10 datasets.
arXiv Detail & Related papers (2021-10-31T03:10:48Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - Deep Networks for Direction-of-Arrival Estimation in Low SNR [89.45026632977456]
We introduce a Convolutional Neural Network (CNN) that is trained from mutli-channel data of the true array manifold matrix.
We train a CNN in the low-SNR regime to predict DoAs across all SNRs.
Our robust solution can be applied in several fields, ranging from wireless array sensors to acoustic microphones or sonars.
arXiv Detail & Related papers (2020-11-17T12:52:18Z) - Toward Trainability of Quantum Neural Networks [87.04438831673063]
Quantum Neural Networks (QNNs) have been proposed as generalizations of classical neural networks to achieve the quantum speed-up.
Serious bottlenecks exist for training QNNs due to the vanishing with gradient rate exponential to the input qubit number.
We show that QNNs with tree tensor and step controlled structures for the application of binary classification. Simulations show faster convergent rates and better accuracy compared to QNNs with random structures.
arXiv Detail & Related papers (2020-11-12T08:32:04Z) - Decentralizing Feature Extraction with Quantum Convolutional Neural
Network for Automatic Speech Recognition [101.69873988328808]
We build upon a quantum convolutional neural network (QCNN) composed of a quantum circuit encoder for feature extraction.
An input speech is first up-streamed to a quantum computing server to extract Mel-spectrogram.
The corresponding convolutional features are encoded using a quantum circuit algorithm with random parameters.
The encoded features are then down-streamed to the local RNN model for the final recognition.
arXiv Detail & Related papers (2020-10-26T03:36:01Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.