Implementation of quantum stochastic walks for function approximation,
two-dimensional data classification, and sequence classification
- URL: http://arxiv.org/abs/2103.03018v2
- Date: Mon, 18 Apr 2022 15:27:41 GMT
- Title: Implementation of quantum stochastic walks for function approximation,
two-dimensional data classification, and sequence classification
- Authors: Lu-Ji Wang, Jia-Yi Lin, Shengjun Wu
- Abstract summary: We study a quantum neural network based on quantum walks on a graph, and use gradient descent to update the network parameters.
A simple QSNN with five neurons is trained to determine whether a sequence of words is a sentence or not, and we find that a QSNN can reduce the number of training steps.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study a quantum stochastic neural network (QSNN) based on quantum
stochastic walks on a graph, and use gradient descent to update the network
parameters. We apply a toy model of QSNN with a few neurons to the problems of
function approximation, two-dimensional data classification, and sequence
classification. A simple QSNN with five neurons is trained to determine whether
a sequence of words is a sentence or not, and we find that a QSNN can reduce
the number of training steps. A QSNN with 11 neurons shows a quantum advantage
in improving the accuracy of recognizing new types of inputs like verses.
Moreover, with our toy model, we find the coherent QSNN is more robust against
both label noise and device noise, compared with the decoherent QSNN. These
results show that quantum stochastic walks may be a useful resource to
implement a quantum neural network.
Related papers
- Approximation rates of quantum neural networks for periodic functions via Jackson's inequality [2.217547045999963]
Quantum neural networks (QNNs) are an analog of classical neural networks in the world of quantum computing.<n>We study the approximation capabilities of QNNs for periodic functions.
arXiv Detail & Related papers (2025-11-20T08:44:24Z) - Extending Quantum Perceptrons: Rydberg Devices, Multi-Class Classification, and Error Tolerance [67.77677387243135]
Quantum Neuromorphic Computing (QNC) merges quantum computation with neural computation to create scalable, noise-resilient algorithms for quantum machine learning (QML)
At the core of QNC is the quantum perceptron (QP), which leverages the analog dynamics of interacting qubits to enable universal quantum computation.
arXiv Detail & Related papers (2024-11-13T23:56:20Z) - Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Randomness-enhanced expressivity of quantum neural networks [7.7991930692137466]
We propose a novel approach to enhance the expressivity of QNNs by incorporating randomness into quantum circuits.
We prove that our approach can accurately approximate arbitrary target operators using Uhlmann's theorem for majorization.
We find the expressivity of QNNs is enhanced by introducing randomness for multiple learning tasks, which could have broad application in quantum machine learning.
arXiv Detail & Related papers (2023-08-09T07:17:13Z) - Random Quantum Neural Networks (RQNN) for Noisy Image Recognition [0.9205287316703888]
We introduce a novel class of supervised Random Quantum Neural Networks (RQNNs) with a robust training strategy.
The proposed RQNN employs hybrid classical-quantum algorithms with superposition state and amplitude encoding features.
Experiments on the MNIST, FashionMNIST, and KMNIST datasets demonstrate that the proposed RQNN model achieves an average classification accuracy of $94.9%$.
arXiv Detail & Related papers (2022-03-03T15:15:29Z) - Quantum-inspired Complex Convolutional Neural Networks [17.65730040410185]
We improve the quantum-inspired neurons by exploiting the complex-valued weights which have richer representational capacity and better non-linearity.
We draw the models of quantum-inspired convolutional neural networks (QICNNs) capable of processing high-dimensional data.
The performance of classification accuracy of the five QICNNs are tested on the MNIST and CIFAR-10 datasets.
arXiv Detail & Related papers (2021-10-31T03:10:48Z) - Quantum convolutional neural network for classical data classification [0.8057006406834467]
We benchmark fully parameterized quantum convolutional neural networks (QCNNs) for classical data classification.
We propose a quantum neural network model inspired by CNN that only uses two-qubit interactions throughout the entire algorithm.
arXiv Detail & Related papers (2021-08-02T06:48:34Z) - Toward Trainability of Quantum Neural Networks [87.04438831673063]
Quantum Neural Networks (QNNs) have been proposed as generalizations of classical neural networks to achieve the quantum speed-up.
Serious bottlenecks exist for training QNNs due to the vanishing with gradient rate exponential to the input qubit number.
We show that QNNs with tree tensor and step controlled structures for the application of binary classification. Simulations show faster convergent rates and better accuracy compared to QNNs with random structures.
arXiv Detail & Related papers (2020-11-12T08:32:04Z) - Decentralizing Feature Extraction with Quantum Convolutional Neural
Network for Automatic Speech Recognition [101.69873988328808]
We build upon a quantum convolutional neural network (QCNN) composed of a quantum circuit encoder for feature extraction.
An input speech is first up-streamed to a quantum computing server to extract Mel-spectrogram.
The corresponding convolutional features are encoded using a quantum circuit algorithm with random parameters.
The encoded features are then down-streamed to the local RNN model for the final recognition.
arXiv Detail & Related papers (2020-10-26T03:36:01Z) - Searching for Low-Bit Weights in Quantized Neural Networks [129.8319019563356]
Quantized neural networks with low-bit weights and activations are attractive for developing AI accelerators.
We present to regard the discrete weights in an arbitrary quantized neural network as searchable variables, and utilize a differential method to search them accurately.
arXiv Detail & Related papers (2020-09-18T09:13:26Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z) - Recurrent Quantum Neural Networks [7.6146285961466]
Recurrent neural networks are the foundation of many sequence-to-sequence models in machine learning.
We construct a quantum recurrent neural network (QRNN) with demonstrable performance on non-trivial tasks.
We evaluate the QRNN on MNIST classification, both by feeding the QRNN each image pixel-by-pixel; and by utilising modern data augmentation as preprocessing step.
arXiv Detail & Related papers (2020-06-25T17:59:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.