Quantum neural networks with deep residual learning
- URL: http://arxiv.org/abs/2012.07772v2
- Date: Tue, 29 Dec 2020 15:00:17 GMT
- Title: Quantum neural networks with deep residual learning
- Authors: Yanying Liang, Wei Peng, Zhu-Jun Zheng, Olli Silv\'en, Guoying Zhao
- Abstract summary: In this paper, a novel quantum neural network with deep residual learning (ResQNN) is proposed.
Our ResQNN is able to learn an unknown unitary and get remarkable performance.
- Score: 29.929891641757273
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Inspired by the success of neural networks in the classical machine learning
tasks, there has been tremendous effort to develop quantum neural networks
(QNNs), especially for quantum data or tasks that are inherently quantum in
nature. Currently, with the imminent advent of quantum computing processors to
evade the computational and thermodynamic limitation of classical
computations,} designing an efficient quantum neural network becomes a valuable
task in quantum machine learning. In this paper, a novel quantum neural network
with deep residual learning (ResQNN) is proposed. {Specifically, a multiple
layer quantum perceptron with residual connection is provided. Our ResQNN is
able to learn an unknown unitary and get remarkable performance. Besides, the
model can be trained with an end-to-end fashion, as analogue of the
backpropagation in the classical neural networks. To explore the effectiveness
of our ResQNN , we perform extensive experiments on the quantum data under the
setting of both clean and noisy training data. The experimental results show
the robustness and superiority of our ResQNN, when compared to current
remarkable work, which is from \textit{Nature communications, 2020}. Moreover,
when training with higher proportion of noisy data, the superiority of our
ResQNN model can be even significant, which implies the generalization ability
and the remarkable tolerance for noisy data of the proposed method.
Related papers
- CTRQNets & LQNets: Continuous Time Recurrent and Liquid Quantum Neural Networks [76.53016529061821]
Liquid Quantum Neural Network (LQNet) and Continuous Time Recurrent Quantum Neural Network (CTRQNet) developed.
LQNet and CTRQNet achieve accuracy increases as high as 40% on CIFAR 10 through binary classification.
arXiv Detail & Related papers (2024-08-28T00:56:03Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Quantum Neural Network for Quantum Neural Computing [0.0]
We propose a new quantum neural network model for quantum neural computing.
Our model circumvents the problem that the state-space size grows exponentially with the number of neurons.
We benchmark our model for handwritten digit recognition and other nonlinear classification tasks.
arXiv Detail & Related papers (2023-05-15T11:16:47Z) - Learning Quantum Processes with Memory -- Quantum Recurrent Neural
Networks [0.0]
We propose fully quantum recurrent neural networks, based on dissipative quantum neural networks.
We demonstrate the potential of these algorithms to learn complex quantum processes with memory.
Numerical simulations indicate that our quantum recurrent neural networks exhibit a striking ability to generalise from small training sets.
arXiv Detail & Related papers (2023-01-19T16:58:39Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Quantum-inspired Complex Convolutional Neural Networks [17.65730040410185]
We improve the quantum-inspired neurons by exploiting the complex-valued weights which have richer representational capacity and better non-linearity.
We draw the models of quantum-inspired convolutional neural networks (QICNNs) capable of processing high-dimensional data.
The performance of classification accuracy of the five QICNNs are tested on the MNIST and CIFAR-10 datasets.
arXiv Detail & Related papers (2021-10-31T03:10:48Z) - QDCNN: Quantum Dilated Convolutional Neural Network [1.52292571922932]
We propose a novel hybrid quantum-classical algorithm called quantum dilated convolutional neural networks (QDCNNs)
Our method extends the concept of dilated convolution, which has been widely applied in modern deep learning algorithms, to the context of hybrid neural networks.
The proposed QDCNNs are able to capture larger context during the quantum convolution process while reducing the computational cost.
arXiv Detail & Related papers (2021-10-29T10:24:34Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - A Quantum Convolutional Neural Network for Image Classification [7.745213180689952]
We propose a novel neural network model named Quantum Convolutional Neural Network (QCNN)
QCNN is based on implementable quantum circuits and has a similar structure as classical convolutional neural networks.
Numerical simulation results on the MNIST dataset demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2021-07-08T06:47:34Z) - The dilemma of quantum neural networks [63.82713636522488]
We show that quantum neural networks (QNNs) fail to provide any benefit over classical learning models.
QNNs suffer from the severely limited effective model capacity, which incurs poor generalization on real-world datasets.
These results force us to rethink the role of current QNNs and to design novel protocols for solving real-world problems with quantum advantages.
arXiv Detail & Related papers (2021-06-09T10:41:47Z) - Quantum Deformed Neural Networks [83.71196337378022]
We develop a new quantum neural network layer designed to run efficiently on a quantum computer.
It can be simulated on a classical computer when restricted in the way it entangles input states.
arXiv Detail & Related papers (2020-10-21T09:46:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.