Synergy between noisy quantum computers and scalable classical deep learning
- URL: http://arxiv.org/abs/2404.07802v1
- Date: Thu, 11 Apr 2024 14:47:18 GMT
- Title: Synergy between noisy quantum computers and scalable classical deep learning
- Authors: Simone Cantori, Andrea Mari, David Vitali, Sebastiano Pilati,
- Abstract summary: We investigate the potential of combining the computational power of noisy quantum computers and classical scalable convolutional neural networks (CNNs)
The goal is to accurately predict exact expectation values of parameterized quantum circuits representing the Trotter-decomposed dynamics of quantum Ising models.
Thanks to the quantum information, our CNNs succeed even when supervised learning based only on classical descriptors fails.
- Score: 0.4999814847776097
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We investigate the potential of combining the computational power of noisy quantum computers and of classical scalable convolutional neural networks (CNNs). The goal is to accurately predict exact expectation values of parameterized quantum circuits representing the Trotter-decomposed dynamics of quantum Ising models. By incorporating (simulated) noisy expectation values alongside circuit structure information, our CNNs effectively capture the underlying relationships between circuit architecture and output behaviour, enabling predictions for circuits with more qubits than those included in the training set. Notably, thanks to the quantum information, our CNNs succeed even when supervised learning based only on classical descriptors fails. Furthermore, they outperform a popular error mitigation scheme, namely, zero-noise extrapolation, demonstrating that the synergy between quantum and classical computational tools leads to higher accuracy compared with quantum-only or classical-only approaches. By tuning the noise strength, we explore the crossover from a computationally powerful classical CNN assisted by quantum noisy data, towards rather precise quantum computations, further error-mitigated via classical deep learning.
Related papers
- Distributed quantum machine learning via classical communication [0.7378853859331619]
We present an experimentally accessible distributed quantum machine learning scheme that integrates quantum processor units via classical communication.
Our results indicate that incorporating classical communication notably improves classification accuracy compared to schemes without communication.
arXiv Detail & Related papers (2024-08-29T08:05:57Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - Bridging Classical and Quantum Machine Learning: Knowledge Transfer From
Classical to Quantum Neural Networks Using Knowledge Distillation [0.0]
This paper introduces a new method to transfer knowledge from classical to quantum neural networks using knowledge distillation.
We adapt classical convolutional neural network (CNN) architectures like LeNet and AlexNet to serve as teacher networks.
Quantum models achieve an average accuracy improvement of 0.80% on the MNIST dataset and 5.40% on the more complex Fashion MNIST dataset.
arXiv Detail & Related papers (2023-11-23T05:06:43Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Classical-to-quantum convolutional neural network transfer learning [1.9336815376402723]
Machine learning using quantum convolutional neural networks (QCNNs) has demonstrated success in both quantum and classical data classification.
We propose transfer learning as an effective strategy for utilizing small QCNNs in the noisy intermediate-scale quantum era.
arXiv Detail & Related papers (2022-08-31T09:15:37Z) - Supervised learning of random quantum circuits via scalable neural
networks [0.0]
Deep convolutional neural networks (CNNs) are trained to predict single-qubit and two-qubit expectation values.
The CNNs often outperform the quantum devices, depending on the circuit depth, on the network depth, and on the training set size.
arXiv Detail & Related papers (2022-06-21T13:05:52Z) - Comparing concepts of quantum and classical neural network models for
image classification task [0.456877715768796]
This material includes the results of experiments on training and performance of a hybrid quantum-classical neural network.
Although its simulation is time-consuming, the quantum network, although its simulation is time-consuming, overcomes the classical network.
arXiv Detail & Related papers (2021-08-19T18:49:30Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z) - The Hintons in your Neural Network: a Quantum Field Theory View of Deep
Learning [84.33745072274942]
We show how to represent linear and non-linear layers as unitary quantum gates, and interpret the fundamental excitations of the quantum model as particles.
On top of opening a new perspective and techniques for studying neural networks, the quantum formulation is well suited for optical quantum computing.
arXiv Detail & Related papers (2021-03-08T17:24:29Z) - Quantum Deformed Neural Networks [83.71196337378022]
We develop a new quantum neural network layer designed to run efficiently on a quantum computer.
It can be simulated on a classical computer when restricted in the way it entangles input states.
arXiv Detail & Related papers (2020-10-21T09:46:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.