Supervised learning of random quantum circuits via scalable neural
networks
- URL: http://arxiv.org/abs/2206.10348v1
- Date: Tue, 21 Jun 2022 13:05:52 GMT
- Title: Supervised learning of random quantum circuits via scalable neural
networks
- Authors: S. Cantori, D. Vitali, S. Pilati
- Abstract summary: Deep convolutional neural networks (CNNs) are trained to predict single-qubit and two-qubit expectation values.
The CNNs often outperform the quantum devices, depending on the circuit depth, on the network depth, and on the training set size.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Predicting the output of quantum circuits is a hard computational task that
plays a pivotal role in the development of universal quantum computers. Here we
investigate the supervised learning of output expectation values of random
quantum circuits. Deep convolutional neural networks (CNNs) are trained to
predict single-qubit and two-qubit expectation values using databases of
classically simulated circuits. These circuits are represented via an
appropriately designed one-hot encoding of the constituent gates. The
prediction accuracy for previously unseen circuits is analyzed, also making
comparisons with small-scale quantum computers available from the free IBM
Quantum program. The CNNs often outperform the quantum devices, depending on
the circuit depth, on the network depth, and on the training set size. Notably,
our CNNs are designed to be scalable. This allows us exploiting transfer
learning and performing extrapolations to circuits larger than those included
in the training set. These CNNs also demonstrate remarkable resilience against
noise, namely, they remain accurate even when trained on (simulated)
expectation values averaged over very few measurements.
Related papers
- CTRQNets & LQNets: Continuous Time Recurrent and Liquid Quantum Neural Networks [76.53016529061821]
Liquid Quantum Neural Network (LQNet) and Continuous Time Recurrent Quantum Neural Network (CTRQNet) developed.
LQNet and CTRQNet achieve accuracy increases as high as 40% on CIFAR 10 through binary classification.
arXiv Detail & Related papers (2024-08-28T00:56:03Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Synergy between noisy quantum computers and scalable classical deep learning [0.4999814847776097]
We investigate the potential of combining the computational power of noisy quantum computers and classical scalable convolutional neural networks (CNNs)
The goal is to accurately predict exact expectation values of parameterized quantum circuits representing the Trotter-decomposed dynamics of quantum Ising models.
Thanks to the quantum information, our CNNs succeed even when supervised learning based only on classical descriptors fails.
arXiv Detail & Related papers (2024-04-11T14:47:18Z) - Challenges and opportunities in the supervised learning of quantum
circuit outputs [0.0]
Deep neural networks have proven capable of predicting some output properties of relevant random quantum circuits.
We investigate if and to what extent neural networks can learn to predict the output expectation values of circuits often employed in variational quantum algorithms.
arXiv Detail & Related papers (2024-02-07T16:10:13Z) - QuantumSEA: In-Time Sparse Exploration for Noise Adaptive Quantum
Circuits [82.50620782471485]
QuantumSEA is an in-time sparse exploration for noise-adaptive quantum circuits.
It aims to achieve two key objectives: (1) implicit circuits capacity during training and (2) noise robustness.
Our method establishes state-of-the-art results with only half the number of quantum gates and 2x time saving of circuit executions.
arXiv Detail & Related papers (2024-01-10T22:33:00Z) - Quantum neural networks [0.0]
This thesis combines two of the most exciting research areas of the last decades: quantum computing and machine learning.
We introduce dissipative quantum neural networks (DQNNs), which are capable of universal quantum computation and have low memory requirements while training.
arXiv Detail & Related papers (2022-05-17T07:47:00Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - Fast Swapping in a Quantum Multiplier Modelled as a Queuing Network [64.1951227380212]
We propose that quantum circuits can be modeled as queuing networks.
Our method is scalable and has the potential speed and precision necessary for large scale quantum circuit compilation.
arXiv Detail & Related papers (2021-06-26T10:55:52Z) - QFCNN: Quantum Fourier Convolutional Neural Network [4.344289435743451]
We propose a new hybrid quantum-classical circuit, namely Quantum Fourier Convolutional Network (QFCN)
Our model achieves exponential speed-up compared with classical CNN theoretically and improves over the existing best result of quantum CNN.
We demonstrate the potential of this architecture by applying it to different deep learning tasks, including traffic prediction and image classification.
arXiv Detail & Related papers (2021-06-19T04:37:39Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z) - Branching Quantum Convolutional Neural Networks [0.0]
Small-scale quantum computers are already showing potential gains in learning tasks on large quantum and very large classical data sets.
We present a generalization of QCNN, the branching quantum convolutional neural network, or bQCNN, with substantially higher expressibility.
arXiv Detail & Related papers (2020-12-28T19:00:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.