On quantum backpropagation, information reuse, and cheating measurement
collapse
- URL: http://arxiv.org/abs/2305.13362v1
- Date: Mon, 22 May 2023 18:00:02 GMT
- Title: On quantum backpropagation, information reuse, and cheating measurement
collapse
- Authors: Amira Abbas, Robbie King, Hsin-Yuan Huang, William J. Huggins, Ramis
Movassagh, Dar Gilboa, Jarrod R. McClean
- Abstract summary: We show that parameterized quantum models can train as efficiently as classical neural networks.
We introduce an algorithm with foundations in shadow tomography that matches backpropagation scaling in quantum resources.
- Score: 6.476797054353113
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The success of modern deep learning hinges on the ability to train neural
networks at scale. Through clever reuse of intermediate information,
backpropagation facilitates training through gradient computation at a total
cost roughly proportional to running the function, rather than incurring an
additional factor proportional to the number of parameters - which can now be
in the trillions. Naively, one expects that quantum measurement collapse
entirely rules out the reuse of quantum information as in backpropagation. But
recent developments in shadow tomography, which assumes access to multiple
copies of a quantum state, have challenged that notion. Here, we investigate
whether parameterized quantum models can train as efficiently as classical
neural networks. We show that achieving backpropagation scaling is impossible
without access to multiple copies of a state. With this added ability, we
introduce an algorithm with foundations in shadow tomography that matches
backpropagation scaling in quantum resources while reducing classical auxiliary
computational costs to open problems in shadow tomography. These results
highlight the nuance of reusing quantum information for practical purposes and
clarify the unique difficulties in training large quantum models, which could
alter the course of quantum machine learning.
Related papers
- Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Quantum Network Tomography via Learning Isometries on Stiefel Manifold [7.796175259362575]
We propose an efficient method for quantum network tomography by learning isometries on the Stiefel manifold.
As a result, our proposed method exhibits high accuracy and efficiency.
arXiv Detail & Related papers (2024-04-10T13:05:39Z) - Machine Learning for Practical Quantum Error Mitigation [0.0]
We show that machine learning for quantum error mitigation can drastically reduce overheads, maintain or even surpass the accuracy of conventional methods.
Our results highlight the potential of classical machine learning for practical quantum computation.
arXiv Detail & Related papers (2023-09-29T16:17:12Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Hybrid quantum transfer learning for crack image classification on NISQ
hardware [62.997667081978825]
We present an application of quantum transfer learning for detecting cracks in gray value images.
We compare the performance and training time of PennyLane's standard qubits with IBM's qasm_simulator and real backends.
arXiv Detail & Related papers (2023-07-31T14:45:29Z) - Quantum Imitation Learning [74.15588381240795]
We propose quantum imitation learning (QIL) with a hope to utilize quantum advantage to speed up IL.
We develop two QIL algorithms, quantum behavioural cloning (Q-BC) and quantum generative adversarial imitation learning (Q-GAIL)
Experiment results demonstrate that both Q-BC and Q-GAIL can achieve comparable performance compared to classical counterparts.
arXiv Detail & Related papers (2023-04-04T12:47:35Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - From Quantum Graph Computing to Quantum Graph Learning: A Survey [86.8206129053725]
We first elaborate the correlations between quantum mechanics and graph theory to show that quantum computers are able to generate useful solutions.
For its practicability and wide-applicability, we give a brief review of typical graph learning techniques.
We give a snapshot of quantum graph learning where expectations serve as a catalyst for subsequent research.
arXiv Detail & Related papers (2022-02-19T02:56:47Z) - Variational learning for quantum artificial neural networks [0.0]
We first review a series of recent works describing the implementation of artificial neurons and feed-forward neural networks on quantum processors.
We then present an original realization of efficient individual quantum nodes based on variational unsampling protocols.
While keeping full compatibility with the overall memory-efficient feed-forward architecture, our constructions effectively reduce the quantum circuit depth required to determine the activation probability of single neurons.
arXiv Detail & Related papers (2021-03-03T16:10:15Z) - VSQL: Variational Shadow Quantum Learning for Classification [6.90132007891849]
We propose a new hybrid quantum-classical framework for supervised quantum learning, which we call Variational Shadow Quantum Learning.
We first use variational shadow quantum circuits to extract classical features in a convolution way and then utilize a fully-connected neural network to complete the classification task.
We show that this method could sharply reduce the number of parameters and thus better facilitate quantum circuit training.
arXiv Detail & Related papers (2020-12-15T13:51:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.