Backpropagation scaling in parameterised quantum circuits
- URL: http://arxiv.org/abs/2306.14962v3
- Date: Fri, 28 Jun 2024 09:44:11 GMT
- Title: Backpropagation scaling in parameterised quantum circuits
- Authors: Joseph Bowles, David Wierichs, Chae-Yeun Park,
- Abstract summary: We introduce circuits that are not known to be classically simulable and admit gradient estimation with significantly fewer circuits.
Specifically, these circuits allow for fast estimation of the gradient, higher order partial derivatives and the Fisher information matrix.
In a toy classification problem on 16 qubits, such circuits show competitive performance with other methods, while reducing the training cost by about two orders of magnitude.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The discovery of the backpropagation algorithm ranks among one of the most important moments in the history of machine learning, and has made possible the training of large-scale neural networks through its ability to compute gradients at roughly the same computational cost as model evaluation. Despite its importance, a similar backpropagation-like scaling for gradient evaluation of parameterised quantum circuits has remained elusive. Currently, the most popular method requires sampling from a number of circuits that scales with the number of circuit parameters, making training of large-scale quantum circuits prohibitively expensive in practice. Here we address this problem by introducing a class of structured circuits that are not known to be classically simulable and admit gradient estimation with significantly fewer circuits. In the simplest case -- for which the parameters feed into commuting quantum gates -- these circuits allow for fast estimation of the gradient, higher order partial derivatives and the Fisher information matrix. Moreover, specific families of parameterised circuits exist for which the scaling of gradient estimation is in line with classical backpropagation, and can thus be trained at scale. In a toy classification problem on 16 qubits, such circuits show competitive performance with other methods, while reducing the training cost by about two orders of magnitude.
Related papers
- QAdaPrune: Adaptive Parameter Pruning For Training Variational Quantum Circuits [2.3332157823623403]
emphQAdaPrune is an adaptive parameter pruning algorithm that automatically determines the threshold and then intelligently prunes the redundant and non-performing parameters.
We show that the resulting sparse parameter sets yield quantum circuits that perform comparably to the unpruned quantum circuits.
arXiv Detail & Related papers (2024-08-23T19:57:40Z) - Adaptive Planning Search Algorithm for Analog Circuit Verification [53.97809573610992]
We propose a machine learning (ML) approach, which uses less simulations.
We show that the proposed approach is able to provide OCCs closer to the specifications for all circuits.
arXiv Detail & Related papers (2023-06-23T12:57:46Z) - Parsimonious Optimisation of Parameters in Variational Quantum Circuits [1.303764728768944]
We propose a novel Quantum-Gradient Sampling that requires the execution of at most two circuits per iteration to update the optimisable parameters.
Our proposed method achieves similar convergence rates to classical gradient descent, and empirically outperforms gradient coordinate descent, and SPSA.
arXiv Detail & Related papers (2023-06-20T18:50:18Z) - Quantum circuit debugging and sensitivity analysis via local inversions [62.997667081978825]
We present a technique that pinpoints the sections of a quantum circuit that affect the circuit output the most.
We demonstrate the practicality and efficacy of the proposed technique by applying it to example algorithmic circuits implemented on IBM quantum machines.
arXiv Detail & Related papers (2022-04-12T19:39:31Z) - Gaussian initializations help deep variational quantum circuits escape
from the barren plateau [87.04438831673063]
Variational quantum circuits have been widely employed in quantum simulation and quantum machine learning in recent years.
However, quantum circuits with random structures have poor trainability due to the exponentially vanishing gradient with respect to the circuit depth and the qubit number.
This result leads to a general belief that deep quantum circuits will not be feasible for practical tasks.
arXiv Detail & Related papers (2022-03-17T15:06:40Z) - Mode connectivity in the loss landscape of parameterized quantum
circuits [1.7546369508217283]
Variational training of parameterized quantum circuits (PQCs) underpins many employed on near-term noisy intermediate scale quantum (NISQ) devices.
We adapt the qualitative loss landscape characterization for neural networks introduced in citegoodfellowqualitatively,li 2017visualizing and tests for connectivity used in citedraxler 2018essentially to study the loss landscape features in PQC training.
arXiv Detail & Related papers (2021-11-09T18:28:46Z) - FLIP: A flexible initializer for arbitrarily-sized parametrized quantum
circuits [105.54048699217668]
We propose a FLexible Initializer for arbitrarily-sized Parametrized quantum circuits.
FLIP can be applied to any family of PQCs, and instead of relying on a generic set of initial parameters, it is tailored to learn the structure of successful parameters.
We illustrate the advantage of using FLIP in three scenarios: a family of problems with proven barren plateaus, PQC training to solve max-cut problem instances, and PQC training for finding the ground state energies of 1D Fermi-Hubbard models.
arXiv Detail & Related papers (2021-03-15T17:38:33Z) - Capacity and quantum geometry of parametrized quantum circuits [0.0]
Parametrized quantum circuits can be effectively implemented on current devices.
We evaluate the capacity and trainability of these circuits using the geometric structure of the parameter space.
Our results enhance the understanding of parametrized quantum circuits for improving variational quantum algorithms.
arXiv Detail & Related papers (2021-02-02T18:16:57Z) - Characterizing the loss landscape of variational quantum circuits [77.34726150561087]
We introduce a way to compute the Hessian of the loss function of VQCs.
We show how this information can be interpreted and compared to classical neural networks.
arXiv Detail & Related papers (2020-08-06T17:48:12Z) - Machine Learning Optimization of Quantum Circuit Layouts [63.55764634492974]
We introduce a quantum circuit mapping, QXX, and its machine learning version, QXX-MLP.
The latter infers automatically the optimal QXX parameter values such that the layed out circuit has a reduced depth.
We present empiric evidence for the feasibility of learning the layout method using approximation.
arXiv Detail & Related papers (2020-07-29T05:26:19Z) - Layerwise learning for quantum neural networks [7.2237324920669055]
We show a layerwise learning strategy for parametrized quantum circuits.
The circuit depth is incrementally grown during optimization, and only subsets of parameters are updated in each training step.
We demonstrate our approach on an image-classification task on handwritten digits, and show that layerwise learning attains an 8% lower generalization error on average.
arXiv Detail & Related papers (2020-06-26T10:44:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.