Avoiding barren plateaus using classical shadows
- URL: http://arxiv.org/abs/2201.08194v2
- Date: Thu, 30 Jun 2022 10:17:25 GMT
- Title: Avoiding barren plateaus using classical shadows
- Authors: Stefan H. Sack, Raimel A. Medina, Alexios A. Michailidis, Richard
Kueng and Maksym Serbyn
- Abstract summary: Variational quantum algorithms are promising algorithms for achieving quantum advantage on near-term devices.
The optimization landscape of expressive variational ans"atze is dominated by large regions in parameter space, known as barren plateaus.
- Score: 1.5749416770494702
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Variational quantum algorithms are promising algorithms for achieving quantum
advantage on near-term devices. The quantum hardware is used to implement a
variational wave function and measure observables, whereas the classical
computer is used to store and update the variational parameters. The
optimization landscape of expressive variational ans\"atze is however dominated
by large regions in parameter space, known as barren plateaus, with vanishing
gradients which prevents efficient optimization. In this work we propose a
general algorithm to avoid barren plateaus in the initialization and throughout
the optimization. To this end we define a notion of weak barren plateaus (WBP)
based on the entropies of local reduced density matrices. The presence of WBPs
can be efficiently quantified using recently introduced shadow tomography of
the quantum state with a classical computer. We demonstrate that avoidance of
WBPs suffices to ensure sizable gradients in the initialization. In addition,
we demonstrate that decreasing the gradient step size, guided by the entropies
allows to avoid WBPs during the optimization process. This paves the way for
efficient barren plateau free optimization on near-term devices.
Related papers
- Flattened one-bit stochastic gradient descent: compressed distributed optimization with controlled variance [55.01966743652196]
We propose a novel algorithm for distributed gradient descent (SGD) with compressed gradient communication in the parameter-server framework.
Our gradient compression technique, named flattened one-bit gradient descent (FO-SGD), relies on two simple algorithmic ideas.
arXiv Detail & Related papers (2024-05-17T21:17:27Z) - Classical Post-processing for Unitary Block Optimization Scheme to Reduce the Effect of Noise on Optimization of Variational Quantum Eigensolvers [0.0]
Variational Quantum Eigensolvers (VQE) are a promising approach for finding the classically intractable ground state of a Hamiltonian.
Here we develop two classical post-processing techniques which improve UBOS especially when measurements have large noise.
arXiv Detail & Related papers (2024-04-29T18:11:53Z) - Differentially Private Optimization with Sparse Gradients [60.853074897282625]
We study differentially private (DP) optimization problems under sparsity of individual gradients.
Building on this, we obtain pure- and approximate-DP algorithms with almost optimal rates for convex optimization with sparse gradients.
arXiv Detail & Related papers (2024-04-16T20:01:10Z) - Line Search Strategy for Navigating through Barren Plateaus in Quantum Circuit Training [0.0]
Variational quantum algorithms are viewed as promising candidates for demonstrating quantum advantage on near-term devices.
This work introduces a novel optimization method designed to alleviate the adverse effects of barren plateau (BP) problems during circuit training.
We have successfully applied our optimization strategy to quantum circuits comprising $16$ qubits and $15000$ entangling gates.
arXiv Detail & Related papers (2024-02-07T20:06:29Z) - Optimizing Variational Quantum Algorithms with qBang: Efficiently Interweaving Metric and Momentum to Navigate Flat Energy Landscapes [0.0]
Variational quantum algorithms (VQAs) represent a promising approach to utilizing current quantum computing infrastructures.
We propose the quantum Broyden adaptive natural gradient (qBang) approach, a novel that aims to distill the best aspects of existing approaches.
arXiv Detail & Related papers (2023-04-27T00:06:48Z) - Accelerated First-Order Optimization under Nonlinear Constraints [73.2273449996098]
We exploit between first-order algorithms for constrained optimization and non-smooth systems to design a new class of accelerated first-order algorithms.
An important property of these algorithms is that constraints are expressed in terms of velocities instead of sparse variables.
arXiv Detail & Related papers (2023-02-01T08:50:48Z) - Avoiding barren plateaus via transferability of smooth solutions in
Hamiltonian Variational Ansatz [0.0]
Variational Quantum Algorithms (VQAs) represent leading candidates to achieve computational speed-ups on current quantum devices.
Two major hurdles are the proliferation of low-quality variational local minima, and the exponential vanishing of gradients in the cost function landscape.
Here we show that by employing iterative search schemes one can effectively prepare the ground state of paradigmatic quantum many-body models.
arXiv Detail & Related papers (2022-06-04T12:52:29Z) - BEINIT: Avoiding Barren Plateaus in Variational Quantum Algorithms [0.7462336024223667]
Barren plateaus are a notorious problem in the optimization of variational quantum algorithms.
We propose an alternative strategy which initializes the parameters of a unitary gate by drawing from a beta distribution.
We empirically show that our proposed framework significantly reduces the possibility of a complex quantum neural network getting stuck in a barren plateau.
arXiv Detail & Related papers (2022-04-28T19:46:10Z) - Self-Tuning Stochastic Optimization with Curvature-Aware Gradient
Filtering [53.523517926927894]
We explore the use of exact per-sample Hessian-vector products and gradients to construct self-tuning quadratics.
We prove that our model-based procedure converges in noisy gradient setting.
This is an interesting step for constructing self-tuning quadratics.
arXiv Detail & Related papers (2020-11-09T22:07:30Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Sub-linear Regret Bounds for Bayesian Optimisation in Unknown Search
Spaces [63.22864716473051]
We propose a novel BO algorithm which expands (and shifts) the search space over iterations.
We show theoretically that for both our algorithms, the cumulative regret grows at sub-linear rates.
arXiv Detail & Related papers (2020-09-05T14:24:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.