Natural Evolutionary Strategies for Variational Quantum Computation
- URL: http://arxiv.org/abs/2012.00101v2
- Date: Mon, 29 Mar 2021 17:15:50 GMT
- Title: Natural Evolutionary Strategies for Variational Quantum Computation
- Authors: Abhinav Anand, Matthias Degroote, and Al\'an Aspuru-Guzik
- Abstract summary: Natural evolutionary strategies (NES) are a family of gradient-free black-box optimization algorithms.
This study illustrates their use for the optimization of randomly-d parametrized quantum circuits (PQCs) in the region of vanishing gradients.
- Score: 0.7874708385247353
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Natural evolutionary strategies (NES) are a family of gradient-free black-box
optimization algorithms. This study illustrates their use for the optimization
of randomly-initialized parametrized quantum circuits (PQCs) in the region of
vanishing gradients. We show that using the NES gradient estimator the
exponential decrease in variance can be alleviated. We implement two specific
approaches, the exponential and separable natural evolutionary strategies, for
parameter optimization of PQCs and compare them against standard gradient
descent. We apply them to two different problems of ground state energy
estimation using variational quantum eigensolver (VQE) and state preparation
with circuits of varying depth and length. We also introduce batch optimization
for circuits with larger depth to extend the use of evolutionary strategies to
a larger number of parameters. We achieve accuracy comparable to
state-of-the-art optimization techniques in all the above cases with a lower
number of circuit evaluations. Our empirical results indicate that one can use
NES as a hybrid tool in tandem with other gradient-based methods for
optimization of deep quantum circuits in regions with vanishing gradients.
Related papers
- Optimizing a parameterized controlled gate with Free Quaternion Selection [0.4353365283165517]
In this study, we propose an algorithm to estimate the optimal parameters for locally minimizing the cost value of a single-qubit gate.
To benchmark the performance, we apply the proposed method to various optimization problems, including the Variational Eigensolver (VQE) for Ising and molecular Hamiltonians.
arXiv Detail & Related papers (2024-09-20T14:46:00Z) - Line Search Strategy for Navigating through Barren Plateaus in Quantum Circuit Training [0.0]
Variational quantum algorithms are viewed as promising candidates for demonstrating quantum advantage on near-term devices.
This work introduces a novel optimization method designed to alleviate the adverse effects of barren plateau (BP) problems during circuit training.
We have successfully applied our optimization strategy to quantum circuits comprising $16$ qubits and $15000$ entangling gates.
arXiv Detail & Related papers (2024-02-07T20:06:29Z) - Parsimonious Optimisation of Parameters in Variational Quantum Circuits [1.303764728768944]
We propose a novel Quantum-Gradient Sampling that requires the execution of at most two circuits per iteration to update the optimisable parameters.
Our proposed method achieves similar convergence rates to classical gradient descent, and empirically outperforms gradient coordinate descent, and SPSA.
arXiv Detail & Related papers (2023-06-20T18:50:18Z) - Variational Quantum Optimization with Multi-Basis Encodings [62.72309460291971]
We introduce a new variational quantum algorithm that benefits from two innovations: multi-basis graph complexity and nonlinear activation functions.
Our results in increased optimization performance, two increase in effective landscapes and a reduction in measurement progress.
arXiv Detail & Related papers (2021-06-24T20:16:02Z) - Normalized Gradient Descent for Variational Quantum Algorithms [4.403985869332685]
Vari quantum algorithms (VQAs) are promising methods that leverage noisy quantum computers.
NGD method, which employs the normalized gradient vector to update the parameters, has been successfully utilized in several optimization problems.
We propose a new NGD that can attain the faster convergence than the ordinary NGD.
arXiv Detail & Related papers (2021-06-21T11:03:12Z) - Zeroth-Order Hybrid Gradient Descent: Towards A Principled Black-Box
Optimization Framework [100.36569795440889]
This work is on the iteration of zero-th-order (ZO) optimization which does not require first-order information.
We show that with a graceful design in coordinate importance sampling, the proposed ZO optimization method is efficient both in terms of complexity as well as as function query cost.
arXiv Detail & Related papers (2020-12-21T17:29:58Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Channel-Directed Gradients for Optimization of Convolutional Neural
Networks [50.34913837546743]
We introduce optimization methods for convolutional neural networks that can be used to improve existing gradient-based optimization in terms of generalization error.
We show that defining the gradients along the output channel direction leads to a performance boost, while other directions can be detrimental.
arXiv Detail & Related papers (2020-08-25T00:44:09Z) - Cross Entropy Hyperparameter Optimization for Constrained Problem
Hamiltonians Applied to QAOA [68.11912614360878]
Hybrid quantum-classical algorithms such as Quantum Approximate Optimization Algorithm (QAOA) are considered as one of the most encouraging approaches for taking advantage of near-term quantum computers in practical applications.
Such algorithms are usually implemented in a variational form, combining a classical optimization method with a quantum machine to find good solutions to an optimization problem.
In this study we apply a Cross-Entropy method to shape this landscape, which allows the classical parameter to find better parameters more easily and hence results in an improved performance.
arXiv Detail & Related papers (2020-03-11T13:52:41Z) - Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization [71.03797261151605]
Adaptivity is an important yet under-studied property in modern optimization theory.
Our algorithm is proved to achieve the best-available convergence for non-PL objectives simultaneously while outperforming existing algorithms for PL objectives.
arXiv Detail & Related papers (2020-02-13T05:42:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.