A Recursive Lower Bound on the Energy Improvement of the Quantum Approximate Optimization Algorithm
- URL: http://arxiv.org/abs/2405.10125v2
- Date: Mon, 18 Nov 2024 14:13:35 GMT
- Title: A Recursive Lower Bound on the Energy Improvement of the Quantum Approximate Optimization Algorithm
- Authors: Raimel A. Medina, Maksym Serbyn,
- Abstract summary: We use an analytic expansion of the cost function around transition states to gain insights into the deep QAOA.
Our numerical study confirms the accuracy of our approximations, and reveals that the obtained bound and the true value of the QAOA cost function gain have a characteristic exponential decrease with the number of layers $p$.
- Score: 0.0
- License:
- Abstract: The Quantum Approximate Optimization Algorithm (QAOA) uses a quantum computer to implement a variational method with $2p$ layers of alternating unitary operators, optimized by a classical computer to minimize a cost function. While rigorous performance guarantees exist for the QAOA at small depths $p$, the behavior at large depths remains less clear, though simulations suggest exponentially fast convergence for certain problems. In this work, we gain insights into the deep QAOA using an analytic expansion of the cost function around transition states. Transition states are constructed recursively: from a local minima of the QAOA with $p$ layers we obtain transition states of the QAOA with $p+1$ layers, which are stationary points characterized by a unique direction of negative curvature. We construct an analytic estimate of the negative curvature and the corresponding direction in parameter space at each transition state. Expansion of the QAOA cost function along the negative direction to the quartic order gives a lower bound of the QAOA cost function improvement. We provide physical intuition behind the analytic expressions for the local curvature and quartic expansion coefficient. Our numerical study confirms the accuracy of our approximations, and reveals that the obtained bound and the true value of the QAOA cost function gain have a characteristic exponential decrease with the number of layers $p$, with the bound decreasing more rapidly. Our study establishes an analytical method for recursively studying the QAOA applicable in the regime of high circuit depth.
Related papers
- A Sample Efficient Alternating Minimization-based Algorithm For Robust Phase Retrieval [56.67706781191521]
In this work, we present a robust phase retrieval problem where the task is to recover an unknown signal.
Our proposed oracle avoids the need for computationally spectral descent, using a simple gradient step and outliers.
arXiv Detail & Related papers (2024-09-07T06:37:23Z) - Efficient Quantum Gradient and Higher-order Derivative Estimation via Generalized Hadamard Test [2.5545813981422882]
Gradient-based methods are crucial for understanding the behavior of parameterized quantum circuits (PQCs)
Existing gradient estimation methods, such as Finite Difference, Shift Rule, Hadamard Test, and Direct Hadamard Test, often yield suboptimal gradient circuits for certain PQCs.
We introduce the Flexible Hadamard Test, which, when applied to first-order gradient estimation methods, can invert the roles of ansatz generators and observables.
We also introduce Quantum Automatic Differentiation (QAD), a unified gradient method that adaptively selects the best gradient estimation technique for individual parameters within a PQ
arXiv Detail & Related papers (2024-08-10T02:08:54Z) - Linearly simplified QAOA parameters and transferability [0.6834295298053009]
Quantum Approximate Algorithm Optimization (QAOA) provides a way to solve optimization problems using quantum computers.
We present some numerical results that are obtained for instances of the random Ising model and of the max-cut problem.
arXiv Detail & Related papers (2024-05-01T17:34:32Z) - A Mean-Field Analysis of Neural Stochastic Gradient Descent-Ascent for Functional Minimax Optimization [90.87444114491116]
This paper studies minimax optimization problems defined over infinite-dimensional function classes of overparametricized two-layer neural networks.
We address (i) the convergence of the gradient descent-ascent algorithm and (ii) the representation learning of the neural networks.
Results show that the feature representation induced by the neural networks is allowed to deviate from the initial one by the magnitude of $O(alpha-1)$, measured in terms of the Wasserstein distance.
arXiv Detail & Related papers (2024-04-18T16:46:08Z) - Trainability Barriers in Low-Depth QAOA Landscapes [0.0]
Quantum Alternating Operator Ansatz (QAOA) is a prominent variational quantum algorithm for solving optimization problems.
Previous results have given analytical performance guarantees for a small, fixed number of parameters.
We study the difficulty of training in the intermediate regime, which is the focus of most current numerical studies.
arXiv Detail & Related papers (2024-02-15T18:45:30Z) - Recursive greedy initialization of the quantum approximate optimization
algorithm with guaranteed improvement [1.720510639137902]
Quantum approximate optimization algorithm (QAOA) is a variational quantum algorithm, where a quantum computer implements a variational ansatz consisting of $p$ layers of alternating unitary operators.
We present an analytic construction of $2p+1$ transition states for QAOA with $p+1$ layers that use the local minimum of QAOA with $p$ layers.
arXiv Detail & Related papers (2022-09-02T16:40:21Z) - Differentiable Annealed Importance Sampling and the Perils of Gradient
Noise [68.44523807580438]
Annealed importance sampling (AIS) and related algorithms are highly effective tools for marginal likelihood estimation.
Differentiability is a desirable property as it would admit the possibility of optimizing marginal likelihood as an objective.
We propose a differentiable algorithm by abandoning Metropolis-Hastings steps, which further unlocks mini-batch computation.
arXiv Detail & Related papers (2021-07-21T17:10:14Z) - Quantum annealing initialization of the quantum approximate optimization
algorithm [0.0]
Quantum approximate optimization algorithm (QAOA) is a prospective near-term quantum algorithm.
external parameter optimization required in QAOA could become a performance bottleneck.
In this work we visualize the optimization landscape of the QAOA applied to the MaxCut problem on random graphs.
arXiv Detail & Related papers (2021-01-14T17:45:13Z) - Zeroth-Order Hybrid Gradient Descent: Towards A Principled Black-Box
Optimization Framework [100.36569795440889]
This work is on the iteration of zero-th-order (ZO) optimization which does not require first-order information.
We show that with a graceful design in coordinate importance sampling, the proposed ZO optimization method is efficient both in terms of complexity as well as as function query cost.
arXiv Detail & Related papers (2020-12-21T17:29:58Z) - Faster Convergence of Stochastic Gradient Langevin Dynamics for
Non-Log-Concave Sampling [110.88857917726276]
We provide a new convergence analysis of gradient Langevin dynamics (SGLD) for sampling from a class of distributions that can be non-log-concave.
At the core of our approach is a novel conductance analysis of SGLD using an auxiliary time-reversible Markov Chain.
arXiv Detail & Related papers (2020-10-19T15:23:18Z) - Towards Better Understanding of Adaptive Gradient Algorithms in
Generative Adversarial Nets [71.05306664267832]
Adaptive algorithms perform gradient updates using the history of gradients and are ubiquitous in training deep neural networks.
In this paper we analyze a variant of OptimisticOA algorithm for nonconcave minmax problems.
Our experiments show that adaptive GAN non-adaptive gradient algorithms can be observed empirically.
arXiv Detail & Related papers (2019-12-26T22:10:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.