Reducing measurement costs by recycling the Hessian in adaptive variational quantum algorithms
- URL: http://arxiv.org/abs/2401.05172v2
- Date: Mon, 18 Nov 2024 18:38:37 GMT
- Title: Reducing measurement costs by recycling the Hessian in adaptive variational quantum algorithms
- Authors: Mafalda Ramôa, Luis Paulo Santos, Nicholas J. Mayhall, Edwin Barnes, Sophia E. Economou,
- Abstract summary: We propose an improved quasi-Newton optimization protocol specifically tailored to adaptive VQAs.
We implement a quasi-Newton algorithm where an approximation to the inverse Hessian matrix is continuously built and grown across the iterations of an adaptive VQA.
- Score: 0.0
- License:
- Abstract: Adaptive protocols enable the construction of more efficient state preparation circuits in variational quantum algorithms (VQAs) by utilizing data obtained from the quantum processor during the execution of the algorithm. This idea originated with ADAPT-VQE, an algorithm that iteratively grows the state preparation circuit operator by operator, with each new operator accompanied by a new variational parameter, and where all parameters acquired thus far are optimized in each iteration. In ADAPT-VQE and other adaptive VQAs that followed it, it has been shown that initializing parameters to their optimal values from the previous iteration speeds up convergence and avoids shallow local traps in the parameter landscape. However, no other data from the optimization performed at one iteration is carried over to the next. In this work, we propose an improved quasi-Newton optimization protocol specifically tailored to adaptive VQAs. The distinctive feature in our proposal is that approximate second derivatives of the cost function are recycled across iterations in addition to parameter values. We implement a quasi-Newton optimizer where an approximation to the inverse Hessian matrix is continuously built and grown across the iterations of an adaptive VQA. The resulting algorithm has the flavor of a continuous optimization where the dimension of the search space is augmented when the gradient norm falls below a given threshold. We show that this inter-optimization exchange of second-order information leads the Hessian in the state of the optimizer to better approximate the exact Hessian. As a result, our method achieves a superlinear convergence rate even in situations where the typical quasi-Newton optimizer converges only linearly. Our protocol decreases the measurement costs in implementing adaptive VQAs on quantum hardware as well as the runtime of their classical simulation.
Related papers
- Accelerated First-Order Optimization under Nonlinear Constraints [73.2273449996098]
We exploit between first-order algorithms for constrained optimization and non-smooth systems to design a new class of accelerated first-order algorithms.
An important property of these algorithms is that constraints are expressed in terms of velocities instead of sparse variables.
arXiv Detail & Related papers (2023-02-01T08:50:48Z) - Iterative-Free Quantum Approximate Optimization Algorithm Using Neural
Networks [20.051757447006043]
We propose a practical method that uses a simple, fully connected neural network to find better parameters tailored to a new given problem instance.
Our method is consistently the fastest to converge while also the best final result.
arXiv Detail & Related papers (2022-08-21T14:05:11Z) - Performance comparison of optimization methods on variational quantum
algorithms [2.690135599539986]
Variational quantum algorithms (VQAs) offer a promising path towards using near-term quantum hardware for applications in academic and industrial research.
We study the performance of four commonly used gradient-free optimization methods: SLSQP, COBYLA, CMA-ES, and SPSA.
arXiv Detail & Related papers (2021-11-26T12:13:20Z) - Quantum Approximate Optimization Algorithm with Adaptive Bias Fields [4.03537866744963]
The quantum approximate optimization algorithm (QAOA) transforms a simple many-qubit wavefunction into one which encodes a solution to a difficult classical optimization problem.
In this paper, the QAOA is modified by updating the operators themselves to include local fields, using information from the measured wavefunction at the end of one step to improve the operators at later steps.
arXiv Detail & Related papers (2021-05-25T13:51:09Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Convergence of adaptive algorithms for weakly convex constrained
optimization [59.36386973876765]
We prove the $mathcaltilde O(t-1/4)$ rate of convergence for the norm of the gradient of Moreau envelope.
Our analysis works with mini-batch size of $1$, constant first and second order moment parameters, and possibly smooth optimization domains.
arXiv Detail & Related papers (2020-06-11T17:43:19Z) - Cross Entropy Hyperparameter Optimization for Constrained Problem
Hamiltonians Applied to QAOA [68.11912614360878]
Hybrid quantum-classical algorithms such as Quantum Approximate Optimization Algorithm (QAOA) are considered as one of the most encouraging approaches for taking advantage of near-term quantum computers in practical applications.
Such algorithms are usually implemented in a variational form, combining a classical optimization method with a quantum machine to find good solutions to an optimization problem.
In this study we apply a Cross-Entropy method to shape this landscape, which allows the classical parameter to find better parameters more easily and hence results in an improved performance.
arXiv Detail & Related papers (2020-03-11T13:52:41Z) - Proximal Gradient Algorithm with Momentum and Flexible Parameter Restart
for Nonconvex Optimization [73.38702974136102]
Various types of parameter restart schemes have been proposed for accelerated algorithms to facilitate their practical convergence in rates.
In this paper, we propose an algorithm for solving nonsmooth problems.
arXiv Detail & Related papers (2020-02-26T16:06:27Z) - Accelerating Quantum Approximate Optimization Algorithm using Machine
Learning [6.735657356113614]
We propose a machine learning based approach to accelerate quantum approximate optimization algorithm (QAOA) implementation.
QAOA is a quantum-classical hybrid algorithm to prove the so-called quantum supremacy.
We show that the proposed approach can curtail the number of optimization iterations by up to 65.7%) from an analysis performed with 264 flavors of graphs.
arXiv Detail & Related papers (2020-02-04T02:21:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.