Adiabatic-Passage-Based Parameter Setting for Quantum Approximate
Optimization Algorithm
- URL: http://arxiv.org/abs/2312.00077v3
- Date: Mon, 22 Jan 2024 07:15:05 GMT
- Title: Adiabatic-Passage-Based Parameter Setting for Quantum Approximate
Optimization Algorithm
- Authors: Mingyou Wu, Hanwu Chen
- Abstract summary: We propose a novel adiabatic-passage-based parameter setting method.
This method remarkably reduces the optimization cost, specifically when applied to the 3-SAT problem, to a sublinear level.
- Score: 0.7252027234425334
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Quantum Approximate Optimization Algorithm (QAOA) exhibits significant
potential for tackling combinatorial optimization problems. Despite its promise
for near-term quantum devices, a major challenge in applying QAOA lies in the
cost of circuit runs associated with parameter optimization. Existing methods
for parameter setting generally incur at least a superlinear cost concerning
the depth p of QAOA. In this study, we propose a novel adiabatic-passage-based
parameter setting method that remarkably reduces the optimization cost,
specifically when applied to the 3-SAT problem, to a sublinear level. Beginning
with an analysis of the random model of the specific problem, this method
applies a problem-dependent preprocessing on the problem Hamiltonian
analytically, effectively segregating the magnitude of parameters from the
scale of the problem. Consequently, a problem-independent initialization is
achieved without incurring any optimization cost or pre-computation.
Furthermore, the parameter space is adjusted based on the continuity of the
optimal adiabatic passage, resulting in a reduction in the disparity of
parameters between adjacent layers of QAOA. By leveraging this continuity, the
cost to find quasi-optimal parameters is significantly reduced to a sublinear
level.
Related papers
- Linearly simplified QAOA parameters and transferability [0.6834295298053009]
Quantum Approximate Algorithm Optimization (QAOA) provides a way to solve optimization problems using quantum computers.
We present some numerical results that are obtained for instances of the random Ising model and of the max-cut problem.
arXiv Detail & Related papers (2024-05-01T17:34:32Z) - Hybrid GRU-CNN Bilinear Parameters Initialization for Quantum
Approximate Optimization Algorithm [7.502733639318316]
We propose a hybrid optimization approach that integrates Gated Recurrent Units (GRU), Conal Neural Networks (CNN), and a bilinear strategy as an innovative alternative to conventional approximations for predicting optimal parameters of QAOA circuits.
We employ the bilinear strategy to initialize to QAOA circuit parameters at greater depths, with reference parameters obtained from GRU-CNN optimization.
arXiv Detail & Related papers (2023-11-14T03:00:39Z) - Probabilistic tensor optimization of quantum circuits for the
max-$k$-cut problem [0.0]
We propose a technique for optimizing parameterized circuits in variational quantum algorithms.
We illustrate our approach on the example of the quantum approximate optimization algorithm (QAOA) applied to the max-$k$-cut problem.
arXiv Detail & Related papers (2023-10-16T12:56:22Z) - A Depth-Progressive Initialization Strategy for Quantum Approximate
Optimization Algorithm [0.0]
We first discuss the patterns of optimal parameters in QAOA in two directions.
We then discuss on the symmetries and periodicity of the expectation that is used to determine the bounds of the search space.
We propose a strategy which predicts the new initial parameters by taking the difference between previous optimal parameters.
arXiv Detail & Related papers (2022-09-22T23:49:11Z) - STORM+: Fully Adaptive SGD with Momentum for Nonconvex Optimization [74.1615979057429]
We investigate non-batch optimization problems where the objective is an expectation over smooth loss functions.
Our work builds on the STORM algorithm, in conjunction with a novel approach to adaptively set the learning rate and momentum parameters.
arXiv Detail & Related papers (2021-11-01T15:43:36Z) - Parameters Fixing Strategy for Quantum Approximate Optimization
Algorithm [0.0]
We propose a strategy to give high approximation ratio on average, even at large circuit depths, by initializing QAOA with the optimal parameters obtained from the previous depths.
We test our strategy on the Max-cut problem of certain classes of graphs such as the 3-regular graphs and the Erd"os-R'enyi graphs.
arXiv Detail & Related papers (2021-08-11T15:44:16Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Convergence of adaptive algorithms for weakly convex constrained
optimization [59.36386973876765]
We prove the $mathcaltilde O(t-1/4)$ rate of convergence for the norm of the gradient of Moreau envelope.
Our analysis works with mini-batch size of $1$, constant first and second order moment parameters, and possibly smooth optimization domains.
arXiv Detail & Related papers (2020-06-11T17:43:19Z) - Cross Entropy Hyperparameter Optimization for Constrained Problem
Hamiltonians Applied to QAOA [68.11912614360878]
Hybrid quantum-classical algorithms such as Quantum Approximate Optimization Algorithm (QAOA) are considered as one of the most encouraging approaches for taking advantage of near-term quantum computers in practical applications.
Such algorithms are usually implemented in a variational form, combining a classical optimization method with a quantum machine to find good solutions to an optimization problem.
In this study we apply a Cross-Entropy method to shape this landscape, which allows the classical parameter to find better parameters more easily and hence results in an improved performance.
arXiv Detail & Related papers (2020-03-11T13:52:41Z) - Proximal Gradient Algorithm with Momentum and Flexible Parameter Restart
for Nonconvex Optimization [73.38702974136102]
Various types of parameter restart schemes have been proposed for accelerated algorithms to facilitate their practical convergence in rates.
In this paper, we propose an algorithm for solving nonsmooth problems.
arXiv Detail & Related papers (2020-02-26T16:06:27Z) - Support recovery and sup-norm convergence rates for sparse pivotal
estimation [79.13844065776928]
In high dimensional sparse regression, pivotal estimators are estimators for which the optimal regularization parameter is independent of the noise level.
We show minimax sup-norm convergence rates for non smoothed and smoothed, single task and multitask square-root Lasso-type estimators.
arXiv Detail & Related papers (2020-01-15T16:11:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.