Accelerating Quantum Approximate Optimization Algorithm using Machine
Learning
- URL: http://arxiv.org/abs/2002.01089v2
- Date: Mon, 6 Apr 2020 14:50:01 GMT
- Title: Accelerating Quantum Approximate Optimization Algorithm using Machine
Learning
- Authors: Mahabubul Alam, Abdullah Ash-Saki, Swaroop Ghosh
- Abstract summary: We propose a machine learning based approach to accelerate quantum approximate optimization algorithm (QAOA) implementation.
QAOA is a quantum-classical hybrid algorithm to prove the so-called quantum supremacy.
We show that the proposed approach can curtail the number of optimization iterations by up to 65.7%) from an analysis performed with 264 flavors of graphs.
- Score: 6.735657356113614
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a machine learning based approach to accelerate quantum
approximate optimization algorithm (QAOA) implementation which is a promising
quantum-classical hybrid algorithm to prove the so-called quantum supremacy. In
QAOA, a parametric quantum circuit and a classical optimizer iterates in a
closed loop to solve hard combinatorial optimization problems. The performance
of QAOA improves with increasing number of stages (depth) in the quantum
circuit. However, two new parameters are introduced with each added stage for
the classical optimizer increasing the number of optimization loop iterations.
We note a correlation among parameters of the lower-depth and the higher-depth
QAOA implementations and, exploit it by developing a machine learning model to
predict the gate parameters close to the optimal values. As a result, the
optimization loop converges in a fewer number of iterations. We choose graph
MaxCut problem as a prototype to solve using QAOA. We perform a feature
extraction routine using 100 different QAOA instances and develop a training
data-set with 13,860 optimal parameters. We present our analysis for 4 flavors
of regression models and 4 flavors of classical optimizers. Finally, we show
that the proposed approach can curtail the number of optimization iterations by
on average 44.9% (up to 65.7%) from an analysis performed with 264 flavors of
graphs.
Related papers
- Quantum Approximate Optimization Algorithm Parameter Prediction Using a
Convolutional Neural Network [0.0]
We build a convolutional neural network to predict parameters of depth $p+1$ QAOA by the parameters from the depth $p$ QAOA counterpart.
An average approximation ratio of 92735$ for Max-Cut over $800$ ErdHos-R'enyi is obtained.
arXiv Detail & Related papers (2022-11-17T13:20:58Z) - Iterative-Free Quantum Approximate Optimization Algorithm Using Neural
Networks [20.051757447006043]
We propose a practical method that uses a simple, fully connected neural network to find better parameters tailored to a new given problem instance.
Our method is consistently the fastest to converge while also the best final result.
arXiv Detail & Related papers (2022-08-21T14:05:11Z) - Twisted hybrid algorithms for combinatorial optimization [68.8204255655161]
Proposed hybrid algorithms encode a cost function into a problem Hamiltonian and optimize its energy by varying over a set of states with low circuit complexity.
We show that for levels $p=2,ldots, 6$, the level $p$ can be reduced by one while roughly maintaining the expected approximation ratio.
arXiv Detail & Related papers (2022-03-01T19:47:16Z) - Performance comparison of optimization methods on variational quantum
algorithms [2.690135599539986]
Variational quantum algorithms (VQAs) offer a promising path towards using near-term quantum hardware for applications in academic and industrial research.
We study the performance of four commonly used gradient-free optimization methods: SLSQP, COBYLA, CMA-ES, and SPSA.
arXiv Detail & Related papers (2021-11-26T12:13:20Z) - Parameters Fixing Strategy for Quantum Approximate Optimization
Algorithm [0.0]
We propose a strategy to give high approximation ratio on average, even at large circuit depths, by initializing QAOA with the optimal parameters obtained from the previous depths.
We test our strategy on the Max-cut problem of certain classes of graphs such as the 3-regular graphs and the Erd"os-R'enyi graphs.
arXiv Detail & Related papers (2021-08-11T15:44:16Z) - Reducing the Variance of Gaussian Process Hyperparameter Optimization
with Preconditioning [54.01682318834995]
Preconditioning is a highly effective step for any iterative method involving matrix-vector multiplication.
We prove that preconditioning has an additional benefit that has been previously unexplored.
It simultaneously can reduce variance at essentially negligible cost.
arXiv Detail & Related papers (2021-07-01T06:43:11Z) - Transferability of optimal QAOA parameters between random graphs [3.321726991033431]
We show that convergence of the optimal QAOA parameters around specific values can be explained and predicted based on the local properties of the graphs.
We demonstrate how optimized parameters calculated for a 6-node random graph can be successfully used without modification as nearly optimal parameters for a 64-node random graph.
arXiv Detail & Related papers (2021-06-14T15:57:47Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Convergence of adaptive algorithms for weakly convex constrained
optimization [59.36386973876765]
We prove the $mathcaltilde O(t-1/4)$ rate of convergence for the norm of the gradient of Moreau envelope.
Our analysis works with mini-batch size of $1$, constant first and second order moment parameters, and possibly smooth optimization domains.
arXiv Detail & Related papers (2020-06-11T17:43:19Z) - Cross Entropy Hyperparameter Optimization for Constrained Problem
Hamiltonians Applied to QAOA [68.11912614360878]
Hybrid quantum-classical algorithms such as Quantum Approximate Optimization Algorithm (QAOA) are considered as one of the most encouraging approaches for taking advantage of near-term quantum computers in practical applications.
Such algorithms are usually implemented in a variational form, combining a classical optimization method with a quantum machine to find good solutions to an optimization problem.
In this study we apply a Cross-Entropy method to shape this landscape, which allows the classical parameter to find better parameters more easily and hence results in an improved performance.
arXiv Detail & Related papers (2020-03-11T13:52:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.