Efficient and Robust Parameter Optimization of the Unitary Coupled-Cluster Ansatz
- URL: http://arxiv.org/abs/2401.04910v2
- Date: Tue, 25 Jun 2024 01:46:50 GMT
- Title: Efficient and Robust Parameter Optimization of the Unitary Coupled-Cluster Ansatz
- Authors: Weitang Li, Yufei Ge, Shixin Zhang, Yuqin Chen, Shengyu Zhang,
- Abstract summary: We propose sequential optimization with approximate parabola (SOAP) for parameter optimization of unitary coupled-cluster ansatz on quantum computers.
Numerical benchmark studies on molecular systems demonstrate that SOAP achieves significantly faster convergence and greater robustness to noise.
SOAP is further validated through experiments on a superconducting quantum computer using a 2-qubit model system.
- Score: 4.607081302947026
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The variational quantum eigensolver (VQE) framework has been instrumental in advancing near-term quantum algorithms. However, parameter optimization remains a significant bottleneck for VQE, requiring a large number of measurements for successful algorithm execution. In this paper, we propose sequential optimization with approximate parabola (SOAP) as an efficient and robust optimizer specifically designed for parameter optimization of the unitary coupled-cluster ansatz on quantum computers. SOAP leverages sequential optimization and approximates the energy landscape as quadratic functions, minimizing the number of energy evaluations required to optimize each parameter. To capture parameter correlations, SOAP incorporates the average direction from the previous iteration into the optimization direction set. Numerical benchmark studies on molecular systems demonstrate that SOAP achieves significantly faster convergence and greater robustness to noise compared to traditional optimization methods. Furthermore, numerical simulations up to 20 qubits reveal that SOAP scales well with the number of parameters in the ansatz. The exceptional performance of SOAP is further validated through experiments on a superconducting quantum computer using a 2-qubit model system.
Related papers
- Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Federated Conditional Stochastic Optimization [110.513884892319]
Conditional optimization has found in a wide range of machine learning tasks, such as in-variant learning tasks, AUPRC, andAML.
This paper proposes algorithms for distributed federated learning.
arXiv Detail & Related papers (2023-10-04T01:47:37Z) - Optimization of Annealed Importance Sampling Hyperparameters [77.34726150561087]
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models.
We present a parameteric AIS process with flexible intermediary distributions and optimize the bridging distributions to use fewer number of steps for sampling.
We assess the performance of our optimized AIS for marginal likelihood estimation of deep generative models and compare it to other estimators.
arXiv Detail & Related papers (2022-09-27T07:58:25Z) - Performance comparison of optimization methods on variational quantum
algorithms [2.690135599539986]
Variational quantum algorithms (VQAs) offer a promising path towards using near-term quantum hardware for applications in academic and industrial research.
We study the performance of four commonly used gradient-free optimization methods: SLSQP, COBYLA, CMA-ES, and SPSA.
arXiv Detail & Related papers (2021-11-26T12:13:20Z) - Stochastic Gradient Line Bayesian Optimization: Reducing Measurement
Shots in Optimizing Parameterized Quantum Circuits [4.94950858749529]
We develop an efficient framework for circuit optimization with fewer measurement shots.
We formulate an adaptive measurement-shot strategy to achieve the optimization feasibly without relying on precise expectation-value estimation.
We show that a technique of suffix averaging can significantly reduce the effect of statistical and hardware noise in the optimization for the VQAs.
arXiv Detail & Related papers (2021-11-15T18:00:14Z) - Parameters Fixing Strategy for Quantum Approximate Optimization
Algorithm [0.0]
We propose a strategy to give high approximation ratio on average, even at large circuit depths, by initializing QAOA with the optimal parameters obtained from the previous depths.
We test our strategy on the Max-cut problem of certain classes of graphs such as the 3-regular graphs and the Erd"os-R'enyi graphs.
arXiv Detail & Related papers (2021-08-11T15:44:16Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Cross Entropy Hyperparameter Optimization for Constrained Problem
Hamiltonians Applied to QAOA [68.11912614360878]
Hybrid quantum-classical algorithms such as Quantum Approximate Optimization Algorithm (QAOA) are considered as one of the most encouraging approaches for taking advantage of near-term quantum computers in practical applications.
Such algorithms are usually implemented in a variational form, combining a classical optimization method with a quantum machine to find good solutions to an optimization problem.
In this study we apply a Cross-Entropy method to shape this landscape, which allows the classical parameter to find better parameters more easily and hence results in an improved performance.
arXiv Detail & Related papers (2020-03-11T13:52:41Z) - Accelerating Quantum Approximate Optimization Algorithm using Machine
Learning [6.735657356113614]
We propose a machine learning based approach to accelerate quantum approximate optimization algorithm (QAOA) implementation.
QAOA is a quantum-classical hybrid algorithm to prove the so-called quantum supremacy.
We show that the proposed approach can curtail the number of optimization iterations by up to 65.7%) from an analysis performed with 264 flavors of graphs.
arXiv Detail & Related papers (2020-02-04T02:21:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.