Stochastic Gradient Line Bayesian Optimization: Reducing Measurement
Shots in Optimizing Parameterized Quantum Circuits
- URL: http://arxiv.org/abs/2111.07952v1
- Date: Mon, 15 Nov 2021 18:00:14 GMT
- Title: Stochastic Gradient Line Bayesian Optimization: Reducing Measurement
Shots in Optimizing Parameterized Quantum Circuits
- Authors: Shiro Tamiya, Hayata Yamasaki
- Abstract summary: We develop an efficient framework for circuit optimization with fewer measurement shots.
We formulate an adaptive measurement-shot strategy to achieve the optimization feasibly without relying on precise expectation-value estimation.
We show that a technique of suffix averaging can significantly reduce the effect of statistical and hardware noise in the optimization for the VQAs.
- Score: 4.94950858749529
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Optimization of parameterized quantum circuits is indispensable for
applications of near-term quantum devices to computational tasks with
variational quantum algorithms (VQAs). However, the existing optimization
algorithms for VQAs require an excessive number of quantum-measurement shots in
estimating expectation values of observables or iterating updates of circuit
parameters, whose cost has been a crucial obstacle for practical use. To
address this problem, we develop an efficient framework, \textit{stochastic
gradient line Bayesian optimization} (SGLBO), for the circuit optimization with
fewer measurement shots. The SGLBO reduces the cost of measurement shots by
estimating an appropriate direction of updating the parameters based on
stochastic gradient descent (SGD) and further by utilizing Bayesian
optimization (BO) to estimate the optimal step size in each iteration of the
SGD. We formulate an adaptive measurement-shot strategy to achieve the
optimization feasibly without relying on precise expectation-value estimation
and many iterations; moreover, we show that a technique of suffix averaging can
significantly reduce the effect of statistical and hardware noise in the
optimization for the VQAs. Our numerical simulation demonstrates that the SGLBO
augmented with these techniques can drastically reduce the required number of
measurement shots, improve the accuracy in the optimization, and enhance the
robustness against the noise compared to other state-of-art optimizers in
representative tasks for the VQAs. These results establish a framework of
quantum-circuit optimizers integrating two different optimization approaches,
SGD and BO, to reduce the cost of measurement shots significantly.
Related papers
- End-to-End Learning for Fair Multiobjective Optimization Under
Uncertainty [55.04219793298687]
The Predict-Then-Forecast (PtO) paradigm in machine learning aims to maximize downstream decision quality.
This paper extends the PtO methodology to optimization problems with nondifferentiable Ordered Weighted Averaging (OWA) objectives.
It shows how optimization of OWA functions can be effectively integrated with parametric prediction for fair and robust optimization under uncertainty.
arXiv Detail & Related papers (2024-02-12T16:33:35Z) - Reducing measurement costs by recycling the Hessian in adaptive variational quantum algorithms [0.0]
We propose an improved quasi-Newton optimization protocol specifically tailored to adaptive VQAs.
We implement a quasi-Newton algorithm where an approximation to the inverse Hessian matrix is continuously built and grown across the iterations of an adaptive VQA.
arXiv Detail & Related papers (2024-01-10T14:08:04Z) - Efficient and Robust Parameter Optimization of the Unitary Coupled-Cluster Ansatz [4.607081302947026]
We propose sequential optimization with approximate parabola (SOAP) for parameter optimization of unitary coupled-cluster ansatz on quantum computers.
Numerical benchmark studies on molecular systems demonstrate that SOAP achieves significantly faster convergence and greater robustness to noise.
SOAP is further validated through experiments on a superconducting quantum computer using a 2-qubit model system.
arXiv Detail & Related papers (2024-01-10T03:30:39Z) - Iterative Layerwise Training for Quantum Approximate Optimization
Algorithm [0.39945675027960637]
The capability of the quantum approximate optimization algorithm (QAOA) in solving the optimization problems has been intensively studied in recent years.
We propose the iterative layerwise optimization strategy and explore the possibility for the reduction of optimization cost in solving problems with QAOA.
arXiv Detail & Related papers (2023-09-24T05:12:48Z) - On-Chip Hardware-Aware Quantization for Mixed Precision Neural Networks [52.97107229149988]
We propose an On-Chip Hardware-Aware Quantization framework, performing hardware-aware mixed-precision quantization on deployed edge devices.
For efficiency metrics, we built an On-Chip Quantization Aware pipeline, which allows the quantization process to perceive the actual hardware efficiency of the quantization operator.
For accuracy metrics, we propose Mask-Guided Quantization Estimation technology to effectively estimate the accuracy impact of operators in the on-chip scenario.
arXiv Detail & Related papers (2023-09-05T04:39:34Z) - Optimization of Annealed Importance Sampling Hyperparameters [77.34726150561087]
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models.
We present a parameteric AIS process with flexible intermediary distributions and optimize the bridging distributions to use fewer number of steps for sampling.
We assess the performance of our optimized AIS for marginal likelihood estimation of deep generative models and compare it to other estimators.
arXiv Detail & Related papers (2022-09-27T07:58:25Z) - Performance comparison of optimization methods on variational quantum
algorithms [2.690135599539986]
Variational quantum algorithms (VQAs) offer a promising path towards using near-term quantum hardware for applications in academic and industrial research.
We study the performance of four commonly used gradient-free optimization methods: SLSQP, COBYLA, CMA-ES, and SPSA.
arXiv Detail & Related papers (2021-11-26T12:13:20Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Automatically Learning Compact Quality-aware Surrogates for Optimization
Problems [55.94450542785096]
Solving optimization problems with unknown parameters requires learning a predictive model to predict the values of the unknown parameters and then solving the problem using these values.
Recent work has shown that including the optimization problem as a layer in a complex training model pipeline results in predictions of iteration of unobserved decision making.
We show that we can improve solution quality by learning a low-dimensional surrogate model of a large optimization problem.
arXiv Detail & Related papers (2020-06-18T19:11:54Z) - Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization [71.03797261151605]
Adaptivity is an important yet under-studied property in modern optimization theory.
Our algorithm is proved to achieve the best-available convergence for non-PL objectives simultaneously while outperforming existing algorithms for PL objectives.
arXiv Detail & Related papers (2020-02-13T05:42:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.