Transferring linearly fixed QAOA angles: performance and real device results
- URL: http://arxiv.org/abs/2504.12632v1
- Date: Thu, 17 Apr 2025 04:17:51 GMT
- Title: Transferring linearly fixed QAOA angles: performance and real device results
- Authors: Ryo Sakai, Hiromichi Matsuyama, Wai-Hong Tam, Yu Yamashiro,
- Abstract summary: We investigate a simplified approach that combines linear parameterization with parameter transferring, reducing the parameter space to just 4 dimensions regardless of the number of layers.<n>We compare this combined approach with standard QAOA and other parameter setting strategies such as INTERP and FOURIER, which require computationally demanding incremental layer-by-layer optimization.<n>Our experiments extend from classical simulation to actual quantum hardware implementation on IBM's Eagle processor, demonstrating the approach's viability on current NISQ devices.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum Approximate Optimization Algorithm (QAOA) enables solving combinatorial optimization problems on quantum computers by optimizing variational parameters for quantum circuits. We investigate a simplified approach that combines linear parameterization with parameter transferring, reducing the parameter space to just 4 dimensions regardless of the number of layers. This simplification draws inspiration from quantum annealing schedules providing both theoretical grounding and practical advantages. We compare this combined approach with standard QAOA and other parameter setting strategies such as INTERP and FOURIER, which require computationally demanding incremental layer-by-layer optimization. Notably, previously known methods like INTERP and FOURIER yield parameters that can be well fitted by linear functions, which supports our linearization strategy. Our analysis reveals that for the random Ising model, cost landscapes in this reduced parameter space demonstrate consistent structural patterns across different problem instances. Our experiments extend from classical simulation to actual quantum hardware implementation on IBM's Eagle processor, demonstrating the approach's viability on current NISQ devices. Furthermore, the numerical results indicate that parameter transferability primarily depends on the energy scale of problem instances, with normalization techniques improving transfer quality. Most of our numerical experiments are conducted on the random Ising model, while problem-dependence is also investigated across other models. A key advantage of parameter transferring is the complete elimination of instance-specific classical optimization overhead, as pre-trained parameters can be directly applied to other problem instances, reducing classical optimization costs by orders of magnitude for deeper circuits.
Related papers
- Cross-Problem Parameter Transfer in Quantum Approximate Optimization Algorithm: A Machine Learning Approach [0.7560883489000579]
We study whether pretrained QAOA parameters of MaxCut can be used as is or to warm start the Maximum Independent Set (MIS) circuits.
Our experimental results show that such parameter transfer can significantly reduce the number of optimization iterations required.
arXiv Detail & Related papers (2025-04-14T21:56:11Z) - Optuna vs Code Llama: Are LLMs a New Paradigm for Hyperparameter Tuning? [42.362388367152256]
Large language models (LLMs) are used to fine-tune a parameter-efficient version of Code Llama using LoRA.<n>Our method achieves competitive or superior results in terms of Root Mean Square Error (RMSE) while significantly reducing computational overhead.
arXiv Detail & Related papers (2025-04-08T13:15:47Z) - Conditional Diffusion-based Parameter Generation for Quantum Approximate Optimization Algorithm [7.48670688063184]
The Quantum Approximate Optimization (OAQA) is a hybrid that shows promise in efficiently solving the MaxCut problem.<n>The generative learning model specifically the denoising diffusion (DDPM) is to learn the distribution parameters on the graph dataset.
arXiv Detail & Related papers (2024-07-17T01:18:27Z) - Linearly simplified QAOA parameters and transferability [0.6834295298053009]
Quantum Approximate Algorithm Optimization (QAOA) provides a way to solve optimization problems using quantum computers.
We present some numerical results that are obtained for instances of the random Ising model and of the max-cut problem.
arXiv Detail & Related papers (2024-05-01T17:34:32Z) - Bayesian Parameterized Quantum Circuit Optimization (BPQCO): A task and hardware-dependent approach [49.89480853499917]
Variational quantum algorithms (VQA) have emerged as a promising quantum alternative for solving optimization and machine learning problems.
In this paper, we experimentally demonstrate the influence of the circuit design on the performance obtained for two classification problems.
We also study the degradation of the obtained circuits in the presence of noise when simulating real quantum computers.
arXiv Detail & Related papers (2024-04-17T11:00:12Z) - Adiabatic-Passage-Based Parameter Setting for Quantum Approximate
Optimization Algorithm [0.7252027234425334]
We propose a novel adiabatic-passage-based parameter setting method.
This method remarkably reduces the optimization cost, specifically when applied to the 3-SAT problem, to a sublinear level.
arXiv Detail & Related papers (2023-11-30T01:06:41Z) - Parsimonious Optimisation of Parameters in Variational Quantum Circuits [1.303764728768944]
We propose a novel Quantum-Gradient Sampling that requires the execution of at most two circuits per iteration to update the optimisable parameters.
Our proposed method achieves similar convergence rates to classical gradient descent, and empirically outperforms gradient coordinate descent, and SPSA.
arXiv Detail & Related papers (2023-06-20T18:50:18Z) - Optimization of Annealed Importance Sampling Hyperparameters [77.34726150561087]
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models.
We present a parameteric AIS process with flexible intermediary distributions and optimize the bridging distributions to use fewer number of steps for sampling.
We assess the performance of our optimized AIS for marginal likelihood estimation of deep generative models and compare it to other estimators.
arXiv Detail & Related papers (2022-09-27T07:58:25Z) - FLIP: A flexible initializer for arbitrarily-sized parametrized quantum
circuits [105.54048699217668]
We propose a FLexible Initializer for arbitrarily-sized Parametrized quantum circuits.
FLIP can be applied to any family of PQCs, and instead of relying on a generic set of initial parameters, it is tailored to learn the structure of successful parameters.
We illustrate the advantage of using FLIP in three scenarios: a family of problems with proven barren plateaus, PQC training to solve max-cut problem instances, and PQC training for finding the ground state energies of 1D Fermi-Hubbard models.
arXiv Detail & Related papers (2021-03-15T17:38:33Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Cross Entropy Hyperparameter Optimization for Constrained Problem
Hamiltonians Applied to QAOA [68.11912614360878]
Hybrid quantum-classical algorithms such as Quantum Approximate Optimization Algorithm (QAOA) are considered as one of the most encouraging approaches for taking advantage of near-term quantum computers in practical applications.
Such algorithms are usually implemented in a variational form, combining a classical optimization method with a quantum machine to find good solutions to an optimization problem.
In this study we apply a Cross-Entropy method to shape this landscape, which allows the classical parameter to find better parameters more easily and hence results in an improved performance.
arXiv Detail & Related papers (2020-03-11T13:52:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.