Benchmarking Lie-Algebraic Pretraining and Non-Variational QWOA for the MaxCut Problem
- URL: http://arxiv.org/abs/2512.22856v1
- Date: Sun, 28 Dec 2025 09:42:02 GMT
- Title: Benchmarking Lie-Algebraic Pretraining and Non-Variational QWOA for the MaxCut Problem
- Authors: Matthaus Zering, Jolyon Joyce, Tal Gurfinkel, Jingbo Wang,
- Abstract summary: This paper provides a comparative performance analysis of two strategies designed to improve trainability.<n>We benchmark both methods on the unweighted Maxcut problem using a circuit depth of $p = 256$ across 200 Erds-Rényi and 200 3-regular graphs.<n>Both approaches significantly improve upon the standard randomly QWOA. NV-QWOA attains a mean approximation ratio of 98.9% in just 60 iterations, while the Lie-algebraic pretrained QWOA improves to 77.71% after 500 iterations.
- Score: 4.103893081207555
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Quantum Approximate Optimization Algorithm (QAOA) is a leading candidate for achieving quantum advantage in combinatorial optimization on Near-Term Intermediate-Scale Quantum (NISQ) devices. However, random initialization of the variational parameters typically leads to vanishing gradients, rendering standard variational optimization ineffective. This paper provides a comparative performance analysis of two distinct strategies designed to improve trainability: Lie algebraic pretraining framework that uses Lie-algebraic classical simulation to find near-optimal initializations, and non-variational QWOA (NV-QWOA) that targets a restrict parameter subspace covered by 3 hyperparameters. We benchmark both methods on the unweighted Maxcut problem using a circuit depth of $p = 256$ across 200 Erdős-Rényi and 200 3-regular graphs, each with 16 vertices. Both approaches significantly improve upon the standard randomly initialized QWOA. NV-QWOA attains a mean approximation ratio of 98.9\% in just 60 iterations, while the Lie-algebraic pretrained QWOA improves to 77.71\% after 500 iterations. That optimization proceeds more quickly for NV-QWOA is not surprising given its significantly smaller parameter space, however, that an algorithm with so few tunable parameters reliably finds near-optimal solutions is remarkable. These findings suggest that the structured parameterization of NV-QWOA offers a more robust training approach than pretraining on lower-dimensional auxiliary problems. Future work is needed to confirm scaling to larger problem sizes and to asses generalization to other problem classes.
Related papers
- QAOA-Predictor: Forecasting Success Probabilities and Minimal Depths for Efficient Fixed-Parameter Optimization [0.9558392439655014]
We propose a novel approach using a Graph Neural Network (GNN) to predict Quantum Approximate Optimization Algorithm (QAOA) performance.<n>We demonstrate that the GNN accurately predicts QAOA performance within a 10% margin of the true values.
arXiv Detail & Related papers (2026-03-03T13:43:44Z) - Towards a Unified Analysis of Neural Networks in Nonparametric Instrumental Variable Regression: Optimization and Generalization [66.08522228989634]
We establish the first global convergence result of neural networks for two stage least squares (2SLS) approach in nonparametric instrumental variable regression (NPIV)<n>This is achieved by adopting a lifted perspective through mean-field Langevin dynamics (MFLD)
arXiv Detail & Related papers (2025-11-18T17:51:17Z) - A Gradient Meta-Learning Joint Optimization for Beamforming and Antenna Position in Pinching-Antenna Systems [63.213207442368294]
We consider a novel optimization design for multi-waveguide pinching-antenna systems.<n>The proposed GML-JO algorithm is robust to different choices and better performance compared with the existing optimization methods.
arXiv Detail & Related papers (2025-06-14T17:35:27Z) - Adam assisted Fully informed Particle Swarm Optimization ( Adam-FIPSO ) based Parameter Prediction for the Quantum Approximate Optimization Algorithm (QAOA) [1.024113475677323]
The Quantum Approximate Optimization Algorithm (QAOA) is a prominent variational algorithm used for solving optimization problems such as the Max-Cut problem.<n>A key challenge in QAOA lies in efficiently identifying suitable parameters that lead to high-quality solutions.
arXiv Detail & Related papers (2025-06-07T13:14:41Z) - Extrapolation method to optimize linear-ramp QAOA parameters: Evaluation of QAOA runtime scaling [0.0]
The linear-ramp QAOA has been proposed to address this issue, as it relies on only two parameters which have to be optimized.<n>We apply this method to several use cases such as portfolio optimization, feature selection and clustering, and compare the quantum runtime scaling with that of classical methods.
arXiv Detail & Related papers (2025-04-11T14:30:26Z) - Iterative-Free Quantum Approximate Optimization Algorithm Using Neural
Networks [20.051757447006043]
We propose a practical method that uses a simple, fully connected neural network to find better parameters tailored to a new given problem instance.
Our method is consistently the fastest to converge while also the best final result.
arXiv Detail & Related papers (2022-08-21T14:05:11Z) - Parameters Fixing Strategy for Quantum Approximate Optimization
Algorithm [0.0]
We propose a strategy to give high approximation ratio on average, even at large circuit depths, by initializing QAOA with the optimal parameters obtained from the previous depths.
We test our strategy on the Max-cut problem of certain classes of graphs such as the 3-regular graphs and the Erd"os-R'enyi graphs.
arXiv Detail & Related papers (2021-08-11T15:44:16Z) - Unified Convergence Analysis for Adaptive Optimization with Moving Average Estimator [75.05106948314956]
We show that an increasing large momentum parameter for the first-order moment is sufficient for adaptive scaling.<n>We also give insights for increasing the momentum in a stagewise manner in accordance with stagewise decreasing step size.
arXiv Detail & Related papers (2021-04-30T08:50:24Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Convergence of adaptive algorithms for weakly convex constrained
optimization [59.36386973876765]
We prove the $mathcaltilde O(t-1/4)$ rate of convergence for the norm of the gradient of Moreau envelope.
Our analysis works with mini-batch size of $1$, constant first and second order moment parameters, and possibly smooth optimization domains.
arXiv Detail & Related papers (2020-06-11T17:43:19Z) - Cross Entropy Hyperparameter Optimization for Constrained Problem
Hamiltonians Applied to QAOA [68.11912614360878]
Hybrid quantum-classical algorithms such as Quantum Approximate Optimization Algorithm (QAOA) are considered as one of the most encouraging approaches for taking advantage of near-term quantum computers in practical applications.
Such algorithms are usually implemented in a variational form, combining a classical optimization method with a quantum machine to find good solutions to an optimization problem.
In this study we apply a Cross-Entropy method to shape this landscape, which allows the classical parameter to find better parameters more easily and hence results in an improved performance.
arXiv Detail & Related papers (2020-03-11T13:52:41Z) - Accelerating Quantum Approximate Optimization Algorithm using Machine
Learning [6.735657356113614]
We propose a machine learning based approach to accelerate quantum approximate optimization algorithm (QAOA) implementation.
QAOA is a quantum-classical hybrid algorithm to prove the so-called quantum supremacy.
We show that the proposed approach can curtail the number of optimization iterations by up to 65.7%) from an analysis performed with 264 flavors of graphs.
arXiv Detail & Related papers (2020-02-04T02:21:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.