Quantum Optimization with Classical Chaos
- URL: http://arxiv.org/abs/2510.01334v1
- Date: Wed, 01 Oct 2025 18:02:46 GMT
- Title: Quantum Optimization with Classical Chaos
- Authors: Malick A. Gaye, Omar Shehab, Paraj Titum, Gregory Quiroz,
- Abstract summary: Quantum Approximate Optimization Algorithm (QAOA) is a powerful tool in solving various problems such as Maximum Satisfiability and Maximum Cut.<n>Hard computational problems, however, require deep circuits that place high demands on classical variational parameter optimization.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The Quantum Approximate Optimization Algorithm (QAOA) is a powerful tool in solving various combinatorial problems such as Maximum Satisfiability and Maximum Cut. Hard computational problems, however, require deep circuits that place high demands on classical variational parameter optimization. Ultimately, this has necessitated investigations into alternative methods for effective QAOA parameterizations. Here, we study a parameterization scheme based on classical chaotic recursive mapping, which enables significant reductions in the scaling of the variational parameter space. Through numerical investigations of hard Maximum Satisfiability problems, we demonstrate that the chaotic mapping can effectively match the performance of standard QAOA when subject to a limited number of classical optimization iterations and short-depth circuits. Insight into this behavior is elucidated through the lens of classical dynamical systems and used to inform hybridized schemes that leverage both standard and chaotic parameterizations. It is shown that these hybridized approaches can boost QAOA performance beyond that of the standard approach alone, especially for deep circuits. Through this study, we provide a new perspective that introduces a generalized framework for specifying performant, dynamical-map-based QAOA parameterizations.
Related papers
- An Introduction to the Quantum Approximate Optimization Algorithm [51.56484100374058]
The tutorial begins by outlining variational quantum circuits and QUBO problems.<n>Next, it explores the QAOA in detail, covering its Hamiltonian formulation, gate decomposition, and example applications.<n>The tutorial extends these concepts to higher-order Hamiltonians and discussing the associated symmetries and circuit construction.
arXiv Detail & Related papers (2025-11-23T09:54:20Z) - Transferring linearly fixed QAOA angles: performance and real device results [0.0]
We investigate a simplified approach that combines linear parameterization with parameter transferring, reducing the parameter space to just 4 dimensions regardless of the number of layers.<n>We compare this combined approach with standard QAOA and other parameter setting strategies such as INTERP and FOURIER, which require computationally demanding incremental layer-by-layer optimization.<n>Our experiments extend from classical simulation to actual quantum hardware implementation on IBM's Eagle processor, demonstrating the approach's viability on current NISQ devices.
arXiv Detail & Related papers (2025-04-17T04:17:51Z) - MG-Net: Learn to Customize QAOA with Circuit Depth Awareness [51.78425545377329]
Quantum Approximate Optimization Algorithm (QAOA) and its variants exhibit immense potential in tackling optimization challenges.
The requisite circuit depth for satisfactory performance is problem-specific and often exceeds the maximum capability of current quantum devices.
We introduce the Mixer Generator Network (MG-Net), a unified deep learning framework adept at dynamically formulating optimal mixer Hamiltonians.
arXiv Detail & Related papers (2024-09-27T12:28:18Z) - Adiabatic-Passage-Based Parameter Setting for Quantum Approximate
Optimization Algorithm [0.7252027234425334]
We propose a novel adiabatic-passage-based parameter setting method.
This method remarkably reduces the optimization cost, specifically when applied to the 3-SAT problem, to a sublinear level.
arXiv Detail & Related papers (2023-11-30T01:06:41Z) - Benchmarking Metaheuristic-Integrated QAOA against Quantum Annealing [0.0]
The study provides insights into the strengths and limitations of both Quantum Annealing and metaheuristic-integrated QAOA across different problem domains.
The findings suggest that the hybrid approach can leverage classical optimization strategies to enhance the solution quality and convergence speed of QAOA.
arXiv Detail & Related papers (2023-09-28T18:55:22Z) - A Deep Unrolling Model with Hybrid Optimization Structure for Hyperspectral Image Deconvolution [50.13564338607482]
We propose a novel optimization framework for the hyperspectral deconvolution problem, called DeepMix.<n>It consists of three distinct modules, namely, a data consistency module, a module that enforces the effect of the handcrafted regularizers, and a denoising module.<n>This work proposes a context aware denoising module designed to sustain the advancements achieved by the cooperative efforts of the other modules.
arXiv Detail & Related papers (2023-06-10T08:25:16Z) - Twisted hybrid algorithms for combinatorial optimization [68.8204255655161]
Proposed hybrid algorithms encode a cost function into a problem Hamiltonian and optimize its energy by varying over a set of states with low circuit complexity.
We show that for levels $p=2,ldots, 6$, the level $p$ can be reduced by one while roughly maintaining the expected approximation ratio.
arXiv Detail & Related papers (2022-03-01T19:47:16Z) - Parameters Fixing Strategy for Quantum Approximate Optimization
Algorithm [0.0]
We propose a strategy to give high approximation ratio on average, even at large circuit depths, by initializing QAOA with the optimal parameters obtained from the previous depths.
We test our strategy on the Max-cut problem of certain classes of graphs such as the 3-regular graphs and the Erd"os-R'enyi graphs.
arXiv Detail & Related papers (2021-08-11T15:44:16Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - A Dynamical Systems Approach for Convergence of the Bayesian EM
Algorithm [59.99439951055238]
We show how (discrete-time) Lyapunov stability theory can serve as a powerful tool to aid, or even lead, in the analysis (and potential design) of optimization algorithms that are not necessarily gradient-based.
The particular ML problem that this paper focuses on is that of parameter estimation in an incomplete-data Bayesian framework via the popular optimization algorithm known as maximum a posteriori expectation-maximization (MAP-EM)
We show that fast convergence (linear or quadratic) is achieved, which could have been difficult to unveil without our adopted S&C approach.
arXiv Detail & Related papers (2020-06-23T01:34:18Z) - Cross Entropy Hyperparameter Optimization for Constrained Problem
Hamiltonians Applied to QAOA [68.11912614360878]
Hybrid quantum-classical algorithms such as Quantum Approximate Optimization Algorithm (QAOA) are considered as one of the most encouraging approaches for taking advantage of near-term quantum computers in practical applications.
Such algorithms are usually implemented in a variational form, combining a classical optimization method with a quantum machine to find good solutions to an optimization problem.
In this study we apply a Cross-Entropy method to shape this landscape, which allows the classical parameter to find better parameters more easily and hence results in an improved performance.
arXiv Detail & Related papers (2020-03-11T13:52:41Z) - On Hyper-parameter Tuning for Stochastic Optimization Algorithms [28.88646928299302]
This paper proposes the first-ever algorithmic framework for tuning hyper-parameters of optimization algorithm based on reinforcement learning.
The proposed framework can be used as a standard tool for hyper-parameter tuning in algorithms.
arXiv Detail & Related papers (2020-03-04T12:29:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.