Convergence guarantee for linearly-constrained combinatorial optimization with a quantum alternating operator ansatz
- URL: http://arxiv.org/abs/2409.18829v1
- Date: Fri, 27 Sep 2024 15:23:47 GMT
- Title: Convergence guarantee for linearly-constrained combinatorial optimization with a quantum alternating operator ansatz
- Authors: Brayden Goldstein-Gelb, Phillip C. Lotshaw,
- Abstract summary: We present a quantum alternating operator ansatz (QAOA$+$) that solves a class of linearly constrained optimization problems.
For problems in this class, we devise circuits that provably converge to the optimal solution as the number of circuit layers increases.
This analysis extends QAOA$+$ performance guarantees to a more general set of linearly-constrained problems and provides tools for future generalizations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We present a quantum alternating operator ansatz (QAOA$^+$) that solves a class of linearly constrained optimization problems by evolving a quantum state within a Hilbert subspace of feasible problem solutions. Our main focus is on a class of problems with a linear constraint containing sequential integer coefficients. For problems in this class, we devise QAOA$^+$ circuits that provably converge to the optimal solution as the number of circuit layers increases, generalizing previous guarantees for solving unconstrained problems or problems with symmetric constraints. Our approach includes asymmetric ``mixing" Hamiltonians that drive transitions between feasible states, as well as a method to incorporate an arbitrary known feasible solution as the initial state, each of which can be applied beyond the specific linear constraints considered here. This analysis extends QAOA$^+$ performance guarantees to a more general set of linearly-constrained problems and provides tools for future generalizations.
Related papers
- Cons-training tensor networks [2.8834278113855896]
We introduce a novel family of tensor networks, termed.
textitconstrained matrix product states (MPS)
These networks incorporate exactly arbitrary discrete linear constraints, including inequalities, into sparse block structures.
These networks are particularly tailored for modeling distributions with support strictly over the feasible space.
arXiv Detail & Related papers (2024-05-15T00:13:18Z) - Symmetries and Dimension Reduction in Quantum Approximate Optimization
Algorithm [1.3469999282609788]
We focus on the generalized formulation of optimization problems defined on the sets of $n-element $d$-ary strings.
Our main contribution encompasses dimension for the originally proposed QAOA.
Restricting the algorithm to spaces of smaller dimension may lead to significant acceleration of both quantum and classical simulation of circuits.
arXiv Detail & Related papers (2023-09-25T00:35:40Z) - Constrained Optimization via Exact Augmented Lagrangian and Randomized
Iterative Sketching [55.28394191394675]
We develop an adaptive inexact Newton method for equality-constrained nonlinear, nonIBS optimization problems.
We demonstrate the superior performance of our method on benchmark nonlinear problems, constrained logistic regression with data from LVM, and a PDE-constrained problem.
arXiv Detail & Related papers (2023-05-28T06:33:37Z) - Approximation of optimization problems with constraints through kernel
Sum-Of-Squares [77.27820145069515]
We show that pointwise inequalities are turned into equalities within a class of nonnegative kSoS functions.
We also show that focusing on pointwise equality constraints enables the use of scattering inequalities to mitigate the curse of dimensionality in sampling the constraints.
arXiv Detail & Related papers (2023-01-16T10:30:04Z) - Quantum Worst-Case to Average-Case Reductions for All Linear Problems [66.65497337069792]
We study the problem of designing worst-case to average-case reductions for quantum algorithms.
We provide an explicit and efficient transformation of quantum algorithms that are only correct on a small fraction of their inputs into ones that are correct on all inputs.
arXiv Detail & Related papers (2022-12-06T22:01:49Z) - Symmetric Tensor Networks for Generative Modeling and Constrained
Combinatorial Optimization [72.41480594026815]
Constrained optimization problems abound in industry, from portfolio optimization to logistics.
One of the major roadblocks in solving these problems is the presence of non-trivial hard constraints which limit the valid search space.
In this work, we encode arbitrary integer-valued equality constraints of the form Ax=b, directly into U(1) symmetric networks (TNs) and leverage their applicability as quantum-inspired generative models.
arXiv Detail & Related papers (2022-11-16T18:59:54Z) - Faster Algorithm and Sharper Analysis for Constrained Markov Decision
Process [56.55075925645864]
The problem of constrained decision process (CMDP) is investigated, where an agent aims to maximize the expected accumulated discounted reward subject to multiple constraints.
A new utilities-dual convex approach is proposed with novel integration of three ingredients: regularized policy, dual regularizer, and Nesterov's gradient descent dual.
This is the first demonstration that nonconcave CMDP problems can attain the lower bound of $mathcal O (1/epsilon)$ for all complexity optimization subject to convex constraints.
arXiv Detail & Related papers (2021-10-20T02:57:21Z) - Modeling Linear Inequality Constraints in Quadratic Binary Optimization
for Variational Quantum Eigensolver [0.0]
This paper introduces the use of tailored variational forms for variational quantum eigensolver.
Four constraints that usually appear in several optimization problems are modeled.
The main advantage of the proposed methodology is that the number of parameters on the variational form remain constant.
arXiv Detail & Related papers (2020-07-26T23:36:22Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z) - Constructing Driver Hamiltonians for Optimization Problems with Linear
Constraints [0.0]
We develop a simple and intuitive framework for reasoning about the commutation of Hamiltonians with linear constraints.
Because unitary operators are exponentials of Hermitian operators, these results can also be applied to the construction of mixers in the Quantum Alternating Operator Ansatz framework.
arXiv Detail & Related papers (2020-06-22T06:47:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.