Minimal Constraints in the Parity Formulation of Optimization Problems
- URL: http://arxiv.org/abs/2008.10458v1
- Date: Mon, 24 Aug 2020 14:04:33 GMT
- Title: Minimal Constraints in the Parity Formulation of Optimization Problems
- Authors: Martin Lanthaler and Wolfgang Lechner
- Abstract summary: We introduce a method to find the minimal strength of the constraints which are required to conserve the correct ground-state.
We find that depending on the problem class, the exponent ranges from linear $alpha propto 1$ to quadratic $alpha propto 2$ scaling with the number of logical qubits.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As a means to solve optimization problems using quantum computers, the
problem is typically recast into a Ising spin model whose ground-state is the
solution of the optimization problem. An alternative to the Ising formulation
is the Lechner-Hauke-Zoller model, which has the form of a lattice gauge model
with nearest neighbor 4-body constraints. Here we introduce a method to find
the minimal strength of the constraints which are required to conserve the
correct ground-state. Based on this, we derive upper and lower bounds for the
minimal constraints strengths. We find that depending on the problem class, the
exponent ranges from linear $\alpha \propto 1$ to quadratic $\alpha \propto 2$
scaling with the number of logical qubits.
Related papers
- A quantum central path algorithm for linear optimization [5.450016817940232]
We propose a novel quantum algorithm for solving linear optimization problems by quantum-mechanical simulation of the central path.
This approach yields an algorithm for solving linear optimization problems involving $m$ constraints and $n$ variables to $varepsilon$-optimality.
In the standard gate model (i.e., without access to quantum RAM), our algorithm can obtain highly-precise solutions to LO problems using at most $$mathcalO left( sqrtm + n textsfnnz (A) fracR_1
arXiv Detail & Related papers (2023-11-07T13:26:20Z) - Constrained Optimization via Exact Augmented Lagrangian and Randomized
Iterative Sketching [55.28394191394675]
We develop an adaptive inexact Newton method for equality-constrained nonlinear, nonIBS optimization problems.
We demonstrate the superior performance of our method on benchmark nonlinear problems, constrained logistic regression with data from LVM, and a PDE-constrained problem.
arXiv Detail & Related papers (2023-05-28T06:33:37Z) - A Block Coordinate Descent Method for Nonsmooth Composite Optimization
under Orthogonality Constraints [7.716156977428555]
Nonsmooth composite optimization with generality constraints has a broad spectrum of applications in statistical learning and data science.
textittextbfOBCD is a new Block Coordinate method for solving nonsmooth composite problems.
arXiv Detail & Related papers (2023-04-07T13:44:59Z) - Best Policy Identification in Linear MDPs [70.57916977441262]
We investigate the problem of best identification in discounted linear Markov+Delta Decision in the fixed confidence setting under a generative model.
The lower bound as the solution of an intricate non- optimization program can be used as the starting point to devise such algorithms.
arXiv Detail & Related papers (2022-08-11T04:12:50Z) - Minimax Optimal Quantization of Linear Models: Information-Theoretic
Limits and Efficient Algorithms [59.724977092582535]
We consider the problem of quantizing a linear model learned from measurements.
We derive an information-theoretic lower bound for the minimax risk under this setting.
We show that our method and upper-bounds can be extended for two-layer ReLU neural networks.
arXiv Detail & Related papers (2022-02-23T02:39:04Z) - A Momentum-Assisted Single-Timescale Stochastic Approximation Algorithm
for Bilevel Optimization [112.59170319105971]
We propose a new algorithm -- the Momentum- Single-timescale Approximation (MSTSA) -- for tackling problems.
MSTSA allows us to control the error in iterations due to inaccurate solution to the lower level subproblem.
arXiv Detail & Related papers (2021-02-15T07:10:33Z) - Maximin Optimization for Binary Regression [24.351803097593887]
regression problems with binary weights are ubiquitous in quantized learning models and digital communication systems.
Lagrangran method also performs well in regression with cross entropy loss, as well as non- neural multi-layer saddle-point optimization.
arXiv Detail & Related papers (2020-10-10T19:47:40Z) - On Suboptimality of Least Squares with Application to Estimation of
Convex Bodies [74.39616164169131]
We settle an open problem regarding optimality of Least Squares in estimating a convex set from noisy support function measurements in dimension $dgeq 6$.
We establish that Least Squares is sub-optimal, and achieves a rate of $tildeTheta_d(n-2/(d-1))$ whereas the minimax rate is $Theta_d(n-4/(d+3))$.
arXiv Detail & Related papers (2020-06-07T05:19:00Z) - Gradient-Free Methods for Saddle-Point Problem [125.99533416395765]
We generalize the approach Gasnikov et. al., 2017, which allows to solve (stochastic) convex optimization problems with an inexact gradient-free oracle.
Our approach reduces $fracnlog n$ times the required number of oracle calls.
In the second part of the paper, we analyze the case when such an assumption cannot be made, we propose a general approach on how to modernize the method to solve this problem.
arXiv Detail & Related papers (2020-05-12T16:44:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.