Efficient QUBO transformation for Higher Degree Pseudo Boolean Functions
- URL: http://arxiv.org/abs/2107.11695v1
- Date: Sat, 24 Jul 2021 22:13:42 GMT
- Title: Efficient QUBO transformation for Higher Degree Pseudo Boolean Functions
- Authors: Amit Verma, Mark Lewis, Gary Kochenberger
- Abstract summary: It is useful to have a method for transforming higher degree pseudo-Boolean problems to QUBO format.
This paper improves on the existing cubic-to-quadratic transformation approach by minimizing the number of additional variables as well as penalty coefficient.
- Score: 0.46040036610482665
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Quadratic Unconstrained Binary Optimization (QUBO) is recognized as a
unifying framework for modeling a wide range of problems. Problems can be
solved with commercial solvers customized for solving QUBO and since QUBO have
degree two, it is useful to have a method for transforming higher degree
pseudo-Boolean problems to QUBO format. The standard transformation approach
requires additional auxiliary variables supported by penalty terms for each
higher degree term. This paper improves on the existing cubic-to-quadratic
transformation approach by minimizing the number of additional variables as
well as penalty coefficient. Extensive experimental testing on Max 3-SAT
modeled as QUBO shows a near 100% reduction in the subproblem size used for
minimization of the number of auxiliary variables.
Related papers
- Transforming optimization problems into a QUBO form: A tutorial [45.31975029877049]
Practically relevant problems of quadratic optimization often contain multidimensional arrays of variables interconnected by linear constraints.
The paper identifies and considers three main transformations of the original problem statement.
Convenient formulas for calculations are presented and proven, simplifying the implementation of these transformations.
arXiv Detail & Related papers (2024-10-28T14:38:09Z) - Double Momentum Method for Lower-Level Constrained Bilevel Optimization [31.28781889710351]
We propose a new hypergradient of LCBO leveraging the theory of nonsmooth implicit function theorem instead of using the restrive assumptions.
In addition, we propose a textitsingle-loop single-timescale iteration algorithm based on the double-momentum method and adaptive step size method.
arXiv Detail & Related papers (2024-06-25T09:05:22Z) - Variable Substitution and Bilinear Programming for Aligning Partially Overlapping Point Sets [48.1015832267945]
This research presents a method to meet requirements through the minimization objective function of the RPM algorithm.
A branch-and-bound (BnB) algorithm is devised, which solely branches over the parameters, thereby boosting convergence rate.
Empirical evaluations demonstrate better robustness of the proposed methodology against non-rigid deformation, positional noise, and outliers, when compared with prevailing state-of-the-art transformations.
arXiv Detail & Related papers (2024-05-14T13:28:57Z) - Pattern QUBOs: Algorithmic construction of 3SAT-to-QUBO transformations [9.466991829376576]
We will introduce the name Pattern QUBO for a concept that has been used implicitly in the construction of 3SAT-to-QUBO transformations before.
We will show that approximate 3SAT-to-QUBO transformations can nevertheless be very effective in some cases.
arXiv Detail & Related papers (2023-05-04T08:57:51Z) - Influence of Different 3SAT-to-QUBO Transformations on the Solution
Quality of Quantum Annealing: A Benchmark Study [9.466991829376576]
We show that the choice of QUBO transformation can significantly impact the number of correct solutions the quantum annealer returns.
We also empirically show that the number of different quadratic values of a QUBO instance, combined with their range, can significantly impact the solution quality.
arXiv Detail & Related papers (2023-05-01T08:40:58Z) - Enhanced Bilevel Optimization via Bregman Distance [104.96004056928474]
We propose a bilevel optimization method based on Bregman Bregman functions.
We also propose an accelerated version of SBiO-BreD method (ASBiO-BreD) by using the variance-reduced technique.
arXiv Detail & Related papers (2021-07-26T16:18:43Z) - Square Root Bundle Adjustment for Large-Scale Reconstruction [56.44094187152862]
We propose a new formulation for the bundle adjustment problem which relies on nullspace marginalization of landmark variables by QR decomposition.
Our approach, which we call square root bundle adjustment, is algebraically equivalent to the commonly used Schur complement trick.
We show in real-world experiments with the BAL datasets that even in single precision the proposed solver achieves on average equally accurate solutions.
arXiv Detail & Related papers (2021-03-02T16:26:20Z) - Hybrid Trilinear and Bilinear Programming for Aligning Partially
Overlapping Point Sets [85.71360365315128]
In many applications, we need algorithms which can align partially overlapping point sets are invariant to the corresponding corresponding RPM algorithm.
We first show that the objective is a cubic bound function. We then utilize the convex envelopes of trilinear and bilinear monomial transformations to derive its lower bound.
We next develop a branch-and-bound (BnB) algorithm which only branches over the transformation variables and runs efficiently.
arXiv Detail & Related papers (2021-01-19T04:24:23Z) - A Hybrid Framework Using a QUBO Solver For Permutation-Based
Combinatorial Optimization [5.460573052311485]
We propose a hybrid framework to solve large-scale permutation-based problems using a high-performance quadratic unconstrained binary optimization solver.
We propose techniques to overcome the challenges in using a QUBO solver that typically comes with limited numbers of bits.
arXiv Detail & Related papers (2020-09-27T07:15:25Z) - Boosting Data Reduction for the Maximum Weight Independent Set Problem
Using Increasing Transformations [59.84561168501493]
We introduce new generalized data reduction and transformation rules for the maximum weight independent set problem.
Surprisingly, these so-called increasing transformations can simplify the problem and also open up the reduction space to yield even smaller irreducible graphs later in the algorithm.
Our algorithm computes significantly smaller irreducible graphs on all except one instance, solves more instances to optimality than previously possible, is up to two orders of magnitude faster than the best state-of-the-art solver, and finds higher-quality solutions than solvers DynWVC and HILS.
arXiv Detail & Related papers (2020-08-12T08:52:50Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.