Transforming optimization problems into a QUBO form: A tutorial
- URL: http://arxiv.org/abs/2410.21074v1
- Date: Mon, 28 Oct 2024 14:38:09 GMT
- Title: Transforming optimization problems into a QUBO form: A tutorial
- Authors: Alexander M. Semenov, Sergey R. Usmanov, Aleksey K. Fedorov,
- Abstract summary: Practically relevant problems of quadratic optimization often contain multidimensional arrays of variables interconnected by linear constraints.
The paper identifies and considers three main transformations of the original problem statement.
Convenient formulas for calculations are presented and proven, simplifying the implementation of these transformations.
- Score: 45.31975029877049
- License:
- Abstract: Practically relevant problems of quadratic optimization often contain multidimensional arrays of variables interconnected by linear constraints, such as equalities and inequalities. The values of each variable depend on its specific meaning and can be binary, integer, discrete, and continuous. These circumstances make it technically difficult to reduce the original problem statement to the QUBO form. The paper identifies and considers three main transformations of the original problem statement, namely, the transition from a multidimensional to a one-dimensional array of variables, the transition in mixed problems to binary variables, and the inclusion of linear constraints in the objective function in the form of quadratic penalties. Convenient formulas for calculations are presented and proven, simplifying the implementation of these transformations. In particular, the formulas for the transition in the problem statement from a multidimensional to a one-dimensional array of variables are based on the use of the Kronecker product of matrices. The considered transformations are illustrated by numerous examples.
Related papers
- Online Stability Improvement of Groebner Basis Solvers using Deep
Learning [29.805408457645886]
We show that different variable or monomial orderings lead to different elimination templates.
We then prove that the original set of coefficients may contain sufficient information to train a selection of a good solver.
arXiv Detail & Related papers (2024-01-17T16:51:28Z) - Data-driven path collective variables [0.0]
We propose a new method for the generation, optimization, and comparison of collective variables.
The resulting collective variable is one-dimensional, interpretable, and differentiable.
We demonstrate the validity of the method on two different applications.
arXiv Detail & Related papers (2023-12-21T14:07:47Z) - Outlier detection in regression: conic quadratic formulations [3.04585143845864]
In many applications, it is important to account for the presence of outliers, i.e., corrupted input data points.
Existing approaches in the literature, typically relying on the linearization of the cubic terms using big-M constraints, suffer from weak relaxation and poor performance in practice.
In this work we derive stronger second-order conic relaxations that do not involve big-M constraints.
arXiv Detail & Related papers (2023-07-12T07:44:13Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - Equivariant Disentangled Transformation for Domain Generalization under
Combination Shift [91.38796390449504]
Combinations of domains and labels are not observed during training but appear in the test environment.
We provide a unique formulation of the combination shift problem based on the concepts of homomorphism, equivariance, and a refined definition of disentanglement.
arXiv Detail & Related papers (2022-08-03T12:31:31Z) - Efficient QUBO transformation for Higher Degree Pseudo Boolean Functions [0.46040036610482665]
It is useful to have a method for transforming higher degree pseudo-Boolean problems to QUBO format.
This paper improves on the existing cubic-to-quadratic transformation approach by minimizing the number of additional variables as well as penalty coefficient.
arXiv Detail & Related papers (2021-07-24T22:13:42Z) - Boosting Data Reduction for the Maximum Weight Independent Set Problem
Using Increasing Transformations [59.84561168501493]
We introduce new generalized data reduction and transformation rules for the maximum weight independent set problem.
Surprisingly, these so-called increasing transformations can simplify the problem and also open up the reduction space to yield even smaller irreducible graphs later in the algorithm.
Our algorithm computes significantly smaller irreducible graphs on all except one instance, solves more instances to optimality than previously possible, is up to two orders of magnitude faster than the best state-of-the-art solver, and finds higher-quality solutions than solvers DynWVC and HILS.
arXiv Detail & Related papers (2020-08-12T08:52:50Z) - Modeling Linear Inequality Constraints in Quadratic Binary Optimization
for Variational Quantum Eigensolver [0.0]
This paper introduces the use of tailored variational forms for variational quantum eigensolver.
Four constraints that usually appear in several optimization problems are modeled.
The main advantage of the proposed methodology is that the number of parameters on the variational form remain constant.
arXiv Detail & Related papers (2020-07-26T23:36:22Z) - Eigendecomposition-Free Training of Deep Networks for Linear
Least-Square Problems [107.3868459697569]
We introduce an eigendecomposition-free approach to training a deep network.
We show that our approach is much more robust than explicit differentiation of the eigendecomposition.
Our method has better convergence properties and yields state-of-the-art results.
arXiv Detail & Related papers (2020-04-15T04:29:34Z) - Multi-Objective Matrix Normalization for Fine-grained Visual Recognition [153.49014114484424]
Bilinear pooling achieves great success in fine-grained visual recognition (FGVC)
Recent methods have shown that the matrix power normalization can stabilize the second-order information in bilinear features.
We propose an efficient Multi-Objective Matrix Normalization (MOMN) method that can simultaneously normalize a bilinear representation.
arXiv Detail & Related papers (2020-03-30T08:40:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.