A degree reduction method for an efficient QUBO formulation for the
graph coloring problem
- URL: http://arxiv.org/abs/2306.12081v2
- Date: Mon, 27 Nov 2023 07:48:47 GMT
- Title: A degree reduction method for an efficient QUBO formulation for the
graph coloring problem
- Authors: Namho Hong, Hyunwoo Jung, Hyosang Kang, Hyunjin Lim, Chaehwan Seol,
and Seokhyun Um
- Abstract summary: We introduce a new degree reduction method for homogeneous symmetrics on binary variables that generalizes the conventional degree reduction methods introduced by Freedman and Ishikawa.
We also design an degree reduction algorithm for generals on binary variables, simulated on a graph coloring problem for random graphs and compared the results with the conventional methods.
- Score: 0.32985979395737774
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce a new degree reduction method for homogeneous symmetric
polynomials on binary variables that generalizes the conventional degree
reduction methods on monomials introduced by Freedman and Ishikawa. We also
design an degree reduction algorithm for general polynomials on binary
variables, simulated on the graph coloring problem for random graphs, and
compared the results with the conventional methods. The simulated results show
that our new method produces reduced quadratic polynomials that contains less
variables than the reduced quadratic polynomials produced by the conventional
methods.
Related papers
- Stochastic Gradient Descent for Gaussian Processes Done Right [86.83678041846971]
We show that when emphdone right -- by which we mean using specific insights from optimisation and kernel communities -- gradient descent is highly effective.
We introduce a emphstochastic dual descent algorithm, explain its design in an intuitive manner and illustrate the design choices.
Our method places Gaussian process regression on par with state-of-the-art graph neural networks for molecular binding affinity prediction.
arXiv Detail & Related papers (2023-10-31T16:15:13Z) - Polynomial Preconditioning for Gradient Methods [9.13755431537592]
We propose a new family of preconditioners generated by symmetrics.
They provide first-order optimization methods with a provable improvement of the condition number.
We show how to incorporate a preconditioning into the Gradient and Fast Gradient Methods and establish the corresponding global complexity.
arXiv Detail & Related papers (2023-01-30T18:55:38Z) - A Direct Reduction from the Polynomial to the Adversary Method [92.54311953850168]
We give a simple and direct reduction from the method (in the form of a dual) to the adversary method.
This shows that any lower bound in the form of a dual is an adversary lower bound of a specific form.
arXiv Detail & Related papers (2023-01-24T21:37:20Z) - Sparse resultant based minimal solvers in computer vision and their
connection with the action matrix [17.31412310131552]
We show that for some camera geometry problems our extra-based method leads to smaller and more stable solvers than the state-of-the-art Grobner basis-based solvers.
It provides a competitive alternative to popularner basis-based methods for minimal problems in computer vision.
arXiv Detail & Related papers (2023-01-16T14:25:19Z) - Vector-Valued Least-Squares Regression under Output Regularity
Assumptions [73.99064151691597]
We propose and analyse a reduced-rank method for solving least-squares regression problems with infinite dimensional output.
We derive learning bounds for our method, and study under which setting statistical performance is improved in comparison to full-rank method.
arXiv Detail & Related papers (2022-11-16T15:07:00Z) - Graph Polynomial Convolution Models for Node Classification of
Non-Homophilous Graphs [52.52570805621925]
We investigate efficient learning from higher-order graph convolution and learning directly from adjacency matrix for node classification.
We show that the resulting model lead to new graphs and residual scaling parameter.
We demonstrate that the proposed methods obtain improved accuracy for node-classification of non-homophilous parameters.
arXiv Detail & Related papers (2022-09-12T04:46:55Z) - Barzilai and Borwein conjugate gradient method equipped with a
non-monotone line search technique and its application on non-negative matrix
factorization [1.6058099298620423]
We first modify the non-monotone line search method by introducing a new trigonometric function to calculate the non-monotone parameter.
Under some suitable assumptions, we prove that the new algorithm has the global convergence property.
The efficiency and effectiveness of the proposed method are determined in practice by applying the algorithm to some standard test problems and non-negative matrix factorization problems.
arXiv Detail & Related papers (2021-09-13T03:35:36Z) - A block-sparse Tensor Train Format for sample-efficient high-dimensional
Polynomial Regression [0.0]
Low-rank tensors are an established framework for high-dimensionals problems.
We propose to extend this framework by including the concept of block-sparsity.
This allows us to adapt the ansatz space to align better with known sample results.
arXiv Detail & Related papers (2021-04-29T10:57:53Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z) - Cogradient Descent for Bilinear Optimization [124.45816011848096]
We introduce a Cogradient Descent algorithm (CoGD) to address the bilinear problem.
We solve one variable by considering its coupling relationship with the other, leading to a synchronous gradient descent.
Our algorithm is applied to solve problems with one variable under the sparsity constraint.
arXiv Detail & Related papers (2020-06-16T13:41:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.