Quantization-Based Optimization: Alternative Stochastic Approximation of
Global Optimization
- URL: http://arxiv.org/abs/2211.03972v1
- Date: Tue, 8 Nov 2022 03:01:45 GMT
- Title: Quantization-Based Optimization: Alternative Stochastic Approximation of
Global Optimization
- Authors: Jinwuk Seok and Chang Sik Cho
- Abstract summary: We propose a global optimization algorithm based on quantizing the energy level of an objective function in an NP-hard problem.
Numerical experiments show that the proposed algorithm outperforms conventional learning methods in solving NP-hard optimization problems.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this study, we propose a global optimization algorithm based on quantizing
the energy level of an objective function in an NP-hard problem. According to
the white noise hypothesis for a quantization error with a dense and uniform
distribution, we can regard the quantization error as i.i.d. white noise. From
stochastic analysis, the proposed algorithm converges weakly only under
conditions satisfying Lipschitz continuity, instead of local convergence
properties such as the Hessian constraint of the objective function. This shows
that the proposed algorithm ensures global optimization by Laplace's condition.
Numerical experiments show that the proposed algorithm outperforms conventional
learning methods in solving NP-hard optimization problems such as the traveling
salesman problem.
Related papers
- Analysis of the Non-variational Quantum Walk-based Optimisation Algorithm [0.0]
This paper introduces in detail a non-variational quantum algorithm designed to solve a wide range of optimisation problems.
The algorithm returns optimal and near-optimal solutions from repeated preparation and measurement of an amplified state.
arXiv Detail & Related papers (2024-07-29T13:54:28Z) - Local to Global: A Distributed Quantum Approximate Optimization
Algorithm for Pseudo-Boolean Optimization Problems [7.723735038335632]
Quantum Approximate Optimization Algorithm (QAOA) is considered as a promising candidate to demonstrate quantum supremacy.
limited qubit availability and restricted coherence time challenge QAOA to solve large-scale pseudo-Boolean problems.
We propose a distributed QAOA which can solve a general pseudo-Boolean problem by converting it to a simplified Ising model.
arXiv Detail & Related papers (2023-10-08T08:07:11Z) - Ising formulation of integer optimization problems for utilizing quantum
annealing in iterative improvement strategy [1.14219428942199]
We propose an Ising formulation of integer optimization problems to utilize quantum annealing in the iterative improvement strategy.
We analytically show that a first-order phase transition is successfully avoided for a fully connected ferro Potts model if the overlap between a ground state and a candidate solution exceeds a threshold.
arXiv Detail & Related papers (2022-11-08T02:12:49Z) - Exploring the Algorithm-Dependent Generalization of AUPRC Optimization
with List Stability [107.65337427333064]
optimization of the Area Under the Precision-Recall Curve (AUPRC) is a crucial problem for machine learning.
In this work, we present the first trial in the single-dependent generalization of AUPRC optimization.
Experiments on three image retrieval datasets on speak to the effectiveness and soundness of our framework.
arXiv Detail & Related papers (2022-09-27T09:06:37Z) - An Algebraically Converging Stochastic Gradient Descent Algorithm for
Global Optimization [14.336473214524663]
A key component in the algorithm is the randomness based on the value of the objective function.
We prove the convergence of the algorithm with an algebra and tuning in the parameter space.
We present several numerical examples to demonstrate the efficiency and robustness of the algorithm.
arXiv Detail & Related papers (2022-04-12T16:27:49Z) - First-Order Algorithms for Nonlinear Generalized Nash Equilibrium
Problems [88.58409977434269]
We consider the problem of computing an equilibrium in a class of nonlinear generalized Nash equilibrium problems (NGNEPs)
Our contribution is to provide two simple first-order algorithmic frameworks based on the quadratic penalty method and the augmented Lagrangian method.
We provide nonasymptotic theoretical guarantees for these algorithms.
arXiv Detail & Related papers (2022-04-07T00:11:05Z) - Amortized Implicit Differentiation for Stochastic Bilevel Optimization [53.12363770169761]
We study a class of algorithms for solving bilevel optimization problems in both deterministic and deterministic settings.
We exploit a warm-start strategy to amortize the estimation of the exact gradient.
By using this framework, our analysis shows these algorithms to match the computational complexity of methods that have access to an unbiased estimate of the gradient.
arXiv Detail & Related papers (2021-11-29T15:10:09Z) - High Probability Complexity Bounds for Non-Smooth Stochastic Optimization with Heavy-Tailed Noise [51.31435087414348]
It is essential to theoretically guarantee that algorithms provide small objective residual with high probability.
Existing methods for non-smooth convex optimization have complexity bounds with dependence on confidence level.
We propose novel stepsize rules for two methods with gradient clipping.
arXiv Detail & Related papers (2021-06-10T17:54:21Z) - Recent Theoretical Advances in Non-Convex Optimization [56.88981258425256]
Motivated by recent increased interest in analysis of optimization algorithms for non- optimization in deep networks and other problems in data, we give an overview of recent results of theoretical optimization algorithms for non- optimization.
arXiv Detail & Related papers (2020-12-11T08:28:51Z) - Convergence of adaptive algorithms for weakly convex constrained
optimization [59.36386973876765]
We prove the $mathcaltilde O(t-1/4)$ rate of convergence for the norm of the gradient of Moreau envelope.
Our analysis works with mini-batch size of $1$, constant first and second order moment parameters, and possibly smooth optimization domains.
arXiv Detail & Related papers (2020-06-11T17:43:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.