Run Time Analysis for Random Local Search on Generalized Majority
Functions
- URL: http://arxiv.org/abs/2204.12770v3
- Date: Mon, 26 Sep 2022 07:43:26 GMT
- Title: Run Time Analysis for Random Local Search on Generalized Majority
Functions
- Authors: Carola Doerr and Martin S. Krejca
- Abstract summary: We study how one of its properties -- neutrality -- influences the run time of random local search.
We provide a theorem, applicable to many optimization algorithms, that links the run time of MAJORITY with its symmetric version HASMAJORITY.
- Score: 1.0965065178451106
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Run time analysis of evolutionary algorithms recently makes significant
progress in linking algorithm performance to algorithm parameters. However,
settings that study the impact of problem parameters are rare. The recently
proposed W-model provides a good framework for such analyses, generating
pseudo-Boolean optimization problems with tunable properties.
We initiate theoretical research of the W-model by studying how one of its
properties -- neutrality -- influences the run time of random local search.
Neutrality creates plateaus in the search space by first performing a majority
vote for subsets of the solution candidate and then evaluating the
smaller-dimensional string via a low-level fitness function.
We prove upper bounds for the expected run time of random local search on
this MAJORITY problem for its entire parameter spectrum. To this end, we
provide a theorem, applicable to many optimization algorithms, that links the
run time of MAJORITY with its symmetric version HASMAJORITY, where a sufficient
majority is needed to optimize the subset. We also introduce a generalized
version of classic drift theorems as well as a generalized version of Wald's
equation, both of which we believe to be of independent interest.
Related papers
- Generalized Short Path Algorithms: Towards Super-Quadratic Speedup over Markov Chain Search for Combinatorial Optimization [3.3508820106803796]
We analyze generalizations of algorithms based on the short-path framework first proposed by Hastings.
Under some commonly satisfied technical conditions, an appropriate generalization can achieve super-quadratic speedups.
We conclude the paper with a numerical analysis that guides the choice of parameters for short path algorithms.
arXiv Detail & Related papers (2024-10-30T17:52:56Z) - Runtime Analysis of a Multi-Valued Compact Genetic Algorithm on Generalized OneMax [2.07180164747172]
We provide a first runtime analysis of a generalized OneMax function.
We show that the r-cGA solves this r-valued OneMax problem efficiently.
At the end of experiments, we state one conjecture related to the expected runtime of another variant of multi-valued OneMax function.
arXiv Detail & Related papers (2024-04-17T10:40:12Z) - DeciLS-PBO: an Effective Local Search Method for Pseudo-Boolean
Optimization [10.513103815142731]
We find two ways to improve the local search algorithms in solving Pseudo-Boolean Optimization (PBO)
Our algorithm DeciLS-PBO has a promising performance compared to the state-of-the-art algorithms.
arXiv Detail & Related papers (2023-01-28T17:03:56Z) - Efficient Non-Parametric Optimizer Search for Diverse Tasks [93.64739408827604]
We present the first efficient scalable and general framework that can directly search on the tasks of interest.
Inspired by the innate tree structure of the underlying math expressions, we re-arrange the spaces into a super-tree.
We adopt an adaptation of the Monte Carlo method to tree search, equipped with rejection sampling and equivalent- form detection.
arXiv Detail & Related papers (2022-09-27T17:51:31Z) - Exploring the Algorithm-Dependent Generalization of AUPRC Optimization
with List Stability [107.65337427333064]
optimization of the Area Under the Precision-Recall Curve (AUPRC) is a crucial problem for machine learning.
In this work, we present the first trial in the single-dependent generalization of AUPRC optimization.
Experiments on three image retrieval datasets on speak to the effectiveness and soundness of our framework.
arXiv Detail & Related papers (2022-09-27T09:06:37Z) - Towards Optimally Efficient Tree Search with Deep Learning [76.64632985696237]
This paper investigates the classical integer least-squares problem which estimates signals integer from linear models.
The problem is NP-hard and often arises in diverse applications such as signal processing, bioinformatics, communications and machine learning.
We propose a general hyper-accelerated tree search (HATS) algorithm by employing a deep neural network to estimate the optimal estimation for the underlying simplified memory-bounded A* algorithm.
arXiv Detail & Related papers (2021-01-07T08:00:02Z) - Recent Theoretical Advances in Non-Convex Optimization [56.88981258425256]
Motivated by recent increased interest in analysis of optimization algorithms for non- optimization in deep networks and other problems in data, we give an overview of recent results of theoretical optimization algorithms for non- optimization.
arXiv Detail & Related papers (2020-12-11T08:28:51Z) - Fast Perturbative Algorithm Configurators [0.0]
We prove a linear lower bound on the expected time to optimise any parameter problem for ParamRLS and ParamILS.
We propose a harmonic mutation operator for perturbative algorithms in polylogarithmic time for unimodal and approximately unimodal.
An experimental analysis confirms the superiority of the approach in practice for a number of configuration scenarios.
arXiv Detail & Related papers (2020-07-07T10:48:32Z) - Convergence of adaptive algorithms for weakly convex constrained
optimization [59.36386973876765]
We prove the $mathcaltilde O(t-1/4)$ rate of convergence for the norm of the gradient of Moreau envelope.
Our analysis works with mini-batch size of $1$, constant first and second order moment parameters, and possibly smooth optimization domains.
arXiv Detail & Related papers (2020-06-11T17:43:19Z) - Hardness of Random Optimization Problems for Boolean Circuits,
Low-Degree Polynomials, and Langevin Dynamics [78.46689176407936]
We show that families of algorithms fail to produce nearly optimal solutions with high probability.
For the case of Boolean circuits, our results improve the state-of-the-art bounds known in circuit complexity theory.
arXiv Detail & Related papers (2020-04-25T05:45:59Z) - Parameterized Complexity Analysis of Randomized Search Heuristics [16.759797540118665]
This chapter compiles a number of results that apply the theory of parameterized running-time analysis of randomized algorithms.
We outline the main results and proof techniques for a collection of randomized searchs tasked to solve NP-hard optimization problems.
arXiv Detail & Related papers (2020-01-15T03:43:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.