(1+1)-CMA-ES with Margin for Discrete and Mixed-Integer Problems
- URL: http://arxiv.org/abs/2305.00849v1
- Date: Mon, 1 May 2023 14:34:55 GMT
- Title: (1+1)-CMA-ES with Margin for Discrete and Mixed-Integer Problems
- Authors: Yohei Watanabe, Kento Uchida, Ryoki Hamano, Shota Saito, Masahiro
Nomura and Shinichi Shirakawa
- Abstract summary: This paper introduces the margin correction into (1+1)-CMA-ES, an elitist version of CMA-ES.
The (1+1)-CMA-ES is often advantageous for unimodal functions and can be computationally less expensive.
The numerical simulation using benchmark functions on mixed-integer, integer, and binary domains shows that (1+1)-CMA-ES with margin outperforms the CMA-ES with margin.
- Score: 6.181926091202142
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: The covariance matrix adaptation evolution strategy (CMA-ES) is an efficient
continuous black-box optimization method. The CMA-ES possesses many attractive
features, including invariance properties and a well-tuned default
hyperparameter setting. Moreover, several components to specialize the CMA-ES
have been proposed, such as noise handling and constraint handling. To utilize
these advantages in mixed-integer optimization problems, the CMA-ES with margin
has been proposed. The CMA-ES with margin prevents the premature convergence of
discrete variables by the margin correction, in which the distribution
parameters are modified to leave the generation probability for changing the
discrete variable. The margin correction has been applied to
($\mu/\mu_\mathrm{w}$,$\lambda$)-CMA-ES, while this paper introduces the margin
correction into (1+1)-CMA-ES, an elitist version of CMA-ES. The (1+1)-CMA-ES is
often advantageous for unimodal functions and can be computationally less
expensive. To tackle the performance deterioration on mixed-integer
optimization, we use the discretized elitist solution as the mean of the
sampling distribution and modify the margin correction not to move the elitist
solution. The numerical simulation using benchmark functions on mixed-integer,
integer, and binary domains shows that (1+1)-CMA-ES with margin outperforms the
CMA-ES with margin and is better than or comparable with several specialized
methods to a particular search domain.
Related papers
- CMA-ES for Discrete and Mixed-Variable Optimization on Sets of Points [9.130749109828717]
This paper focuses on the optimization on sets of points and proposes an optimization method by extending the covariance matrix adaptation evolution strategy (CMA-ES)
The CMA-ES-SoP incorporates margin correction that maintains the generation probability of neighboring points to prevent premature convergence to a specific non-optimal point.
Numerical simulations demonstrated that the CMA-ES-SoP successfully optimized the optimization problems on sets of points, whereas the naive CMA-ES failed to optimize them due to premature convergence.
arXiv Detail & Related papers (2024-08-23T13:10:06Z) - CatCMA : Stochastic Optimization for Mixed-Category Problems [9.13074910982872]
Black-box optimization problems often require simultaneously optimizing different types of variables, such as continuous, integer, and categorical variables.
Several Bayesian optimization methods can deal with mixed-category black-box optimization (MC-BBO), but they suffer from a lack of scalability to high-dimensional problems and internal computational cost.
This paper proposes CatCMA, a new optimization method for MC-BBO problems.
arXiv Detail & Related papers (2024-05-16T10:11:18Z) - ALEXR: An Optimal Single-Loop Algorithm for Convex Finite-Sum Coupled Compositional Stochastic Optimization [53.14532968909759]
We introduce an efficient single-loop primal-dual block-coordinate algorithm, dubbed ALEXR.
We establish the convergence rates of ALEXR in both convex and strongly convex cases under smoothness and non-smoothness conditions.
We present lower complexity bounds to demonstrate that the convergence rates of ALEXR are optimal among first-order block-coordinate algorithms for the considered class of cFCCO problems.
arXiv Detail & Related papers (2023-12-04T19:00:07Z) - Natural Evolution Strategy for Mixed-Integer Black-Box Optimization [0.0]
CMA-ES w. Margin reportedly showed good performance on MI-BBO benchmark problems.
DX-NES-ICI was up to 3.7 times better than CMA-ES w. Margin in terms of a rate of finding the optimal solutions.
arXiv Detail & Related papers (2023-04-21T03:34:22Z) - Marginal Probability-Based Integer Handling for CMA-ES Tackling
Single-and Multi-Objective Mixed-Integer Black-Box Optimization [10.42820615166362]
We study the mixed-integer black-box optimization (MI-BBO) problem where continuous and integer variables should be optimized simultaneously.
We propose a simple integer handling for the CMA-ES based on lower-bounding the marginal probabilities associated with the generation of integer variables in the MGD.
arXiv Detail & Related papers (2022-12-19T06:08:20Z) - Optimization of Annealed Importance Sampling Hyperparameters [77.34726150561087]
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models.
We present a parameteric AIS process with flexible intermediary distributions and optimize the bridging distributions to use fewer number of steps for sampling.
We assess the performance of our optimized AIS for marginal likelihood estimation of deep generative models and compare it to other estimators.
arXiv Detail & Related papers (2022-09-27T07:58:25Z) - CMA-ES with Margin: Lower-Bounding Marginal Probability for
Mixed-Integer Black-Box Optimization [5.237999056930947]
This study targets the mixed-integer black-box optimization (MI-BBO) problem where continuous and integer variables should be optimized simultaneously.
We propose a simple modification of the CMA-ES based on lower-bounding the marginal probabilities associated with the generation of integer variables in the MGD.
arXiv Detail & Related papers (2022-05-26T16:47:56Z) - Faster Algorithm and Sharper Analysis for Constrained Markov Decision
Process [56.55075925645864]
The problem of constrained decision process (CMDP) is investigated, where an agent aims to maximize the expected accumulated discounted reward subject to multiple constraints.
A new utilities-dual convex approach is proposed with novel integration of three ingredients: regularized policy, dual regularizer, and Nesterov's gradient descent dual.
This is the first demonstration that nonconcave CMDP problems can attain the lower bound of $mathcal O (1/epsilon)$ for all complexity optimization subject to convex constraints.
arXiv Detail & Related papers (2021-10-20T02:57:21Z) - Sparse Quadratic Optimisation over the Stiefel Manifold with Application
to Permutation Synchronisation [71.27989298860481]
We address the non- optimisation problem of finding a matrix on the Stiefel manifold that maximises a quadratic objective function.
We propose a simple yet effective sparsity-promoting algorithm for finding the dominant eigenspace matrix.
arXiv Detail & Related papers (2021-09-30T19:17:35Z) - Stochastic Gradient Descent-Ascent and Consensus Optimization for Smooth
Games: Convergence Analysis under Expected Co-coercivity [49.66890309455787]
We introduce the expected co-coercivity condition, explain its benefits, and provide the first last-iterate convergence guarantees of SGDA and SCO.
We prove linear convergence of both methods to a neighborhood of the solution when they use constant step-size.
Our convergence guarantees hold under the arbitrary sampling paradigm, and we give insights into the complexity of minibatching.
arXiv Detail & Related papers (2021-06-30T18:32:46Z) - Distributionally Robust Bayesian Optimization [121.71766171427433]
We present a novel distributionally robust Bayesian optimization algorithm (DRBO) for zeroth-order, noisy optimization.
Our algorithm provably obtains sub-linear robust regret in various settings.
We demonstrate the robust performance of our method on both synthetic and real-world benchmarks.
arXiv Detail & Related papers (2020-02-20T22:04:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.