CMA-ES with Margin: Lower-Bounding Marginal Probability for
Mixed-Integer Black-Box Optimization
- URL: http://arxiv.org/abs/2205.13482v1
- Date: Thu, 26 May 2022 16:47:56 GMT
- Title: CMA-ES with Margin: Lower-Bounding Marginal Probability for
Mixed-Integer Black-Box Optimization
- Authors: Ryoki Hamano, Shota Saito, Masahiro Nomura, Shinichi Shirakawa
- Abstract summary: This study targets the mixed-integer black-box optimization (MI-BBO) problem where continuous and integer variables should be optimized simultaneously.
We propose a simple modification of the CMA-ES based on lower-bounding the marginal probabilities associated with the generation of integer variables in the MGD.
- Score: 5.237999056930947
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: This study targets the mixed-integer black-box optimization (MI-BBO) problem
where continuous and integer variables should be optimized simultaneously. The
CMA-ES, our focus in this study, is a population-based stochastic search method
that samples solution candidates from a multivariate Gaussian distribution
(MGD), which shows excellent performance in continuous BBO. The parameters of
MGD, mean and (co)variance, are updated based on the evaluation value of
candidate solutions in the CMA-ES. If the CMA-ES is applied to the MI-BBO with
straightforward discretization, however, the variance corresponding to the
integer variables becomes much smaller than the granularity of the
discretization before reaching the optimal solution, which leads to the
stagnation of the optimization. In particular, when binary variables are
included in the problem, this stagnation more likely occurs because the
granularity of the discretization becomes wider, and the existing modification
to the CMA-ES does not address this stagnation. To overcome these limitations,
we propose a simple modification of the CMA-ES based on lower-bounding the
marginal probabilities associated with the generation of integer variables in
the MGD. The numerical experiments on the MI-BBO benchmark problems demonstrate
the efficiency and robustness of the proposed method.
Related papers
- A Fresh Look at Generalized Category Discovery through Non-negative Matrix Factorization [83.12938977698988]
Generalized Category Discovery (GCD) aims to classify both base and novel images using labeled base data.
Current approaches inadequately address the intrinsic optimization of the co-occurrence matrix $barA$ based on cosine similarity.
We propose a Non-Negative Generalized Category Discovery (NN-GCD) framework to address these deficiencies.
arXiv Detail & Related papers (2024-10-29T07:24:11Z) - CMA-ES for Discrete and Mixed-Variable Optimization on Sets of Points [9.130749109828717]
This paper focuses on the optimization on sets of points and proposes an optimization method by extending the covariance matrix adaptation evolution strategy (CMA-ES)
The CMA-ES-SoP incorporates margin correction that maintains the generation probability of neighboring points to prevent premature convergence to a specific non-optimal point.
Numerical simulations demonstrated that the CMA-ES-SoP successfully optimized the optimization problems on sets of points, whereas the naive CMA-ES failed to optimize them due to premature convergence.
arXiv Detail & Related papers (2024-08-23T13:10:06Z) - CatCMA : Stochastic Optimization for Mixed-Category Problems [9.13074910982872]
Black-box optimization problems often require simultaneously optimizing different types of variables, such as continuous, integer, and categorical variables.
Several Bayesian optimization methods can deal with mixed-category black-box optimization (MC-BBO), but they suffer from a lack of scalability to high-dimensional problems and internal computational cost.
This paper proposes CatCMA, a new optimization method for MC-BBO problems.
arXiv Detail & Related papers (2024-05-16T10:11:18Z) - Moreau Envelope ADMM for Decentralized Weakly Convex Optimization [55.2289666758254]
This paper proposes a proximal variant of the alternating direction method of multipliers (ADMM) for distributed optimization.
The results of our numerical experiments indicate that our method is faster and more robust than widely-used approaches.
arXiv Detail & Related papers (2023-08-31T14:16:30Z) - (1+1)-CMA-ES with Margin for Discrete and Mixed-Integer Problems [6.181926091202142]
This paper introduces the margin correction into (1+1)-CMA-ES, an elitist version of CMA-ES.
The (1+1)-CMA-ES is often advantageous for unimodal functions and can be computationally less expensive.
The numerical simulation using benchmark functions on mixed-integer, integer, and binary domains shows that (1+1)-CMA-ES with margin outperforms the CMA-ES with margin.
arXiv Detail & Related papers (2023-05-01T14:34:55Z) - Marginal Probability-Based Integer Handling for CMA-ES Tackling
Single-and Multi-Objective Mixed-Integer Black-Box Optimization [10.42820615166362]
We study the mixed-integer black-box optimization (MI-BBO) problem where continuous and integer variables should be optimized simultaneously.
We propose a simple integer handling for the CMA-ES based on lower-bounding the marginal probabilities associated with the generation of integer variables in the MGD.
arXiv Detail & Related papers (2022-12-19T06:08:20Z) - Optimization of Annealed Importance Sampling Hyperparameters [77.34726150561087]
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models.
We present a parameteric AIS process with flexible intermediary distributions and optimize the bridging distributions to use fewer number of steps for sampling.
We assess the performance of our optimized AIS for marginal likelihood estimation of deep generative models and compare it to other estimators.
arXiv Detail & Related papers (2022-09-27T07:58:25Z) - MMES: Mixture Model based Evolution Strategy for Large-Scale
Optimization [36.37871629761407]
This work provides an efficient sampling method for the covariance matrix adaptation evolution strategy (CMA-ES) in large-scale settings.
We analyze the probability distribution of this mixture model and show that it approximates the Gaussian distribution of CMA-ES with a controllable accuracy.
We use this sampling method, coupled with a novel method for mutation strength adaptation, to formulate the mixture model based evolution strategy (MMES)
arXiv Detail & Related papers (2022-03-15T14:33:37Z) - Differentiable Annealed Importance Sampling and the Perils of Gradient
Noise [68.44523807580438]
Annealed importance sampling (AIS) and related algorithms are highly effective tools for marginal likelihood estimation.
Differentiability is a desirable property as it would admit the possibility of optimizing marginal likelihood as an objective.
We propose a differentiable algorithm by abandoning Metropolis-Hastings steps, which further unlocks mini-batch computation.
arXiv Detail & Related papers (2021-07-21T17:10:14Z) - Stochastic Gradient Descent-Ascent and Consensus Optimization for Smooth
Games: Convergence Analysis under Expected Co-coercivity [49.66890309455787]
We introduce the expected co-coercivity condition, explain its benefits, and provide the first last-iterate convergence guarantees of SGDA and SCO.
We prove linear convergence of both methods to a neighborhood of the solution when they use constant step-size.
Our convergence guarantees hold under the arbitrary sampling paradigm, and we give insights into the complexity of minibatching.
arXiv Detail & Related papers (2021-06-30T18:32:46Z) - Distributionally Robust Bayesian Optimization [121.71766171427433]
We present a novel distributionally robust Bayesian optimization algorithm (DRBO) for zeroth-order, noisy optimization.
Our algorithm provably obtains sub-linear robust regret in various settings.
We demonstrate the robust performance of our method on both synthetic and real-world benchmarks.
arXiv Detail & Related papers (2020-02-20T22:04:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.