Natural Evolution Strategy for Unconstrained and Implicitly Constrained
Problems with Ridge Structure
- URL: http://arxiv.org/abs/2108.09455v2
- Date: Sat, 16 Oct 2021 23:51:59 GMT
- Title: Natural Evolution Strategy for Unconstrained and Implicitly Constrained
Problems with Ridge Structure
- Authors: Masahiro Nomura, Isao Ono
- Abstract summary: We propose a new natural evolution strategy for unconstrained black-box function optimization (BBFO) problems and implicitly constrained BBFO problems.
FM-NES shows better performance than DX-NES-IC on problems with ridge structure and almost the same performance as DX-NES-IC on the others.
- Score: 2.512827436728378
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose a new natural evolution strategy for unconstrained
black-box function optimization (BBFO) problems and implicitly constrained BBFO
problems. BBFO problems are known to be difficult because explicit
representations of objective functions are not available. Implicit constraints
make the problems more difficult because whether or not a solution is feasible
is revealed when the solution is evaluated with the objective function.
DX-NES-IC is one of the promising methods for implicitly constrained BBFO
problems. DX-NES-IC has shown better performance than conventional methods on
implicitly constrained benchmark problems. However, DX-NES-IC has a problem in
that the moving speed of the probability distribution is slow on ridge
structure. To address the problem, we propose the Fast Moving Natural Evolution
Strategy (FM-NES) that accelerates the movement of the probability distribution
on ridge structure by introducing the rank-one update into DX-NES-IC. The
rank-one update is utilized in CMA-ES. Since naively introducing the rank-one
update makes the search performance deteriorate on implicitly constrained
problems, we propose a condition of performing the rank-one update. We also
propose to reset the shape of the probability distribution when an infeasible
solution is sampled at the first time. In numerical experiments using
unconstrained and implicitly constrained benchmark problems, FM-NES showed
better performance than DX-NES-IC on problems with ridge structure and almost
the same performance as DX-NES-IC on the others. Furthermore, FM-NES
outperformed xNES, CMA-ES, xNES with the resampling technique, and CMA-ES with
the resampling technique.
Related papers
- Enhancing GNNs Performance on Combinatorial Optimization by Recurrent Feature Update [0.09986418756990156]
We introduce a novel algorithm, denoted hereafter as QRF-GNN, leveraging the power of GNNs to efficiently solve Combinatorial optimization (CO) problems.
It relies on unsupervised learning by minimizing the loss function derived from QUBO relaxation.
Results of experiments show that QRF-GNN drastically surpasses existing learning-based approaches and is comparable to the state-of-the-art conventionals.
arXiv Detail & Related papers (2024-07-23T13:34:35Z) - Non-convex Bayesian Learning via Stochastic Gradient Markov Chain Monte
Carlo [4.656426393230839]
The rise of artificial intelligence (AI) hinges on efficient of modern deep neural networks (DNNs) for non-trips and uncertainty.
In this thesis we propose a tool to handle the problem of Monte Carlo exploitation.
We also propose two dynamic importance sampling algorithms for the underlying ordinary equation (ODE) system.
arXiv Detail & Related papers (2023-05-30T18:25:11Z) - Can Decentralized Stochastic Minimax Optimization Algorithms Converge
Linearly for Finite-Sum Nonconvex-Nonconcave Problems? [56.62372517641597]
Decentralized minimax optimization has been actively studied in the past few years due to its application in a wide range machine learning.
This paper develops two novel decentralized minimax optimization algorithms for the non-strongly-nonconcave problem.
arXiv Detail & Related papers (2023-04-24T02:19:39Z) - Natural Evolution Strategy for Mixed-Integer Black-Box Optimization [0.0]
CMA-ES w. Margin reportedly showed good performance on MI-BBO benchmark problems.
DX-NES-ICI was up to 3.7 times better than CMA-ES w. Margin in terms of a rate of finding the optimal solutions.
arXiv Detail & Related papers (2023-04-21T03:34:22Z) - Decentralized Riemannian Algorithm for Nonconvex Minimax Problems [82.50374560598493]
The minimax algorithms for neural networks have been developed to solve many problems.
In this paper, we propose two types of minimax algorithms.
For the setting, we propose DRSGDA and prove that our method achieves a gradient.
arXiv Detail & Related papers (2023-02-08T01:42:45Z) - Efficient Few-Shot Object Detection via Knowledge Inheritance [62.36414544915032]
Few-shot object detection (FSOD) aims at learning a generic detector that can adapt to unseen tasks with scarce training samples.
We present an efficient pretrain-transfer framework (PTF) baseline with no computational increment.
We also propose an adaptive length re-scaling (ALR) strategy to alleviate the vector length inconsistency between the predicted novel weights and the pretrained base weights.
arXiv Detail & Related papers (2022-03-23T06:24:31Z) - Fast Moving Natural Evolution Strategy for High-Dimensional Problems [2.512827436728378]
We propose a new variant of natural evolution strategies (NES) for high-dimensional black-box optimization problems.
The proposed method, CR-FM-NES, extends a recently proposed state-of-the-art NES, Fast Moving Natural Evolution Strategy (FM-NES) in order to be applicable in high-dimensional problems.
arXiv Detail & Related papers (2022-01-27T10:18:11Z) - Gaussian MRF Covariance Modeling for Efficient Black-Box Adversarial
Attacks [86.88061841975482]
We study the problem of generating adversarial examples in a black-box setting, where we only have access to a zeroth order oracle.
We use this setting to find fast one-step adversarial attacks, akin to a black-box version of the Fast Gradient Sign Method(FGSM)
We show that the method uses fewer queries and achieves higher attack success rates than the current state of the art.
arXiv Detail & Related papers (2020-10-08T18:36:51Z) - Communication-Efficient Robust Federated Learning Over Heterogeneous
Datasets [147.11434031193164]
This work investigates fault-resilient federated learning when the data samples are non-uniformly distributed across workers.
In the presence of adversarially faulty workers who may strategically corrupt datasets, the local messages exchanged can be unreliable.
The present work introduces a fault-resilient gradient (FRPG) algorithm that relies on Nesterov's acceleration technique.
For strongly convex loss functions, FRPG and LFRPG have provably faster convergence rates than a benchmark robust aggregation algorithm.
arXiv Detail & Related papers (2020-06-17T16:50:33Z) - Corruption-Tolerant Gaussian Process Bandit Optimization [130.60115798580136]
We consider the problem of optimizing an unknown (typically non-producing) function with a bounded norm.
We introduce an algorithm based on Fast-Slow GP-UCB based on "fast but non-robust" and "slow"
We argue that certain dependencies cannot be required depending on the corruption level.
arXiv Detail & Related papers (2020-03-04T09:46:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.