Generative Evolutionary Strategy For Black-Box Optimizations
- URL: http://arxiv.org/abs/2205.03056v4
- Date: Sat, 27 Jan 2024 15:08:13 GMT
- Title: Generative Evolutionary Strategy For Black-Box Optimizations
- Authors: Changhwi Park
- Abstract summary: Black-box optimization in high-dimensional space is challenging.
Recent neural network-based black-box optimization studies have shown noteworthy achievements.
This study proposes a black-box optimization method based on the evolution strategy (ES) and the generative neural network (GNN) model.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many scientific and technological problems are related to optimization. Among
them, black-box optimization in high-dimensional space is particularly
challenging. Recent neural network-based black-box optimization studies have
shown noteworthy achievements. However, their capability in high-dimensional
search space is still limited. This study proposes a black-box optimization
method based on the evolution strategy (ES) and the generative neural network
(GNN) model. We designed the algorithm so that the ES and the GNN model work
cooperatively. This hybrid model enables reliable training of surrogate
networks; it optimizes multi-objective, high-dimensional, and stochastic
black-box functions. Our method outperforms baseline optimization methods in
this experiment, including ES, and Bayesian optimization.
Related papers
- Enhancing CNN Classification with Lamarckian Memetic Algorithms and Local Search [0.0]
We propose a novel approach integrating a two-stage training technique with population-based optimization algorithms incorporating local search capabilities.
Our experiments demonstrate that the proposed method outperforms state-of-the-art gradient-based techniques.
arXiv Detail & Related papers (2024-10-26T17:31:15Z) - Sharpness-Aware Black-Box Optimization [47.95184866255126]
We propose a Sharpness-Aware Black-box Optimization (SABO) algorithm, which applies a sharpness-aware minimization strategy to improve the model generalization.
Empirically, extensive experiments on the black-box prompt fine-tuning tasks demonstrate the effectiveness of the proposed SABO method in improving model generalization performance.
arXiv Detail & Related papers (2024-10-16T11:08:06Z) - Covariance-Adaptive Sequential Black-box Optimization for Diffusion Targeted Generation [60.41803046775034]
We show how to perform user-preferred targeted generation via diffusion models with only black-box target scores of users.
Experiments on both numerical test problems and target-guided 3D-molecule generation tasks show the superior performance of our method in achieving better target scores.
arXiv Detail & Related papers (2024-06-02T17:26:27Z) - PINN-BO: A Black-box Optimization Algorithm using Physics-Informed
Neural Networks [11.618811218101058]
Black-box optimization is a powerful approach for discovering global optima in noisy and expensive black-box functions.
We propose PINN-BO, a black-box optimization algorithm employing Physics-Informed Neural Networks.
We show that our algorithm is more sample-efficient compared to existing methods.
arXiv Detail & Related papers (2024-02-05T17:58:17Z) - Quantum Inspired Optimization for Industrial Scale Problems [0.5417521241272644]
We use a quantum-inspired model-based optimization method TN-GEO to assess the efficacy of these quantum-inspired methods when applied to realistic problems.
In this case, the problem of interest is the optimization of a realistic assembly line based on BMW's currently utilized manufacturing schedule.
Through a comparison of optimization techniques, we found that quantum-inspired model-based optimization, when combined with conventional black-box methods, can find lower-cost solutions in certain contexts.
arXiv Detail & Related papers (2023-05-03T15:19:36Z) - Neural-BO: A Black-box Optimization Algorithm using Deep Neural Networks [12.218039144209017]
We propose a novel black-box optimization algorithm where the black-box function is modeled using a neural network.
Our algorithm does not need a Bayesian neural network to estimate predictive uncertainty and is therefore computationally favorable.
arXiv Detail & Related papers (2023-03-03T02:53:56Z) - Transfer Learning for Bayesian Optimization: A Survey [29.229660973338145]
Black-box optimization is a powerful tool that models and optimize such expensive "black-box" functions.
Researchers in the BO community propose to incorporate the spirit of transfer learning to accelerate optimization process.
arXiv Detail & Related papers (2023-02-12T14:37:25Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Conservative Objective Models for Effective Offline Model-Based
Optimization [78.19085445065845]
Computational design problems arise in a number of settings, from synthetic biology to computer architectures.
We propose a method that learns a model of the objective function that lower bounds the actual value of the ground-truth objective on out-of-distribution inputs.
COMs are simple to implement and outperform a number of existing methods on a wide range of MBO problems.
arXiv Detail & Related papers (2021-07-14T17:55:28Z) - Bayesian Optimization for Selecting Efficient Machine Learning Models [53.202224677485525]
We present a unified Bayesian Optimization framework for jointly optimizing models for both prediction effectiveness and training efficiency.
Experiments on model selection for recommendation tasks indicate models selected this way significantly improves model training efficiency.
arXiv Detail & Related papers (2020-08-02T02:56:30Z) - Self-Directed Online Machine Learning for Topology Optimization [58.920693413667216]
Self-directed Online Learning Optimization integrates Deep Neural Network (DNN) with Finite Element Method (FEM) calculations.
Our algorithm was tested by four types of problems including compliance minimization, fluid-structure optimization, heat transfer enhancement and truss optimization.
It reduced the computational time by 2 5 orders of magnitude compared with directly using methods, and outperformed all state-of-the-art algorithms tested in our experiments.
arXiv Detail & Related papers (2020-02-04T20:00:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.