Exploring the effectiveness of surrogate-assisted evolutionary
algorithms on the batch processing problem
- URL: http://arxiv.org/abs/2210.17149v1
- Date: Mon, 31 Oct 2022 09:00:39 GMT
- Title: Exploring the effectiveness of surrogate-assisted evolutionary
algorithms on the batch processing problem
- Authors: Mohamed Z. Variawa, Terence L. Van Zyl and Matthew Woolway
- Abstract summary: This paper introduces a simulation of a well-known batch processing problem in the literature.
Evolutionary algorithms such as Genetic Algorithm (GA), Differential Evolution (DE) are used to find the optimal schedule for the simulation.
We then compare the quality of solutions obtained by the surrogate-assisted versions of the algorithms against the baseline algorithms.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Real-world optimisation problems typically have objective functions which
cannot be expressed analytically. These optimisation problems are evaluated
through expensive physical experiments or simulations. Cheap approximations of
the objective function can reduce the computational requirements for solving
these expensive optimisation problems. These cheap approximations may be
machine learning or statistical models and are known as surrogate models. This
paper introduces a simulation of a well-known batch processing problem in the
literature. Evolutionary algorithms such as Genetic Algorithm (GA),
Differential Evolution (DE) are used to find the optimal schedule for the
simulation. We then compare the quality of solutions obtained by the
surrogate-assisted versions of the algorithms against the baseline algorithms.
Surrogate-assistance is achieved through Probablistic Surrogate-Assisted
Framework (PSAF). The results highlight the potential for improving baseline
evolutionary algorithms through surrogates. For different time horizons, the
solutions are evaluated with respect to several quality indicators. It is shown
that the PSAF assisted GA (PSAF-GA) and PSAF-assisted DE (PSAF-DE) provided
improvement in some time horizons. In others, they either maintained the
solutions or showed some deterioration. The results also highlight the need to
tune the hyper-parameters used by the surrogate-assisted framework, as the
surrogate, in some instances, shows some deterioration over the baseline
algorithm.
Related papers
- Comparative study of regression vs pairwise models for surrogate-based heuristic optimisation [1.2535250082638645]
This paper addresses the formulation of surrogate problems as both regression models that approximate fitness (surface surrogate models) and a novel way to connect classification models (pairwise surrogate models)
The performance of the overall search, when using online machine learning-based surrogate models, depends not only on the accuracy of the predictive model but also on the kind of bias towards positive or negative cases.
arXiv Detail & Related papers (2024-10-04T13:19:06Z) - Model Uncertainty in Evolutionary Optimization and Bayesian Optimization: A Comparative Analysis [5.6787965501364335]
Black-box optimization problems are common in many real-world applications.
These problems require optimization through input-output interactions without access to internal workings.
Two widely used gradient-free optimization techniques are employed to address such challenges.
This paper aims to elucidate the similarities and differences in the utilization of model uncertainty between these two methods.
arXiv Detail & Related papers (2024-03-21T13:59:19Z) - GE-AdvGAN: Improving the transferability of adversarial samples by
gradient editing-based adversarial generative model [69.71629949747884]
Adversarial generative models, such as Generative Adversarial Networks (GANs), are widely applied for generating various types of data.
In this work, we propose a novel algorithm named GE-AdvGAN to enhance the transferability of adversarial samples.
arXiv Detail & Related papers (2024-01-11T16:43:16Z) - An Optimization-based Deep Equilibrium Model for Hyperspectral Image
Deconvolution with Convergence Guarantees [71.57324258813675]
We propose a novel methodology for addressing the hyperspectral image deconvolution problem.
A new optimization problem is formulated, leveraging a learnable regularizer in the form of a neural network.
The derived iterative solver is then expressed as a fixed-point calculation problem within the Deep Equilibrium framework.
arXiv Detail & Related papers (2023-06-10T08:25:16Z) - Rank-Based Learning and Local Model Based Evolutionary Algorithm for High-Dimensional Expensive Multi-Objective Problems [1.0499611180329806]
The proposed algorithm consists of three parts: rank-based learning, hyper-volume-based non-dominated search, and local search in the relatively sparse objective space.
The experimental results of benchmark problems and a real-world application on geothermal reservoir heat extraction optimization demonstrate that the proposed algorithm shows superior performance.
arXiv Detail & Related papers (2023-04-19T06:25:04Z) - Multi-surrogate Assisted Efficient Global Optimization for Discrete
Problems [0.9127162004615265]
This paper investigates the possible benefit of a concurrent utilization of multiple simulation-based surrogate models to solve discrete problems.
Our findings indicate that SAMA-DiEGO can rapidly converge to better solutions on a majority of the test problems.
arXiv Detail & Related papers (2022-12-13T09:10:08Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Multi-objective hyperparameter optimization with performance uncertainty [62.997667081978825]
This paper presents results on multi-objective hyperparameter optimization with uncertainty on the evaluation of Machine Learning algorithms.
We combine the sampling strategy of Tree-structured Parzen Estimators (TPE) with the metamodel obtained after training a Gaussian Process Regression (GPR) with heterogeneous noise.
Experimental results on three analytical test functions and three ML problems show the improvement over multi-objective TPE and GPR.
arXiv Detail & Related papers (2022-09-09T14:58:43Z) - Surrogate-Assisted Genetic Algorithm for Wrapper Feature Selection [4.89253144446913]
We propose a novel multi-stage feature selection framework utilizing multiple levels of approximations, or surrogates.
Our experiments show that SAGA can arrive at near-optimal solutions three times faster than a wrapper GA, on average.
arXiv Detail & Related papers (2021-11-17T12:33:18Z) - EOS: a Parallel, Self-Adaptive, Multi-Population Evolutionary Algorithm
for Constrained Global Optimization [68.8204255655161]
EOS is a global optimization algorithm for constrained and unconstrained problems of real-valued variables.
It implements a number of improvements to the well-known Differential Evolution (DE) algorithm.
Results prove that EOSis capable of achieving increased performance compared to state-of-the-art single-population self-adaptive DE algorithms.
arXiv Detail & Related papers (2020-07-09T10:19:22Z) - Automatically Learning Compact Quality-aware Surrogates for Optimization
Problems [55.94450542785096]
Solving optimization problems with unknown parameters requires learning a predictive model to predict the values of the unknown parameters and then solving the problem using these values.
Recent work has shown that including the optimization problem as a layer in a complex training model pipeline results in predictions of iteration of unobserved decision making.
We show that we can improve solution quality by learning a low-dimensional surrogate model of a large optimization problem.
arXiv Detail & Related papers (2020-06-18T19:11:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.