Surrogate Assisted Evolutionary Algorithm for Medium Scale Expensive
Multi-Objective Optimisation Problems
- URL: http://arxiv.org/abs/2002.03150v1
- Date: Sat, 8 Feb 2020 12:06:08 GMT
- Title: Surrogate Assisted Evolutionary Algorithm for Medium Scale Expensive
Multi-Objective Optimisation Problems
- Authors: Xiaoran Ruan, Ke Li, Bilel Derbel, Arnaud Liefooghe
- Abstract summary: Building a surrogate model of an objective function has shown to be effective to assist evolutionary algorithms (EAs) to solve real-world complex optimisation problems.
We propose a Gaussian process surrogate model assisted EA for medium-scale expensive multi-objective optimisation problems with up to 50 decision variables.
The effectiveness of our proposed algorithm is validated on benchmark problems with 10, 20, 50 variables, comparing with three state-of-the-art SAEAs.
- Score: 4.338938227238059
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Building a surrogate model of an objective function has shown to be effective
to assist evolutionary algorithms (EAs) to solve real-world complex
optimisation problems which involve either computationally expensive numerical
simulations or costly physical experiments. However, their effectiveness mostly
focuses on small-scale problems with less than 10 decision variables. The
scalability of surrogate assisted EAs (SAEAs) have not been well studied yet.
In this paper, we propose a Gaussian process surrogate model assisted EA for
medium-scale expensive multi-objective optimisation problems with up to 50
decision variables. There are three distinctive features of our proposed SAEA.
First, instead of using all decision variables in surrogate model building, we
only use those correlated ones to build the surrogate model for each objective
function. Second, rather than directly optimising the surrogate objective
functions, the original multi-objective optimisation problem is transformed to
a new one based on the surrogate models. Last but not the least, a subset
selection method is developed to choose a couple of promising candidate
solutions for actual objective function evaluations thus to update the training
dataset. The effectiveness of our proposed algorithm is validated on benchmark
problems with 10, 20, 50 variables, comparing with three state-of-the-art
SAEAs.
Related papers
- Autoformulation of Mathematical Optimization Models Using LLMs [50.030647274271516]
We develop an automated approach to creating optimization models from natural language descriptions for commercial solvers.
We identify the three core challenges of autoformulation: (1) defining the vast, problem-dependent hypothesis space, (2) efficiently searching this space under uncertainty, and (3) evaluating formulation correctness.
arXiv Detail & Related papers (2024-11-03T20:41:38Z) - A First Look at Kolmogorov-Arnold Networks in Surrogate-assisted Evolutionary Algorithms [5.198324938447394]
Surrogate-assisted Evolutionary Algorithm (SAEA) is an essential method for solving expensive problems.
This paper introduces Kolmogorov-Arnold Networks (KANs) as surrogate models within SAEAs.
KANs demonstrate commendable performance within SAEAs, effectively decreasing the number of function calls and enhancing the optimization efficiency.
arXiv Detail & Related papers (2024-05-26T09:12:44Z) - Enhancing SAEAs with Unevaluated Solutions: A Case Study of Relation
Model for Expensive Optimization [6.382398222493027]
This paper presents a framework using unevaluated solutions to enhance the efficiency of SAEAs.
The surrogate model is employed to identify high-quality solutions for direct generation of new solutions without evaluation.
arXiv Detail & Related papers (2023-09-21T12:09:55Z) - Multi-surrogate Assisted Efficient Global Optimization for Discrete
Problems [0.9127162004615265]
This paper investigates the possible benefit of a concurrent utilization of multiple simulation-based surrogate models to solve discrete problems.
Our findings indicate that SAMA-DiEGO can rapidly converge to better solutions on a majority of the test problems.
arXiv Detail & Related papers (2022-12-13T09:10:08Z) - Dynamic Multi-objective Ensemble of Acquisition Functions in Batch
Bayesian Optimization [1.1602089225841632]
The acquisition function plays a crucial role in the optimization process.
Three acquisition functions are dynamically selected from a set based on their current and historical performance.
Using an evolutionary multi-objective algorithm to optimize such a MOP, a set of non-dominated solutions can be obtained.
arXiv Detail & Related papers (2022-06-22T14:09:18Z) - Towards Explainable Metaheuristic: Mining Surrogate Fitness Models for
Importance of Variables [69.02115180674885]
We use four benchmark problems to train a surrogate model and investigate the learning of the search space by the surrogate model.
We show that the surrogate model picks out key characteristics of the problem as it is trained on population data from each generation.
arXiv Detail & Related papers (2022-05-31T09:16:18Z) - Data-Driven Evolutionary Multi-Objective Optimization Based on
Multiple-Gradient Descent for Disconnected Pareto Fronts [6.560512252982714]
This paper proposes a data-driven evolutionary multi-objective optimization (EMO) algorithm based on multiple-gradient descent.
Its infill criterion recommends a batch of promising candidate solutions to conduct expensive objective function evaluations.
arXiv Detail & Related papers (2022-05-28T06:01:41Z) - RoMA: Robust Model Adaptation for Offline Model-based Optimization [115.02677045518692]
We consider the problem of searching an input maximizing a black-box objective function given a static dataset of input-output queries.
A popular approach to solving this problem is maintaining a proxy model that approximates the true objective function.
Here, the main challenge is how to avoid adversarially optimized inputs during the search.
arXiv Detail & Related papers (2021-10-27T05:37:12Z) - Conservative Objective Models for Effective Offline Model-Based
Optimization [78.19085445065845]
Computational design problems arise in a number of settings, from synthetic biology to computer architectures.
We propose a method that learns a model of the objective function that lower bounds the actual value of the ground-truth objective on out-of-distribution inputs.
COMs are simple to implement and outperform a number of existing methods on a wide range of MBO problems.
arXiv Detail & Related papers (2021-07-14T17:55:28Z) - Modeling the Second Player in Distributionally Robust Optimization [90.25995710696425]
We argue for the use of neural generative models to characterize the worst-case distribution.
This approach poses a number of implementation and optimization challenges.
We find that the proposed approach yields models that are more robust than comparable baselines.
arXiv Detail & Related papers (2021-03-18T14:26:26Z) - Automatically Learning Compact Quality-aware Surrogates for Optimization
Problems [55.94450542785096]
Solving optimization problems with unknown parameters requires learning a predictive model to predict the values of the unknown parameters and then solving the problem using these values.
Recent work has shown that including the optimization problem as a layer in a complex training model pipeline results in predictions of iteration of unobserved decision making.
We show that we can improve solution quality by learning a low-dimensional surrogate model of a large optimization problem.
arXiv Detail & Related papers (2020-06-18T19:11:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.