GPSAF: A Generalized Probabilistic Surrogate-Assisted Framework for
Constrained Single- and Multi-objective Optimization
- URL: http://arxiv.org/abs/2204.04054v1
- Date: Wed, 6 Apr 2022 13:22:30 GMT
- Title: GPSAF: A Generalized Probabilistic Surrogate-Assisted Framework for
Constrained Single- and Multi-objective Optimization
- Authors: Julian Blank and Kalyanmoy Deb
- Abstract summary: This paper proposes a generalized probabilistic surrogate-assisted framework (GPSAF)
GPSAF is applicable to a broad category of unconstrained and constrained, single- and multi-objective optimization algorithms.
- Score: 7.8140593450932965
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Significant effort has been made to solve computationally expensive
optimization problems in the past two decades, and various optimization methods
incorporating surrogates into optimization have been proposed. Most research
focuses on either exploiting the surrogate by defining a utility optimization
problem or customizing an existing optimization method to use one or multiple
approximation models. However, only a little attention has been paid to generic
concepts applicable to different types of algorithms and optimization problems
simultaneously. Thus this paper proposes a generalized probabilistic
surrogate-assisted framework (GPSAF), applicable to a broad category of
unconstrained and constrained, single- and multi-objective optimization
algorithms. The idea is based on a surrogate assisting an existing optimization
method. The assistance is based on two distinct phases, one facilitating
exploration and another exploiting the surrogates. The exploration and
exploitation of surrogates are automatically balanced by performing a
probabilistic knockout tournament among different clusters of solutions. A
study of multiple well-known population-based optimization algorithms is
conducted with and without the proposed surrogate assistance on single- and
multi-objective optimization problems with a maximum solution evaluation budget
of 300 or less. The results indicate the effectiveness of applying GPSAF to an
optimization algorithm and the competitiveness with other surrogate-assisted
algorithms.
Related papers
- Learning Joint Models of Prediction and Optimization [56.04498536842065]
Predict-Then-Then framework uses machine learning models to predict unknown parameters of an optimization problem from features before solving.
This paper proposes an alternative method, in which optimal solutions are learned directly from the observable features by joint predictive models.
arXiv Detail & Related papers (2024-09-07T19:52:14Z) - Model Uncertainty in Evolutionary Optimization and Bayesian Optimization: A Comparative Analysis [5.6787965501364335]
Black-box optimization problems are common in many real-world applications.
These problems require optimization through input-output interactions without access to internal workings.
Two widely used gradient-free optimization techniques are employed to address such challenges.
This paper aims to elucidate the similarities and differences in the utilization of model uncertainty between these two methods.
arXiv Detail & Related papers (2024-03-21T13:59:19Z) - Analyzing and Enhancing the Backward-Pass Convergence of Unrolled
Optimization [50.38518771642365]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
A central challenge in this setting is backpropagation through the solution of an optimization problem, which often lacks a closed form.
This paper provides theoretical insights into the backward pass of unrolled optimization, showing that it is equivalent to the solution of a linear system by a particular iterative method.
A system called Folded Optimization is proposed to construct more efficient backpropagation rules from unrolled solver implementations.
arXiv Detail & Related papers (2023-12-28T23:15:18Z) - Federated Multi-Level Optimization over Decentralized Networks [55.776919718214224]
We study the problem of distributed multi-level optimization over a network, where agents can only communicate with their immediate neighbors.
We propose a novel gossip-based distributed multi-level optimization algorithm that enables networked agents to solve optimization problems at different levels in a single timescale.
Our algorithm achieves optimal sample complexity, scaling linearly with the network size, and demonstrates state-of-the-art performance on various applications.
arXiv Detail & Related papers (2023-10-10T00:21:10Z) - Backpropagation of Unrolled Solvers with Folded Optimization [55.04219793298687]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
One typical strategy is algorithm unrolling, which relies on automatic differentiation through the operations of an iterative solver.
This paper provides theoretical insights into the backward pass of unrolled optimization, leading to a system for generating efficiently solvable analytical models of backpropagation.
arXiv Detail & Related papers (2023-01-28T01:50:42Z) - Multi-surrogate Assisted Efficient Global Optimization for Discrete
Problems [0.9127162004615265]
This paper investigates the possible benefit of a concurrent utilization of multiple simulation-based surrogate models to solve discrete problems.
Our findings indicate that SAMA-DiEGO can rapidly converge to better solutions on a majority of the test problems.
arXiv Detail & Related papers (2022-12-13T09:10:08Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Enhanced Opposition Differential Evolution Algorithm for Multimodal
Optimization [0.2538209532048866]
Most of the real-world problems are multimodal in nature that consists of multiple optimum values.
Classical gradient-based methods fail for optimization problems in which the objective functions are either discontinuous or non-differentiable.
We have proposed an algorithm known as Enhanced Opposition Differential Evolution (EODE) algorithm to solve the MMOPs.
arXiv Detail & Related papers (2022-08-23T16:18:27Z) - A survey on multi-objective hyperparameter optimization algorithms for
Machine Learning [62.997667081978825]
This article presents a systematic survey of the literature published between 2014 and 2020 on multi-objective HPO algorithms.
We distinguish between metaheuristic-based algorithms, metamodel-based algorithms, and approaches using a mixture of both.
We also discuss the quality metrics used to compare multi-objective HPO procedures and present future research directions.
arXiv Detail & Related papers (2021-11-23T10:22:30Z) - Batched Data-Driven Evolutionary Multi-Objective Optimization Based on
Manifold Interpolation [6.560512252982714]
We propose a framework for implementing batched data-driven evolutionary multi-objective optimization.
It is so general that any off-the-shelf evolutionary multi-objective optimization algorithms can be applied in a plug-in manner.
Our proposed framework is featured with a faster convergence and a stronger resilience to various PF shapes.
arXiv Detail & Related papers (2021-09-12T23:54:26Z) - PAMELI: A Meta-Algorithm for Computationally Expensive Multi-Objective
Optimization Problems [0.0]
The proposed algorithm is based on solving a set of surrogate problems defined by models of the real one.
Our algorithm also performs a meta-search for optimal surrogate models and navigation strategies for the optimization landscape.
arXiv Detail & Related papers (2021-03-19T11:18:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.