Batched Data-Driven Evolutionary Multi-Objective Optimization Based on
Manifold Interpolation
- URL: http://arxiv.org/abs/2109.05639v1
- Date: Sun, 12 Sep 2021 23:54:26 GMT
- Title: Batched Data-Driven Evolutionary Multi-Objective Optimization Based on
Manifold Interpolation
- Authors: Ke Li, Renzhi Chen
- Abstract summary: We propose a framework for implementing batched data-driven evolutionary multi-objective optimization.
It is so general that any off-the-shelf evolutionary multi-objective optimization algorithms can be applied in a plug-in manner.
Our proposed framework is featured with a faster convergence and a stronger resilience to various PF shapes.
- Score: 6.560512252982714
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multi-objective optimization problems are ubiquitous in real-world science,
engineering and design optimization problems. It is not uncommon that the
objective functions are as a black box, the evaluation of which usually involve
time-consuming and/or costly physical experiments. Data-driven evolutionary
optimization can be used to search for a set of non-dominated trade-off
solutions, where the expensive objective functions are approximated as a
surrogate model. In this paper, we propose a framework for implementing batched
data-driven evolutionary multi-objective optimization. It is so general that
any off-the-shelf evolutionary multi-objective optimization algorithms can be
applied in a plug-in manner. In particular, it has two unique components: 1)
based on the Karush-Kuhn-Tucker conditions, a manifold interpolation approach
that explores more diversified solutions with a convergence guarantee along the
manifold of the approximated Pareto-optimal set; and 2) a batch recommendation
approach that reduces the computational time of the optimization process by
evaluating multiple samples at a time in parallel. Experiments on 136 benchmark
test problem instances with irregular Pareto-optimal front shapes against six
state-of-the-art surrogate-assisted EMO algorithms fully demonstrate the
effectiveness and superiority of our proposed framework. In particular, our
proposed framework is featured with a faster convergence and a stronger
resilience to various PF shapes.
Related papers
- Analyzing and Enhancing the Backward-Pass Convergence of Unrolled
Optimization [50.38518771642365]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
A central challenge in this setting is backpropagation through the solution of an optimization problem, which often lacks a closed form.
This paper provides theoretical insights into the backward pass of unrolled optimization, showing that it is equivalent to the solution of a linear system by a particular iterative method.
A system called Folded Optimization is proposed to construct more efficient backpropagation rules from unrolled solver implementations.
arXiv Detail & Related papers (2023-12-28T23:15:18Z) - Combining Kernelized Autoencoding and Centroid Prediction for Dynamic
Multi-objective Optimization [3.431120541553662]
This paper proposes a unified paradigm, which combines the kernelized autoncoding evolutionary search and the centriod-based prediction.
The proposed method is compared with five state-of-the-art algorithms on a number of complex benchmark problems.
arXiv Detail & Related papers (2023-12-02T00:24:22Z) - Evolutionary Solution Adaption for Multi-Objective Metal Cutting Process
Optimization [59.45414406974091]
We introduce a framework for system flexibility that allows us to study the ability of an algorithm to transfer solutions from previous optimization tasks.
We study the flexibility of NSGA-II, which we extend by two variants: 1) varying goals, that optimize solutions for two tasks simultaneously to obtain in-between source solutions expected to be more adaptable, and 2) active-inactive genotype, that accommodates different possibilities that can be activated or deactivated.
Results show that adaption with standard NSGA-II greatly reduces the number of evaluations required for optimization to a target goal, while the proposed variants further improve the adaption costs.
arXiv Detail & Related papers (2023-05-31T12:07:50Z) - A Survey on Multi-Objective based Parameter Optimization for Deep
Learning [1.3223682837381137]
We focus on exploring the effectiveness of multi-objective optimization strategies for parameter optimization in conjunction with deep neural networks.
The two methods are combined to provide valuable insights into the generation of predictions and analysis in multiple applications.
arXiv Detail & Related papers (2023-05-17T07:48:54Z) - Backpropagation of Unrolled Solvers with Folded Optimization [55.04219793298687]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
One typical strategy is algorithm unrolling, which relies on automatic differentiation through the operations of an iterative solver.
This paper provides theoretical insights into the backward pass of unrolled optimization, leading to a system for generating efficiently solvable analytical models of backpropagation.
arXiv Detail & Related papers (2023-01-28T01:50:42Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Late Fusion Multi-view Clustering via Global and Local Alignment
Maximization [61.89218392703043]
Multi-view clustering (MVC) optimally integrates complementary information from different views to improve clustering performance.
Most of existing approaches directly fuse multiple pre-specified similarities to learn an optimal similarity matrix for clustering.
We propose late fusion MVC via alignment to address these issues.
arXiv Detail & Related papers (2022-08-02T01:49:31Z) - Data-Driven Evolutionary Multi-Objective Optimization Based on
Multiple-Gradient Descent for Disconnected Pareto Fronts [6.560512252982714]
This paper proposes a data-driven evolutionary multi-objective optimization (EMO) algorithm based on multiple-gradient descent.
Its infill criterion recommends a batch of promising candidate solutions to conduct expensive objective function evaluations.
arXiv Detail & Related papers (2022-05-28T06:01:41Z) - A novel multiobjective evolutionary algorithm based on decomposition and
multi-reference points strategy [14.102326122777475]
Multiobjective evolutionary algorithm based on decomposition (MOEA/D) has been regarded as a significantly promising approach for solving multiobjective optimization problems (MOPs)
We propose an improved MOEA/D algorithm by virtue of the well-known Pascoletti-Serafini scalarization method and a new strategy of multi-reference points.
arXiv Detail & Related papers (2021-10-27T02:07:08Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - Adaptive Sampling of Pareto Frontiers with Binary Constraints Using
Regression and Classification [0.0]
We present a novel adaptive optimization algorithm for black-box multi-objective optimization problems with binary constraints.
Our method is based on probabilistic regression and classification models, which act as a surrogate for the optimization goals.
We also present a novel ellipsoid truncation method to speed up the expected hypervolume calculation.
arXiv Detail & Related papers (2020-08-27T09:15:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.