Benchmarking Subset Selection from Large Candidate Solution Sets in
Evolutionary Multi-objective Optimization
- URL: http://arxiv.org/abs/2201.06700v1
- Date: Tue, 18 Jan 2022 02:09:08 GMT
- Title: Benchmarking Subset Selection from Large Candidate Solution Sets in
Evolutionary Multi-objective Optimization
- Authors: Ke Shang and Tianye Shu and Hisao Ishibuchi and Yang Nan and Lie Meng
Pang
- Abstract summary: In the evolutionary multi-objective optimization (EMO) field, the standard practice is to present the final population of an EMO algorithm as the output.
Recently, a new EMO framework has been proposed to solve this issue by storing all the non-dominated solutions generated during the evolution in an archive.
This paper proposes a benchmark test suite for subset selection from large candidate solution sets.
- Score: 6.544757635738911
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the evolutionary multi-objective optimization (EMO) field, the standard
practice is to present the final population of an EMO algorithm as the output.
However, it has been shown that the final population often includes solutions
which are dominated by other solutions generated and discarded in previous
generations. Recently, a new EMO framework has been proposed to solve this
issue by storing all the non-dominated solutions generated during the evolution
in an archive and selecting a subset of solutions from the archive as the
output. The key component in this framework is the subset selection from the
archive which usually stores a large number of candidate solutions. However,
most studies on subset selection focus on small candidate solution sets for
environmental selection. There is no benchmark test suite for large-scale
subset selection. This paper aims to fill this research gap by proposing a
benchmark test suite for subset selection from large candidate solution sets,
and comparing some representative methods using the proposed test suite. The
proposed test suite together with the benchmarking studies provides a baseline
for researchers to understand, use, compare, and develop subset selection
methods in the EMO field.
Related papers
- An incremental preference elicitation-based approach to learning potentially non-monotonic preferences in multi-criteria sorting [53.36437745983783]
We first construct a max-margin optimization-based model to model potentially non-monotonic preferences.
We devise information amount measurement methods and question selection strategies to pinpoint the most informative alternative in each iteration.
Two incremental preference elicitation-based algorithms are developed to learn potentially non-monotonic preferences.
arXiv Detail & Related papers (2024-09-04T14:36:20Z) - Training Greedy Policy for Proposal Batch Selection in Expensive Multi-Objective Combinatorial Optimization [52.80408805368928]
We introduce a novel greedy-style subset selection algorithm for batch acquisition.
Our experiments on the red fluorescent proteins show that our proposed method achieves the baseline performance in 1.69x fewer queries.
arXiv Detail & Related papers (2024-06-21T05:57:08Z) - Large Language Models Are Not Robust Multiple Choice Selectors [117.72712117510953]
Multiple choice questions (MCQs) serve as a common yet important task format in the evaluation of large language models (LLMs)
This work shows that modern LLMs are vulnerable to option position changes due to their inherent "selection bias"
We propose a label-free, inference-time debiasing method, called PriDe, which separates the model's prior bias for option IDs from the overall prediction distribution.
arXiv Detail & Related papers (2023-09-07T17:44:56Z) - Domain Generalization via Rationale Invariance [70.32415695574555]
This paper offers a new perspective to ease the challenge of domain generalization, which involves maintaining robust results even in unseen environments.
We propose treating the element-wise contributions to the final results as the rationale for making a decision and representing the rationale for each sample as a matrix.
Our experiments demonstrate that the proposed approach achieves competitive results across various datasets, despite its simplicity.
arXiv Detail & Related papers (2023-08-22T03:31:40Z) - Optimal Data Selection: An Online Distributed View [61.31708750038692]
We develop algorithms for the online and distributed version of the problem.
We show that our selection methods outperform random selection by $5-20%$.
In learning tasks on ImageNet and MNIST, we show that our selection methods outperform random selection by $5-20%$.
arXiv Detail & Related papers (2022-01-25T18:56:16Z) - Clustering-Based Subset Selection in Evolutionary Multiobjective
Optimization [11.110675371854988]
Subset selection is an important component in evolutionary multiobjective optimization (EMO) algorithms.
Clustering-based methods have not been evaluated in the context of subset selection from solution sets obtained by EMO algorithms.
arXiv Detail & Related papers (2021-08-19T02:56:41Z) - Fast Greedy Subset Selection from Large Candidate Solution Sets in
Evolutionary Multi-objective Optimization [11.110675371854988]
We discuss the efficiency of greedy subset selection for the hypervolume, IGD and IGD+ indicators.
Our idea is to use the submodular property, which is known for the hypervolume indicator, to improve their efficiency.
arXiv Detail & Related papers (2021-02-01T16:14:15Z) - Evolutionary Multi-Objective Optimization Algorithm Framework with Three
Solution Sets [7.745468825770201]
It is assumed that a final solution is selected by a decision maker from a non-dominated solution set obtained by an EMO algorithm.
In this paper, we suggest the use of a general EMO framework with three solution sets to handle various situations.
arXiv Detail & Related papers (2020-12-14T08:04:07Z) - Algorithm Configurations of MOEA/D with an Unbounded External Archive [7.745468825770201]
We show that the performance of MOEA/D is improved by linearly changing the reference point specification during its execution.
We also examine the use of a genetic algorithm-based offline hyper-heuristic method to find the best configuration of MOEA/D in each framework.
arXiv Detail & Related papers (2020-07-27T08:14:37Z) - Solution Subset Selection for Final Decision Making in Evolutionary
Multi-Objective Optimization [7.745468825770201]
We discuss subset selection from a viewpoint of the final decision making.
We show that the formulated function is the same as the IGD plus indicator.
arXiv Detail & Related papers (2020-06-15T06:26:58Z) - Extreme Algorithm Selection With Dyadic Feature Representation [78.13985819417974]
We propose the setting of extreme algorithm selection (XAS) where we consider fixed sets of thousands of candidate algorithms.
We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic feature representation.
arXiv Detail & Related papers (2020-01-29T09:40:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.