Fast Greedy Subset Selection from Large Candidate Solution Sets in
Evolutionary Multi-objective Optimization
- URL: http://arxiv.org/abs/2102.00941v1
- Date: Mon, 1 Feb 2021 16:14:15 GMT
- Title: Fast Greedy Subset Selection from Large Candidate Solution Sets in
Evolutionary Multi-objective Optimization
- Authors: Weiyu Chen, Hisao Ishibuchi, and Ke Shang
- Abstract summary: We discuss the efficiency of greedy subset selection for the hypervolume, IGD and IGD+ indicators.
Our idea is to use the submodular property, which is known for the hypervolume indicator, to improve their efficiency.
- Score: 11.110675371854988
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Subset selection is an interesting and important topic in the field of
evolutionary multi-objective optimization (EMO). Especially, in an EMO
algorithm with an unbounded external archive, subset selection is an essential
post-processing procedure to select a pre-specified number of solutions as the
final result. In this paper, we discuss the efficiency of greedy subset
selection for the hypervolume, IGD and IGD+ indicators. Greedy algorithms
usually efficiently handle subset selection. However, when a large number of
solutions are given (e.g., subset selection from tens of thousands of solutions
in an unbounded external archive), they often become time-consuming. Our idea
is to use the submodular property, which is known for the hypervolume
indicator, to improve their efficiency. First, we prove that the IGD and IGD+
indicators are also submodular. Next, based on the submodular property, we
propose an efficient greedy inclusion algorithm for each indicator. Then, we
demonstrate through computational experiments that the proposed algorithms are
much faster than the standard greedy subset selection algorithms.
Related papers
- Training Greedy Policy for Proposal Batch Selection in Expensive Multi-Objective Combinatorial Optimization [52.80408805368928]
We introduce a novel greedy-style subset selection algorithm for batch acquisition.
Our experiments on the red fluorescent proteins show that our proposed method achieves the baseline performance in 1.69x fewer queries.
arXiv Detail & Related papers (2024-06-21T05:57:08Z) - Feature Selection as Deep Sequential Generative Learning [50.00973409680637]
We develop a deep variational transformer model over a joint of sequential reconstruction, variational, and performance evaluator losses.
Our model can distill feature selection knowledge and learn a continuous embedding space to map feature selection decision sequences into embedding vectors associated with utility scores.
arXiv Detail & Related papers (2024-03-06T16:31:56Z) - On Distributed Larger-Than-Memory Subset Selection With Pairwise
Submodular Functions [32.09166873008287]
Submodularity, a discrete analogue of convexity, is commonly used for solving subset selection problems.
In this paper, we propose a novel distributed bounding algorithm with provable approximation guarantees.
We show that these algorithms find high quality subsets on CIFAR-100 and ImageNet with marginal or no loss in quality compared to centralized methods.
arXiv Detail & Related papers (2024-02-26T09:38:39Z) - Rank-Based Learning and Local Model Based Evolutionary Algorithm for High-Dimensional Expensive Multi-Objective Problems [1.0499611180329806]
The proposed algorithm consists of three parts: rank-based learning, hyper-volume-based non-dominated search, and local search in the relatively sparse objective space.
The experimental results of benchmark problems and a real-world application on geothermal reservoir heat extraction optimization demonstrate that the proposed algorithm shows superior performance.
arXiv Detail & Related papers (2023-04-19T06:25:04Z) - Fast Feature Selection with Fairness Constraints [49.142308856826396]
We study the fundamental problem of selecting optimal features for model construction.
This problem is computationally challenging on large datasets, even with the use of greedy algorithm variants.
We extend the adaptive query model, recently proposed for the greedy forward selection for submodular functions, to the faster paradigm of Orthogonal Matching Pursuit for non-submodular functions.
The proposed algorithm achieves exponentially fast parallel run time in the adaptive query model, scaling much better than prior work.
arXiv Detail & Related papers (2022-02-28T12:26:47Z) - Benchmarking Subset Selection from Large Candidate Solution Sets in
Evolutionary Multi-objective Optimization [6.544757635738911]
In the evolutionary multi-objective optimization (EMO) field, the standard practice is to present the final population of an EMO algorithm as the output.
Recently, a new EMO framework has been proposed to solve this issue by storing all the non-dominated solutions generated during the evolution in an archive.
This paper proposes a benchmark test suite for subset selection from large candidate solution sets.
arXiv Detail & Related papers (2022-01-18T02:09:08Z) - Joint Deep Reinforcement Learning and Unfolding: Beam Selection and
Precoding for mmWave Multiuser MIMO with Lens Arrays [54.43962058166702]
millimeter wave (mmWave) multiuser multiple-input multiple-output (MU-MIMO) systems with discrete lens arrays have received great attention.
In this work, we investigate the joint design of a beam precoding matrix for mmWave MU-MIMO systems with DLA.
arXiv Detail & Related papers (2021-01-05T03:55:04Z) - Lazy Greedy Hypervolume Subset Selection from Large Candidate Solution
Sets [5.222705629027499]
We propose a new lazy greedy algorithm exploiting the submodular property of the hypervolume indicator.
Experimental results show that the proposed algorithm is hundreds of times faster than the original greedy inclusion algorithm.
arXiv Detail & Related papers (2020-07-04T09:19:45Z) - Extreme Algorithm Selection With Dyadic Feature Representation [78.13985819417974]
We propose the setting of extreme algorithm selection (XAS) where we consider fixed sets of thousands of candidate algorithms.
We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic feature representation.
arXiv Detail & Related papers (2020-01-29T09:40:58Z) - Optimal Clustering from Noisy Binary Feedback [75.17453757892152]
We study the problem of clustering a set of items from binary user feedback.
We devise an algorithm with a minimal cluster recovery error rate.
For adaptive selection, we develop an algorithm inspired by the derivation of the information-theoretical error lower bounds.
arXiv Detail & Related papers (2019-10-14T09:18:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.