Speeding up Local Search for the Indicator-based Subset Selection Problem by a Candidate List Strategy
- URL: http://arxiv.org/abs/2503.04224v1
- Date: Thu, 06 Mar 2025 09:06:49 GMT
- Title: Speeding up Local Search for the Indicator-based Subset Selection Problem by a Candidate List Strategy
- Authors: Keisuke Korogi, Ryoji Tanabe,
- Abstract summary: In evolutionary multi-objective optimization, the indicator-based subset selection problem involves finding a subset of points that maximizes a given quality indicator.<n>Local search is an effective approach for obtaining a high-quality subset in this problem.<n>This paper proposes a candidate list strategy for local search in the indicator-based subset selection problem.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In evolutionary multi-objective optimization, the indicator-based subset selection problem involves finding a subset of points that maximizes a given quality indicator. Local search is an effective approach for obtaining a high-quality subset in this problem. However, local search requires high computational cost, especially as the size of the point set and the number of objectives increase. To address this issue, this paper proposes a candidate list strategy for local search in the indicator-based subset selection problem. In the proposed strategy, each point in a given point set has a candidate list. During search, each point is only eligible to swap with unselected points in its associated candidate list. This restriction drastically reduces the number of swaps at each iteration of local search. We consider two types of candidate lists: nearest neighbor and random neighbor lists. This paper investigates the effectiveness of the proposed candidate list strategy on various Pareto fronts. The results show that the proposed strategy with the nearest neighbor list can significantly speed up local search on continuous Pareto fronts without significantly compromising the subset quality. The results also show that the sequential use of the two lists can address the discontinuity of Pareto fronts.
Related papers
- Training Greedy Policy for Proposal Batch Selection in Expensive Multi-Objective Combinatorial Optimization [52.80408805368928]
We introduce a novel greedy-style subset selection algorithm for batch acquisition.
Our experiments on the red fluorescent proteins show that our proposed method achieves the baseline performance in 1.69x fewer queries.
arXiv Detail & Related papers (2024-06-21T05:57:08Z) - Optimistic Query Routing in Clustering-based Approximate Maximum Inner Product Search [9.01394829787271]
This work bridges the gap by studying routing in clustering-based maximum inner product search.<n>We present a framework that incorporates the moments of the distribution of inner products within each shard to estimate the maximum inner product.
arXiv Detail & Related papers (2024-05-20T17:47:18Z) - Frequent Itemset-driven Search for Finding Minimum Node Separators in
Complex Networks [61.2383572324176]
We propose a frequent itemset-driven search approach, which integrates the concept of frequent itemset mining in data mining into the well-known memetic search framework.
It iteratively employs the frequent itemset recombination operator to generate promising offspring solution based on itemsets that frequently occur in high-quality solutions.
In particular, it discovers 29 new upper bounds and matches 18 previous best-known bounds.
arXiv Detail & Related papers (2022-01-18T11:16:40Z) - Adaptive Sampling for Heterogeneous Rank Aggregation from Noisy Pairwise
Comparisons [85.5955376526419]
In rank aggregation problems, users exhibit various accuracy levels when comparing pairs of items.
We propose an elimination-based active sampling strategy, which estimates the ranking of items via noisy pairwise comparisons.
We prove that our algorithm can return the true ranking of items with high probability.
arXiv Detail & Related papers (2021-10-08T13:51:55Z) - Determinantal Beam Search [75.84501052642361]
Beam search is a go-to strategy for decoding neural sequence models.
In use-cases that call for multiple solutions, a diverse or representative set is often desired.
By posing iterations in beam search as a series of subdeterminant problems, we can turn the algorithm into a diverse subset selection process.
arXiv Detail & Related papers (2021-06-14T13:01:46Z) - Extreme Algorithm Selection With Dyadic Feature Representation [78.13985819417974]
We propose the setting of extreme algorithm selection (XAS) where we consider fixed sets of thousands of candidate algorithms.
We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic feature representation.
arXiv Detail & Related papers (2020-01-29T09:40:58Z) - Optimal Clustering from Noisy Binary Feedback [75.17453757892152]
We study the problem of clustering a set of items from binary user feedback.
We devise an algorithm with a minimal cluster recovery error rate.
For adaptive selection, we develop an algorithm inspired by the derivation of the information-theoretical error lower bounds.
arXiv Detail & Related papers (2019-10-14T09:18:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.