MCS-HMS: A Multi-Cluster Selection Strategy for the Human Mental Search
Algorithm
- URL: http://arxiv.org/abs/2111.10676v1
- Date: Sat, 20 Nov 2021 20:49:52 GMT
- Title: MCS-HMS: A Multi-Cluster Selection Strategy for the Human Mental Search
Algorithm
- Authors: Ehsan Bojnordi, Seyed Jalaleddin Mousavirad, Gerald Schaefer, Iakov
Korovin
- Abstract summary: Population-based metaheuristic algorithms have received significant attention in global optimisation.
Human Mental Search (HMS) is a relatively recent population-based metaheuristic that has been shown to work well in comparison to other algorithms.
We propose an improvement to HMS in which the best bids from multiple clusters are used to benefit from enhanced exploration.
- Score: 2.3127833924896173
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Population-based metaheuristic algorithms have received significant attention
in global optimisation. Human Mental Search (HMS) is a relatively recent
population-based metaheuristic that has been shown to work well in comparison
to other algorithms. However, HMS is time-consuming and suffers from relatively
poor exploration. Having clustered the candidate solutions, HMS selects a
winner cluster with the best mean objective function. This is not necessarily
the best criterion to choose the winner group and limits the exploration
ability of the algorithm. In this paper, we propose an improvement to the HMS
algorithm in which the best bids from multiple clusters are used to benefit
from enhanced exploration. We also use a one-step k-means algorithm in the
clustering phase to improve the speed of the algorithm. Our experimental
results show that MCS-HMS outperforms HMS as well as other population-based
metaheuristic algorithms
Related papers
- A Modular Spatial Clustering Algorithm with Noise Specification [0.0]
Bacteria-Farm algorithm is inspired by the growth of bacteria in closed experimental farms.
In contrast with other clustering algorithms, our algorithm also has a provision to specify the amount of noise to be excluded during clustering.
arXiv Detail & Related papers (2023-09-18T18:05:06Z) - A Hybrid Chimp Optimization Algorithm and Generalized Normal
Distribution Algorithm with Opposition-Based Learning Strategy for Solving
Data Clustering Problems [0.0]
This paper is concerned with data clustering to separate clusters based on the connectivity principle for categorizing similar and dissimilar data into different groups.
Successful meta-heuristic optimization algorithms and intelligence-based methods have been introduced to attain the optimal solution in a reasonable time.
arXiv Detail & Related papers (2023-02-16T23:29:01Z) - HARRIS: Hybrid Ranking and Regression Forests for Algorithm Selection [75.84584400866254]
We propose a new algorithm selector leveraging special forests, combining the strengths of both approaches while alleviating their weaknesses.
HARRIS' decisions are based on a forest model, whose trees are created based on optimized on a hybrid ranking and regression loss function.
arXiv Detail & Related papers (2022-10-31T14:06:11Z) - An enhanced method of initial cluster center selection for K-means
algorithm [0.0]
We propose a novel approach to improve initial cluster selection for K-means algorithm.
The Convex Hull algorithm facilitates the computing of the first two centroids and the remaining ones are selected according to the distance from previously selected centers.
We obtained only 7.33%, 7.90%, and 0% clustering error in Iris, Letter, and Ruspini data respectively.
arXiv Detail & Related papers (2022-10-18T00:58:50Z) - Adaptive Group Collaborative Artificial Bee Colony Algorithm [12.843155301033512]
artificial bee colony (ABC) algorithm has shown to be competitive.
It is poor at balancing the abilities of global searching in the whole solution space (named as exploration) and quick searching in local solution space.
For improving the performance of ABC, an adaptive group collaborative ABC (AgABC) algorithm is introduced.
arXiv Detail & Related papers (2021-12-02T13:33:37Z) - HMS-OS: Improving the Human Mental Search Optimisation Algorithm by
Grouping in both Search and Objective Space [10.61900641909722]
We propose a novel HMS algorithm, HMS-OS, based on clustering in both objective and search space.
For further improvement, HMSOS benefits from an adaptive selection of the number of mental processes in the mental search operator.
arXiv Detail & Related papers (2021-11-19T12:56:33Z) - Machine Learning for Online Algorithm Selection under Censored Feedback [71.6879432974126]
In online algorithm selection (OAS), instances of an algorithmic problem class are presented to an agent one after another, and the agent has to quickly select a presumably best algorithm from a fixed set of candidate algorithms.
For decision problems such as satisfiability (SAT), quality typically refers to the algorithm's runtime.
In this work, we revisit multi-armed bandit algorithms for OAS and discuss their capability of dealing with the problem.
We adapt them towards runtime-oriented losses, allowing for partially censored data while keeping a space- and time-complexity independent of the time horizon.
arXiv Detail & Related papers (2021-09-13T18:10:52Z) - Differentially Private Clustering: Tight Approximation Ratios [57.89473217052714]
We give efficient differentially private algorithms for basic clustering problems.
Our results imply an improved algorithm for the Sample and Aggregate privacy framework.
One of the tools used in our 1-Cluster algorithm can be employed to get a faster quantum algorithm for ClosestPair in a moderate number of dimensions.
arXiv Detail & Related papers (2020-08-18T16:22:06Z) - Run2Survive: A Decision-theoretic Approach to Algorithm Selection based
on Survival Analysis [75.64261155172856]
survival analysis (SA) naturally supports censored data and offers appropriate ways to use such data for learning distributional models of algorithm runtime.
We leverage such models as a basis of a sophisticated decision-theoretic approach to algorithm selection, which we dub Run2Survive.
In an extensive experimental study with the standard benchmark ASlib, our approach is shown to be highly competitive and in many cases even superior to state-of-the-art AS approaches.
arXiv Detail & Related papers (2020-07-06T15:20:17Z) - Extreme Algorithm Selection With Dyadic Feature Representation [78.13985819417974]
We propose the setting of extreme algorithm selection (XAS) where we consider fixed sets of thousands of candidate algorithms.
We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic feature representation.
arXiv Detail & Related papers (2020-01-29T09:40:58Z) - Optimal Clustering from Noisy Binary Feedback [75.17453757892152]
We study the problem of clustering a set of items from binary user feedback.
We devise an algorithm with a minimal cluster recovery error rate.
For adaptive selection, we develop an algorithm inspired by the derivation of the information-theoretical error lower bounds.
arXiv Detail & Related papers (2019-10-14T09:18:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.