A New K means Grey Wolf Algorithm for Engineering Problems
- URL: http://arxiv.org/abs/2103.05760v1
- Date: Sat, 27 Feb 2021 04:29:07 GMT
- Title: A New K means Grey Wolf Algorithm for Engineering Problems
- Authors: Hardi M. Mohammed, Zrar Kh. Abdul, Tarik A. Rashid, Abeer Alsadoon,
Nebojsa Bacanin
- Abstract summary: The main purpose of this paper is to overcome the GWO problem which is trapping into local optima.
The proposed algorithm is called K-means clustering Grey Wolf Optimization (KMGWO)
- Score: 5.373967658884675
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Purpose: The development of metaheuristic algorithms has increased by
researchers to use them extensively in the field of business, science, and
engineering. One of the common metaheuristic optimization algorithms is called
Grey Wolf Optimization (GWO). The algorithm works based on imitation of the
wolves' searching and the process of attacking grey wolves. The main purpose of
this paper to overcome the GWO problem which is trapping into local optima.
Design or Methodology or Approach: In this paper, the K-means clustering
algorithm is used to enhance the performance of the original Grey Wolf
Optimization by dividing the population into different parts. The proposed
algorithm is called K-means clustering Grey Wolf Optimization (KMGWO).
Findings: Results illustrate the efficiency of KMGWO is superior to GWO. To
evaluate the performance of the KMGWO, KMGWO applied to solve 10 CEC2019
benchmark test functions. Results prove that KMGWO is better compared to GWO.
KMGWO is also compared to Cat Swarm Optimization (CSO), Whale Optimization
Algorithm-Bat Algorithm (WOA-BAT), and WOA, so, KMGWO achieves the first rank
in terms of performance. Statistical results proved that KMGWO achieved a
higher significant value compared to the compared algorithms. Also, the KMGWO
is used to solve a pressure vessel design problem and it has outperformed
results.
Originality/value: Results prove that KMGWO is superior to GWO. KMGWO is also
compared to cat swarm optimization (CSO), whale optimization algorithm-bat
algorithm (WOA-BAT), WOA, and GWO so KMGWO achieved the first rank in terms of
performance. Also, the KMGWO is used to solve a classical engineering problem
and it is superior
Related papers
- A Novel Hybrid Grey Wolf Differential Evolution Algorithm [1.2842469556228848]
We introduce a new algorithm based on the hybridization of GWO and two DE variants, namely the GWO-DE algorithm.<n>We evaluate the new algorithm by applying various numerical benchmark functions.
arXiv Detail & Related papers (2025-07-02T17:56:02Z) - Provably Faster Algorithms for Bilevel Optimization via Without-Replacement Sampling [96.47086913559289]
gradient-based algorithms are widely used in bilevel optimization.
We introduce a without-replacement sampling based algorithm which achieves a faster convergence rate.
We validate our algorithms over both synthetic and real-world applications.
arXiv Detail & Related papers (2024-11-07T17:05:31Z) - An Enhanced Grey Wolf Optimizer with Elite Inheritance and Balance Search Mechanisms [13.754649370512924]
The Grey Wolf (GWO) is recognized as a novel meta-heuristic algorithm inspired by the social leadership hierarchy and hunting mechanism of grey wolves.
In the original GWO, there are two significant flaws in its fundamental optimization mechanisms.
An enhanced Grey Wolf with Elite Inheritance Mechanism and Balance Search Mechanism is proposed.
arXiv Detail & Related papers (2024-04-09T03:28:00Z) - GOOSE Algorithm: A Powerful Optimization Tool for Real-World Engineering Challenges and Beyond [4.939986309170004]
The GOOSE algorithm is benchmarked on 19 well-known test functions.
The proposed algorithm is tested on 10 modern benchmark functions.
The achieved findings attest to the proposed algorithm's superior performance.
arXiv Detail & Related papers (2023-07-19T19:14:25Z) - Improved Algorithms for Neural Active Learning [74.89097665112621]
We improve the theoretical and empirical performance of neural-network(NN)-based active learning algorithms for the non-parametric streaming setting.
We introduce two regret metrics by minimizing the population loss that are more suitable in active learning than the one used in state-of-the-art (SOTA) related work.
arXiv Detail & Related papers (2022-10-02T05:03:38Z) - Regret Bounds for Expected Improvement Algorithms in Gaussian Process
Bandit Optimization [63.8557841188626]
The expected improvement (EI) algorithm is one of the most popular strategies for optimization under uncertainty.
We propose a variant of EI with a standard incumbent defined via the GP predictive mean.
We show that our algorithm converges, and achieves a cumulative regret bound of $mathcal O(gamma_TsqrtT)$.
arXiv Detail & Related papers (2022-03-15T13:17:53Z) - Duck swarm algorithm: theory, numerical optimization, and applications [6.244015536594532]
A swarm intelligence-based optimization algorithm, named Duck Swarm Algorithm (DSA), is proposed in this study.
Two rules are modeled from the finding food and foraging of the duck, which corresponds to the exploration and exploitation phases of the proposed DSA.
Results show that DSA is a high-performance optimization method in terms of convergence speed and exploration-exploitation balance.
arXiv Detail & Related papers (2021-12-27T04:53:36Z) - An Accelerated Variance-Reduced Conditional Gradient Sliding Algorithm
for First-order and Zeroth-order Optimization [111.24899593052851]
Conditional gradient algorithm (also known as the Frank-Wolfe algorithm) has recently regained popularity in the machine learning community.
ARCS is the first zeroth-order conditional gradient sliding type algorithms solving convex problems in zeroth-order optimization.
In first-order optimization, the convergence results of ARCS substantially outperform previous algorithms in terms of the number of gradient query oracle.
arXiv Detail & Related papers (2021-09-18T07:08:11Z) - Provably Faster Algorithms for Bilevel Optimization [54.83583213812667]
Bilevel optimization has been widely applied in many important machine learning applications.
We propose two new algorithms for bilevel optimization.
We show that both algorithms achieve the complexity of $mathcalO(epsilon-1.5)$, which outperforms all existing algorithms by the order of magnitude.
arXiv Detail & Related papers (2021-06-08T21:05:30Z) - Double Coverage with Machine-Learned Advice [100.23487145400833]
We study the fundamental online $k$-server problem in a learning-augmented setting.
We show that our algorithm achieves for any k an almost optimal consistency-robustness tradeoff.
arXiv Detail & Related papers (2021-03-02T11:04:33Z) - The Archerfish Hunting Optimizer: a novel metaheuristic algorithm for
global optimization [0.8315801422499861]
Global optimization solves real-world problems numerically or analytically by minimizing their objective functions.
We propose a global metahistic algorithm based on the Archerfish Hunting (AHO)
arXiv Detail & Related papers (2021-02-03T16:22:31Z) - A Novel Hybrid GWO with WOA for Global Numerical Optimization and
Solving Pressure Vessel Design [1.1802674324027231]
Grey Wolf Optimization (GWO) is a very competitive algorithm comparing to other common metaheuristic algorithms.
In this paper, a hybridized WOA with GWO which is called WOAGWO is presented.
The proposed WOAGWO is also evaluated against original WOA, GWO and three other commonly used algorithms.
arXiv Detail & Related papers (2020-02-28T21:15:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.