The Archerfish Hunting Optimizer: a novel metaheuristic algorithm for
global optimization
- URL: http://arxiv.org/abs/2102.02134v1
- Date: Wed, 3 Feb 2021 16:22:31 GMT
- Title: The Archerfish Hunting Optimizer: a novel metaheuristic algorithm for
global optimization
- Authors: Farouq Zitouni, Saad Harous, Abdelghani Belkeram, Lokman Elhakim Baba
Hammou
- Abstract summary: Global optimization solves real-world problems numerically or analytically by minimizing their objective functions.
We propose a global metahistic algorithm based on the Archerfish Hunting (AHO)
- Score: 0.8315801422499861
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Global optimization solves real-world problems numerically or analytically by
minimizing their objective functions. Most of the analytical algorithms are
greedy and computationally intractable. Metaheuristics are nature-inspired
optimization algorithms. They numerically find a near-optimal solution for
optimization problems in a reasonable amount of time. We propose a novel
metaheuristic algorithm for global optimization. It is based on the shooting
and jumping behaviors of the archerfish for hunting aerial insects. We name it
the Archerfish Hunting Optimizer (AHO). We Perform two sorts of comparisons to
validate the proposed algorithm's performance. First, AHO is compared to the 12
recent metaheuristic algorithms (the accepted algorithms for the 2020's
competition on single objective bound-constrained numerical optimization) on
ten test functions of the benchmark CEC 2020 for unconstrained optimization.
Second, the performance of AHO and 3 recent metaheuristic algorithms, is
evaluated using five engineering design problems taken from the benchmark CEC
2020 for non-convex constrained optimization. The experimental results are
evaluated using the Wilcoxon signed-rank and the Friedman tests. The
statistical indicators illustrate that the Archerfish Hunting Optimizer has an
excellent ability to accomplish higher performance in competition with the
well-established optimizers.
Related papers
- BMR and BWR: Two simple metaphor-free optimization algorithms for solving real-life non-convex constrained and unconstrained problems [0.5755004576310334]
Two simple yet powerful optimization algorithms, named the Best-MeanRandom (BMR) and Best-Worst-Random (BWR) algorithms, are developed and presented in this paper.
arXiv Detail & Related papers (2024-07-15T18:11:47Z) - GOOSE Algorithm: A Powerful Optimization Tool for Real-World Engineering Challenges and Beyond [4.939986309170004]
The GOOSE algorithm is benchmarked on 19 well-known test functions.
The proposed algorithm is tested on 10 modern benchmark functions.
The achieved findings attest to the proposed algorithm's superior performance.
arXiv Detail & Related papers (2023-07-19T19:14:25Z) - Improving Performance Insensitivity of Large-scale Multiobjective
Optimization via Monte Carlo Tree Search [7.34812867861951]
We propose an evolutionary algorithm for solving large-scale multiobjective optimization problems based on Monte Carlo tree search.
The proposed method samples the decision variables to construct new nodes on the Monte Carlo tree for optimization and evaluation.
It selects nodes with good evaluation for further search to reduce the performance sensitivity caused by large-scale decision variables.
arXiv Detail & Related papers (2023-04-08T17:15:49Z) - Duck swarm algorithm: theory, numerical optimization, and applications [6.244015536594532]
A swarm intelligence-based optimization algorithm, named Duck Swarm Algorithm (DSA), is proposed in this study.
Two rules are modeled from the finding food and foraging of the duck, which corresponds to the exploration and exploitation phases of the proposed DSA.
Results show that DSA is a high-performance optimization method in terms of convergence speed and exploration-exploitation balance.
arXiv Detail & Related papers (2021-12-27T04:53:36Z) - An Accelerated Variance-Reduced Conditional Gradient Sliding Algorithm
for First-order and Zeroth-order Optimization [111.24899593052851]
Conditional gradient algorithm (also known as the Frank-Wolfe algorithm) has recently regained popularity in the machine learning community.
ARCS is the first zeroth-order conditional gradient sliding type algorithms solving convex problems in zeroth-order optimization.
In first-order optimization, the convergence results of ARCS substantially outperform previous algorithms in terms of the number of gradient query oracle.
arXiv Detail & Related papers (2021-09-18T07:08:11Z) - Provably Faster Algorithms for Bilevel Optimization [54.83583213812667]
Bilevel optimization has been widely applied in many important machine learning applications.
We propose two new algorithms for bilevel optimization.
We show that both algorithms achieve the complexity of $mathcalO(epsilon-1.5)$, which outperforms all existing algorithms by the order of magnitude.
arXiv Detail & Related papers (2021-06-08T21:05:30Z) - Dynamic Cat Swarm Optimization Algorithm for Backboard Wiring Problem [0.9990687944474739]
This paper presents a powerful swarm intelligence meta-heuristic optimization algorithm called Dynamic Cat Swarm Optimization.
The proposed algorithm suggests a new method to provide a proper balance between these phases by modifying the selection scheme and the seeking mode of the algorithm.
optimization results show the effectiveness of the proposed algorithm, which ranks first compared to several well-known algorithms available in the literature.
arXiv Detail & Related papers (2021-04-27T19:41:27Z) - Recent Theoretical Advances in Non-Convex Optimization [56.88981258425256]
Motivated by recent increased interest in analysis of optimization algorithms for non- optimization in deep networks and other problems in data, we give an overview of recent results of theoretical optimization algorithms for non- optimization.
arXiv Detail & Related papers (2020-12-11T08:28:51Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z) - Private Stochastic Convex Optimization: Optimal Rates in Linear Time [74.47681868973598]
We study the problem of minimizing the population loss given i.i.d. samples from a distribution over convex loss functions.
A recent work of Bassily et al. has established the optimal bound on the excess population loss achievable given $n$ samples.
We describe two new techniques for deriving convex optimization algorithms both achieving the optimal bound on excess loss and using $O(minn, n2/d)$ gradient computations.
arXiv Detail & Related papers (2020-05-10T19:52:03Z) - Self-Directed Online Machine Learning for Topology Optimization [58.920693413667216]
Self-directed Online Learning Optimization integrates Deep Neural Network (DNN) with Finite Element Method (FEM) calculations.
Our algorithm was tested by four types of problems including compliance minimization, fluid-structure optimization, heat transfer enhancement and truss optimization.
It reduced the computational time by 2 5 orders of magnitude compared with directly using methods, and outperformed all state-of-the-art algorithms tested in our experiments.
arXiv Detail & Related papers (2020-02-04T20:00:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.