Egret Swarm Optimization Algorithm: An Evolutionary Computation Approach
for Model Free Optimization
- URL: http://arxiv.org/abs/2207.14667v1
- Date: Fri, 29 Jul 2022 13:24:29 GMT
- Title: Egret Swarm Optimization Algorithm: An Evolutionary Computation Approach
for Model Free Optimization
- Authors: Zuyan Chen, Adam Francis, Shuai Li, Bolin Liao, Dunhui Xiao
- Abstract summary: A novel meta-heuristic algorithm, Egret Swarm Optimization Algorithm (ESOA), is proposed in this paper.
ESOA is inspired by two egret species' (Great Egret and Snowy Egret) hunting behavior.
The performance of ESOA on 36 benchmark functions as well as 2 engineering problems are compared.
- Score: 5.486833154281385
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A novel meta-heuristic algorithm, Egret Swarm Optimization Algorithm (ESOA),
is proposed in this paper, which is inspired by two egret species' (Great Egret
and Snowy Egret) hunting behavior. ESOA consists of three primary components:
Sit-And-Wait Strategy, Aggressive Strategy as well as Discriminant Conditions.
The performance of ESOA on 36 benchmark functions as well as 2 engineering
problems are compared with Particle Swarm Optimization (PSO), Genetic Algorithm
(GA), Differential Evolution (DE), Grey Wolf Optimizer (GWO), and Harris Hawks
Optimization (HHO). The result proves the superior effectiveness and robustness
of ESOA. The source code used in this work can be retrieved from
https://github.com/Knightsll/Egret_Swarm_Optimization_Algorithm;
https://ww2.mathworks.cn/matlabcentral/fileexchange/115595-egret-swarm-optimization-algorithm-esoa.
Related papers
- A Novel Hybrid Grey Wolf Differential Evolution Algorithm [1.2842469556228848]
We introduce a new algorithm based on the hybridization of GWO and two DE variants, namely the GWO-DE algorithm.<n>We evaluate the new algorithm by applying various numerical benchmark functions.
arXiv Detail & Related papers (2025-07-02T17:56:02Z) - ELRA: Exponential learning rate adaption gradient descent optimization
method [83.88591755871734]
We present a novel, fast (exponential rate), ab initio (hyper-free) gradient based adaption.
The main idea of the method is to adapt the $alpha by situational awareness.
It can be applied to problems of any dimensions n and scales only linearly.
arXiv Detail & Related papers (2023-09-12T14:36:13Z) - GOOSE Algorithm: A Powerful Optimization Tool for Real-World Engineering Challenges and Beyond [4.939986309170004]
The GOOSE algorithm is benchmarked on 19 well-known test functions.
The proposed algorithm is tested on 10 modern benchmark functions.
The achieved findings attest to the proposed algorithm's superior performance.
arXiv Detail & Related papers (2023-07-19T19:14:25Z) - Genetically Modified Wolf Optimization with Stochastic Gradient Descent
for Optimising Deep Neural Networks [0.0]
This research aims to analyze an alternative approach to optimizing neural network (NN) weights, with the use of population-based metaheuristic algorithms.
A hybrid between Grey Wolf (GWO) and Genetic Modified Algorithms (GA) is explored, in conjunction with Gradient Descent (SGD)
This algorithm allows for a combination between exploitation and exploration, whilst also tackling the issue of high-dimensionality.
arXiv Detail & Related papers (2023-01-21T13:22:09Z) - Massively Parallel Genetic Optimization through Asynchronous Propagation
of Populations [50.591267188664666]
Propulate is an evolutionary optimization algorithm and software package for global optimization.
We provide an MPI-based implementation of our algorithm, which features variants of selection, mutation, crossover, and migration.
We find that Propulate is up to three orders of magnitude faster without sacrificing solution accuracy.
arXiv Detail & Related papers (2023-01-20T18:17:34Z) - An efficient hybrid classification approach for COVID-19 based on Harris
Hawks Optimization and Salp Swarm Optimization [0.0]
This study presents a hybrid binary version of the Harris Hawks Optimization algorithm (HHO) and Salp Swarm Optimization (SSA) for Covid-19 classification.
The proposed algorithm (HHOSSA) achieved 96% accuracy with the SVM, 98% and 98% accuracy with two classifiers.
arXiv Detail & Related papers (2022-12-25T19:52:18Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Duck swarm algorithm: theory, numerical optimization, and applications [6.244015536594532]
A swarm intelligence-based optimization algorithm, named Duck Swarm Algorithm (DSA), is proposed in this study.
Two rules are modeled from the finding food and foraging of the duck, which corresponds to the exploration and exploitation phases of the proposed DSA.
Results show that DSA is a high-performance optimization method in terms of convergence speed and exploration-exploitation balance.
arXiv Detail & Related papers (2021-12-27T04:53:36Z) - An Accelerated Variance-Reduced Conditional Gradient Sliding Algorithm
for First-order and Zeroth-order Optimization [111.24899593052851]
Conditional gradient algorithm (also known as the Frank-Wolfe algorithm) has recently regained popularity in the machine learning community.
ARCS is the first zeroth-order conditional gradient sliding type algorithms solving convex problems in zeroth-order optimization.
In first-order optimization, the convergence results of ARCS substantially outperform previous algorithms in terms of the number of gradient query oracle.
arXiv Detail & Related papers (2021-09-18T07:08:11Z) - The Archerfish Hunting Optimizer: a novel metaheuristic algorithm for
global optimization [0.8315801422499861]
Global optimization solves real-world problems numerically or analytically by minimizing their objective functions.
We propose a global metahistic algorithm based on the Archerfish Hunting (AHO)
arXiv Detail & Related papers (2021-02-03T16:22:31Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - A Primer on Zeroth-Order Optimization in Signal Processing and Machine
Learning [95.85269649177336]
ZO optimization iteratively performs three major steps: gradient estimation, descent direction, and solution update.
We demonstrate promising applications of ZO optimization, such as evaluating and generating explanations from black-box deep learning models, and efficient online sensor management.
arXiv Detail & Related papers (2020-06-11T06:50:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.