Goat Optimization Algorithm: A Novel Bio-Inspired Metaheuristic for Global Optimization
- URL: http://arxiv.org/abs/2503.02331v1
- Date: Tue, 04 Mar 2025 06:44:07 GMT
- Title: Goat Optimization Algorithm: A Novel Bio-Inspired Metaheuristic for Global Optimization
- Authors: Hamed Nozari, Hoessein Abdi, Agnieszka Szmelter-Jarosz,
- Abstract summary: This paper presents a novel bio-inspired metaheuristic optimization technique inspired by goats' adaptive foraging, strategic movement, and parasite avoidance behaviors.<n>The algorithm's performance is evaluated on standard unimodal benchmark functions.<n>The findings suggest that GOA is a promising advancement in bio-inspired optimization techniques.
- Score: 1.2289361708127877
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents the Goat Optimization Algorithm (GOA), a novel bio-inspired metaheuristic optimization technique inspired by goats' adaptive foraging, strategic movement, and parasite avoidance behaviors.GOA is designed to balance exploration and exploitation effectively by incorporating three key mechanisms, adaptive foraging for global search, movement toward the best solution for local refinement, and a jump strategy to escape local optima.A solution filtering mechanism is introduced to enhance robustness and maintain population diversity. The algorithm's performance is evaluated on standard unimodal and multimodal benchmark functions, demonstrating significant improvements over existing metaheuristics, including Particle Swarm Optimization (PSO), Grey Wolf Optimizer (GWO), Genetic Algorithm (GA), Whale Optimization Algorithm (WOA), and Artificial Bee Colony (ABC). Comparative analysis highlights GOA's superior convergence rate, enhanced global search capability, and higher solution accuracy.A Wilcoxon rank-sum test confirms the statistical significance of GOA's exceptional performance. Despite its efficiency, computational complexity and parameter sensitivity remain areas for further optimization. Future research will focus on adaptive parameter tuning, hybridization with other metaheuristics, and real-world applications in supply chain management, bioinformatics, and energy optimization. The findings suggest that GOA is a promising advancement in bio-inspired optimization techniques.
Related papers
- Testing the Efficacy of Hyperparameter Optimization Algorithms in Short-Term Load Forecasting [0.0]
We use the Panama Electricity dataset to evaluate HPO algorithms' performances on a surrogate forecasting algorithm, XGBoost, in terms of accuracy (i.e., MAPE, $R2$) and runtime.
Results reveal significant runtime advantages for HPO algorithms over Random Search.
arXiv Detail & Related papers (2024-10-19T09:08:52Z) - The Firefighter Algorithm: A Hybrid Metaheuristic for Optimization Problems [3.2432648012273346]
The Firefighter Optimization (FFO) algorithm is a new hybrid metaheuristic for optimization problems.
To evaluate the performance of FFO, extensive experiments were conducted, wherein the FFO was examined against 13 commonly used optimization algorithms.
The results demonstrate that FFO achieves comparative performance and, in some scenarios, outperforms commonly adopted optimization algorithms in terms of the obtained fitness, time taken for exaction, and research space covered per unit of time.
arXiv Detail & Related papers (2024-06-01T18:38:59Z) - Localized Zeroth-Order Prompt Optimization [54.964765668688806]
We propose a novel algorithm, namely localized zeroth-order prompt optimization (ZOPO)
ZOPO incorporates a Neural Tangent Kernel-based derived Gaussian process into standard zeroth-order optimization for an efficient search of well-performing local optima in prompt optimization.
Remarkably, ZOPO outperforms existing baselines in terms of both the optimization performance and the query efficiency.
arXiv Detail & Related papers (2024-03-05T14:18:15Z) - MADA: Meta-Adaptive Optimizers through hyper-gradient Descent [73.1383658672682]
We introduce Meta-Adaptives (MADA), a unified framework that can generalize several known convergences and dynamically learn the most suitable one during training.
We empirically compare MADA to other populars on vision and language tasks, and find that MADA consistently outperforms Adam and other populars.
We also propose AVGrad, a modification of AMS that replaces the maximum operator with averaging, which is more suitable for hyper-gradient optimization.
arXiv Detail & Related papers (2024-01-17T00:16:46Z) - Enhancing Optimization Through Innovation: The Multi-Strategy Improved
Black Widow Optimization Algorithm (MSBWOA) [11.450701963760817]
This paper introduces a Multi-Strategy Improved Black Widow Optimization Algorithm (MSBWOA)
It is designed to enhance the performance of the standard Black Widow Algorithm (BW) in solving complex optimization problems.
It integrates four key strategies: initializing the population using Tent chaotic mapping to enhance diversity and initial exploratory capability; implementing mutation optimization on the least fit individuals to maintain dynamic population and prevent premature convergence; and adding a random perturbation strategy to enhance the algorithm's ability to escape local optima.
arXiv Detail & Related papers (2023-12-20T19:55:36Z) - Advancements in Optimization: Adaptive Differential Evolution with
Diversification Strategy [0.0]
The study employs single-objective optimization in a two-dimensional space and runs ADEDS on each of the benchmark functions with multiple iterations.
ADEDS consistently outperforms standard DE for a variety of optimization challenges, including functions with numerous local optima, plate-shaped, valley-shaped, stretched-shaped, and noisy functions.
arXiv Detail & Related papers (2023-10-02T10:05:41Z) - Learning Regions of Interest for Bayesian Optimization with Adaptive
Level-Set Estimation [84.0621253654014]
We propose a framework, called BALLET, which adaptively filters for a high-confidence region of interest.
We show theoretically that BALLET can efficiently shrink the search space, and can exhibit a tighter regret bound than standard BO.
arXiv Detail & Related papers (2023-07-25T09:45:47Z) - Massively Parallel Genetic Optimization through Asynchronous Propagation
of Populations [50.591267188664666]
Propulate is an evolutionary optimization algorithm and software package for global optimization.
We provide an MPI-based implementation of our algorithm, which features variants of selection, mutation, crossover, and migration.
We find that Propulate is up to three orders of magnitude faster without sacrificing solution accuracy.
arXiv Detail & Related papers (2023-01-20T18:17:34Z) - Nesterov Meets Optimism: Rate-Optimal Separable Minimax Optimization [108.35402316802765]
We propose a new first-order optimization algorithm -- AcceleratedGradient-OptimisticGradient (AG-OG) Ascent.
We show that AG-OG achieves the optimal convergence rate (up to a constant) for a variety of settings.
We extend our algorithm to extend the setting and achieve the optimal convergence rate in both bi-SC-SC and bi-C-SC settings.
arXiv Detail & Related papers (2022-10-31T17:59:29Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - EOS: a Parallel, Self-Adaptive, Multi-Population Evolutionary Algorithm
for Constrained Global Optimization [68.8204255655161]
EOS is a global optimization algorithm for constrained and unconstrained problems of real-valued variables.
It implements a number of improvements to the well-known Differential Evolution (DE) algorithm.
Results prove that EOSis capable of achieving increased performance compared to state-of-the-art single-population self-adaptive DE algorithms.
arXiv Detail & Related papers (2020-07-09T10:19:22Z) - A Novel Meta-Heuristic Optimization Algorithm Inspired by the Spread of
Viruses [0.0]
A novel nature-inspired meta-heuristic optimization algorithm called virus spread optimization (VSO) is proposed.
VSO loosely mimics the spread of viruses among hosts, and can be effectively applied to solving many challenging and continuous optimization problems.
arXiv Detail & Related papers (2020-06-11T09:35:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.