An Improved Dung Beetle Optimizer for Random Forest Optimization
- URL: http://arxiv.org/abs/2411.17738v1
- Date: Sun, 24 Nov 2024 06:48:55 GMT
- Title: An Improved Dung Beetle Optimizer for Random Forest Optimization
- Authors: Lianghao Tan, Xiaoyi Liu, Dong Liu, Shubing Liu, Weixi Wu, Huangqi Jiang,
- Abstract summary: This paper proposes an improved algorithm based on circle mapping and longitudinal-horizontal crossover strategy (CICRDBO)
The improved algorithm performs well in both convergence speed and optimization accuracy.
- Score: 5.609805090828983
- License:
- Abstract: To improve the convergence speed and optimization accuracy of the Dung Beetle Optimizer (DBO), this paper proposes an improved algorithm based on circle mapping and longitudinal-horizontal crossover strategy (CICRDBO). First, the Circle method is used to map the initial population to increase diversity. Second, the longitudinal-horizontal crossover strategy is applied to enhance the global search ability by ensuring the position updates of the dung beetle. Simulations were conducted on 10 benchmark test functions, and the results demonstrate that the improved algorithm performs well in both convergence speed and optimization accuracy. The improved algorithm is further applied to the hyperparameter selection of the Random Forest classification algorithm for binary classification prediction in the retail industry. Various combination comparisons prove the practicality of the improved algorithm, followed by SHapley Additive exPlanations (SHAP) analysis.
Related papers
- Provably Faster Algorithms for Bilevel Optimization via Without-Replacement Sampling [96.47086913559289]
gradient-based algorithms are widely used in bilevel optimization.
We introduce a without-replacement sampling based algorithm which achieves a faster convergence rate.
We validate our algorithms over both synthetic and real-world applications.
arXiv Detail & Related papers (2024-11-07T17:05:31Z) - Enhancing Gaussian Process Surrogates for Optimization and Posterior Approximation via Random Exploration [2.984929040246293]
novel noise-free Bayesian optimization strategies that rely on a random exploration step to enhance the accuracy of Gaussian process surrogate models.
New algorithms retain the ease of implementation of the classical GP-UCB, but an additional exploration step facilitates their convergence.
arXiv Detail & Related papers (2024-01-30T14:16:06Z) - Performance Evaluation of Evolutionary Algorithms for Analog Integrated
Circuit Design Optimisation [0.0]
An automated sizing approach for analog circuits is presented in this paper.
A targeted search of the search space has been implemented using a particle generation function and a repair-bounds function.
The algorithms are tuned and modified to converge to a better optimal solution.
arXiv Detail & Related papers (2023-10-19T03:26:36Z) - Advancements in Optimization: Adaptive Differential Evolution with
Diversification Strategy [0.0]
The study employs single-objective optimization in a two-dimensional space and runs ADEDS on each of the benchmark functions with multiple iterations.
ADEDS consistently outperforms standard DE for a variety of optimization challenges, including functions with numerous local optima, plate-shaped, valley-shaped, stretched-shaped, and noisy functions.
arXiv Detail & Related papers (2023-10-02T10:05:41Z) - Fast Computation of Optimal Transport via Entropy-Regularized Extragradient Methods [75.34939761152587]
Efficient computation of the optimal transport distance between two distributions serves as an algorithm that empowers various applications.
This paper develops a scalable first-order optimization-based method that computes optimal transport to within $varepsilon$ additive accuracy.
arXiv Detail & Related papers (2023-01-30T15:46:39Z) - Efficient Non-Parametric Optimizer Search for Diverse Tasks [93.64739408827604]
We present the first efficient scalable and general framework that can directly search on the tasks of interest.
Inspired by the innate tree structure of the underlying math expressions, we re-arrange the spaces into a super-tree.
We adopt an adaptation of the Monte Carlo method to tree search, equipped with rejection sampling and equivalent- form detection.
arXiv Detail & Related papers (2022-09-27T17:51:31Z) - Improved Binary Forward Exploration: Learning Rate Scheduling Method for
Stochastic Optimization [3.541406632811038]
A new gradient-based optimization approach by automatically scheduling the learning rate has been proposed recently, which is called Binary Forward Exploration (BFE)
In this paper, the improved algorithms based on them will be investigated, in order to optimize the efficiency and robustness of the new methodology.
The goal of this method does not aim to beat others but provide a different viewpoint to optimize the gradient descent process.
arXiv Detail & Related papers (2022-07-09T05:28:44Z) - Dynamic Cat Swarm Optimization Algorithm for Backboard Wiring Problem [0.9990687944474739]
This paper presents a powerful swarm intelligence meta-heuristic optimization algorithm called Dynamic Cat Swarm Optimization.
The proposed algorithm suggests a new method to provide a proper balance between these phases by modifying the selection scheme and the seeking mode of the algorithm.
optimization results show the effectiveness of the proposed algorithm, which ranks first compared to several well-known algorithms available in the literature.
arXiv Detail & Related papers (2021-04-27T19:41:27Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - Stochastic batch size for adaptive regularization in deep network
optimization [63.68104397173262]
We propose a first-order optimization algorithm incorporating adaptive regularization applicable to machine learning problems in deep learning framework.
We empirically demonstrate the effectiveness of our algorithm using an image classification task based on conventional network models applied to commonly used benchmark datasets.
arXiv Detail & Related papers (2020-04-14T07:54:53Z) - Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization [71.03797261151605]
Adaptivity is an important yet under-studied property in modern optimization theory.
Our algorithm is proved to achieve the best-available convergence for non-PL objectives simultaneously while outperforming existing algorithms for PL objectives.
arXiv Detail & Related papers (2020-02-13T05:42:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.