A Performance Analysis of Basin Hopping Compared to Established
Metaheuristics for Global Optimization
- URL: http://arxiv.org/abs/2403.05877v1
- Date: Sat, 9 Mar 2024 11:02:47 GMT
- Title: A Performance Analysis of Basin Hopping Compared to Established
Metaheuristics for Global Optimization
- Authors: Marco Baioletti, Valentino Santucci, Marco Tomassini
- Abstract summary: We perform numerical experiments using the IOH profiler environment with the BBOB test function set and two difficult real-world problems.
Results show that Basin Hopping can be considered a good candidate for global numerical optimization problems along with the more established metaheuristics.
- Score: 2.209921757303168
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: During the last decades many metaheuristics for global numerical optimization
have been proposed. Among them, Basin Hopping is very simple and
straightforward to implement, although rarely used outside its original
Physical Chemistry community. In this work, our aim is to compare Basin
Hopping, and two population variants of it, with readily available
implementations of the well known metaheuristics Differential Evolution,
Particle Swarm Optimization, and Covariance Matrix Adaptation Evolution
Strategy. We perform numerical experiments using the IOH profiler environment
with the BBOB test function set and two difficult real-world problems. The
experiments were carried out in two different but complementary ways: by
measuring the performance under a fixed budget of function evaluations and by
considering a fixed target value. The general conclusion is that Basin Hopping
and its newly introduced population variant are almost as good as Covariance
Matrix Adaptation on the synthetic benchmark functions and better than it on
the two hard cluster energy minimization problems. Thus, the proposed analyses
show that Basin Hopping can be considered a good candidate for global numerical
optimization problems along with the more established metaheuristics,
especially if one wants to obtain quick and reliable results on an unknown
problem.
Related papers
- BoTier: Multi-Objective Bayesian Optimization with Tiered Composite Objectives [0.0]
We introduce BoTier, a composite objective that can flexibly represent a hierarchy of preferences over both experiment outcomes and input parameters.
Importantly, BoTier is implemented in an auto-differentiable fashion, enabling seamless integration with the BoTorch library.
arXiv Detail & Related papers (2025-01-26T15:05:37Z) - Co-Learning Bayesian Optimization [28.394424693363103]
We propose a novel BO algorithm labeled as co-learning BO (CLBO), which exploits both model diversity and agreement on unlabeled information to improve the overall surrogate accuracy with limited samples.
Through tests on five numerical toy problems and three engineering benchmarks, the effectiveness of proposed CLBO has been well demonstrated.
arXiv Detail & Related papers (2025-01-23T02:25:10Z) - Tackling Prevalent Conditions in Unsupervised Combinatorial Optimization: Cardinality, Minimum, Covering, and More [28.920691288421484]
We aim to tackle prevalent (i.e., commonly involved) conditions in unsupervised CO.
First, we concretize the targets for objective construction and derandomization with theoretical justification.
Then, for various conditions commonly involved in different CO problems, we derive nontrivial objectives and derandomization to meet the targets.
arXiv Detail & Related papers (2024-05-14T08:35:39Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Fast Bayesian Optimization of Needle-in-a-Haystack Problems using
Zooming Memory-Based Initialization [73.96101108943986]
A Needle-in-a-Haystack problem arises when there is an extreme imbalance of optimum conditions relative to the size of the dataset.
We present a Zooming Memory-Based Initialization algorithm that builds on conventional Bayesian optimization principles.
arXiv Detail & Related papers (2022-08-26T23:57:41Z) - Optimizer Amalgamation [124.33523126363728]
We are motivated to study a new problem named Amalgamation: how can we best combine a pool of "teacher" amalgamations into a single "student" that can have stronger problem-specific performance?
First, we define three differentiable mechanisms to amalgamate a pool of analyticals by gradient descent.
In order to reduce variance of the process, we also explore methods to stabilize the process by perturbing the target.
arXiv Detail & Related papers (2022-03-12T16:07:57Z) - Harnessing Heterogeneity: Learning from Decomposed Feedback in Bayesian
Modeling [68.69431580852535]
We introduce a novel GP regression to incorporate the subgroup feedback.
Our modified regression has provably lower variance -- and thus a more accurate posterior -- compared to previous approaches.
We execute our algorithm on two disparate social problems.
arXiv Detail & Related papers (2021-07-07T03:57:22Z) - Zeroth-Order Hybrid Gradient Descent: Towards A Principled Black-Box
Optimization Framework [100.36569795440889]
This work is on the iteration of zero-th-order (ZO) optimization which does not require first-order information.
We show that with a graceful design in coordinate importance sampling, the proposed ZO optimization method is efficient both in terms of complexity as well as as function query cost.
arXiv Detail & Related papers (2020-12-21T17:29:58Z) - On the Convergence Rate of Projected Gradient Descent for a
Back-Projection based Objective [58.33065918353532]
We consider a back-projection based fidelity term as an alternative to the common least squares (LS)
We show that using the BP term, rather than the LS term, requires fewer iterations of optimization algorithms.
arXiv Detail & Related papers (2020-05-03T00:58:23Z) - Scalable Constrained Bayesian Optimization [10.820024633762596]
The global optimization of a high-dimensional black-box function under black-box constraints is a pervasive task in machine learning, control, and the scientific community.
We propose the scalable constrained Bayesian optimization (SCBO) algorithm that overcomes the above challenges and pushes the state-the-art.
arXiv Detail & Related papers (2020-02-20T01:48:46Z) - On the Performance of Metaheuristics: A Different Perspective [0.0]
We study some basic evolutionary and swam-intelligence metaheuristics i.e. Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Artificial Bee Colony (ABC), Teaching-Learning-Based Optimization (TLBO) and Cuckoo Optimization algorithm (COA)
A large number of experiments have been conducted on 20 different optimization benchmark functions with different characteristics, and the results reveal to us some fundamental conclusions besides the following ranking order among these metaheuristics.
arXiv Detail & Related papers (2020-01-24T09:34:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.