GNBG-Generated Test Suite for Box-Constrained Numerical Global
Optimization
- URL: http://arxiv.org/abs/2312.07034v1
- Date: Tue, 12 Dec 2023 07:40:12 GMT
- Title: GNBG-Generated Test Suite for Box-Constrained Numerical Global
Optimization
- Authors: Amir H. Gandomi (1,2), Danial Yazdani (1), Mohammad Nabi Omidvar (3),
and Kalyanmoy Deb (4) ((1) Faculty of Engineering & Information Technology,
University of Technology Sydney, (2) University Research and Innovation
Center (EKIK), Obuda University, (3) School of Computing, University of
Leeds, and Leeds University Business School, (4) BEACON Center, Michigan
State University)
- Abstract summary: This document introduces a set of 24 box-constrained numerical global optimization problem instances.
Cases cover a broad spectrum of problem features, including varying degrees of modality, ruggedness, symmetry, conditioning, variable interaction structures, basin linearity, and deceptiveness.
- Score: 5.804807909435654
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: This document introduces a set of 24 box-constrained numerical global
optimization problem instances, systematically constructed using the
Generalized Numerical Benchmark Generator (GNBG). These instances cover a broad
spectrum of problem features, including varying degrees of modality,
ruggedness, symmetry, conditioning, variable interaction structures, basin
linearity, and deceptiveness. Purposefully designed, this test suite offers
varying difficulty levels and problem characteristics, facilitating rigorous
evaluation and comparative analysis of optimization algorithms. By presenting
these problems, we aim to provide researchers with a structured platform to
assess the strengths and weaknesses of their algorithms against challenges with
known, controlled characteristics. For reproducibility, the MATLAB source code
for this test suite is publicly available.
Related papers
- A Benchmark for Maximum Cut: Towards Standardization of the Evaluation of Learned Heuristics for Combinatorial Optimization [12.016449555335976]
We propose an open-source benchmark suite MaxCut-Bench dedicated to the NP-hard Maximum Cut problem in both its weighted and unweighted variants.
We use the benchmark in an attempt to systematically corroborate or reproduce the results of several, popular learning-based approaches.
Our results show that several of the learneds fail to outperform a naive greedy algorithm, and that only one of them consistently outperforms Tabu Search.
arXiv Detail & Related papers (2024-06-14T19:44:23Z) - GNBG: A Generalized and Configurable Benchmark Generator for Continuous
Numerical Optimization [5.635586285644365]
It is crucial to use a benchmark test suite that encompasses a diverse range of problem instances with various characteristics.
Traditional benchmark suites often consist of numerous fixed test functions, making it challenging to align these with specific research objectives.
This paper introduces the Generalized Numerical Benchmark Generator (GNBG) for single-objective, box-constrained, continuous numerical optimization.
arXiv Detail & Related papers (2023-12-12T09:04:34Z) - A Sequential Deep Learning Algorithm for Sampled Mixed-integer
Optimisation Problems [0.3867363075280544]
We introduce and analyse two efficient algorithms for mixed-integer optimisation problems.
We show that both algorithms exhibit finite-time convergence towards the optimal solution.
We establish quantitatively the efficacy of these algorithms by means of three numerical tests.
arXiv Detail & Related papers (2023-01-25T17:10:52Z) - Amortized Implicit Differentiation for Stochastic Bilevel Optimization [53.12363770169761]
We study a class of algorithms for solving bilevel optimization problems in both deterministic and deterministic settings.
We exploit a warm-start strategy to amortize the estimation of the exact gradient.
By using this framework, our analysis shows these algorithms to match the computational complexity of methods that have access to an unbiased estimate of the gradient.
arXiv Detail & Related papers (2021-11-29T15:10:09Z) - Generating Large-scale Dynamic Optimization Problem Instances Using the
Generalized Moving Peaks Benchmark [9.109331015600185]
This document describes the generalized moving peaks benchmark (GMPB) and how it can be used to generate problem instances for continuous large-scale dynamic optimization problems.
It presents a set of 15 benchmark problems, the relevant source code, and a performance indicator, designed for comparative studies and competitions in large-scale dynamic optimization.
arXiv Detail & Related papers (2021-07-23T03:57:50Z) - Harnessing Heterogeneity: Learning from Decomposed Feedback in Bayesian
Modeling [68.69431580852535]
We introduce a novel GP regression to incorporate the subgroup feedback.
Our modified regression has provably lower variance -- and thus a more accurate posterior -- compared to previous approaches.
We execute our algorithm on two disparate social problems.
arXiv Detail & Related papers (2021-07-07T03:57:22Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - Efficient Methods for Structured Nonconvex-Nonconcave Min-Max
Optimization [98.0595480384208]
We propose a generalization extraient spaces which converges to a stationary point.
The algorithm applies not only to general $p$-normed spaces, but also to general $p$-dimensional vector spaces.
arXiv Detail & Related papers (2020-10-31T21:35:42Z) - EOS: a Parallel, Self-Adaptive, Multi-Population Evolutionary Algorithm
for Constrained Global Optimization [68.8204255655161]
EOS is a global optimization algorithm for constrained and unconstrained problems of real-valued variables.
It implements a number of improvements to the well-known Differential Evolution (DE) algorithm.
Results prove that EOSis capable of achieving increased performance compared to state-of-the-art single-population self-adaptive DE algorithms.
arXiv Detail & Related papers (2020-07-09T10:19:22Z) - Total Deep Variation: A Stable Regularizer for Inverse Problems [71.90933869570914]
We introduce the data-driven general-purpose total deep variation regularizer.
In its core, a convolutional neural network extracts local features on multiple scales and in successive blocks.
We achieve state-of-the-art results for numerous imaging tasks.
arXiv Detail & Related papers (2020-06-15T21:54:15Z) - Scalable and Customizable Benchmark Problems for Many-Objective
Optimization [0.0]
We propose a parameterized generator of scalable and customizable benchmark problems for many-objective problems (MaOPs)
It is able to generate problems that reproduce features present in other benchmarks and also problems with some new features.
arXiv Detail & Related papers (2020-01-26T12:39:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.