Generating Large-scale Dynamic Optimization Problem Instances Using the
Generalized Moving Peaks Benchmark
- URL: http://arxiv.org/abs/2107.11019v1
- Date: Fri, 23 Jul 2021 03:57:50 GMT
- Title: Generating Large-scale Dynamic Optimization Problem Instances Using the
Generalized Moving Peaks Benchmark
- Authors: Mohammad Nabi Omidvar, Danial Yazdani, Juergen Branke, Xiaodong Li,
Shengxiang Yang, Xin Yao
- Abstract summary: This document describes the generalized moving peaks benchmark (GMPB) and how it can be used to generate problem instances for continuous large-scale dynamic optimization problems.
It presents a set of 15 benchmark problems, the relevant source code, and a performance indicator, designed for comparative studies and competitions in large-scale dynamic optimization.
- Score: 9.109331015600185
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This document describes the generalized moving peaks benchmark (GMPB) and how
it can be used to generate problem instances for continuous large-scale dynamic
optimization problems. It presents a set of 15 benchmark problems, the relevant
source code, and a performance indicator, designed for comparative studies and
competitions in large-scale dynamic optimization. Although its primary purpose
is to provide a coherent basis for running competitions, its generality allows
the interested reader to use this document as a guide to design customized
problem instances to investigate issues beyond the scope of the presented
benchmark suite. To this end, we explain the modular structure of the GMPB and
how its constituents can be assembled to form problem instances with a variety
of controllable characteristics ranging from unimodal to highly multimodal,
symmetric to highly asymmetric, smooth to highly irregular, and various degrees
of variable interaction and ill-conditioning.
Related papers
- LLaMA-Berry: Pairwise Optimization for O1-like Olympiad-Level Mathematical Reasoning [56.273799410256075]
The framework combines Monte Carlo Tree Search (MCTS) with iterative Self-Refine to optimize the reasoning path.
The framework has been tested on general and advanced benchmarks, showing superior performance in terms of search efficiency and problem-solving capability.
arXiv Detail & Related papers (2024-10-03T18:12:29Z) - MG-Net: Learn to Customize QAOA with Circuit Depth Awareness [51.78425545377329]
Quantum Approximate Optimization Algorithm (QAOA) and its variants exhibit immense potential in tackling optimization challenges.
The requisite circuit depth for satisfactory performance is problem-specific and often exceeds the maximum capability of current quantum devices.
We introduce the Mixer Generator Network (MG-Net), a unified deep learning framework adept at dynamically formulating optimal mixer Hamiltonians.
arXiv Detail & Related papers (2024-09-27T12:28:18Z) - GNBG: A Generalized and Configurable Benchmark Generator for Continuous
Numerical Optimization [5.635586285644365]
It is crucial to use a benchmark test suite that encompasses a diverse range of problem instances with various characteristics.
Traditional benchmark suites often consist of numerous fixed test functions, making it challenging to align these with specific research objectives.
This paper introduces the Generalized Numerical Benchmark Generator (GNBG) for single-objective, box-constrained, continuous numerical optimization.
arXiv Detail & Related papers (2023-12-12T09:04:34Z) - GNBG-Generated Test Suite for Box-Constrained Numerical Global
Optimization [5.804807909435654]
This document introduces a set of 24 box-constrained numerical global optimization problem instances.
Cases cover a broad spectrum of problem features, including varying degrees of modality, ruggedness, symmetry, conditioning, variable interaction structures, basin linearity, and deceptiveness.
arXiv Detail & Related papers (2023-12-12T07:40:12Z) - RGM: A Robust Generalizable Matching Model [49.60975442871967]
We propose a deep model for sparse and dense matching, termed RGM (Robust Generalist Matching)
To narrow the gap between synthetic training samples and real-world scenarios, we build a new, large-scale dataset with sparse correspondence ground truth.
We are able to mix up various dense and sparse matching datasets, significantly improving the training diversity.
arXiv Detail & Related papers (2023-10-18T07:30:08Z) - A Scalable Test Problem Generator for Sequential Transfer Optimization [32.171233314036286]
Sequential transfer optimization (STO) aims to improve the optimization performance on a task of interest by exploiting previously-solved optimization tasks stored in a database.
Existing test problems are either simply generated by assembling other benchmark functions or extended from specific practical problems with limited scalability.
In this study, we first introduce four concepts for characterizing STO problems and present an important problem feature, namely similarity distribution.
arXiv Detail & Related papers (2023-04-17T06:48:07Z) - Backpropagation of Unrolled Solvers with Folded Optimization [55.04219793298687]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
One typical strategy is algorithm unrolling, which relies on automatic differentiation through the operations of an iterative solver.
This paper provides theoretical insights into the backward pass of unrolled optimization, leading to a system for generating efficiently solvable analytical models of backpropagation.
arXiv Detail & Related papers (2023-01-28T01:50:42Z) - Multi-Objective Constrained Optimization for Energy Applications via
Tree Ensembles [55.23285485923913]
Energy systems optimization problems are complex due to strongly non-linear system behavior and multiple competing objectives.
In some cases, proposed optimal solutions need to obey explicit input constraints related to physical properties or safety-critical operating conditions.
This paper proposes a novel data-driven strategy using tree ensembles for constrained multi-objective optimization of black-box problems.
arXiv Detail & Related papers (2021-11-04T20:18:55Z) - Competition on Dynamic Optimization Problems Generated by Generalized
Moving Peaks Benchmark (GMPB) [5.1812733319583915]
This document introduces the Generalized Moving Benchmark (GMPB)
GMPB is adept at generating landscapes with a broad spectrum of characteristics.
This document delves into the intricacies of GMPB, detailing its myriad ways in which its parameters can be tuned to produce these diverse landscape characteristics.
arXiv Detail & Related papers (2021-06-11T05:31:01Z) - Joint Continuous and Discrete Model Selection via Submodularity [1.332560004325655]
In model selection problems for machine learning, the desire for a well-performing model with meaningful structure is typically expressed through a regularized optimization problem.
In many scenarios, however, numerically meaningful structure is specified in some discrete space leading to difficult non optimization problems.
We show how simple continuous or discrete constraints can also be handled for certain problem classes, motivated by robust optimization.
arXiv Detail & Related papers (2021-02-17T21:14:47Z) - Posterior Differential Regularization with f-divergence for Improving
Model Robustness [95.05725916287376]
We focus on methods that regularize the model posterior difference between clean and noisy inputs.
We generalize the posterior differential regularization to the family of $f$-divergences.
Our experiments show that regularizing the posterior differential with $f$-divergence can result in well-improved model robustness.
arXiv Detail & Related papers (2020-10-23T19:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.