Competition on Dynamic Optimization Problems Generated by Generalized
Moving Peaks Benchmark (GMPB)
- URL: http://arxiv.org/abs/2106.06174v3
- Date: Wed, 13 Dec 2023 09:16:47 GMT
- Title: Competition on Dynamic Optimization Problems Generated by Generalized
Moving Peaks Benchmark (GMPB)
- Authors: Danial Yazdani (1), Michalis Mavrovouniotis (2), Changhe Li (3),
Wenjian Luo (4), Mohammad Nabi Omidvar (5), Amir H. Gandomi (6), Trung Thanh
Nguyen (7), Juergen Branke (8), Xiaodong Li (9), Shengxiang Yang (10), and
Xin Yao (11) ((1) Faculty of Engineering & Information Technology, University
of Technology Sydney,(2) ERATOSTHENES Centre of Excellence, (3) School of
Automation, China University of Geosciences, (4) Guangdong Provincial Key
Laboratory of Novel Security Intelligence Technologies, School of Computer
Science and Technology, Harbin Institute of Technology and Peng Cheng
Laboratory, (5) School of Computing, University of Leeds, and Leeds
University Business School, (6) Faculty of Engineering & Information
Technology, University of Technology Sydney and University Research and
Innovation Center (EKIK), Obuda University, (7) Liverpool Logistics, Offshore
and Marine (LOOM) Research Institute, Faculty of Engineering and Technology,
School of Engineering, Liverpool John Moores University, (8) Warwick Business
school, University of Warwick, (9) School of Science (Computer Science), RMIT
University, (10) Center for Computational Intelligence (CCI), School of
Computer Science and Informatics, De Montfort University, (11) Research
Institute of Trustworthy Autonomous Systems (RITAS), and Guangdong Provincial
Key Laboratory of Brain inspired Intelligent Computation, Department of
Computer Science and Engineering, Southern University of Science and
Technology, and CERCIA, School of Computer Science, University of Birmingham)
- Abstract summary: This document introduces the Generalized Moving Benchmark (GMPB)
GMPB is adept at generating landscapes with a broad spectrum of characteristics.
This document delves into the intricacies of GMPB, detailing its myriad ways in which its parameters can be tuned to produce these diverse landscape characteristics.
- Score: 5.1812733319583915
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This document introduces the Generalized Moving Peaks Benchmark (GMPB), a
tool for generating continuous dynamic optimization problem instances that is
used for the CEC 2024 Competition on Dynamic Optimization. GMPB is adept at
generating landscapes with a broad spectrum of characteristics, offering
everything from unimodal to highly multimodal landscapes and ranging from
symmetric to highly asymmetric configurations. The landscapes also vary in
texture, from smooth to highly irregular surfaces, encompassing diverse degrees
of variable interaction and conditioning. This document delves into the
intricacies of GMPB, detailing the myriad ways in which its parameters can be
tuned to produce these diverse landscape characteristics. GMPB's MATLAB
implementation is available on the EDOLAB Platform.
Related papers
- Margin Matching Preference Optimization: Enhanced Model Alignment with Granular Feedback [64.67540769692074]
Large language models (LLMs) fine-tuned with alignment techniques, such as reinforcement learning from human feedback, have been instrumental in developing some of the most capable AI systems to date.
We introduce an approach called Margin Matching Preference Optimization (MMPO), which incorporates relative quality margins into optimization, leading to improved LLM policies and reward models.
Experiments with both human and AI feedback data demonstrate that MMPO consistently outperforms baseline methods, often by a substantial margin, on popular benchmarks including MT-bench and RewardBench.
arXiv Detail & Related papers (2024-10-04T04:56:11Z) - Generalized Preference Optimization: A Unified Approach to Offline Alignment [54.97015778517253]
We propose generalized preference optimization (GPO), a family of offline losses parameterized by a general class of convex functions.
GPO enables a unified view over preference optimization, encompassing existing algorithms such as DPO, IPO and SLiC as special cases.
Our results present new algorithmic toolkits and empirical insights to alignment practitioners.
arXiv Detail & Related papers (2024-02-08T15:33:09Z) - 360 Layout Estimation via Orthogonal Planes Disentanglement and Multi-view Geometric Consistency Perception [56.84921040837699]
Existing panoramic layout estimation solutions tend to recover room boundaries from a vertically compressed sequence, yielding imprecise results.
We propose an orthogonal plane disentanglement network (termed DOPNet) to distinguish ambiguous semantics.
We also present an unsupervised adaptation technique tailored for horizon-depth and ratio representations.
Our solution outperforms other SoTA models on both monocular layout estimation and multi-view layout estimation tasks.
arXiv Detail & Related papers (2023-12-26T12:16:03Z) - Parameter Efficient Fine-tuning via Cross Block Orchestration for Segment Anything Model [81.55141188169621]
We equip PEFT with a cross-block orchestration mechanism to enable the adaptation of the Segment Anything Model (SAM) to various downstream scenarios.
We propose an intra-block enhancement module, which introduces a linear projection head whose weights are generated from a hyper-complex layer.
Our proposed approach consistently improves the segmentation performance significantly on novel scenarios with only around 1K additional parameters.
arXiv Detail & Related papers (2023-11-28T11:23:34Z) - Global Optimization: A Machine Learning Approach [7.052596485478637]
Bertsimas and Ozturk (2023) proposed OCTHaGOn as a way of solving black-box global optimization problems.
We provide extensions to this approach by approximating the original problem using other MIO-representable ML models.
We show improvements in solution feasibility and optimality in the majority of instances.
arXiv Detail & Related papers (2023-11-03T06:33:38Z) - SIGMA: Scale-Invariant Global Sparse Shape Matching [50.385414715675076]
We propose a novel mixed-integer programming (MIP) formulation for generating precise sparse correspondences for non-rigid shapes.
We show state-of-the-art results for sparse non-rigid matching on several challenging 3D datasets.
arXiv Detail & Related papers (2023-08-16T14:25:30Z) - MA-BBOB: Many-Affine Combinations of BBOB Functions for Evaluating
AutoML Approaches in Noiseless Numerical Black-Box Optimization Contexts [0.8258451067861933]
(MA-)BBOB is built on the publicly available IOHprofiler platform.
It provides access to the interactive IOHanalyzer module for performance analysis and visualization, and enables comparisons with the rich and growing data collection available for the (MA-)BBOB functions.
arXiv Detail & Related papers (2023-06-18T19:32:12Z) - A Pareto-optimal compositional energy-based model for sampling and
optimization of protein sequences [55.25331349436895]
Deep generative models have emerged as a popular machine learning-based approach for inverse problems in the life sciences.
These problems often require sampling new designs that satisfy multiple properties of interest in addition to learning the data distribution.
arXiv Detail & Related papers (2022-10-19T19:04:45Z) - Hybrid Parameter Search and Dynamic Model Selection for Mixed-Variable
Bayesian Optimization [6.204805504959941]
We propose a new type of hybrid model for Bayesian optimization (BO) adept at managing mixed variables.
Our proposed new hybrid models (named hybridM) merge the Monte Carlo Tree Search structure (MCTS) for categorical variables with Gaussian Processes (GP) for continuous ones.
Our innovations, including dynamic online kernel selection in the surrogate modeling phase, position our hybrid models as an advancement in mixed-variable surrogate models.
arXiv Detail & Related papers (2022-06-03T06:34:09Z) - Explicitly Multi-Modal Benchmarks for Multi-Objective Optimization [1.9282110216621833]
We introduce a benchmarking based on basin connectivity (3BC) by using basins of attraction.
The 3BC allows for the specification of a multimodal landscape through a kind of topological analysis called the basin graph.
arXiv Detail & Related papers (2021-10-07T05:51:32Z) - Generating Large-scale Dynamic Optimization Problem Instances Using the
Generalized Moving Peaks Benchmark [9.109331015600185]
This document describes the generalized moving peaks benchmark (GMPB) and how it can be used to generate problem instances for continuous large-scale dynamic optimization problems.
It presents a set of 15 benchmark problems, the relevant source code, and a performance indicator, designed for comparative studies and competitions in large-scale dynamic optimization.
arXiv Detail & Related papers (2021-07-23T03:57:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.