PlatMetaX: An Integrated MATLAB platform for Meta-Black-Box Optimization
- URL: http://arxiv.org/abs/2503.22722v1
- Date: Wed, 26 Mar 2025 02:03:35 GMT
- Title: PlatMetaX: An Integrated MATLAB platform for Meta-Black-Box Optimization
- Authors: Xu Yang, Rui Wang, Kaiwen Li, Wenhua Li, Tao Zhang, Fujun He,
- Abstract summary: We present PlatMetaX, a novel platform for developing, evaluating, and comparing optimization algorithms.<n>PlatMetaX integrates the strengths of MetaBox and PlatEMO, offering a comprehensive framework for developing, evaluating, and comparing optimization algorithms.
- Score: 13.141855165689448
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The landscape of optimization problems has become increasingly complex, necessitating the development of advanced optimization techniques. Meta-Black-Box Optimization (MetaBBO), which involves refining the optimization algorithms themselves via meta-learning, has emerged as a promising approach. Recognizing the limitations in existing platforms, we presents PlatMetaX, a novel MATLAB platform for MetaBBO with reinforcement learning. PlatMetaX integrates the strengths of MetaBox and PlatEMO, offering a comprehensive framework for developing, evaluating, and comparing optimization algorithms. The platform is designed to handle a wide range of optimization problems, from single-objective to multi-objective, and is equipped with a rich set of baseline algorithms and evaluation metrics. We demonstrate the utility of PlatMetaX through extensive experiments and provide insights into its design and implementation. PlatMetaX is available at: \href{https://github.com/Yxxx616/PlatMetaX}{https://github.com/Yxxx616/PlatMetaX}.
Related papers
- Toward Automated Algorithm Design: A Survey and Practical Guide to Meta-Black-Box-Optimization [22.902923118981857]
We introduce Meta-Black-Box-Optimization(MetaBBO) as an emerging avenue within the Evolutionary Computation(EC) community.
Despite the success of MetaBBO, the current literature provides insufficient summaries of its key aspects and lacks practical guidance for implementation.
arXiv Detail & Related papers (2024-11-01T14:32:19Z) - Symbol: Generating Flexible Black-Box Optimizers through Symbolic
Equation Learning [16.338146844605404]
We present textscSymbol, a framework that promotes the automated discovery of black-boxs through symbolic equation learning.
Specifically, we propose a Symbolic Equation Generator (SEG) that allows closed-form optimization rules to be dynamically generated.
Extensive experiments reveal that the generalizations generated by textscSymbol not only surpass the state-of-the-art BBO and MetaBBO baselines, but also exhibit exceptional zero-shot abilities.
arXiv Detail & Related papers (2024-02-04T05:41:27Z) - Contextual Stochastic Bilevel Optimization [50.36775806399861]
We introduce contextual bilevel optimization (CSBO) -- a bilevel optimization framework with the lower-level problem minimizing an expectation on some contextual information and the upper-level variable.
It is important for applications such as meta-learning, personalized learning, end-to-end learning, and Wasserstein distributionally robustly optimization with side information (WDRO-SI)
arXiv Detail & Related papers (2023-10-27T23:24:37Z) - MetaBox: A Benchmark Platform for Meta-Black-Box Optimization with
Reinforcement Learning [25.687304354503148]
We introduce MetaBox, the first benchmark platform specifically tailored for developing and evaluating MetaBBO-RL methods.
MetaBox offers a flexible algorithmic template that allows users to effortlessly implement their unique designs within the platform.
It provides a broad spectrum of over 300 problem instances, collected from synthetic to realistic scenarios, and an extensive library of 19 baseline methods.
arXiv Detail & Related papers (2023-10-12T11:55:17Z) - DAC-MR: Data Augmentation Consistency Based Meta-Regularization for
Meta-Learning [55.733193075728096]
We propose a meta-knowledge informed meta-learning (MKIML) framework to improve meta-learning.
We preliminarily integrate meta-knowledge into meta-objective via using an appropriate meta-regularization (MR) objective.
The proposed DAC-MR is hopeful to learn well-performing meta-models from training tasks with noisy, sparse or unavailable meta-data.
arXiv Detail & Related papers (2023-05-13T11:01:47Z) - Meta Mirror Descent: Optimiser Learning for Fast Convergence [85.98034682899855]
We take a different perspective starting from mirror descent rather than gradient descent, and meta-learning the corresponding Bregman divergence.
Within this paradigm, we formalise a novel meta-learning objective of minimising the regret bound of learning.
Unlike many meta-learned optimisers, it also supports convergence and generalisation guarantees and uniquely does so without requiring validation data.
arXiv Detail & Related papers (2022-03-05T11:41:13Z) - Meta-Learning with Neural Tangent Kernels [58.06951624702086]
We propose the first meta-learning paradigm in the Reproducing Kernel Hilbert Space (RKHS) induced by the meta-model's Neural Tangent Kernel (NTK)
Within this paradigm, we introduce two meta-learning algorithms, which no longer need a sub-optimal iterative inner-loop adaptation as in the MAML framework.
We achieve this goal by 1) replacing the adaptation with a fast-adaptive regularizer in the RKHS; and 2) solving the adaptation analytically based on the NTK theory.
arXiv Detail & Related papers (2021-02-07T20:53:23Z) - MetaMix: Improved Meta-Learning with Interpolation-based Consistency
Regularization [14.531741503372764]
We propose an approach called MetaMix to regularize backbone models.
It generates virtual feature-target pairs within each episode to regularize the backbone models.
It can be integrated with any of the MAML-based algorithms and learn the decision boundaries generalizing better to new tasks.
arXiv Detail & Related papers (2020-09-29T02:44:13Z) - BOML: A Modularized Bilevel Optimization Library in Python for Meta
Learning [52.90643948602659]
BOML is a modularized optimization library that unifies several meta-learning algorithms into a common bilevel optimization framework.
It provides a hierarchical optimization pipeline together with a variety of iteration modules, which can be used to solve the mainstream categories of meta-learning methods.
arXiv Detail & Related papers (2020-09-28T14:21:55Z) - On the Global Optimality of Model-Agnostic Meta-Learning [133.16370011229776]
Model-a meta-learning (MAML) formulates meta-learning as a bilevel optimization problem, where the inner level solves each subtask based on a shared prior.
We characterize optimality of the stationary points attained by MAML for both learning and supervised learning, where the inner-level outer-level problems are solved via first-order optimization methods.
arXiv Detail & Related papers (2020-06-23T17:33:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.