Robustness and Invariance of Hybrid Metaheuristics under Objective Function Transformations
- URL: http://arxiv.org/abs/2509.05445v1
- Date: Fri, 05 Sep 2025 18:55:33 GMT
- Title: Robustness and Invariance of Hybrid Metaheuristics under Objective Function Transformations
- Authors: Grzegorz Sroka, Sławomir T. Wierzchoń,
- Abstract summary: This paper evaluates the robustness and structural invariance of hybrid population-based metaheuristics under various objective space transformations.<n>A lightweight plug-and-play hybridization operator is applied to nineteen state-of-the-art algorithms.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper evaluates the robustness and structural invariance of hybrid population-based metaheuristics under various objective space transformations. A lightweight plug-and-play hybridization operator is applied to nineteen state-of-the-art algorithms-including differential evolution (DE), particle swarm optimization (PSO), and recent bio-inspired methods-without modifying their internal logic. Benchmarking on the CEC-2017 suite across four dimensions (10, 30, 50, 100) is performed under five transformation types: baseline, translation, scaling, rotation, and constant shift. Statistical comparisons based on Wilcoxon and Friedman tests, Bayesian dominance analysis, and convergence trajectory profiling consistently show that differential-based hybrids (e.g., hIMODE, hSHADE, hDMSSA) maintain high accuracy, stability, and invariance under all tested deformations. In contrast, classical algorithms-especially PSO- and HHO-based variants-exhibit significant performance degradation under non-separable or distorted landscapes. The findings confirm the superiority of adaptive, structurally resilient hybrids for real-world optimization tasks subject to domain-specific transformations.
Related papers
- LEVDA: Latent Ensemble Variational Data Assimilation via Differentiable Dynamics [6.953554594702111]
We propose Latent Ensemble Data Assimilation (LEVDA), an ensemble-space variational smoother.<n>It assimilates states and unknown parameters without the need for adjoint code or auxiliary observation-to-latent encoders.<n>It substantially improved assimilation accuracy and computational efficiency compared to full-state 4DEnVar.
arXiv Detail & Related papers (2026-02-23T00:54:59Z) - Rethinking Diffusion Models with Symmetries through Canonicalization with Applications to Molecular Graph Generation [56.361076943802594]
CanonFlow achieves state-of-the-art performance on the challenging GEOM-DRUG dataset, and the advantage remains large in few-step generation.
arXiv Detail & Related papers (2026-02-16T18:58:55Z) - Variational Entropic Optimal Transport [67.76725267984578]
We propose Variational Entropic Optimal Transport (VarEOT) for domain translation problems.<n>VarEOT is based on an exact variational reformulation of the log-partition $log mathbbE[exp(cdot)$ as a tractable generalization over an auxiliary positive normalizer.<n> Experiments on synthetic data and unpaired image-to-image translation demonstrate competitive or improved translation quality.
arXiv Detail & Related papers (2026-02-02T15:48:44Z) - Regime-Adaptive Bayesian Optimization via Dirichlet Process Mixtures of Gaussian Processes [14.367563771141592]
RAMBO is a Dirichlet Process Mixture of Gaussian Processes that automatically discovers latent regimes during optimization.<n>We derive collapsed Gibbs sampling that analytically marginalizes latent functions for efficient inference.<n>Our acquisition functions decompose uncertainty into intra-regime and inter-regime components.
arXiv Detail & Related papers (2026-01-27T20:45:50Z) - Neural Optimal Transport Meets Multivariate Conformal Prediction [58.43397908730771]
We propose a framework for conditional vectorile regression (CVQR)<n>CVQR combines neural optimal transport with quantized optimization, and apply it to predictions.
arXiv Detail & Related papers (2025-09-29T19:50:19Z) - Sequential, Parallel and Consecutive Hybrid Evolutionary-Swarm Optimization Metaheuristics [43.05659890525653]
This paper explores hybrid evolutionary-swarm metaheuristics that combine the features of PSO and GA in a sequential, parallel and consecutive manner.<n>The experimental results demonstrate that the hybrid approaches achieve superior convergence and consistency.<n>The paper introduces a novel consecutive hybrid PSO-GA evolutionary algorithm that ensures continuity between PSO and GA steps through explicit information transfer mechanisms.
arXiv Detail & Related papers (2025-08-01T00:23:36Z) - EvoGrad: Metaheuristics in a Differentiable Wonderland [1.065497990128313]
Differentiable programming has revolutionised optimisation by enabling efficient gradient-based training of complex models.<n>EvoGrad is a unified differentiable framework that integrates EC and SI with gradient-based optimisation.<n>Our results show the substantial benefits of fully differentiable evolutionary and swarm optimisation.
arXiv Detail & Related papers (2025-05-28T15:42:07Z) - Accelerating Evolution: Integrating PSO Principles into Real-Coded Genetic Algorithm Crossover [2.854482269849925]
This study introduces an innovative crossover operator named Particle Swarm Optimization-inspired Crossover (PSOX)<n>PSOX uniquely incorporates guidance from both the current global best solution and historical optimal solutions across multiple generations.<n>This novel mechanism enables the algorithm to maintain population diversity while simultaneously accelerating convergence toward promising regions of the search space.
arXiv Detail & Related papers (2025-05-06T06:17:57Z) - Adaptive sparse variational approximations for Gaussian process regression [6.169364905804677]
We construct a variational approximation to a hierarchical Bayes procedure, and derive upper bounds for the contraction rate of the variational posterior.<n>Our theoretical results are accompanied by numerical analysis both on synthetic and real world data sets.
arXiv Detail & Related papers (2025-04-04T09:57:00Z) - Structure Language Models for Protein Conformation Generation [66.42864253026053]
Traditional physics-based simulation methods often struggle with sampling equilibrium conformations.<n>Deep generative models have shown promise in generating protein conformations as a more efficient alternative.<n>We introduce Structure Language Modeling as a novel framework for efficient protein conformation generation.
arXiv Detail & Related papers (2024-10-24T03:38:51Z) - Joint State Estimation and Noise Identification Based on Variational
Optimization [8.536356569523127]
A novel adaptive Kalman filter method based on conjugate-computation variational inference, referred to as CVIAKF, is proposed.
The effectiveness of CVIAKF is validated through synthetic and real-world datasets of maneuvering target tracking.
arXiv Detail & Related papers (2023-12-15T07:47:03Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - NAG-GS: Semi-Implicit, Accelerated and Robust Stochastic Optimizer [45.47667026025716]
We propose a novel, robust and accelerated iteration that relies on two key elements.
The convergence and stability of the obtained method, referred to as NAG-GS, are first studied extensively.
We show that NAG-arity is competitive with state-the-art methods such as momentum SGD with weight decay and AdamW for the training of machine learning models.
arXiv Detail & Related papers (2022-09-29T16:54:53Z) - Improving Covariance Conditioning of the SVD Meta-layer by Orthogonality [65.67315418971688]
Nearest Orthogonal Gradient (NOG) and Optimal Learning Rate (OLR) are proposed.
Experiments on visual recognition demonstrate that our methods can simultaneously improve the covariance conditioning and generalization.
arXiv Detail & Related papers (2022-07-05T15:39:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.