A many-objective evolutionary algorithm using indicator-driven weight vector optimization
- URL: http://arxiv.org/abs/2510.02709v1
- Date: Fri, 03 Oct 2025 04:14:24 GMT
- Title: A many-objective evolutionary algorithm using indicator-driven weight vector optimization
- Authors: Xiaojing Han, Yuanxin Li,
- Abstract summary: This study proposes an adaptive many-objective evolutionary algorithm with a simplified hypervolume indicator.<n>It synthesizes indicator assessment techniques with decomposition-based methods to facilitate self-adaptive and dynamic adjustment of the weight vectors.<n> Experimental results demonstrate that the proposed algorithm is efficient and effective when compared with six state-of-the-art algorithms.
- Score: 7.849314686124955
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: For regular Pareto Fronts (PFs), such as those that are smooth, continuous, and uniformly distributed, using fixed weight vectors is sufficient for multi-objective optimization approaches using decomposition. However, when encountering irregular PFs-including degenerate, disconnected, inverted, etc. Fixed weight vectors can often cause a non-uniform distribution of the sets or even poor optimization results. To address this issue, this study proposes an adaptive many-objective evolutionary algorithm with a simplified hypervolume indicator. It synthesizes indicator assessment techniques with decomposition-based methods to facilitate self-adaptive and dynamic adjustment of the weight vectors in many-objective optimization methods. Specifically, based on the MOEA/D framework, it uses a simplified hypervolume indicator to accurately assess solution distribution. Simultaneously, applying the R2 indicator (as an approximation of hypervolume) dynamically regulates the update frequency of the weight vectors. Experimental results demonstrate that the proposed algorithm is efficient and effective when compared with six state-of-the-art algorithms.
Related papers
- Multi-Dimensional Visual Data Recovery: Scale-Aware Tensor Modeling and Accelerated Randomized Computation [51.65236537605077]
We propose a new type of network compression optimization technique, fully randomized tensor network compression (FCTN)<n>FCTN has significant advantages in correlation characterization and transpositional in algebra, and has notable achievements in multi-dimensional data processing and analysis.<n>We derive efficient algorithms with guarantees to solve the formulated models.
arXiv Detail & Related papers (2026-02-13T14:56:37Z) - Parallel Diffusion Solver via Residual Dirichlet Policy Optimization [88.7827307535107]
Diffusion models (DMs) have achieved state-of-the-art generative performance but suffer from high sampling latency due to their sequential denoising nature.<n>Existing solver-based acceleration methods often face significant image quality degradation under a low-dimensional budget.<n>We propose the Ensemble Parallel Direction solver (dubbed as EPD-EPr), a novel ODE solver that mitigates these errors by incorporating multiple gradient parallel evaluations in each step.
arXiv Detail & Related papers (2025-12-28T05:48:55Z) - Tuning-Free Structured Sparse Recovery of Multiple Measurement Vectors using Implicit Regularization [13.378211527081582]
We introduce a tuning-free framework to recover sparse signals in multiple measurement vectors.<n>We show that the optimization dynamics exhibit a "momentum-like" effect, causing the norms of rows in the true support to grow significantly faster than others.
arXiv Detail & Related papers (2025-12-03T02:53:11Z) - Self-Supervised Coarsening of Unstructured Grid with Automatic Differentiation [55.88862563823878]
In this work, we present an original algorithm to coarsen an unstructured grid based on the concepts of differentiable physics.<n>We demonstrate performance of the algorithm on two PDEs: a linear equation which governs slightly compressible fluid flow in porous media and the wave equation.<n>Our results show that in the considered scenarios, we reduced the number of grid points up to 10 times while preserving the modeled variable dynamics in the points of interest.
arXiv Detail & Related papers (2025-07-24T11:02:13Z) - A Differential Evolution Algorithm with Neighbor-hood Mutation for DOA Estimation [11.842677286643609]
Two-dimensional (2D) Multiple Signal Classification algorithm is a powerful technique for high-resolution direction-of-arrival (DOA) estimation in array signal processing.<n>We reformulate the peak-finding process as a multimodal optimization prob-lem, and propose a Differential Evolu-tion algorithm with Neighborhood Mutation (DE-NM) to efficiently lo-cate multiple spectral peaks.
arXiv Detail & Related papers (2025-07-08T14:30:01Z) - A novel algorithm for optimizing bundle adjustment in image sequence alignment [6.322876598831792]
This paper introduces a novel algorithm for optimizing the Bundle Adjustment (BA) model in the context of image sequence alignment for cryo-electron tomography.<n>Extensive experiments on both synthetic and real-world datasets were conducted to evaluate the algorithm's performance.
arXiv Detail & Related papers (2024-11-10T03:19:33Z) - Ensemble Kalman Filtering Meets Gaussian Process SSM for Non-Mean-Field and Online Inference [47.460898983429374]
We introduce an ensemble Kalman filter (EnKF) into the non-mean-field (NMF) variational inference framework to approximate the posterior distribution of the latent states.
This novel marriage between EnKF and GPSSM not only eliminates the need for extensive parameterization in learning variational distributions, but also enables an interpretable, closed-form approximation of the evidence lower bound (ELBO)
We demonstrate that the resulting EnKF-aided online algorithm embodies a principled objective function by ensuring data-fitting accuracy while incorporating model regularizations to mitigate overfitting.
arXiv Detail & Related papers (2023-12-10T15:22:30Z) - Stochastic Optimal Control Matching [53.156277491861985]
Our work introduces Optimal Control Matching (SOCM), a novel Iterative Diffusion Optimization (IDO) technique for optimal control.
The control is learned via a least squares problem by trying to fit a matching vector field.
Experimentally, our algorithm achieves lower error than all the existing IDO techniques for optimal control.
arXiv Detail & Related papers (2023-12-04T16:49:43Z) - Federated Conditional Stochastic Optimization [110.513884892319]
Conditional optimization has found in a wide range of machine learning tasks, such as in-variant learning tasks, AUPRC, andAML.
This paper proposes algorithms for distributed federated learning.
arXiv Detail & Related papers (2023-10-04T01:47:37Z) - Momentum Accelerates the Convergence of Stochastic AUPRC Maximization [80.8226518642952]
We study optimization of areas under precision-recall curves (AUPRC), which is widely used for imbalanced tasks.
We develop novel momentum methods with a better iteration of $O (1/epsilon4)$ for finding an $epsilon$stationary solution.
We also design a novel family of adaptive methods with the same complexity of $O (1/epsilon4)$, which enjoy faster convergence in practice.
arXiv Detail & Related papers (2021-07-02T16:21:52Z) - Evolutionary Variational Optimization of Generative Models [0.0]
We combine two popular optimization approaches to derive learning algorithms for generative models: variational optimization and evolutionary algorithms.
We show that evolutionary algorithms can effectively and efficiently optimize the variational bound.
In the category of "zero-shot" learning, we observed the evolutionary variational algorithm to significantly improve the state-of-the-art in many benchmark settings.
arXiv Detail & Related papers (2020-12-22T19:06:33Z) - Finding optimal Pulse Repetion Intervals with Many-objective
Evolutionary Algorithms [0.0]
We consider the problem of finding Pulse Repetition Intervals allowing the best compromises mitigating range and Doppler ambiguities in a Pulsed-Doppler radar system.
We use it as a baseline to compare several Evolutionary Algorithms for black-box optimization with different metrics.
arXiv Detail & Related papers (2020-11-13T13:56:13Z) - Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization [71.03797261151605]
Adaptivity is an important yet under-studied property in modern optimization theory.
Our algorithm is proved to achieve the best-available convergence for non-PL objectives simultaneously while outperforming existing algorithms for PL objectives.
arXiv Detail & Related papers (2020-02-13T05:42:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.