A Non-Dominated Sorting Evolutionary Algorithm Updating When Required
- URL: http://arxiv.org/abs/2507.03864v1
- Date: Sat, 05 Jul 2025 02:28:10 GMT
- Title: A Non-Dominated Sorting Evolutionary Algorithm Updating When Required
- Authors: Lucas R. C. Farias, Abimael J. F. Santos, Matheus R. B. Nobre,
- Abstract summary: NSGA-III relies on uniformly distributed reference points to promote diversity in many-objective optimization problems.<n>This paper proposes NSGA-III with Update when Required (NSGA-III-UR), a hybrid algorithm that selectively activates reference vector adaptation.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The NSGA-III algorithm relies on uniformly distributed reference points to promote diversity in many-objective optimization problems. However, this strategy may underperform when facing irregular Pareto fronts, where certain vectors remain unassociated with any optimal solutions. While adaptive schemes such as A-NSGA-III address this issue by dynamically modifying reference points, they may introduce unnecessary complexity in regular scenarios. This paper proposes NSGA-III with Update when Required (NSGA-III-UR), a hybrid algorithm that selectively activates reference vector adaptation based on the estimated regularity of the Pareto front. Experimental results on benchmark suites (DTLZ1-7, IDTLZ1-2) and real-world problems demonstrate that NSGA-III-UR consistently outperforms NSGA-III and A-NSGA-III across diverse problem landscapes.
Related papers
- A Gradient Meta-Learning Joint Optimization for Beamforming and Antenna Position in Pinching-Antenna Systems [63.213207442368294]
We consider a novel optimization design for multi-waveguide pinching-antenna systems.<n>The proposed GML-JO algorithm is robust to different choices and better performance compared with the existing optimization methods.
arXiv Detail & Related papers (2025-06-14T17:35:27Z) - GPU-accelerated Evolutionary Many-objective Optimization Using Tensorized NSGA-III [13.487945730611193]
We propose a fully tensorized implementation of NSGA-III for large-scale many-objective optimization.<n>NSGA-III maintains the exact selection and variation mechanisms of NSGA-III while achieving significant acceleration.<n>Results show thatNSGA-III achieves speedups of up to $3629times$ over the CPU version of NSGA-III.
arXiv Detail & Related papers (2025-04-08T14:09:23Z) - Runtime Analyses of NSGA-III on Many-Objective Problems [10.955844285189372]
We present the first runtime analyses of NSGA-III on the popular many-objective benchmark problems mLOTZ, mOMM, and mCOCZ.
We show how these parameters should be scaled with the problem dimension, the number of objectives and the fitness range.
To our knowledge, these are the first runtime analyses for NSGA-III for more than 3 objectives.
arXiv Detail & Related papers (2024-04-17T14:39:14Z) - HKNAS: Classification of Hyperspectral Imagery Based on Hyper Kernel
Neural Architecture Search [104.45426861115972]
We propose to directly generate structural parameters by utilizing the specifically designed hyper kernels.
We obtain three kinds of networks to separately conduct pixel-level or image-level classifications with 1-D or 3-D convolutions.
A series of experiments on six public datasets demonstrate that the proposed methods achieve state-of-the-art results.
arXiv Detail & Related papers (2023-04-23T17:27:40Z) - A Mathematical Runtime Analysis of the Non-dominated Sorting Genetic
Algorithm III (NSGA-III) [9.853329403413701]
The Non-dominated Sorting Genetic Algorithm II (NSGA-II) is the most prominent multi-objective evolutionary algorithm for real-world applications.
We provide the first mathematical runtime analysis of the NSGA-III, a refinement of the NSGA-II aimed at better handling more than two objectives.
arXiv Detail & Related papers (2022-11-15T15:10:36Z) - Shape-constrained Symbolic Regression with NSGA-III [0.0]
Shape-constrained symbolic regression (SCSR) allows to include prior knowledge into data-based modeling.
This paper presents a mutlicriterial approach to minimize the approximation error as well as the constraint violations.
Explicitly the two algorithms NSGA-II and NSGA-III are implemented and compared against each other in terms of model quality and runtime.
arXiv Detail & Related papers (2022-09-28T06:10:34Z) - Orthogonal Graph Neural Networks [53.466187667936026]
Graph neural networks (GNNs) have received tremendous attention due to their superiority in learning node representations.
stacking more convolutional layers significantly decreases the performance of GNNs.
We propose a novel Ortho-GConv, which could generally augment the existing GNN backbones to stabilize the model training and improve the model's generalization performance.
arXiv Detail & Related papers (2021-09-23T12:39:01Z) - EOS: a Parallel, Self-Adaptive, Multi-Population Evolutionary Algorithm
for Constrained Global Optimization [68.8204255655161]
EOS is a global optimization algorithm for constrained and unconstrained problems of real-valued variables.
It implements a number of improvements to the well-known Differential Evolution (DE) algorithm.
Results prove that EOSis capable of achieving increased performance compared to state-of-the-art single-population self-adaptive DE algorithms.
arXiv Detail & Related papers (2020-07-09T10:19:22Z) - Convergence of adaptive algorithms for weakly convex constrained
optimization [59.36386973876765]
We prove the $mathcaltilde O(t-1/4)$ rate of convergence for the norm of the gradient of Moreau envelope.
Our analysis works with mini-batch size of $1$, constant first and second order moment parameters, and possibly smooth optimization domains.
arXiv Detail & Related papers (2020-06-11T17:43:19Z) - A Tailored NSGA-III Instantiation for Flexible Job Shop Scheduling [18.401817124823832]
A customized evolutionary algorithm is proposed to solve the flexible job scheduling problem.
Different local search strategies are employed to explore the neighborhood parameters for better solutions.
The experimental results show excellent performance with less computing budget.
arXiv Detail & Related papers (2020-04-14T14:49:36Z) - Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization [71.03797261151605]
Adaptivity is an important yet under-studied property in modern optimization theory.
Our algorithm is proved to achieve the best-available convergence for non-PL objectives simultaneously while outperforming existing algorithms for PL objectives.
arXiv Detail & Related papers (2020-02-13T05:42:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.