Dominated Novelty Search: Rethinking Local Competition in Quality-Diversity
- URL: http://arxiv.org/abs/2502.00593v1
- Date: Sat, 01 Feb 2025 23:15:50 GMT
- Title: Dominated Novelty Search: Rethinking Local Competition in Quality-Diversity
- Authors: Ryan Bahlous-Boldi, Maxence Faldor, Luca Grillotti, Hannah Janmohamed, Lisa Coiffard, Lee Spector, Antoine Cully,
- Abstract summary: We introduce Dominated Novelty Search, a Quality-Diversity algorithm that implements local competition through dynamic fitness transformations.
Our experiments show that Dominated Novelty Search significantly outperforms existing approaches across standard Quality-Diversity benchmarks.
- Score: 6.576386892835931
- License:
- Abstract: Quality-Diversity is a family of evolutionary algorithms that generate diverse, high-performing solutions through local competition principles inspired by natural evolution. While research has focused on improving specific aspects of Quality-Diversity algorithms, surprisingly little attention has been paid to investigating alternative formulations of local competition itself -- the core mechanism distinguishing Quality-Diversity from traditional evolutionary algorithms. Most approaches implement local competition through explicit collection mechanisms like fixed grids or unstructured archives, imposing artificial constraints that require predefined bounds or hard-to-tune parameters. We show that Quality-Diversity methods can be reformulated as Genetic Algorithms where local competition occurs through fitness transformations rather than explicit collection mechanisms. Building on this insight, we introduce Dominated Novelty Search, a Quality-Diversity algorithm that implements local competition through dynamic fitness transformations, eliminating the need for predefined bounds or parameters. Our experiments show that Dominated Novelty Search significantly outperforms existing approaches across standard Quality-Diversity benchmarks, while maintaining its advantage in challenging scenarios like high-dimensional and unsupervised spaces.
Related papers
- Discovering Quality-Diversity Algorithms via Meta-Black-Box Optimization [8.5083347559272]
Quality-Diversity is a family of evolutionary algorithms that generate diverse populations of high-performing solutions.
We propose using meta-learning to automatically discover novel Quality-Diversity algorithms.
arXiv Detail & Related papers (2025-02-04T10:13:13Z) - Randomized Adversarial Style Perturbations for Domain Generalization [49.888364462991234]
We propose a novel domain generalization technique, referred to as Randomized Adversarial Style Perturbation (RASP)
The proposed algorithm perturbs the style of a feature in an adversarial direction towards a randomly selected class, and makes the model learn against being misled by the unexpected styles observed in unseen target domains.
We evaluate the proposed algorithm via extensive experiments on various benchmarks and show that our approach improves domain generalization performance, especially in large-scale benchmarks.
arXiv Detail & Related papers (2023-04-04T17:07:06Z) - Safe Multi-agent Learning via Trapping Regions [89.24858306636816]
We apply the concept of trapping regions, known from qualitative theory of dynamical systems, to create safety sets in the joint strategy space for decentralized learning.
We propose a binary partitioning algorithm for verification that candidate sets form trapping regions in systems with known learning dynamics, and a sampling algorithm for scenarios where learning dynamics are not known.
arXiv Detail & Related papers (2023-02-27T14:47:52Z) - Balancing Discriminability and Transferability for Source-Free Domain
Adaptation [55.143687986324935]
Conventional domain adaptation (DA) techniques aim to improve domain transferability by learning domain-invariant representations.
The requirement of simultaneous access to labeled source and unlabeled target renders them unsuitable for the challenging source-free DA setting.
We derive novel insights to show that a mixup between original and corresponding translated generic samples enhances the discriminability-transferability trade-off.
arXiv Detail & Related papers (2022-06-16T09:06:22Z) - Optimal Algorithms for Decentralized Stochastic Variational Inequalities [113.43047601775453]
This work concentrates on the decentralized setting, which is increasingly important but not well understood.
We present lower bounds for both communication and local iterations and construct optimal algorithms that match these lower bounds.
Our algorithms are the best among the available not only in the decentralized case, but also in the deterministic and non-distributed literature.
arXiv Detail & Related papers (2022-02-06T13:14:02Z) - Substitution of the Fittest: A Novel Approach for Mitigating
Disengagement in Coevolutionary Genetic Algorithms [3.8073142980733]
substitution of the fittest (SF) designed to counteract the problem of disengagement in two-population competitive coevolutionary genetic algorithms.
In a minimal domain, we perform a controlled evaluation of the ability to maintain engagement and the capacity to discover optimal solutions.
arXiv Detail & Related papers (2021-08-06T15:10:36Z) - Towards robust and domain agnostic reinforcement learning competitions [12.731614722371376]
Reinforcement learning competitions have formed the basis for standard research benchmarks.
Despite this, a majority of challenges suffer from the same fundamental problems.
We present a new framework of competition design that promotes the development of algorithms that overcome these barriers.
arXiv Detail & Related papers (2021-06-07T16:15:46Z) - cMLSGA: A Co-Evolutionary Multi-Level Selection Genetic Algorithm for
Multi-Objective Optimization [0.0]
Multi-Level Selection Genetic Algorithm (MLSGA) already shows good performance on range of problems.
This paper proposes a distinct set of co-evolutionary mechanisms, which defines co-evolution as competition between collectives rather than individuals.
arXiv Detail & Related papers (2021-04-22T13:52:21Z) - Quality-Diversity Optimization: a novel branch of stochastic
optimization [5.677685109155078]
Multimodal optimization algorithms search for the highest peaks in the search space that can be more than one.
Quality-Diversity algorithms are a recent addition to the evolutionary computation toolbox that do not only search for a single set of local optima, but instead try to illuminate the search space.
arXiv Detail & Related papers (2020-12-08T09:52:50Z) - Towards Fair Knowledge Transfer for Imbalanced Domain Adaptation [61.317911756566126]
We propose a Towards Fair Knowledge Transfer framework to handle the fairness challenge in imbalanced cross-domain learning.
Specifically, a novel cross-domain mixup generation is exploited to augment the minority source set with target information to enhance fairness.
Our model significantly improves over 20% on two benchmarks in terms of the overall accuracy.
arXiv Detail & Related papers (2020-10-23T06:29:09Z) - Contradictory Structure Learning for Semi-supervised Domain Adaptation [67.89665267469053]
Current adversarial adaptation methods attempt to align the cross-domain features.
Two challenges remain unsolved: 1) the conditional distribution mismatch and 2) the bias of the decision boundary towards the source domain.
We propose a novel framework for semi-supervised domain adaptation by unifying the learning of opposite structures.
arXiv Detail & Related papers (2020-02-06T22:58:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.