BR-NS: an Archive-less Approach to Novelty Search
- URL: http://arxiv.org/abs/2104.03936v1
- Date: Thu, 8 Apr 2021 17:31:34 GMT
- Title: BR-NS: an Archive-less Approach to Novelty Search
- Authors: Achkan Salehi, Alexandre Coninx, Stephane Doncieux
- Abstract summary: We discuss an alternative approach to novelty estimation, dubbed Behavior Recognition based Novelty Search (BR-NS)
BR-NS does not require an archive, makes no assumption on the metrics that can be defined in the behavior space and does not rely on nearest neighbours search.
We conduct experiments to gain insight into its feasibility and dynamics as well as potential advantages over archive-based NS in terms of time complexity.
- Score: 70.13948372218849
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As open-ended learning based on divergent search algorithms such as Novelty
Search (NS) draws more and more attention from the research community, it is
natural to expect that its application to increasingly complex real-world
problems will require the exploration to operate in higher dimensional Behavior
Spaces which will not necessarily be Euclidean. Novelty Search traditionally
relies on k-nearest neighbours search and an archive of previously visited
behavior descriptors which are assumed to live in a Euclidean space. This is
problematic because of a number of issues. On one hand, Euclidean distance and
Nearest-neighbour search are known to behave differently and become less
meaningful in high dimensional spaces. On the other hand, the archive has to be
bounded since, memory considerations aside, the computational complexity of
finding nearest neighbours in that archive grows linearithmically with its
size. A sub-optimal bound can result in "cycling" in the behavior space, which
inhibits the progress of the exploration. Furthermore, the performance of NS
depends on a number of algorithmic choices and hyperparameters, such as the
strategies to add or remove elements to the archive and the number of
neighbours to use in k-nn search. In this paper, we discuss an alternative
approach to novelty estimation, dubbed Behavior Recognition based Novelty
Search (BR-NS), which does not require an archive, makes no assumption on the
metrics that can be defined in the behavior space and does not rely on nearest
neighbours search. We conduct experiments to gain insight into its feasibility
and dynamics as well as potential advantages over archive-based NS in terms of
time complexity.
Related papers
- Pathfinding with Lazy Successor Generation [12.02023514105999]
We study a pathfinding problem where only locations are given, and edges are implicitly defined.
Despite its simple structure, this problem becomes non-trivial with a massive number of locations.
We propose a novel LaCAS* algorithm, which does not generate successors all at once but gradually generates successors as the search progresses.
arXiv Detail & Related papers (2024-08-27T23:25:25Z) - Efficient NAS with FaDE on Hierarchical Spaces [0.6372911857214884]
We present FaDE which uses differentiable architecture search to obtain relative performance predictions on finite regions of a hierarchical NAS space.
FaDE is especially suited on deep hierarchical, respectively multi-cell search spaces.
arXiv Detail & Related papers (2024-04-24T21:33:17Z) - Lexically-Accelerated Dense Retrieval [29.327878974130055]
'LADR' (Lexically-Accelerated Dense Retrieval) is a simple-yet-effective approach that improves the efficiency of existing dense retrieval models.
LADR consistently achieves both precision and recall that are on par with an exhaustive search on standard benchmarks.
arXiv Detail & Related papers (2023-07-31T15:44:26Z) - k-Means Maximum Entropy Exploration [55.81894038654918]
Exploration in continuous spaces with sparse rewards is an open problem in reinforcement learning.
We introduce an artificial curiosity algorithm based on lower bounding an approximation to the entropy of the state visitation distribution.
We show that our approach is both computationally efficient and competitive on benchmarks for exploration in high-dimensional, continuous spaces.
arXiv Detail & Related papers (2022-05-31T09:05:58Z) - Geodesics, Non-linearities and the Archive of Novelty Search [69.6462706723023]
We show that a key effect of the archive is that it counterbalances the exploration biases that result from the use of inadequate behavior metrics.
Our observations seem to hint that attributing a more active role to the archive in sampling can be beneficial.
arXiv Detail & Related papers (2022-05-06T12:03:40Z) - Exploring Complicated Search Spaces with Interleaving-Free Sampling [127.07551427957362]
In this paper, we build the search algorithm upon a complicated search space with long-distance connections.
We present a simple yet effective algorithm named textbfIF-NAS, where we perform a periodic sampling strategy to construct different sub-networks.
In the proposed search space, IF-NAS outperform both random sampling and previous weight-sharing search algorithms by a significant margin.
arXiv Detail & Related papers (2021-12-05T06:42:48Z) - Probabilistic DAG Search [29.47649645431227]
We develop a probabilistic framework to exploit a search space's latent structure and share information across the search tree.
We empirically find our algorithm to compare favorably to existing non-probabilistic alternatives in Tic-Tac-Toe and a feature selection application.
arXiv Detail & Related papers (2021-06-16T11:35:19Z) - AutoSpace: Neural Architecture Search with Less Human Interference [84.42680793945007]
Current neural architecture search (NAS) algorithms still require expert knowledge and effort to design a search space for network construction.
We propose a novel differentiable evolutionary framework named AutoSpace, which evolves the search space to an optimal one.
With the learned search space, the performance of recent NAS algorithms can be improved significantly compared with using previously manually designed spaces.
arXiv Detail & Related papers (2021-03-22T13:28:56Z) - Leveraging Reinforcement Learning for evaluating Robustness of KNN
Search Algorithms [0.0]
The problem of finding K-nearest neighbors in the given dataset for a given query point has been worked upon since several years.
In this paper, we survey some novel K-Nearest Neighbor Search approaches that tackles the problem of Search from the perspectives of computations.
In order to evaluate the robustness of a KNNS approach against adversarial points, we propose a generic Reinforcement Learning based framework for the same.
arXiv Detail & Related papers (2021-02-10T16:10:58Z) - GOLD-NAS: Gradual, One-Level, Differentiable [100.12492801459105]
We propose a novel algorithm named Gradual One-Level Differentiable Neural Architecture Search (GOLD-NAS)
It introduces a variable resource constraint to one-level optimization so that the weak operators are gradually pruned out from the super-network.
arXiv Detail & Related papers (2020-07-07T10:37:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.