Fitness Landscape Footprint: A Framework to Compare Neural Architecture
Search Problems
- URL: http://arxiv.org/abs/2111.01584v1
- Date: Tue, 2 Nov 2021 13:20:01 GMT
- Title: Fitness Landscape Footprint: A Framework to Compare Neural Architecture
Search Problems
- Authors: Kalifou Ren\'e Traor\'e, Andr\'es Camero and Xiao Xiang Zhu
- Abstract summary: We use fitness landscape analysis to study a neural architecture search problem.
We study two problems, the classical image classification benchmark CIFAR-10, and the Remote-Sensing problem So2Sat LCZ42.
- Score: 12.901952926144258
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Neural architecture search is a promising area of research dedicated to
automating the design of neural network models. This field is rapidly growing,
with a surge of methodologies ranging from Bayesian optimization,neuroevoltion,
to differentiable search, and applications in various contexts. However,
despite all great advances, few studies have presented insights on the
difficulty of the problem itself, thus the success (or fail) of these
methodologies remains unexplained. In this sense, the field of optimization has
developed methods that highlight key aspects to describe optimization problems.
The fitness landscape analysis stands out when it comes to characterize
reliably and quantitatively search algorithms. In this paper, we propose to use
fitness landscape analysis to study a neural architecture search problem.
Particularly, we introduce the fitness landscape footprint, an aggregation of
eight (8)general-purpose metrics to synthesize the landscape of an architecture
search problem. We studied two problems, the classical image classification
benchmark CIFAR-10, and the Remote-Sensing problem So2Sat LCZ42. The results
present a quantitative appraisal of the problems, allowing to characterize the
relative difficulty and other characteristics, such as the ruggedness or the
persistence, that helps to tailor a search strategy to the problem. Also, the
footprint is a tool that enables the comparison of multiple problems.
Related papers
- Unsupervised Graph Neural Architecture Search with Disentangled
Self-supervision [51.88848982611515]
Unsupervised graph neural architecture search remains unexplored in the literature.
We propose a novel Disentangled Self-supervised Graph Neural Architecture Search model.
Our model is able to achieve state-of-the-art performance against several baseline methods in an unsupervised manner.
arXiv Detail & Related papers (2024-03-08T05:23:55Z) - Interactive Multi-Objective Evolutionary Optimization of Software
Architectures [0.0]
Putting the human in the loop brings new challenges to the search-based software engineering field.
This paper explores how the interactive evolutionary computation can serve as a basis for integrating the human's judgment into the search process.
arXiv Detail & Related papers (2024-01-08T19:15:40Z) - GLUECons: A Generic Benchmark for Learning Under Constraints [102.78051169725455]
In this work, we create a benchmark that is a collection of nine tasks in the domains of natural language processing and computer vision.
We model external knowledge as constraints, specify the sources of the constraints for each task, and implement various models that use these constraints.
arXiv Detail & Related papers (2023-02-16T16:45:36Z) - A Collection of Deep Learning-based Feature-Free Approaches for
Characterizing Single-Objective Continuous Fitness Landscapes [0.0]
Landscape insights are crucial for problem understanding as well as for assessing benchmark set diversity and composition.
In this work we provide a collection of different approaches to characterize optimization landscapes.
We demonstrate and validate our devised methods on the BBOB testbed and predict, with the help of Deep Learning.
arXiv Detail & Related papers (2022-04-12T12:46:31Z) - SuperNet in Neural Architecture Search: A Taxonomic Survey [14.037182039950505]
This survey focuses on the supernet optimization that builds a neural network that assembles all the architectures as its sub models by using weight sharing.
We aim to accomplish that by proposing them as solutions to the common challenges found in the literature: data-side optimization, poor rank correlation alleviation, and transferable NAS for a number of deployment scenarios.
arXiv Detail & Related papers (2022-04-08T08:29:52Z) - Fine-Grained Image Analysis with Deep Learning: A Survey [146.22351342315233]
Fine-grained image analysis (FGIA) is a longstanding and fundamental problem in computer vision and pattern recognition.
This paper attempts to re-define and broaden the field of FGIA by consolidating two fundamental fine-grained research areas -- fine-grained image recognition and fine-grained image retrieval.
arXiv Detail & Related papers (2021-11-11T09:43:56Z) - Making Differentiable Architecture Search less local [9.869449181400466]
Differentiable neural architecture search (DARTS) is a promising NAS approach that dramatically increases search efficiency.
It has been shown to suffer from performance collapse, where the search often leads to detrimental architectures.
We develop a more global optimisation scheme that is able to better explore the space without changing the DARTS problem formulation.
arXiv Detail & Related papers (2021-04-21T10:36:43Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z) - Disentangling Neural Architectures and Weights: A Case Study in
Supervised Classification [8.976788958300766]
This work investigates the problem of disentangling the role of the neural structure and its edge weights.
We show that well-trained architectures may not need any link-specific fine-tuning of the weights.
We use a novel and computationally efficient method that translates the hard architecture-search problem into a feasible optimization problem.
arXiv Detail & Related papers (2020-09-11T11:22:22Z) - Neural Topological SLAM for Visual Navigation [112.73876869904]
We design topological representations for space that leverage semantics and afford approximate geometric reasoning.
We describe supervised learning-based algorithms that can build, maintain and use such representations under noisy actuation.
arXiv Detail & Related papers (2020-05-25T17:56:29Z) - RC-DARTS: Resource Constrained Differentiable Architecture Search [162.7199952019152]
We propose the resource constrained differentiable architecture search (RC-DARTS) method to learn architectures that are significantly smaller and faster.
We show that the RC-DARTS method learns lightweight neural architectures which have smaller model size and lower computational complexity.
arXiv Detail & Related papers (2019-12-30T05:02:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.