Neural Architecture Search From Fr\'echet Task Distance
- URL: http://arxiv.org/abs/2103.12827v2
- Date: Thu, 25 Mar 2021 14:13:56 GMT
- Title: Neural Architecture Search From Fr\'echet Task Distance
- Authors: Cat P. Le, Mohammadreza Soltani, Robert Ravier, Trevor Standley,
Silvio Savarese, Vahid Tarokh
- Abstract summary: We show how the distance between a target task and each task in a given set of baseline tasks can be used to reduce the neural architecture search space for the target task.
The complexity reduction in search space for task-specific architectures is achieved by building on the optimized architectures for similar tasks instead of doing a full search without using this side information.
- Score: 50.9995960884133
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We formulate a Fr\'echet-type asymmetric distance between tasks based on
Fisher Information Matrices. We show how the distance between a target task and
each task in a given set of baseline tasks can be used to reduce the neural
architecture search space for the target task. The complexity reduction in
search space for task-specific architectures is achieved by building on the
optimized architectures for similar tasks instead of doing a full search
without using this side information. Experimental results demonstrate the
efficacy of the proposed approach and its improvements over the
state-of-the-art methods.
Related papers
- Fast Inference and Transfer of Compositional Task Structures for
Few-shot Task Generalization [101.72755769194677]
We formulate it as a few-shot reinforcement learning problem where a task is characterized by a subtask graph.
Our multi-task subtask graph inferencer (MTSGI) first infers the common high-level task structure in terms of the subtask graph from the training tasks.
Our experiment results on 2D grid-world and complex web navigation domains show that the proposed method can learn and leverage the common underlying structure of the tasks for faster adaptation to the unseen tasks.
arXiv Detail & Related papers (2022-05-25T10:44:25Z) - Arch-Graph: Acyclic Architecture Relation Predictor for
Task-Transferable Neural Architecture Search [96.31315520244605]
Arch-Graph is a transferable NAS method that predicts task-specific optimal architectures.
We show Arch-Graph's transferability and high sample efficiency across numerous tasks.
It is able to find top 0.16% and 0.29% architectures on average on two search spaces under the budget of only 50 models.
arXiv Detail & Related papers (2022-04-12T16:46:06Z) - Elastic Architecture Search for Diverse Tasks with Different Resources [87.23061200971912]
We study a new challenging problem of efficient deployment for diverse tasks with different resources, where the resource constraint and task of interest corresponding to a group of classes are dynamically specified at testing time.
Previous NAS approaches seek to design architectures for all classes simultaneously, which may not be optimal for some individual tasks.
We present a novel and general framework, called Elastic Architecture Search (EAS), permitting instant specializations at runtime for diverse tasks with various resource constraints.
arXiv Detail & Related papers (2021-08-03T00:54:27Z) - Exploring Relational Context for Multi-Task Dense Prediction [76.86090370115]
We consider a multi-task environment for dense prediction tasks, represented by a common backbone and independent task-specific heads.
We explore various attention-based contexts, such as global and local, in the multi-task setting.
We propose an Adaptive Task-Relational Context module, which samples the pool of all available contexts for each task pair.
arXiv Detail & Related papers (2021-04-28T16:45:56Z) - A Multi-Task Deep Learning Framework for Building Footprint Segmentation [0.0]
We propose a joint optimization scheme for the task of building footprint delineation.
We also introduce two auxiliary tasks; image reconstruction and building footprint boundary segmentation.
In particular, we propose a deep multi-task learning (MTL) based unified fully convolutional framework.
arXiv Detail & Related papers (2021-04-19T15:07:27Z) - Neural Architecture Search From Task Similarity Measure [28.5184196829547]
We propose a neural architecture search framework based on a similarity measure between various tasks defined in terms of Fisher information.
By utilizing the relation between a target and a set of existing tasks, the search space of architectures can be significantly reduced.
arXiv Detail & Related papers (2021-02-27T15:26:14Z) - MTL-NAS: Task-Agnostic Neural Architecture Search towards
General-Purpose Multi-Task Learning [71.90902837008278]
We propose to incorporate neural architecture search (NAS) into general-purpose multi-task learning (GP-MTL)
In order to adapt to different task combinations, we disentangle the GP-MTL networks into single-task backbones.
We also propose a novel single-shot gradient-based search algorithm that closes the performance gap between the searched architectures.
arXiv Detail & Related papers (2020-03-31T09:49:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.