Self-supervised Neural Architecture Search
- URL: http://arxiv.org/abs/2007.01500v1
- Date: Fri, 3 Jul 2020 05:09:30 GMT
- Title: Self-supervised Neural Architecture Search
- Authors: Sapir Kaplan and Raja Giryes
- Abstract summary: We propose a self-supervised neural architecture search (SSNAS) that allows finding novel network models without the need for labeled data.
We show that such a search leads to comparable results to supervised training with a "fully labeled" NAS and that it can improve the performance of self-supervised learning.
- Score: 41.07083436560303
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Architecture Search (NAS) has been used recently to achieve improved
performance in various tasks and most prominently in image classification. Yet,
current search strategies rely on large labeled datasets, which limit their
usage in the case where only a smaller fraction of the data is annotated.
Self-supervised learning has shown great promise in training neural networks
using unlabeled data. In this work, we propose a self-supervised neural
architecture search (SSNAS) that allows finding novel network models without
the need for labeled data. We show that such a search leads to comparable
results to supervised training with a "fully labeled" NAS and that it can
improve the performance of self-supervised learning. Moreover, we demonstrate
the advantage of the proposed approach when the number of labels in the search
is relatively small.
Related papers
- GeNAS: Neural Architecture Search with Better Generalization [14.92869716323226]
Recent neural architecture search (NAS) approaches rely on validation loss or accuracy to find the superior network for the target data.
In this paper, we investigate a new neural architecture search measure for excavating architectures with better generalization.
arXiv Detail & Related papers (2023-05-15T12:44:54Z) - UnrealNAS: Can We Search Neural Architectures with Unreal Data? [84.78460976605425]
Neural architecture search (NAS) has shown great success in the automatic design of deep neural networks (DNNs)
Previous work has analyzed the necessity of having ground-truth labels in NAS and inspired broad interest.
We take a further step to question whether real data is necessary for NAS to be effective.
arXiv Detail & Related papers (2022-05-04T16:30:26Z) - On the Effectiveness of Neural Ensembles for Image Classification with
Small Datasets [2.3478438171452014]
We focus on image classification problems with a few labeled examples per class and improve data efficiency by using an ensemble of relatively small networks.
We show that ensembling relatively shallow networks is a simple yet effective technique that is generally better than current state-of-the-art approaches for learning from small datasets.
arXiv Detail & Related papers (2021-11-29T12:34:49Z) - Understanding and Accelerating Neural Architecture Search with
Training-Free and Theory-Grounded Metrics [117.4281417428145]
This work targets designing a principled and unified training-free framework for Neural Architecture Search (NAS)
NAS has been explosively studied to automate the discovery of top-performer neural networks, but suffers from heavy resource consumption and often incurs search bias due to truncated training or approximations.
We present a unified framework to understand and accelerate NAS, by disentangling "TEG" characteristics of searched networks.
arXiv Detail & Related papers (2021-08-26T17:52:07Z) - Search to aggregate neighborhood for graph neural network [47.47628113034479]
We propose a framework, which tries to Search to Aggregate NEighborhood (SANE) to automatically design data-specific GNN architectures.
By designing a novel and expressive search space, we propose a differentiable search algorithm, which is more efficient than previous reinforcement learning based methods.
arXiv Detail & Related papers (2021-04-14T03:15:19Z) - Contrastive Self-supervised Neural Architecture Search [6.162410142452926]
This paper proposes a novel cell-based neural architecture search algorithm (NAS)
Our algorithm capitalizes on the effectiveness of self-supervised learning for image representations.
An extensive number of experiments empirically show that our search algorithm can achieve state-of-the-art results.
arXiv Detail & Related papers (2021-02-21T08:38:28Z) - Graph-Based Neural Network Models with Multiple Self-Supervised
Auxiliary Tasks [79.28094304325116]
Graph Convolutional Networks are among the most promising approaches for capturing relationships among structured data points.
We propose three novel self-supervised auxiliary tasks to train graph-based neural network models in a multi-task fashion.
arXiv Detail & Related papers (2020-11-14T11:09:51Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.