Towards Self-supervised and Weight-preserving Neural Architecture Search
- URL: http://arxiv.org/abs/2206.04125v1
- Date: Wed, 8 Jun 2022 18:48:05 GMT
- Title: Towards Self-supervised and Weight-preserving Neural Architecture Search
- Authors: Zhuowei Li, Yibo Gao, Zhenzhou Zha, Zhiqiang HU, Qing Xia, Shaoting
Zhang, Dimitris N. Metaxas
- Abstract summary: We propose the self-supervised and weight-preserving neural architecture search (SSWP-NAS) as an extension of the current NAS framework.
Experiments show that the architectures searched by the proposed framework achieve state-of-the-art accuracy on CIFAR-10, CIFAR-100, and ImageNet datasets.
- Score: 38.497608743382145
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural architecture search (NAS) algorithms save tremendous labor from human
experts. Recent advancements further reduce the computational overhead to an
affordable level. However, it is still cumbersome to deploy the NAS techniques
in real-world applications due to the fussy procedures and the supervised
learning paradigm. In this work, we propose the self-supervised and
weight-preserving neural architecture search (SSWP-NAS) as an extension of the
current NAS framework by allowing the self-supervision and retaining the
concomitant weights discovered during the search stage. As such, we simplify
the workflow of NAS to a one-stage and proxy-free procedure. Experiments show
that the architectures searched by the proposed framework achieve
state-of-the-art accuracy on CIFAR-10, CIFAR-100, and ImageNet datasets without
using manual labels. Moreover, we show that employing the concomitant weights
as initialization consistently outperforms the random initialization and the
two-stage weight pre-training method by a clear margin under semi-supervised
learning scenarios. Codes are publicly available at
https://github.com/LzVv123456/SSWP-NAS.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.