LISSNAS: Locality-based Iterative Search Space Shrinkage for Neural
Architecture Search
- URL: http://arxiv.org/abs/2307.03110v1
- Date: Thu, 6 Jul 2023 16:28:51 GMT
- Title: LISSNAS: Locality-based Iterative Search Space Shrinkage for Neural
Architecture Search
- Authors: Bhavna Gopal, Arjun Sridhar, Tunhou Zhang and Yiran Chen
- Abstract summary: We propose an automated algorithm that shrinks a large space into a diverse, small search space with SOTA search performance.
Our method achieves a SOTA Top-1 accuracy of 77.6% in ImageNet under mobile constraints, best-in-class Kendal-Tau, architectural diversity, and search space size.
- Score: 30.079267927860347
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Search spaces hallmark the advancement of Neural Architecture Search (NAS).
Large and complex search spaces with versatile building operators and
structures provide more opportunities to brew promising architectures, yet pose
severe challenges on efficient exploration and exploitation. Subsequently,
several search space shrinkage methods optimize by selecting a single
sub-region that contains some well-performing networks. Small performance and
efficiency gains are observed with these methods but such techniques leave room
for significantly improved search performance and are ineffective at retaining
architectural diversity. We propose LISSNAS, an automated algorithm that
shrinks a large space into a diverse, small search space with SOTA search
performance. Our approach leverages locality, the relationship between
structural and performance similarity, to efficiently extract many pockets of
well-performing networks. We showcase our method on an array of search spaces
spanning various sizes and datasets. We accentuate the effectiveness of our
shrunk spaces when used in one-shot search by achieving the best Top-1 accuracy
in two different search spaces. Our method achieves a SOTA Top-1 accuracy of
77.6\% in ImageNet under mobile constraints, best-in-class Kendal-Tau,
architectural diversity, and search space size.
Related papers
- Construction of Hierarchical Neural Architecture Search Spaces based on
Context-free Grammars [66.05096551112932]
We introduce a unifying search space design framework based on context-free grammars.
By enhancing and using their properties, we effectively enable search over the complete architecture.
We show that our search strategy can be superior to existing Neural Architecture Search approaches.
arXiv Detail & Related papers (2022-11-03T14:23:00Z) - Automated Dominative Subspace Mining for Efficient Neural Architecture Search [36.06889021273405]
We propose a novel Neural Architecture Search method via Dominative Subspace Mining (DSM-NAS)
DSM-NAS finds promising architectures in automatically mined subspaces.
Experimental results demonstrate that DSM-NAS not only reduces the search cost but also discovers better architectures than state-of-the-art methods in various benchmark search spaces.
arXiv Detail & Related papers (2022-10-31T09:54:28Z) - Searching a High-Performance Feature Extractor for Text Recognition
Network [92.12492627169108]
We design a domain-specific search space by exploring principles for having good feature extractors.
As the space is huge and complexly structured, no existing NAS algorithms can be applied.
We propose a two-stage algorithm to effectively search in the space.
arXiv Detail & Related papers (2022-09-27T03:49:04Z) - BossNAS: Exploring Hybrid CNN-transformers with Block-wisely
Self-supervised Neural Architecture Search [100.28980854978768]
We present Block-wisely Self-supervised Neural Architecture Search (BossNAS)
We factorize the search space into blocks and utilize a novel self-supervised training scheme, named ensemble bootstrapping, to train each block separately.
We also present HyTra search space, a fabric-like hybrid CNN-transformer search space with searchable down-sampling positions.
arXiv Detail & Related papers (2021-03-23T10:05:58Z) - AutoSpace: Neural Architecture Search with Less Human Interference [84.42680793945007]
Current neural architecture search (NAS) algorithms still require expert knowledge and effort to design a search space for network construction.
We propose a novel differentiable evolutionary framework named AutoSpace, which evolves the search space to an optimal one.
With the learned search space, the performance of recent NAS algorithms can be improved significantly compared with using previously manually designed spaces.
arXiv Detail & Related papers (2021-03-22T13:28:56Z) - ISTA-NAS: Efficient and Consistent Neural Architecture Search by Sparse
Coding [86.40042104698792]
We formulate neural architecture search as a sparse coding problem.
In experiments, our two-stage method on CIFAR-10 requires only 0.05 GPU-day for search.
Our one-stage method produces state-of-the-art performances on both CIFAR-10 and ImageNet at the cost of only evaluation time.
arXiv Detail & Related papers (2020-10-13T04:34:24Z) - Breaking the Curse of Space Explosion: Towards Efficient NAS with
Curriculum Search [94.46818035655943]
We propose a curriculum search method that starts from a small search space and gradually incorporates the learned knowledge to guide the search in a large space.
With the proposed search strategy, our Curriculum Neural Architecture Search (CNAS) method significantly improves the search efficiency and finds better architectures than existing NAS methods.
arXiv Detail & Related papers (2020-07-07T02:29:06Z) - Neural Architecture Generator Optimization [9.082931889304723]
We are first to investigate casting NAS as a problem of finding the optimal network generator.
We propose a new, hierarchical and graph-based search space capable of representing an extremely large variety of network types.
arXiv Detail & Related papers (2020-04-03T06:38:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.