Exploring Complicated Search Spaces with Interleaving-Free Sampling
- URL: http://arxiv.org/abs/2112.02488v1
- Date: Sun, 5 Dec 2021 06:42:48 GMT
- Title: Exploring Complicated Search Spaces with Interleaving-Free Sampling
- Authors: Yunjie Tian, Lingxi Xie, Jiemin Fang, Jianbin Jiao, Qixiang Ye, Qi
Tian
- Abstract summary: In this paper, we build the search algorithm upon a complicated search space with long-distance connections.
We present a simple yet effective algorithm named textbfIF-NAS, where we perform a periodic sampling strategy to construct different sub-networks.
In the proposed search space, IF-NAS outperform both random sampling and previous weight-sharing search algorithms by a significant margin.
- Score: 127.07551427957362
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The existing neural architecture search algorithms are mostly working on
search spaces with short-distance connections. We argue that such designs,
though safe and stable, obstacles the search algorithms from exploring more
complicated scenarios. In this paper, we build the search algorithm upon a
complicated search space with long-distance connections, and show that existing
weight-sharing search algorithms mostly fail due to the existence of
\textbf{interleaved connections}. Based on the observation, we present a simple
yet effective algorithm named \textbf{IF-NAS}, where we perform a periodic
sampling strategy to construct different sub-networks during the search
procedure, avoiding the interleaved connections to emerge in any of them. In
the proposed search space, IF-NAS outperform both random sampling and previous
weight-sharing search algorithms by a significant margin. IF-NAS also
generalizes to the micro cell-based spaces which are much easier. Our research
emphasizes the importance of macro structure and we look forward to further
efforts along this direction.
Related papers
- TopoNAS: Boosting Search Efficiency of Gradient-based NAS via Topological Simplification [11.08910129925713]
TopoNAS is a model-agnostic approach for gradient-based one-shot NAS.
It significantly reduces searching time and memory usage by topological simplification of searchable paths.
arXiv Detail & Related papers (2024-08-02T15:01:29Z) - Construction of Hierarchical Neural Architecture Search Spaces based on
Context-free Grammars [66.05096551112932]
We introduce a unifying search space design framework based on context-free grammars.
By enhancing and using their properties, we effectively enable search over the complete architecture.
We show that our search strategy can be superior to existing Neural Architecture Search approaches.
arXiv Detail & Related papers (2022-11-03T14:23:00Z) - Automated Dominative Subspace Mining for Efficient Neural Architecture Search [36.06889021273405]
We propose a novel Neural Architecture Search method via Dominative Subspace Mining (DSM-NAS)
DSM-NAS finds promising architectures in automatically mined subspaces.
Experimental results demonstrate that DSM-NAS not only reduces the search cost but also discovers better architectures than state-of-the-art methods in various benchmark search spaces.
arXiv Detail & Related papers (2022-10-31T09:54:28Z) - Searching a High-Performance Feature Extractor for Text Recognition
Network [92.12492627169108]
We design a domain-specific search space by exploring principles for having good feature extractors.
As the space is huge and complexly structured, no existing NAS algorithms can be applied.
We propose a two-stage algorithm to effectively search in the space.
arXiv Detail & Related papers (2022-09-27T03:49:04Z) - Efficient Joint-Dimensional Search with Solution Space Regularization
for Real-Time Semantic Segmentation [27.94898516315886]
We search an optimal network structure that can run in real-time for this problem.
A novel Solution Space Regularization (SSR) loss is first proposed to effectively encourage the supernet to converge to its discrete one.
A new Hierarchical and Progressive Solution Space Shrinking method is presented to further achieve high efficiency of searching.
arXiv Detail & Related papers (2022-08-10T11:07:33Z) - RF-Next: Efficient Receptive Field Search for Convolutional Neural
Networks [86.6139619721343]
We propose to find better receptive field combinations through a global-to-local search scheme.
Our search scheme exploits both global search to find the coarse combinations and local search to get the refined receptive field combinations.
Our RF-Next models, plugging receptive field search to various models, boost the performance on many tasks.
arXiv Detail & Related papers (2022-06-14T06:56:26Z) - CrossBeam: Learning to Search in Bottom-Up Program Synthesis [51.37514793318815]
We propose training a neural model to learn a hands-on search policy for bottom-up synthesis.
Our approach, called CrossBeam, uses the neural model to choose how to combine previously-explored programs into new programs.
We observe that CrossBeam learns to search efficiently, exploring much smaller portions of the program space compared to the state-of-the-art.
arXiv Detail & Related papers (2022-03-20T04:41:05Z) - AutoSpace: Neural Architecture Search with Less Human Interference [84.42680793945007]
Current neural architecture search (NAS) algorithms still require expert knowledge and effort to design a search space for network construction.
We propose a novel differentiable evolutionary framework named AutoSpace, which evolves the search space to an optimal one.
With the learned search space, the performance of recent NAS algorithms can be improved significantly compared with using previously manually designed spaces.
arXiv Detail & Related papers (2021-03-22T13:28:56Z) - Searching for a Search Method: Benchmarking Search Algorithms for
Generating NLP Adversarial Examples [10.993342896547691]
We study the behavior of several black-box search algorithms used for generating adversarial examples for natural language processing (NLP) tasks.
We perform a fine-grained analysis of three elements relevant to search: search algorithm, search space, and search budget.
arXiv Detail & Related papers (2020-09-09T17:04:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.