DAAS: Differentiable Architecture and Augmentation Policy Search
- URL: http://arxiv.org/abs/2109.15273v1
- Date: Thu, 30 Sep 2021 17:15:17 GMT
- Title: DAAS: Differentiable Architecture and Augmentation Policy Search
- Authors: Xiaoxing Wang, Xiangxiang Chu, Junchi Yan, Xiaokang Yang
- Abstract summary: This work considers the possible coupling between neural architectures and data augmentation and proposes an effective algorithm jointly searching for them.
Our approach achieves 97.91% accuracy on CIFAR-10 and 76.6% Top-1 accuracy on ImageNet dataset, showing the outstanding performance of our search algorithm.
- Score: 107.53318939844422
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Neural architecture search (NAS) has been an active direction of automatic
machine learning (Auto-ML), aiming to explore efficient network structures. The
searched architecture is evaluated by training on datasets with fixed data
augmentation policies. However, recent works on auto-augmentation show that the
suited augmentation policies can vary over different structures. Therefore,
this work considers the possible coupling between neural architectures and data
augmentation and proposes an effective algorithm jointly searching for them.
Specifically, 1) for the NAS task, we adopt a single-path based differentiable
method with Gumbel-softmax reparameterization strategy due to its memory
efficiency; 2) for the auto-augmentation task, we introduce a novel search
method based on policy gradient algorithm, which can significantly reduce the
computation complexity. Our approach achieves 97.91% accuracy on CIFAR-10 and
76.6% Top-1 accuracy on ImageNet dataset, showing the outstanding performance
of our search algorithm.
Related papers
- A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - Pruning-as-Search: Efficient Neural Architecture Search via Channel
Pruning and Structural Reparameterization [50.50023451369742]
Pruning-as-Search (PaS) is an end-to-end channel pruning method to search out desired sub-network automatically and efficiently.
Our proposed architecture outperforms prior arts by around $1.0%$ top-1 accuracy on ImageNet-1000 classification task.
arXiv Detail & Related papers (2022-06-02T17:58:54Z) - Efficient Architecture Search for Diverse Tasks [29.83517145790238]
We study neural architecture search (NAS) for efficiently solving diverse problems.
We introduce DASH, a differentiable NAS algorithm that computes the mixture-of-operations using the Fourier diagonalization of convolution.
We evaluate DASH-Bench-360, a suite of ten tasks designed for NAS benchmarking in diverse domains.
arXiv Detail & Related papers (2022-04-15T17:21:27Z) - ZARTS: On Zero-order Optimization for Neural Architecture Search [94.41017048659664]
Differentiable architecture search (DARTS) has been a popular one-shot paradigm for NAS due to its high efficiency.
This work turns to zero-order optimization and proposes a novel NAS scheme, called ZARTS, to search without enforcing the above approximation.
In particular, results on 12 benchmarks verify the outstanding robustness of ZARTS, where the performance of DARTS collapses due to its known instability issue.
arXiv Detail & Related papers (2021-10-10T09:35:15Z) - AutoSpace: Neural Architecture Search with Less Human Interference [84.42680793945007]
Current neural architecture search (NAS) algorithms still require expert knowledge and effort to design a search space for network construction.
We propose a novel differentiable evolutionary framework named AutoSpace, which evolves the search space to an optimal one.
With the learned search space, the performance of recent NAS algorithms can be improved significantly compared with using previously manually designed spaces.
arXiv Detail & Related papers (2021-03-22T13:28:56Z) - ISTA-NAS: Efficient and Consistent Neural Architecture Search by Sparse
Coding [86.40042104698792]
We formulate neural architecture search as a sparse coding problem.
In experiments, our two-stage method on CIFAR-10 requires only 0.05 GPU-day for search.
Our one-stage method produces state-of-the-art performances on both CIFAR-10 and ImageNet at the cost of only evaluation time.
arXiv Detail & Related papers (2020-10-13T04:34:24Z) - Hypernetwork-Based Augmentation [1.6752182911522517]
We propose an efficient gradient-based search algorithm, called Hypernetwork-Based Augmentation (HBA)
Our HBA uses a hypernetwork to approximate a population-based training algorithm.
Our results show that HBA is competitive to the state-of-the-art methods in terms of both search speed and accuracy.
arXiv Detail & Related papers (2020-06-11T10:36:39Z) - Hyperparameter optimization with REINFORCE and Transformers [2.1404235519012076]
Reinforcement Learning has yielded promising results for Neural Architecture Search (NAS)
We demonstrate how its performance can be improved by using a simplified Transformer block to model the policy network.
arXiv Detail & Related papers (2020-06-01T13:35:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.