NAS-FCOS: Efficient Search for Object Detection Architectures
- URL: http://arxiv.org/abs/2110.12423v1
- Date: Sun, 24 Oct 2021 12:20:04 GMT
- Title: NAS-FCOS: Efficient Search for Object Detection Architectures
- Authors: Ning Wang and Yang Gao and Hao Chen and Peng Wang and Zhi Tian and
Chunhua Shen and Yanning Zhang
- Abstract summary: We propose an efficient method to obtain better object detectors by searching for the feature pyramid network (FPN) and the prediction head of a simple anchor-free object detector.
With carefully designed search space, search algorithms, and strategies for evaluating network quality, we are able to find top-performing detection architectures within 4 days using 8 V100 GPUs.
- Score: 113.47766862146389
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Architecture Search (NAS) has shown great potential in effectively
reducing manual effort in network design by automatically discovering optimal
architectures. What is noteworthy is that as of now, object detection is less
touched by NAS algorithms despite its significant importance in computer
vision. To the best of our knowledge, most of the recent NAS studies on object
detection tasks fail to satisfactorily strike a balance between performance and
efficiency of the resulting models, let alone the excessive amount of
computational resources cost by those algorithms. Here we propose an efficient
method to obtain better object detectors by searching for the feature pyramid
network (FPN) as well as the prediction head of a simple anchor-free object
detector, namely, FCOS [36], using a tailored reinforcement learning paradigm.
With carefully designed search space, search algorithms, and strategies for
evaluating network quality, we are able to find top-performing detection
architectures within 4 days using 8 V100 GPUs. The discovered architectures
surpass state-of-the-art object detection models (such as Faster R-CNN,
Retina-Net and, FCOS) by 1.0% to 5.4% points in AP on the COCO dataset, with
comparable computation complexity and memory footprint, demonstrating the
efficacy of the proposed NAS method for object detection. Code is available at
https://github.com/Lausannen/NAS-FCOS.
Related papers
- A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - DONNAv2 -- Lightweight Neural Architecture Search for Vision tasks [6.628409795264665]
We present the next-generation neural architecture design for computationally efficient neural architecture distillation - DONNAv2.
DONNAv2 reduces the computational cost of DONNA by 10x for the larger datasets.
To improve the quality of NAS search space, DONNAv2 leverages a block knowledge distillation filter to remove blocks with high inference costs.
arXiv Detail & Related papers (2023-09-26T04:48:50Z) - GeNAS: Neural Architecture Search with Better Generalization [14.92869716323226]
Recent neural architecture search (NAS) approaches rely on validation loss or accuracy to find the superior network for the target data.
In this paper, we investigate a new neural architecture search measure for excavating architectures with better generalization.
arXiv Detail & Related papers (2023-05-15T12:44:54Z) - Multi-Objective Evolutionary for Object Detection Mobile Architectures
Search [21.14296703753317]
We propose a mobile object detection backbone network architecture search algorithm based on non-dominated sorting for NAS scenarios.
The proposed approach can search the backbone networks with different depths, widths, or expansion sizes via a technique of weight mapping.
Under similar computational complexity, the accuracy of the backbone network architecture we search for is 2.0% mAP higher than MobileDet.
arXiv Detail & Related papers (2022-11-05T00:28:49Z) - EAutoDet: Efficient Architecture Search for Object Detection [110.99532343155073]
EAutoDet framework can discover practical backbone and FPN architectures for object detection in 1.4 GPU-days.
We propose a kernel reusing technique by sharing the weights of candidate operations on one edge and consolidating them into one convolution.
In particular, the discovered architectures surpass state-of-the-art object detection NAS methods and achieve 40.1 mAP with 120 FPS and 49.2 mAP with 41.3 FPS on COCO test-dev set.
arXiv Detail & Related papers (2022-03-21T05:56:12Z) - Poisoning the Search Space in Neural Architecture Search [0.0]
We evaluate the robustness of one such algorithm known as Efficient NAS against data poisoning attacks on the original search space.
Our results provide insights into the challenges to surmount in using NAS for more adversarially robust architecture search.
arXiv Detail & Related papers (2021-06-28T05:45:57Z) - Search to aggregate neighborhood for graph neural network [47.47628113034479]
We propose a framework, which tries to Search to Aggregate NEighborhood (SANE) to automatically design data-specific GNN architectures.
By designing a novel and expressive search space, we propose a differentiable search algorithm, which is more efficient than previous reinforcement learning based methods.
arXiv Detail & Related papers (2021-04-14T03:15:19Z) - OPANAS: One-Shot Path Aggregation Network Architecture Search for Object
Detection [82.04372532783931]
Recently, neural architecture search (NAS) has been exploited to design feature pyramid networks (FPNs)
We propose a novel One-Shot Path Aggregation Network Architecture Search (OPANAS) algorithm, which significantly improves both searching efficiency and detection accuracy.
arXiv Detail & Related papers (2021-03-08T01:48:53Z) - DA-NAS: Data Adapted Pruning for Efficient Neural Architecture Search [76.9225014200746]
Efficient search is a core issue in Neural Architecture Search (NAS)
We present DA-NAS that can directly search the architecture for large-scale target tasks while allowing a large candidate set in a more efficient manner.
It is 2x faster than previous methods while the accuracy is currently state-of-the-art, at 76.2% under small FLOPs constraint.
arXiv Detail & Related papers (2020-03-27T17:55:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.