Combined Depth Space based Architecture Search For Person
Re-identification
- URL: http://arxiv.org/abs/2104.04163v1
- Date: Fri, 9 Apr 2021 02:40:01 GMT
- Title: Combined Depth Space based Architecture Search For Person
Re-identification
- Authors: Hanjun Li, Gaojie Wu, Wei-Shi Zheng
- Abstract summary: We aim to design a lightweight and suitable network for person re-identification (ReID)
We propose a novel search space called Combined Depth Space (CDS), based on which we search for an efficient network architecture, which we call CDNet.
We then propose a low-cost search strategy named the Top-k Sample Search strategy to make full use of the search space and avoid trapping in local optimal result.
- Score: 70.86236888223569
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most works on person re-identification (ReID) take advantage of large
backbone networks such as ResNet, which are designed for image classification
instead of ReID, for feature extraction. However, these backbones may not be
computationally efficient or the most suitable architectures for ReID. In this
work, we aim to design a lightweight and suitable network for ReID. We propose
a novel search space called Combined Depth Space (CDS), based on which we
search for an efficient network architecture, which we call CDNet, via a
differentiable architecture search algorithm. Through the use of the combined
basic building blocks in CDS, CDNet tends to focus on combined pattern
information that is typically found in images of pedestrians. We then propose a
low-cost search strategy named the Top-k Sample Search strategy to make full
use of the search space and avoid trapping in local optimal result.
Furthermore, an effective Fine-grained Balance Neck (FBLNeck), which is
removable at the inference time, is presented to balance the effects of triplet
loss and softmax loss during the training process. Extensive experiments show
that our CDNet (~1.8M parameters) has comparable performance with
state-of-the-art lightweight networks.
Related papers
- Tiered Pruning for Efficient Differentialble Inference-Aware Neural
Architecture Search [0.0]
We introduce, a bi-path building block for DNAS, which can search over inner hidden dimensions with memory and compute complexity.
Second, we present an algorithm for pruning blocks within a layer of the SuperNet during the search.
Third, we describe a novel technique for pruning unnecessary layers during the search.
arXiv Detail & Related papers (2022-09-23T18:03:54Z) - Pruning-as-Search: Efficient Neural Architecture Search via Channel
Pruning and Structural Reparameterization [50.50023451369742]
Pruning-as-Search (PaS) is an end-to-end channel pruning method to search out desired sub-network automatically and efficiently.
Our proposed architecture outperforms prior arts by around $1.0%$ top-1 accuracy on ImageNet-1000 classification task.
arXiv Detail & Related papers (2022-06-02T17:58:54Z) - Partially-Connected Differentiable Architecture Search for Deepfake and
Spoofing Detection [14.792884010821762]
This paper reports the first successful application of a differentiable architecture search (DARTS) approach to the deepfake and spoofing detection problems.
DARTS operates upon a continuous, differentiable search space which enables both the architecture and parameters to be optimised via gradient descent.
arXiv Detail & Related papers (2021-04-07T13:53:20Z) - Sequential End-to-end Network for Efficient Person Search [7.3658840620058115]
Person search aims at jointly solving Person Detection and Person Re-identification (re-ID)
Existing works have designed end-to-end networks based on Faster R-CNN.
We propose a Sequential End-to-end Network (SeqNet) to extract superior features.
arXiv Detail & Related papers (2021-03-18T10:28:24Z) - Trilevel Neural Architecture Search for Efficient Single Image
Super-Resolution [127.92235484598811]
This paper proposes a trilevel neural architecture search (NAS) method for efficient single image super-resolution (SR)
For modeling the discrete search space, we apply a new continuous relaxation on the discrete search spaces to build a hierarchical mixture of network-path, cell-operations, and kernel-width.
An efficient search algorithm is proposed to perform optimization in a hierarchical supernet manner.
arXiv Detail & Related papers (2021-01-17T12:19:49Z) - ISTA-NAS: Efficient and Consistent Neural Architecture Search by Sparse
Coding [86.40042104698792]
We formulate neural architecture search as a sparse coding problem.
In experiments, our two-stage method on CIFAR-10 requires only 0.05 GPU-day for search.
Our one-stage method produces state-of-the-art performances on both CIFAR-10 and ImageNet at the cost of only evaluation time.
arXiv Detail & Related papers (2020-10-13T04:34:24Z) - Cyclic Differentiable Architecture Search [99.12381460261841]
Differentiable ARchiTecture Search, i.e., DARTS, has drawn great attention in neural architecture search.
We propose new joint objectives and a novel Cyclic Differentiable ARchiTecture Search framework, dubbed CDARTS.
In the DARTS search space, we achieve 97.52% top-1 accuracy on CIFAR10 and 76.3% top-1 accuracy on ImageNet.
arXiv Detail & Related papers (2020-06-18T17:55:19Z) - DC-NAS: Divide-and-Conquer Neural Architecture Search [108.57785531758076]
We present a divide-and-conquer (DC) approach to effectively and efficiently search deep neural architectures.
We achieve a $75.1%$ top-1 accuracy on the ImageNet dataset, which is higher than that of state-of-the-art methods using the same search space.
arXiv Detail & Related papers (2020-05-29T09:02:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.