Lightweight Monocular Depth with a Novel Neural Architecture Search
Method
- URL: http://arxiv.org/abs/2108.11105v1
- Date: Wed, 25 Aug 2021 08:06:28 GMT
- Title: Lightweight Monocular Depth with a Novel Neural Architecture Search
Method
- Authors: Lam Huynh, Phong Nguyen, Jiri Matas, Esa Rahtu, Janne Heikkila
- Abstract summary: This paper presents a novel neural architecture search method, called LiDNAS, for generating lightweight monocular depth estimation models.
We construct the search space on a pre-defined backbone network to balance layer diversity and search space size.
The LiDNAS optimized models achieve results superior to compact depth estimation state-of-the-art on NYU-Depth-v2, KITTI, and ScanNet.
- Score: 46.97673710849343
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents a novel neural architecture search method, called LiDNAS,
for generating lightweight monocular depth estimation models. Unlike previous
neural architecture search (NAS) approaches, where finding optimized networks
are computationally highly demanding, the introduced novel Assisted Tabu Search
leads to efficient architecture exploration. Moreover, we construct the search
space on a pre-defined backbone network to balance layer diversity and search
space size. The LiDNAS method outperforms the state-of-the-art NAS approach,
proposed for disparity and depth estimation, in terms of search efficiency and
output model performance. The LiDNAS optimized models achieve results superior
to compact depth estimation state-of-the-art on NYU-Depth-v2, KITTI, and
ScanNet, while being 7%-500% more compact in size, i.e the number of model
parameters.
Related papers
- FlatNAS: optimizing Flatness in Neural Architecture Search for
Out-of-Distribution Robustness [3.724847012963521]
This study introduces a novel NAS solution, called Flat Neural Architecture Search (FlatNAS)
It explores the interplay between a novel figure of merit based on robustness to weight perturbations and single NN optimization with Sharpness-Aware Minimization (SAM)
The OOD robustness of the NAS-designed models is evaluated by focusing on robustness to input data corruptions, using popular benchmark datasets in the literature.
arXiv Detail & Related papers (2024-02-29T12:33:14Z) - SVD-NAS: Coupling Low-Rank Approximation and Neural Architecture Search [7.221206118679026]
This work proposes the SVD-NAS framework that couples the domains of low-rank approximation and neural architecture search.
Results demonstrate that the SVD-NAS achieves 2.06-12.85pp higher accuracy on ImageNet than state-of-the-art methods under the data-limited problem setting.
arXiv Detail & Related papers (2022-08-22T15:41:28Z) - Fast Neural Architecture Search for Lightweight Dense Prediction
Networks [41.605107921584775]
We present LDP, a lightweight dense prediction neural architecture search (NAS) framework.
Starting from a pre-defined generic backbone, LDP applies the novel Assisted Tabu Search for efficient architecture exploration.
Experiments show that the proposed framework yields consistent improvements on all tested dense prediction tasks.
arXiv Detail & Related papers (2022-03-03T20:17:10Z) - Searching Efficient Model-guided Deep Network for Image Denoising [61.65776576769698]
We present a novel approach by connecting model-guided design with NAS (MoD-NAS)
MoD-NAS employs a highly reusable width search strategy and a densely connected search block to automatically select the operations of each layer.
Experimental results on several popular datasets show that our MoD-NAS has achieved even better PSNR performance than current state-of-the-art methods.
arXiv Detail & Related papers (2021-04-06T14:03:01Z) - PV-NAS: Practical Neural Architecture Search for Video Recognition [83.77236063613579]
Deep neural networks for video tasks is highly customized and the design of such networks requires domain experts and costly trial and error tests.
Recent advance in network architecture search has boosted the image recognition performance in a large margin.
In this study, we propose a practical solution, namely Practical Video Neural Architecture Search (PV-NAS)
arXiv Detail & Related papers (2020-11-02T08:50:23Z) - Neural Architecture Search of SPD Manifold Networks [79.45110063435617]
We propose a new neural architecture search (NAS) problem of Symmetric Positive Definite (SPD) manifold networks.
We first introduce a geometrically rich and diverse SPD neural architecture search space for an efficient SPD cell design.
We exploit a differentiable NAS algorithm on our relaxed continuous search space for SPD neural architecture search.
arXiv Detail & Related papers (2020-10-27T18:08:57Z) - Hyperparameter Optimization in Neural Networks via Structured Sparse
Recovery [54.60327265077322]
We study two important problems in the automated design of neural networks through the lens of sparse recovery methods.
In the first part of this paper, we establish a novel connection between HPO and structured sparse recovery.
In the second part of this paper, we establish a connection between NAS and structured sparse recovery.
arXiv Detail & Related papers (2020-07-07T00:57:09Z) - DrNAS: Dirichlet Neural Architecture Search [88.56953713817545]
We treat the continuously relaxed architecture mixing weight as random variables, modeled by Dirichlet distribution.
With recently developed pathwise derivatives, the Dirichlet parameters can be easily optimized with gradient-based generalization.
To alleviate the large memory consumption of differentiable NAS, we propose a simple yet effective progressive learning scheme.
arXiv Detail & Related papers (2020-06-18T08:23:02Z) - Bonsai-Net: One-Shot Neural Architecture Search via Differentiable
Pruners [1.4180331276028662]
One-shot Neural Architecture Search (NAS) aims to minimize the computational expense of discovering state-of-the-art models.
We present Bonsai-Net, an efficient one-shot NAS method to explore our relaxed search space.
arXiv Detail & Related papers (2020-06-12T14:44:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.