MS-NAS: Multi-Scale Neural Architecture Search for Medical Image
Segmentation
- URL: http://arxiv.org/abs/2007.06151v1
- Date: Mon, 13 Jul 2020 02:02:00 GMT
- Title: MS-NAS: Multi-Scale Neural Architecture Search for Medical Image
Segmentation
- Authors: Xingang Yan, Weiwen Jiang, Yiyu Shi, and Cheng Zhuo
- Abstract summary: This paper presents a Multi-Scale NAS framework that is featured with multi-scale search space from network backbone to cell operation.
On various datasets for segmentation, MS-NAS outperforms the state-of-the-art methods and achieves 0.6-5.4% mIOU and 0.4-3.5% DSC improvements.
- Score: 16.206524842952636
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The recent breakthroughs of Neural Architecture Search (NAS) have motivated
various applications in medical image segmentation. However, most existing work
either simply rely on hyper-parameter tuning or stick to a fixed network
backbone, thereby limiting the underlying search space to identify more
efficient architecture. This paper presents a Multi-Scale NAS (MS-NAS)
framework that is featured with multi-scale search space from network backbone
to cell operation, and multi-scale fusion capability to fuse features with
different sizes. To mitigate the computational overhead due to the larger
search space, a partial channel connection scheme and a two-step decoding
method are utilized to reduce computational overhead while maintaining
optimization quality. Experimental results show that on various datasets for
segmentation, MS-NAS outperforms the state-of-the-art methods and achieves
0.6-5.4% mIOU and 0.4-3.5% DSC improvements, while the computational resource
consumption is reduced by 18.0-24.9%.
Related papers
- A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - Lightweight Neural Architecture Search for Temporal Convolutional
Networks at the Edge [21.72253397805102]
This work focuses in particular on Temporal Convolutional Networks (TCNs), a convolutional model for time-series processing.
We propose the first NAS tool that explicitly targets the optimization of the most peculiar architectural parameters of TCNs.
We test the proposed NAS on four real-world, edge-relevant tasks, involving audio and bio-signals.
arXiv Detail & Related papers (2023-01-24T19:47:40Z) - Generalizing Few-Shot NAS with Gradient Matching [165.5690495295074]
One-Shot methods train one supernet to approximate the performance of every architecture in the search space via weight-sharing.
Few-Shot NAS reduces the level of weight-sharing by splitting the One-Shot supernet into multiple separated sub-supernets.
It significantly outperforms its Few-Shot counterparts while surpassing previous comparable methods in terms of the accuracy of derived architectures.
arXiv Detail & Related papers (2022-03-29T03:06:16Z) - HyperSegNAS: Bridging One-Shot Neural Architecture Search with 3D
Medical Image Segmentation using HyperNet [51.60655410423093]
We introduce HyperSegNAS to enable one-shot Neural Architecture Search (NAS) for medical image segmentation.
We show that HyperSegNAS yields better performing and more intuitive architectures compared to the previous state-of-the-art (SOTA) segmentation networks.
Our method is evaluated on public datasets from the Medical Decathlon (MSD) challenge, and achieves SOTA performances.
arXiv Detail & Related papers (2021-12-20T16:21:09Z) - TND-NAS: Towards Non-differentiable Objectives in Progressive
Differentiable NAS Framework [6.895590095853327]
Differentiable architecture search has gradually become the mainstream research topic in the field of Neural Architecture Search (NAS)
Recent differentiable NAS also aims at further improving the search performance and reducing the GPU-memory consumption.
We propose the TND-NAS, which is with the merits of the high efficiency in differentiable NAS framework and the compatibility among non-differentiable metrics in Multi-objective NAS.
arXiv Detail & Related papers (2021-11-06T14:19:36Z) - BiX-NAS: Searching Efficient Bi-directional Architecture for Medical
Image Segmentation [85.0444711725392]
We study a multi-scale upgrade of a bi-directional skip connected network and then automatically discover an efficient architecture by a novel two-phase Neural Architecture Search (NAS) algorithm, namely BiX-NAS.
Our proposed method reduces the network computational cost by sifting out ineffective multi-scale features at different levels and iterations.
We evaluate BiX-NAS on two segmentation tasks using three different medical image datasets, and the experimental results show that our BiX-NAS searched architecture achieves the state-of-the-art performance with significantly lower computational cost.
arXiv Detail & Related papers (2021-06-26T14:33:04Z) - HR-NAS: Searching Efficient High-Resolution Neural Architectures with
Lightweight Transformers [48.74623838201632]
High-resolution representations (HR) are essential for dense prediction tasks such as segmentation, detection, and pose estimation.
This work proposes a novel NAS method, called HR-NAS, which is able to find efficient and accurate networks for different tasks.
HR-NAS is capable of achieving state-of-the-art trade-offs between performance and FLOPs for three dense prediction tasks and an image classification task.
arXiv Detail & Related papers (2021-06-11T18:11:36Z) - Searching Efficient Model-guided Deep Network for Image Denoising [61.65776576769698]
We present a novel approach by connecting model-guided design with NAS (MoD-NAS)
MoD-NAS employs a highly reusable width search strategy and a densely connected search block to automatically select the operations of each layer.
Experimental results on several popular datasets show that our MoD-NAS has achieved even better PSNR performance than current state-of-the-art methods.
arXiv Detail & Related papers (2021-04-06T14:03:01Z) - Binarized Neural Architecture Search for Efficient Object Recognition [120.23378346337311]
Binarized neural architecture search (BNAS) produces extremely compressed models to reduce huge computational cost on embedded devices for edge computing.
An accuracy of $96.53%$ vs. $97.22%$ is achieved on the CIFAR-10 dataset, but with a significantly compressed model, and a $40%$ faster search than the state-of-the-art PC-DARTS.
arXiv Detail & Related papers (2020-09-08T15:51:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.