BiX-NAS: Searching Efficient Bi-directional Architecture for Medical
Image Segmentation
- URL: http://arxiv.org/abs/2106.14033v2
- Date: Wed, 30 Jun 2021 12:59:44 GMT
- Title: BiX-NAS: Searching Efficient Bi-directional Architecture for Medical
Image Segmentation
- Authors: Xinyi Wang, Tiange Xiang, Chaoyi Zhang, Yang Song, Dongnan Liu, Heng
Huang, Weidong Cai
- Abstract summary: We study a multi-scale upgrade of a bi-directional skip connected network and then automatically discover an efficient architecture by a novel two-phase Neural Architecture Search (NAS) algorithm, namely BiX-NAS.
Our proposed method reduces the network computational cost by sifting out ineffective multi-scale features at different levels and iterations.
We evaluate BiX-NAS on two segmentation tasks using three different medical image datasets, and the experimental results show that our BiX-NAS searched architecture achieves the state-of-the-art performance with significantly lower computational cost.
- Score: 85.0444711725392
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The recurrent mechanism has recently been introduced into U-Net in various
medical image segmentation tasks. Existing studies have focused on promoting
network recursion via reusing building blocks. Although network parameters
could be greatly saved, computational costs still increase inevitably in
accordance with the pre-set iteration time. In this work, we study a
multi-scale upgrade of a bi-directional skip connected network and then
automatically discover an efficient architecture by a novel two-phase Neural
Architecture Search (NAS) algorithm, namely BiX-NAS. Our proposed method
reduces the network computational cost by sifting out ineffective multi-scale
features at different levels and iterations. We evaluate BiX-NAS on two
segmentation tasks using three different medical image datasets, and the
experimental results show that our BiX-NAS searched architecture achieves the
state-of-the-art performance with significantly lower computational cost.
Related papers
- DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - HASA: Hybrid Architecture Search with Aggregation Strategy for
Echinococcosis Classification and Ovary Segmentation in Ultrasound Images [0.0]
We propose a hybrid NAS framework for ultrasound (US) image classification and segmentation.
Our method can generate more powerful and lightweight models for the above US image classification and segmentation tasks.
arXiv Detail & Related papers (2022-04-14T01:43:00Z) - Towards Bi-directional Skip Connections in Encoder-Decoder Architectures
and Beyond [95.46272735589648]
We propose backward skip connections that bring decoded features back to the encoder.
Our design can be jointly adopted with forward skip connections in any encoder-decoder architecture.
We propose a novel two-phase Neural Architecture Search (NAS) algorithm, namely BiX-NAS, to search for the best multi-scale skip connections.
arXiv Detail & Related papers (2022-03-11T01:38:52Z) - HyperSegNAS: Bridging One-Shot Neural Architecture Search with 3D
Medical Image Segmentation using HyperNet [51.60655410423093]
We introduce HyperSegNAS to enable one-shot Neural Architecture Search (NAS) for medical image segmentation.
We show that HyperSegNAS yields better performing and more intuitive architectures compared to the previous state-of-the-art (SOTA) segmentation networks.
Our method is evaluated on public datasets from the Medical Decathlon (MSD) challenge, and achieves SOTA performances.
arXiv Detail & Related papers (2021-12-20T16:21:09Z) - Memory-Efficient Hierarchical Neural Architecture Search for Image
Restoration [68.6505473346005]
We propose a memory-efficient hierarchical NAS HiNAS (HiNAS) for image denoising and image super-resolution tasks.
With a single GTX1080Ti GPU, it takes only about 1 hour for searching for denoising network on BSD 500 and 3.5 hours for searching for the super-resolution structure on DIV2K.
arXiv Detail & Related papers (2020-12-24T12:06:17Z) - Evaluating the Effectiveness of Efficient Neural Architecture Search for
Sentence-Pair Tasks [14.963150544536203]
Neural Architecture Search (NAS) methods have recently achieved competitive or state-of-the-art (SOTA) performance on variety of natural language processing and computer vision tasks.
In this work, we explore the applicability of a SOTA NAS algorithm, Efficient Neural Architecture Search (ENAS) to two sentence pair tasks.
arXiv Detail & Related papers (2020-10-08T20:26:34Z) - Binarized Neural Architecture Search for Efficient Object Recognition [120.23378346337311]
Binarized neural architecture search (BNAS) produces extremely compressed models to reduce huge computational cost on embedded devices for edge computing.
An accuracy of $96.53%$ vs. $97.22%$ is achieved on the CIFAR-10 dataset, but with a significantly compressed model, and a $40%$ faster search than the state-of-the-art PC-DARTS.
arXiv Detail & Related papers (2020-09-08T15:51:23Z) - MS-NAS: Multi-Scale Neural Architecture Search for Medical Image
Segmentation [16.206524842952636]
This paper presents a Multi-Scale NAS framework that is featured with multi-scale search space from network backbone to cell operation.
On various datasets for segmentation, MS-NAS outperforms the state-of-the-art methods and achieves 0.6-5.4% mIOU and 0.4-3.5% DSC improvements.
arXiv Detail & Related papers (2020-07-13T02:02:00Z) - Fast Neural Network Adaptation via Parameter Remapping and Architecture
Search [35.61441231491448]
Deep neural networks achieve remarkable performance in many computer vision tasks.
Most state-of-the-art (SOTA) semantic segmentation and object detection approaches reuse neural network architectures designed for image classification as the backbone.
One major challenge though, is that ImageNet pre-training of the search space representation incurs huge computational cost.
In this paper, we propose a Fast Neural Network Adaptation (FNA) method, which can adapt both the architecture and parameters of a seed network.
arXiv Detail & Related papers (2020-01-08T13:45:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.