Meta-Learning of NAS for Few-shot Learning in Medical Image Applications
- URL: http://arxiv.org/abs/2203.08951v1
- Date: Wed, 16 Mar 2022 21:21:51 GMT
- Title: Meta-Learning of NAS for Few-shot Learning in Medical Image Applications
- Authors: Viet-Khoa Vo-Ho, Kashu Yamazaki, Hieu Hoang, Minh-Triet Tran, Ngan Le
- Abstract summary: Neural Architecture Search (NAS) has motivated various applications in medical imaging.
NAS requires the availability of large annotated data, considerable resources, and pre-defined tasks.
We introduce various NAS approaches in medical imaging with different applications such as classification, segmentation, detection, reconstruction.
- Score: 10.666687733540668
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning methods have been successful in solving tasks in machine
learning and have made breakthroughs in many sectors owing to their ability to
automatically extract features from unstructured data. However, their
performance relies on manual trial-and-error processes for selecting an
appropriate network architecture, hyperparameters for training, and
pre-/post-procedures. Even though it has been shown that network architecture
plays a critical role in learning feature representation feature from data and
the final performance, searching for the best network architecture is
computationally intensive and heavily relies on researchers' experience.
Automated machine learning (AutoML) and its advanced techniques i.e. Neural
Architecture Search (NAS) have been promoted to address those limitations. Not
only in general computer vision tasks, but NAS has also motivated various
applications in multiple areas including medical imaging. In medical imaging,
NAS has significant progress in improving the accuracy of image classification,
segmentation, reconstruction, and more. However, NAS requires the availability
of large annotated data, considerable computation resources, and pre-defined
tasks. To address such limitations, meta-learning has been adopted in the
scenarios of few-shot learning and multiple tasks. In this book chapter, we
first present a brief review of NAS by discussing well-known approaches in
search space, search strategy, and evaluation strategy. We then introduce
various NAS approaches in medical imaging with different applications such as
classification, segmentation, detection, reconstruction, etc. Meta-learning in
NAS for few-shot learning and multiple tasks is then explained. Finally, we
describe several open problems in NAS.
Related papers
- Fair Differentiable Neural Network Architecture Search for Long-Tailed Data with Self-Supervised Learning [0.0]
This paper explores to improve the searching and training performance of NAS on long-tailed datasets.
We first discuss the related works about NAS and the deep learning method for long-tailed datasets.
Then, we focus on an existing work, called SSF-NAS, which integrates the self-supervised learning and fair differentiable NAS.
Finally, we conducted a series of experiments on the CIFAR10-LT dataset for performance evaluation.
arXiv Detail & Related papers (2024-06-19T12:39:02Z) - How Much Is Hidden in the NAS Benchmarks? Few-Shot Adaptation of a NAS
Predictor [22.87207410692821]
We borrow from the rich field of meta-learning for few-shot adaptation and study applicability of those methods to NAS.
Our meta-learning approach not only shows superior (or matching) performance in the cross-validation experiments but also successful extrapolation to a new search space and tasks.
arXiv Detail & Related papers (2023-11-30T10:51:46Z) - Meta-prediction Model for Distillation-Aware NAS on Unseen Datasets [55.2118691522524]
Distillation-aware Neural Architecture Search (DaNAS) aims to search for an optimal student architecture.
We propose a distillation-aware meta accuracy prediction model, DaSS (Distillation-aware Student Search), which can predict a given architecture's final performances on a dataset.
arXiv Detail & Related papers (2023-05-26T14:00:35Z) - NASiam: Efficient Representation Learning using Neural Architecture
Search for Siamese Networks [76.8112416450677]
Siamese networks are one of the most trending methods to achieve self-supervised visual representation learning (SSL)
NASiam is a novel approach that uses for the first time differentiable NAS to improve the multilayer perceptron projector and predictor (encoder/predictor pair)
NASiam reaches competitive performance in both small-scale (i.e., CIFAR-10/CIFAR-100) and large-scale (i.e., ImageNet) image classification datasets while costing only a few GPU hours.
arXiv Detail & Related papers (2023-01-31T19:48:37Z) - Neural Architecture Search for Dense Prediction Tasks in Computer Vision [74.9839082859151]
Deep learning has led to a rising demand for neural network architecture engineering.
neural architecture search (NAS) aims at automatically designing neural network architectures in a data-driven manner rather than manually.
NAS has become applicable to a much wider range of problems in computer vision.
arXiv Detail & Related papers (2022-02-15T08:06:50Z) - HyperSegNAS: Bridging One-Shot Neural Architecture Search with 3D
Medical Image Segmentation using HyperNet [51.60655410423093]
We introduce HyperSegNAS to enable one-shot Neural Architecture Search (NAS) for medical image segmentation.
We show that HyperSegNAS yields better performing and more intuitive architectures compared to the previous state-of-the-art (SOTA) segmentation networks.
Our method is evaluated on public datasets from the Medical Decathlon (MSD) challenge, and achieves SOTA performances.
arXiv Detail & Related papers (2021-12-20T16:21:09Z) - Across-Task Neural Architecture Search via Meta Learning [1.225795556154044]
Adequate labeled data and expensive compute resources are the prerequisites for the success of neural architecture search(NAS)
It is challenging to apply NAS in meta-learning scenarios with limited compute resources and data.
In this paper, an across-task neural architecture search (AT-NAS) is proposed to address the problem through combining gradient-based meta-learning with EA-based NAS.
arXiv Detail & Related papers (2021-10-12T09:07:33Z) - NAS-Bench-360: Benchmarking Diverse Tasks for Neural Architecture Search [18.9676056830197]
Most existing neural architecture search (NAS) benchmarks and algorithms prioritize performance on well-studied tasks.
We present NAS-Bench-360, a benchmark suite for evaluating state-of-the-art NAS methods for convolutional neural networks (CNNs)
arXiv Detail & Related papers (2021-10-12T01:13:18Z) - TransNAS-Bench-101: Improving Transferability and Generalizability of
Cross-Task Neural Architecture Search [98.22779489340869]
We propose TransNAS-Bench-101, a benchmark dataset containing network performance across seven vision tasks.
We explore two fundamentally different types of search space: cell-level search space and macro-level search space.
With 7,352 backbones evaluated on seven tasks, 51,464 trained models with detailed training information are provided.
arXiv Detail & Related papers (2021-05-25T12:15:21Z) - A Comprehensive Survey on Hardware-Aware Neural Architecture Search [6.23453131728063]
Neural Architecture Search (NAS) methods have been growing in popularity.
NAS has been extensively studied in the past few years.
Applying NAS to real-world problems still poses significant challenges and is not widely practical.
One solution growing in popularity is to use multi-objective optimization algorithms in the NAS search strategy by taking into account execution latency, energy consumption, memory footprint, etc.
This kind of NAS, called hardware-aware NAS (HW-NAS), makes searching the most efficient architecture more complicated and opens several questions.
arXiv Detail & Related papers (2021-01-22T21:13:46Z) - NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture
Search [55.12928953187342]
We propose an extension to NAS-Bench-101: NAS-Bench-201 with a different search space, results on multiple datasets, and more diagnostic information.
NAS-Bench-201 has a fixed search space and provides a unified benchmark for almost any up-to-date NAS algorithms.
We provide additional diagnostic information such as fine-grained loss and accuracy, which can give inspirations to new designs of NAS algorithms.
arXiv Detail & Related papers (2020-01-02T05:28:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.