Probeable DARTS with Application to Computational Pathology
- URL: http://arxiv.org/abs/2108.06859v1
- Date: Mon, 16 Aug 2021 02:16:06 GMT
- Title: Probeable DARTS with Application to Computational Pathology
- Authors: Sheyang Tang, Mahdi S. Hosseini, Lina Chen, Sonal Varma, Corwyn
Rowsell, Savvas Damaskinos, Konstantinos N. Plataniotis, Zhou Wang
- Abstract summary: We use differentiable architecture search (DARTS) for its efficiency.
We then apply our searching framework on CPath applications by searching for the optimum network architecture.
Results show that the searched network outperforms state-of-the-art networks in terms of prediction accuracy and complexity.
- Score: 44.20005949950844
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: AI technology has made remarkable achievements in computational pathology
(CPath), especially with the help of deep neural networks. However, the network
performance is highly related to architecture design, which commonly requires
human experts with domain knowledge. In this paper, we combat this challenge
with the recent advance in neural architecture search (NAS) to find an optimal
network for CPath applications. In particular, we use differentiable
architecture search (DARTS) for its efficiency. We first adopt a probing metric
to show that the original DARTS lacks proper hyperparameter tuning on the CIFAR
dataset, and how the generalization issue can be addressed using an adaptive
optimization strategy. We then apply our searching framework on CPath
applications by searching for the optimum network architecture on a
histological tissue type dataset (ADP). Results show that the searched network
outperforms state-of-the-art networks in terms of prediction accuracy and
computation complexity. We further conduct extensive experiments to demonstrate
the transferability of the searched network to new CPath applications, the
robustness against downscaled inputs, as well as the reliability of
predictions.
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - AutoGCN -- Towards Generic Human Activity Recognition with Neural
Architecture Search [0.16385815610837165]
This paper introduces AutoGCN, a generic Neural Architecture Search (NAS) algorithm for Human Activity Recognition (HAR) using Graph Convolution Networks (GCNs)
We conduct extensive experiments on two large-scale datasets focused on skeleton-based action recognition to assess the proposed algorithm's performance.
arXiv Detail & Related papers (2024-02-02T11:07:27Z) - NAS-ASDet: An Adaptive Design Method for Surface Defect Detection
Network using Neural Architecture Search [5.640706784987607]
We propose a new method called NAS-ASDet to adaptively design network for surface defect detection.
First, a refined and industry-appropriate search space that can adaptively adjust the feature distribution is designed.
Then, a progressive search strategy with a deep supervision mechanism is used to explore the search space faster and better.
arXiv Detail & Related papers (2023-11-18T03:15:45Z) - SuperNet in Neural Architecture Search: A Taxonomic Survey [14.037182039950505]
This survey focuses on the supernet optimization that builds a neural network that assembles all the architectures as its sub models by using weight sharing.
We aim to accomplish that by proposing them as solutions to the common challenges found in the literature: data-side optimization, poor rank correlation alleviation, and transferable NAS for a number of deployment scenarios.
arXiv Detail & Related papers (2022-04-08T08:29:52Z) - Neural Architecture Search for Speech Emotion Recognition [72.1966266171951]
We propose to apply neural architecture search (NAS) techniques to automatically configure the SER models.
We show that NAS can improve SER performance (54.89% to 56.28%) while maintaining model parameter sizes.
arXiv Detail & Related papers (2022-03-31T10:16:10Z) - AutoAdapt: Automated Segmentation Network Search for Unsupervised Domain
Adaptation [4.793219747021116]
We perform neural architecture search (NAS) to provide architecture-level perspective and analysis for domain adaptation.
We propose bridging this gap by using maximum mean discrepancy and regional weighted entropy to estimate the accuracy metric.
arXiv Detail & Related papers (2021-06-24T17:59:02Z) - iDARTS: Differentiable Architecture Search with Stochastic Implicit
Gradients [75.41173109807735]
Differentiable ARchiTecture Search (DARTS) has recently become the mainstream of neural architecture search (NAS)
We tackle the hypergradient computation in DARTS based on the implicit function theorem.
We show that the architecture optimisation with the proposed method, named iDARTS, is expected to converge to a stationary point.
arXiv Detail & Related papers (2021-06-21T00:44:11Z) - On the performance of deep learning for numerical optimization: an
application to protein structure prediction [0.0]
We present a study on the performance of the deep learning models to deal with global optimization problems.
The proposed approach adopts the idea of the neural architecture search (NAS) to generate efficient neural networks.
Experiments reveal that the generated learning models can achieve competitive results when compared to hand-designed algorithms.
arXiv Detail & Related papers (2020-12-17T17:01:30Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z) - Off-Policy Reinforcement Learning for Efficient and Effective GAN
Architecture Search [50.40004966087121]
We introduce a new reinforcement learning based neural architecture search (NAS) methodology for generative adversarial network (GAN) architecture search.
The key idea is to formulate the GAN architecture search problem as a Markov decision process (MDP) for smoother architecture sampling.
We exploit an off-policy GAN architecture search algorithm that makes efficient use of the samples generated by previous policies.
arXiv Detail & Related papers (2020-07-17T18:29:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.