Direct Federated Neural Architecture Search
- URL: http://arxiv.org/abs/2010.06223v3
- Date: Tue, 20 Oct 2020 07:55:54 GMT
- Title: Direct Federated Neural Architecture Search
- Authors: Anubhav Garg, Amit Kumar Saha, Debo Dutta
- Abstract summary: We present an effective approach for direct federated NAS which is hardware agnostic, computationally lightweight, and a one-stage method to search for ready-to-deploy neural network models.
Our results show an order of magnitude reduction in resource consumption while edging out prior art in accuracy.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Architecture Search (NAS) is a collection of methods to craft the way
neural networks are built. We apply this idea to Federated Learning (FL),
wherein predefined neural network models are trained on the client/device data.
This approach is not optimal as the model developers can't observe the local
data, and hence, are unable to build highly accurate and efficient models. NAS
is promising for FL which can search for global and personalized models
automatically for the non-IID data. Most NAS methods are computationally
expensive and require fine-tuning after the search, making it a two-stage
complex process with possible human intervention. Thus there is a need for
end-to-end NAS which can run on the heterogeneous data and resource
distribution typically seen in the FL scenario. In this paper, we present an
effective approach for direct federated NAS which is hardware agnostic,
computationally lightweight, and a one-stage method to search for
ready-to-deploy neural network models. Our results show an order of magnitude
reduction in resource consumption while edging out prior art in accuracy. This
opens up a window of opportunity to create optimized and computationally
efficient federated learning systems.
Related papers
- DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - Meta-prediction Model for Distillation-Aware NAS on Unseen Datasets [55.2118691522524]
Distillation-aware Neural Architecture Search (DaNAS) aims to search for an optimal student architecture.
We propose a distillation-aware meta accuracy prediction model, DaSS (Distillation-aware Student Search), which can predict a given architecture's final performances on a dataset.
arXiv Detail & Related papers (2023-05-26T14:00:35Z) - DiffusionNAG: Predictor-guided Neural Architecture Generation with Diffusion Models [56.584561770857306]
We propose a novel conditional Neural Architecture Generation (NAG) framework based on diffusion models, dubbed DiffusionNAG.
Specifically, we consider the neural architectures as directed graphs and propose a graph diffusion model for generating them.
We validate the effectiveness of DiffusionNAG through extensive experiments in two predictor-based NAS scenarios: Transferable NAS and Bayesian Optimization (BO)-based NAS.
When integrated into a BO-based algorithm, DiffusionNAG outperforms existing BO-based NAS approaches, particularly in the large MobileNetV3 search space on the ImageNet 1K dataset.
arXiv Detail & Related papers (2023-05-26T13:58:18Z) - Neural Architecture Search via Two Constant Shared Weights Initialisations [0.0]
We present a zero-cost metric that highly correlated with the train set accuracy across the NAS-Bench-101, NAS-Bench-201 and NAS-Bench-NLP benchmark datasets.
Our method is easy to integrate within existing NAS algorithms and takes a fraction of a second to evaluate a single network.
arXiv Detail & Related papers (2023-02-09T02:25:38Z) - Generalization Properties of NAS under Activation and Skip Connection
Search [66.8386847112332]
We study the generalization properties of Neural Architecture Search (NAS) under a unifying framework.
We derive the lower (and upper) bounds of the minimum eigenvalue of the Neural Tangent Kernel (NTK) under the (in)finite-width regime.
We show how the derived results can guide NAS to select the top-performing architectures, even in the case without training.
arXiv Detail & Related papers (2022-09-15T12:11:41Z) - UnrealNAS: Can We Search Neural Architectures with Unreal Data? [84.78460976605425]
Neural architecture search (NAS) has shown great success in the automatic design of deep neural networks (DNNs)
Previous work has analyzed the necessity of having ground-truth labels in NAS and inspired broad interest.
We take a further step to question whether real data is necessary for NAS to be effective.
arXiv Detail & Related papers (2022-05-04T16:30:26Z) - PEng4NN: An Accurate Performance Estimation Engine for Efficient
Automated Neural Network Architecture Search [0.0]
Neural network (NN) models are increasingly used in scientific simulations, AI, and other high performance computing fields.
NAS attempts to find well-performing NN models for specialized datsets, where performance is measured by key metrics that capture the NN capabilities.
We propose a performance estimation strategy that reduces the resources for training NNs and increases NAS throughput without jeopardizing accuracy.
arXiv Detail & Related papers (2021-01-11T20:49:55Z) - FDNAS: Improving Data Privacy and Model Diversity in AutoML [7.402044070683503]
We propose a Federated Direct Neural Architecture Search (FDNAS) framework that allows hardware-aware NAS from decentralized non-iid data of clients.
To further adapt for various data distributions of clients, inspired by meta-learning, a cluster Federated Direct Neural Architecture Search (CFDNAS) framework is proposed to achieve client-aware NAS.
arXiv Detail & Related papers (2020-11-06T14:13:42Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - Towards Non-I.I.D. and Invisible Data with FedNAS: Federated Deep
Learning via Neural Architecture Search [15.714385295889944]
We propose a Federated NAS (FedNAS) algorithm to help scattered workers collaboratively searching for a better architecture with higher accuracy.
Our experiments on non-IID dataset show that the architecture searched by FedNAS can outperform the manually predefined architecture.
arXiv Detail & Related papers (2020-04-18T08:04:44Z) - Federated Neural Architecture Search [19.573780215917477]
We propose an automatic neural architecture search into the decentralized training, as a new DNN training paradigm called Federated Neural Architecture Search.
We present FedNAS, a highly optimized framework for efficient federated NAS.
Tested on large-scale datasets and typical CNN architectures, FedNAS achieves comparable model accuracy as state-of-the-art NAS algorithm.
arXiv Detail & Related papers (2020-02-15T10:01:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.