Towards Non-I.I.D. and Invisible Data with FedNAS: Federated Deep
Learning via Neural Architecture Search
- URL: http://arxiv.org/abs/2004.08546v4
- Date: Mon, 4 Jan 2021 02:18:08 GMT
- Title: Towards Non-I.I.D. and Invisible Data with FedNAS: Federated Deep
Learning via Neural Architecture Search
- Authors: Chaoyang He, Murali Annavaram, Salman Avestimehr
- Abstract summary: We propose a Federated NAS (FedNAS) algorithm to help scattered workers collaboratively searching for a better architecture with higher accuracy.
Our experiments on non-IID dataset show that the architecture searched by FedNAS can outperform the manually predefined architecture.
- Score: 15.714385295889944
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated Learning (FL) has been proved to be an effective learning framework
when data cannot be centralized due to privacy, communication costs, and
regulatory restrictions. When training deep learning models under an FL
setting, people employ the predefined model architecture discovered in the
centralized environment. However, this predefined architecture may not be the
optimal choice because it may not fit data with non-identical and independent
distribution (non-IID). Thus, we advocate automating federated learning
(AutoFL) to improve model accuracy and reduce the manual design effort. We
specifically study AutoFL via Neural Architecture Search (NAS), which can
automate the design process. We propose a Federated NAS (FedNAS) algorithm to
help scattered workers collaboratively searching for a better architecture with
higher accuracy. We also build a system based on FedNAS. Our experiments on
non-IID dataset show that the architecture searched by FedNAS can outperform
the manually predefined architecture.
Related papers
- DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - Meta-prediction Model for Distillation-Aware NAS on Unseen Datasets [55.2118691522524]
Distillation-aware Neural Architecture Search (DaNAS) aims to search for an optimal student architecture.
We propose a distillation-aware meta accuracy prediction model, DaSS (Distillation-aware Student Search), which can predict a given architecture's final performances on a dataset.
arXiv Detail & Related papers (2023-05-26T14:00:35Z) - NASiam: Efficient Representation Learning using Neural Architecture
Search for Siamese Networks [76.8112416450677]
Siamese networks are one of the most trending methods to achieve self-supervised visual representation learning (SSL)
NASiam is a novel approach that uses for the first time differentiable NAS to improve the multilayer perceptron projector and predictor (encoder/predictor pair)
NASiam reaches competitive performance in both small-scale (i.e., CIFAR-10/CIFAR-100) and large-scale (i.e., ImageNet) image classification datasets while costing only a few GPU hours.
arXiv Detail & Related papers (2023-01-31T19:48:37Z) - Generalization Properties of NAS under Activation and Skip Connection
Search [66.8386847112332]
We study the generalization properties of Neural Architecture Search (NAS) under a unifying framework.
We derive the lower (and upper) bounds of the minimum eigenvalue of the Neural Tangent Kernel (NTK) under the (in)finite-width regime.
We show how the derived results can guide NAS to select the top-performing architectures, even in the case without training.
arXiv Detail & Related papers (2022-09-15T12:11:41Z) - UnrealNAS: Can We Search Neural Architectures with Unreal Data? [84.78460976605425]
Neural architecture search (NAS) has shown great success in the automatic design of deep neural networks (DNNs)
Previous work has analyzed the necessity of having ground-truth labels in NAS and inspired broad interest.
We take a further step to question whether real data is necessary for NAS to be effective.
arXiv Detail & Related papers (2022-05-04T16:30:26Z) - Pretraining Neural Architecture Search Controllers with Locality-based
Self-Supervised Learning [0.0]
We propose a pretraining scheme that can be applied to controller-based NAS.
Our method, locality-based self-supervised classification task, leverages the structural similarity of network architectures to obtain good architecture representations.
arXiv Detail & Related papers (2021-03-15T06:30:36Z) - Self-supervised Cross-silo Federated Neural Architecture Search [13.971827232338716]
We present Self-supervised Vertical Federated Neural Architecture Search (SS-VFNAS) for automating Vertical Federated Learning (VFL)
In the proposed framework, each party first conducts NAS using self-supervised approach to find a local optimal architecture with its own data.
We demonstrate experimentally that our approach has superior performance, communication efficiency and privacy compared to Federated NAS.
arXiv Detail & Related papers (2021-01-28T09:57:30Z) - FDNAS: Improving Data Privacy and Model Diversity in AutoML [7.402044070683503]
We propose a Federated Direct Neural Architecture Search (FDNAS) framework that allows hardware-aware NAS from decentralized non-iid data of clients.
To further adapt for various data distributions of clients, inspired by meta-learning, a cluster Federated Direct Neural Architecture Search (CFDNAS) framework is proposed to achieve client-aware NAS.
arXiv Detail & Related papers (2020-11-06T14:13:42Z) - Direct Federated Neural Architecture Search [0.0]
We present an effective approach for direct federated NAS which is hardware agnostic, computationally lightweight, and a one-stage method to search for ready-to-deploy neural network models.
Our results show an order of magnitude reduction in resource consumption while edging out prior art in accuracy.
arXiv Detail & Related papers (2020-10-13T08:11:35Z) - A Privacy-Preserving Distributed Architecture for
Deep-Learning-as-a-Service [68.84245063902908]
This paper introduces a novel distributed architecture for deep-learning-as-a-service.
It is able to preserve the user sensitive data while providing Cloud-based machine and deep learning services.
arXiv Detail & Related papers (2020-03-30T15:12:03Z) - Federated Neural Architecture Search [19.573780215917477]
We propose an automatic neural architecture search into the decentralized training, as a new DNN training paradigm called Federated Neural Architecture Search.
We present FedNAS, a highly optimized framework for efficient federated NAS.
Tested on large-scale datasets and typical CNN architectures, FedNAS achieves comparable model accuracy as state-of-the-art NAS algorithm.
arXiv Detail & Related papers (2020-02-15T10:01:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.