FedorAS: Federated Architecture Search under system heterogeneity
- URL: http://arxiv.org/abs/2206.11239v2
- Date: Thu, 23 Jun 2022 10:04:54 GMT
- Title: FedorAS: Federated Architecture Search under system heterogeneity
- Authors: Lukasz Dudziak, Stefanos Laskaridis, Javier Fernandez-Marques
- Abstract summary: Federated learning (FL) has recently gained considerable attention due to its ability to use decentralised data while preserving privacy.
It also poses additional challenges related to the heterogeneity of the participating devices, in terms of their computational capabilities and contributed data.
We design our system, FedorAS, to discover and train promising architectures when dealing with devices of varying capabilities holding non-IID distributed data.
- Score: 7.187123335023895
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) has recently gained considerable attention due to its
ability to use decentralised data while preserving privacy. However, it also
poses additional challenges related to the heterogeneity of the participating
devices, both in terms of their computational capabilities and contributed
data. Meanwhile, Neural Architecture Search (NAS) has been successfully used
with centralised datasets, producing state-of-the-art results in constrained
(hardware-aware) and unconstrained settings. However, even the most recent work
laying at the intersection of NAS and FL assumes homogeneous compute
environment with datacenter-grade hardware and does not address the issues of
working with constrained, heterogeneous devices. As a result, practical usage
of NAS in a federated setting remains an open problem that we address in our
work. We design our system, FedorAS, to discover and train promising
architectures when dealing with devices of varying capabilities holding non-IID
distributed data, and present empirical evidence of its effectiveness across
different settings. Specifically, we evaluate FedorAS across datasets spanning
three different modalities (vision, speech, text) and show its better
performance compared to state-of-the-art federated solutions, while maintaining
resource efficiency.
Related papers
- Fair Differentiable Neural Network Architecture Search for Long-Tailed Data with Self-Supervised Learning [0.0]
This paper explores to improve the searching and training performance of NAS on long-tailed datasets.
We first discuss the related works about NAS and the deep learning method for long-tailed datasets.
Then, we focus on an existing work, called SSF-NAS, which integrates the self-supervised learning and fair differentiable NAS.
Finally, we conducted a series of experiments on the CIFAR10-LT dataset for performance evaluation.
arXiv Detail & Related papers (2024-06-19T12:39:02Z) - Multi-objective Differentiable Neural Architecture Search [58.67218773054753]
We propose a novel NAS algorithm that encodes user preferences for the trade-off between performance and hardware metrics.
Our method outperforms existing MOO NAS methods across a broad range of qualitatively different search spaces and datasets.
arXiv Detail & Related papers (2024-02-28T10:09:04Z) - Effective Intrusion Detection in Heterogeneous Internet-of-Things Networks via Ensemble Knowledge Distillation-based Federated Learning [52.6706505729803]
We introduce Federated Learning (FL) to collaboratively train a decentralized shared model of Intrusion Detection Systems (IDS)
FLEKD enables a more flexible aggregation method than conventional model fusion techniques.
Experiment results show that the proposed approach outperforms local training and traditional FL in terms of both speed and performance.
arXiv Detail & Related papers (2024-01-22T14:16:37Z) - Federated Learning for Computationally-Constrained Heterogeneous
Devices: A Survey [3.219812767529503]
Federated learning (FL) offers a privacy-preserving trade-off between communication overhead and model accuracy.
We outline the challengesFL has to overcome to be widely applicable in real-world applications.
arXiv Detail & Related papers (2023-07-18T12:05:36Z) - FS-Real: Towards Real-World Cross-Device Federated Learning [60.91678132132229]
Federated Learning (FL) aims to train high-quality models in collaboration with distributed clients while not uploading their local data.
There is still a considerable gap between the flourishing FL research and real-world scenarios, mainly caused by the characteristics of heterogeneous devices and its scales.
We propose an efficient and scalable prototyping system for real-world cross-device FL, FS-Real.
arXiv Detail & Related papers (2023-03-23T15:37:17Z) - FedHiSyn: A Hierarchical Synchronous Federated Learning Framework for
Resource and Data Heterogeneity [56.82825745165945]
Federated Learning (FL) enables training a global model without sharing the decentralized raw data stored on multiple devices to protect data privacy.
We propose a hierarchical synchronous FL framework, i.e., FedHiSyn, to tackle the problems of straggler effects and outdated models.
We evaluate the proposed framework based on MNIST, EMNIST, CIFAR10 and CIFAR100 datasets and diverse heterogeneous settings of devices.
arXiv Detail & Related papers (2022-06-21T17:23:06Z) - FedILC: Weighted Geometric Mean and Invariant Gradient Covariance for
Federated Learning on Non-IID Data [69.0785021613868]
Federated learning is a distributed machine learning approach which enables a shared server model to learn by aggregating the locally-computed parameter updates with the training data from spatially-distributed client silos.
We propose the Federated Invariant Learning Consistency (FedILC) approach, which leverages the gradient covariance and the geometric mean of Hessians to capture both inter-silo and intra-silo consistencies.
This is relevant to various fields such as medical healthcare, computer vision, and the Internet of Things (IoT)
arXiv Detail & Related papers (2022-05-19T03:32:03Z) - Towards Tailored Models on Private AIoT Devices: Federated Direct Neural
Architecture Search [22.69123714900226]
We propose a Federated Direct Neural Architecture Search (FDNAS) framework that allows for hardware-friendly NAS from non- IID data across devices.
Experiments on non-IID datasets have shown the state-of-the-art accuracy-efficiency trade-offs achieved by the proposed solution.
arXiv Detail & Related papers (2022-02-23T13:10:01Z) - Rethinking Architecture Design for Tackling Data Heterogeneity in
Federated Learning [53.73083199055093]
We show that attention-based architectures (e.g., Transformers) are fairly robust to distribution shifts.
Our experiments show that replacing convolutional networks with Transformers can greatly reduce catastrophic forgetting of previous devices.
arXiv Detail & Related papers (2021-06-10T21:04:18Z) - Self-supervised Cross-silo Federated Neural Architecture Search [13.971827232338716]
We present Self-supervised Vertical Federated Neural Architecture Search (SS-VFNAS) for automating Vertical Federated Learning (VFL)
In the proposed framework, each party first conducts NAS using self-supervised approach to find a local optimal architecture with its own data.
We demonstrate experimentally that our approach has superior performance, communication efficiency and privacy compared to Federated NAS.
arXiv Detail & Related papers (2021-01-28T09:57:30Z) - FDNAS: Improving Data Privacy and Model Diversity in AutoML [7.402044070683503]
We propose a Federated Direct Neural Architecture Search (FDNAS) framework that allows hardware-aware NAS from decentralized non-iid data of clients.
To further adapt for various data distributions of clients, inspired by meta-learning, a cluster Federated Direct Neural Architecture Search (CFDNAS) framework is proposed to achieve client-aware NAS.
arXiv Detail & Related papers (2020-11-06T14:13:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.