EdgeFL: A Lightweight Decentralized Federated Learning Framework
- URL: http://arxiv.org/abs/2309.02936v1
- Date: Wed, 6 Sep 2023 11:55:41 GMT
- Title: EdgeFL: A Lightweight Decentralized Federated Learning Framework
- Authors: Hongyi Zhang, Jan Bosch, Helena Holmstr\"om Olsson
- Abstract summary: We introduce EdgeFL, an edge-only lightweight decentralized FL framework.
By adopting an edge-only model training and aggregation approach, EdgeFL eliminates the need for a central server.
We show that EdgeFL achieves superior performance compared to existing FL platforms/frameworks.
- Score: 8.934690279361286
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated Learning (FL) has emerged as a promising approach for collaborative
machine learning, addressing data privacy concerns. However, existing FL
platforms and frameworks often present challenges for software engineers in
terms of complexity, limited customization options, and scalability
limitations. In this paper, we introduce EdgeFL, an edge-only lightweight
decentralized FL framework, designed to overcome the limitations of centralized
aggregation and scalability in FL deployments. By adopting an edge-only model
training and aggregation approach, EdgeFL eliminates the need for a central
server, enabling seamless scalability across diverse use cases. With a
straightforward integration process requiring just four lines of code (LOC),
software engineers can easily incorporate FL functionalities into their AI
products. Furthermore, EdgeFL offers the flexibility to customize aggregation
functions, empowering engineers to adapt them to specific needs. Based on the
results, we demonstrate that EdgeFL achieves superior performance compared to
existing FL platforms/frameworks. Our results show that EdgeFL reduces weights
update latency and enables faster model evolution, enhancing the efficiency of
edge devices. Moreover, EdgeFL exhibits improved classification accuracy
compared to traditional centralized FL approaches. By leveraging EdgeFL,
software engineers can harness the benefits of federated learning while
overcoming the challenges associated with existing FL platforms/frameworks.
Related papers
- Advances in APPFL: A Comprehensive and Extensible Federated Learning Framework [1.4206132527980742]
Federated learning (FL) is a distributed machine learning paradigm enabling collaborative model training while preserving data privacy.
We present the recent advances in developing APPFL, a framework and benchmarking suite for federated learning.
We demonstrate the capabilities of APPFL through extensive experiments evaluating various aspects of FL, including communication efficiency, privacy preservation, computational performance, and resource utilization.
arXiv Detail & Related papers (2024-09-17T22:20:26Z) - SpaFL: Communication-Efficient Federated Learning with Sparse Models and Low computational Overhead [75.87007729801304]
SpaFL: a communication-efficient FL framework is proposed to optimize sparse model structures with low computational overhead.
Experiments show that SpaFL improves accuracy while requiring much less communication and computing resources compared to sparse baselines.
arXiv Detail & Related papers (2024-06-01T13:10:35Z) - AdaptSFL: Adaptive Split Federated Learning in Resource-constrained Edge Networks [15.195798715517315]
Split federated learning (SFL) is a promising solution by of floading the primary training workload to a server via model partitioning.
We propose AdaptSFL, a novel resource-adaptive SFL framework, to expedite SFL under resource-constrained edge computing systems.
arXiv Detail & Related papers (2024-03-19T19:05:24Z) - Communication Efficient ConFederated Learning: An Event-Triggered SAGA
Approach [67.27031215756121]
Federated learning (FL) is a machine learning paradigm that targets model training without gathering the local data over various data sources.
Standard FL, which employs a single server, can only support a limited number of users, leading to degraded learning capability.
In this work, we consider a multi-server FL framework, referred to as emphConfederated Learning (CFL) in order to accommodate a larger number of users.
arXiv Detail & Related papers (2024-02-28T03:27:10Z) - Adaptive Federated Pruning in Hierarchical Wireless Networks [69.6417645730093]
Federated Learning (FL) is a privacy-preserving distributed learning framework where a server aggregates models updated by multiple devices without accessing their private datasets.
In this paper, we introduce model pruning for HFL in wireless networks to reduce the neural network scale.
We show that our proposed HFL with model pruning achieves similar learning accuracy compared with the HFL without model pruning and reduces about 50 percent communication cost.
arXiv Detail & Related papers (2023-05-15T22:04:49Z) - Automated Federated Learning in Mobile Edge Networks -- Fast Adaptation
and Convergence [83.58839320635956]
Federated Learning (FL) can be used in mobile edge networks to train machine learning models in a distributed manner.
Recent FL has been interpreted within a Model-Agnostic Meta-Learning (MAML) framework, which brings FL significant advantages in fast adaptation and convergence over heterogeneous datasets.
This paper addresses how much benefit MAML brings to FL and how to maximize such benefit over mobile edge networks.
arXiv Detail & Related papers (2023-03-23T02:42:10Z) - Hierarchical Personalized Federated Learning Over Massive Mobile Edge
Computing Networks [95.39148209543175]
We propose hierarchical PFL (HPFL), an algorithm for deploying PFL over massive MEC networks.
HPFL combines the objectives of training loss minimization and round latency minimization while jointly determining the optimal bandwidth allocation.
arXiv Detail & Related papers (2023-03-19T06:00:05Z) - EasyFL: A Low-code Federated Learning Platform For Dummies [21.984721627569783]
We propose the first low-code Federated Learning (FL) platform, EasyFL, to enable users with various levels of expertise to experiment and prototype FL applications with little coding.
With only a few lines of code, EasyFL empowers them with many out-of-the-box functionalities to accelerate experimentation and deployment.
Our implementations show that EasyFL requires only three lines of code to build a vanilla FL application, at least 10x lesser than other platforms.
arXiv Detail & Related papers (2021-05-17T04:15:55Z) - Wireless Communications for Collaborative Federated Learning [160.82696473996566]
Internet of Things (IoT) devices may not be able to transmit their collected data to a central controller for training machine learning models.
Google's seminal FL algorithm requires all devices to be directly connected with a central controller.
This paper introduces a novel FL framework, called collaborative FL (CFL), which enables edge devices to implement FL with less reliance on a central controller.
arXiv Detail & Related papers (2020-06-03T20:00:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.