Decentralized Federated Learning: Fundamentals, State of the Art,
Frameworks, Trends, and Challenges
- URL: http://arxiv.org/abs/2211.08413v5
- Date: Wed, 13 Sep 2023 08:12:26 GMT
- Title: Decentralized Federated Learning: Fundamentals, State of the Art,
Frameworks, Trends, and Challenges
- Authors: Enrique Tom\'as Mart\'inez Beltr\'an, Mario Quiles P\'erez, Pedro
Miguel S\'anchez S\'anchez, Sergio L\'opez Bernal, G\'er\^ome Bovet, Manuel
Gil P\'erez, Gregorio Mart\'inez P\'erez, Alberto Huertas Celdr\'an
- Abstract summary: Federated Learning (FL) has gained relevance in training collaborative models without sharing sensitive data.
Decentralized Federated Learning (DFL) emerged to address these concerns by promoting decentralized model aggregation.
This article identifies and analyzes the main fundamentals of DFL in terms of federation architectures, topologies, communication mechanisms, security approaches, and key performance indicators.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In recent years, Federated Learning (FL) has gained relevance in training
collaborative models without sharing sensitive data. Since its birth,
Centralized FL (CFL) has been the most common approach in the literature, where
a central entity creates a global model. However, a centralized approach leads
to increased latency due to bottlenecks, heightened vulnerability to system
failures, and trustworthiness concerns affecting the entity responsible for the
global model creation. Decentralized Federated Learning (DFL) emerged to
address these concerns by promoting decentralized model aggregation and
minimizing reliance on centralized architectures. However, despite the work
done in DFL, the literature has not (i) studied the main aspects
differentiating DFL and CFL; (ii) analyzed DFL frameworks to create and
evaluate new solutions; and (iii) reviewed application scenarios using DFL.
Thus, this article identifies and analyzes the main fundamentals of DFL in
terms of federation architectures, topologies, communication mechanisms,
security approaches, and key performance indicators. Additionally, the paper at
hand explores existing mechanisms to optimize critical DFL fundamentals. Then,
the most relevant features of the current DFL frameworks are reviewed and
compared. After that, it analyzes the most used DFL application scenarios,
identifying solutions based on the fundamentals and frameworks previously
defined. Finally, the evolution of existing DFL solutions is studied to provide
a list of trends, lessons learned, and open challenges.
Related papers
- Advances in APPFL: A Comprehensive and Extensible Federated Learning Framework [1.4206132527980742]
Federated learning (FL) is a distributed machine learning paradigm enabling collaborative model training while preserving data privacy.
We present the recent advances in developing APPFL, a framework and benchmarking suite for federated learning.
We demonstrate the capabilities of APPFL through extensive experiments evaluating various aspects of FL, including communication efficiency, privacy preservation, computational performance, and resource utilization.
arXiv Detail & Related papers (2024-09-17T22:20:26Z) - Towards Understanding Generalization and Stability Gaps between Centralized and Decentralized Federated Learning [57.35402286842029]
We show that centralized learning always generalizes better than decentralized learning (DFL)
We also conduct experiments on several common setups in FL to validate that our theoretical analysis is consistent with experimental phenomena and contextually valid in several general and practical scenarios.
arXiv Detail & Related papers (2023-10-05T11:09:42Z) - Heterogeneous Federated Learning: State-of-the-art and Research
Challenges [117.77132819796105]
Heterogeneous Federated Learning (HFL) is much more challenging and corresponding solutions are diverse and complex.
New advances in HFL are reviewed and a new taxonomy of existing HFL methods is proposed.
Several critical and promising future research directions in HFL are discussed.
arXiv Detail & Related papers (2023-07-20T06:32:14Z) - Decentralized Federated Learning: A Survey and Perspective [45.81975053649379]
Decentralized FL (DFL) is a decentralized network architecture that eliminates the need for a central server.
DFL enables direct communication between clients, resulting in significant savings in communication resources.
arXiv Detail & Related papers (2023-06-02T15:12:58Z) - Deep Equilibrium Models Meet Federated Learning [71.57324258813675]
This study explores the problem of Federated Learning (FL) by utilizing the Deep Equilibrium (DEQ) models instead of conventional deep learning networks.
We claim that incorporating DEQ models into the federated learning framework naturally addresses several open problems in FL.
To the best of our knowledge, this study is the first to establish a connection between DEQ models and federated learning.
arXiv Detail & Related papers (2023-05-29T22:51:40Z) - Bayesian Federated Learning: A Survey [54.40136267717288]
Federated learning (FL) demonstrates its advantages in integrating distributed infrastructure, communication, computing and learning in a privacy-preserving manner.
The robustness and capabilities of existing FL methods are challenged by limited and dynamic data and conditions.
BFL has emerged as a promising approach to address these issues.
arXiv Detail & Related papers (2023-04-26T03:41:17Z) - Vertical Federated Learning: A Structured Literature Review [0.0]
Federated learning (FL) has emerged as a promising distributed learning paradigm with an added advantage of data privacy.
In this paper, we present a structured literature review discussing the state-of-the-art approaches in VFL.
arXiv Detail & Related papers (2022-12-01T16:16:41Z) - Vertical Federated Learning: Challenges, Methodologies and Experiments [34.4865409422585]
vertical learning (VFL) is capable of constructing a hyper ML model by embracing sub-models from different clients.
In this paper, we discuss key challenges in VFL with effective solutions, and conduct experiments on real-life datasets.
arXiv Detail & Related papers (2022-02-09T06:56:41Z) - SSFL: Tackling Label Deficiency in Federated Learning via Personalized
Self-Supervision [34.38856587032084]
Federated Learning (FL) is transforming the ML training ecosystem from a centralized over-the-cloud setting to distributed training over edge devices.
We propose self-supervised federated learning (SSFL), a unified self-supervised and personalized federated learning framework.
We show that the gap of evaluation accuracy between supervised learning and unsupervised learning in FL is both small and reasonable.
arXiv Detail & Related papers (2021-10-06T02:58:45Z) - Edge-assisted Democratized Learning Towards Federated Analytics [67.44078999945722]
We show the hierarchical learning structure of the proposed edge-assisted democratized learning mechanism, namely Edge-DemLearn.
We also validate Edge-DemLearn as a flexible model training mechanism to build a distributed control and aggregation methodology in regions.
arXiv Detail & Related papers (2020-12-01T11:46:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.