Asymmetrically Decentralized Federated Learning
- URL: http://arxiv.org/abs/2310.05093v1
- Date: Sun, 8 Oct 2023 09:46:26 GMT
- Title: Asymmetrically Decentralized Federated Learning
- Authors: Qinglun Li, Miao Zhang, Nan Yin, Quanjun Yin, Li Shen
- Abstract summary: Decentralized Federated Learning (DFL) has emerged, which discards the server with a peer-to-peer (P2P) communication framework.
This paper proposes DFedSGPSM algorithm, which is based on asymmetric topologies and utilizes the Push- Aware protocol.
- Score: 22.21977974314497
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To address the communication burden and privacy concerns associated with the
centralized server in Federated Learning (FL), Decentralized Federated Learning
(DFL) has emerged, which discards the server with a peer-to-peer (P2P)
communication framework. However, most existing DFL algorithms are based on
symmetric topologies, such as ring and grid topologies, which can easily lead
to deadlocks and are susceptible to the impact of network link quality in
practice. To address these issues, this paper proposes the DFedSGPSM algorithm,
which is based on asymmetric topologies and utilizes the Push-Sum protocol to
effectively solve consensus optimization problems. To further improve algorithm
performance and alleviate local heterogeneous overfitting in Federated Learning
(FL), our algorithm combines the Sharpness Aware Minimization (SAM) optimizer
and local momentum. The SAM optimizer employs gradient perturbations to
generate locally flat models and searches for models with uniformly low loss
values, mitigating local heterogeneous overfitting. The local momentum
accelerates the optimization process of the SAM optimizer. Theoretical analysis
proves that DFedSGPSM achieves a convergence rate of
$\mathcal{O}(\frac{1}{\sqrt{T}})$ in a non-convex smooth setting under mild
assumptions. This analysis also reveals that better topological connectivity
achieves tighter upper bounds. Empirically, extensive experiments are conducted
on the MNIST, CIFAR10, and CIFAR100 datasets, demonstrating the superior
performance of our algorithm compared to state-of-the-art optimizers.
Related papers
- Fast Decentralized Gradient Tracking for Federated Minimax Optimization with Local Updates [5.269633789700637]
textttK-GT-Minimax's ability to handle data heterogeneity underscores its significance in advancing federated learning research and applications.
arXiv Detail & Related papers (2024-05-07T17:25:56Z) - Semi-Federated Learning: Convergence Analysis and Optimization of A
Hybrid Learning Framework [70.83511997272457]
We propose a semi-federated learning (SemiFL) paradigm to leverage both the base station (BS) and devices for a hybrid implementation of centralized learning (CL) and FL.
We propose a two-stage algorithm to solve this intractable problem, in which we provide the closed-form solutions to the beamformers.
arXiv Detail & Related papers (2023-10-04T03:32:39Z) - Federated Conditional Stochastic Optimization [110.513884892319]
Conditional optimization has found in a wide range of machine learning tasks, such as in-variant learning tasks, AUPRC, andAML.
This paper proposes algorithms for distributed federated learning.
arXiv Detail & Related papers (2023-10-04T01:47:37Z) - DFedADMM: Dual Constraints Controlled Model Inconsistency for
Decentralized Federated Learning [52.83811558753284]
Decentralized learning (DFL) discards the central server and establishes a decentralized communication network.
Existing DFL methods still suffer from two major challenges: local inconsistency and local overfitting.
arXiv Detail & Related papers (2023-08-16T11:22:36Z) - Learner Referral for Cost-Effective Federated Learning Over Hierarchical
IoT Networks [21.76836812021954]
This paper aided federated selection (LRef-FedCS), communications resource, and local model accuracy (LMAO) methods.
Our proposed LRef-FedCS approach could achieve a good balance between high global accuracy and reducing cost.
arXiv Detail & Related papers (2023-07-19T13:33:43Z) - Dynamic Regularized Sharpness Aware Minimization in Federated Learning: Approaching Global Consistency and Smooth Landscape [59.841889495864386]
In federated learning (FL), a cluster of local clients are chaired under the coordination of a global server.
Clients are prone to overfit into their own optima, which extremely deviates from the global objective.
ttfamily FedSMOO adopts a dynamic regularizer to guarantee the local optima towards the global objective.
Our theoretical analysis indicates that ttfamily FedSMOO achieves fast $mathcalO (1/T)$ convergence rate with low bound generalization.
arXiv Detail & Related papers (2023-05-19T10:47:44Z) - Escaping Saddle Points with Bias-Variance Reduced Local Perturbed SGD
for Communication Efficient Nonconvex Distributed Learning [58.79085525115987]
Local methods are one of the promising approaches to reduce communication time.
We show that the communication complexity is better than non-local methods when the local datasets is smaller than the smoothness local loss.
arXiv Detail & Related papers (2022-02-12T15:12:17Z) - DESTRESS: Computation-Optimal and Communication-Efficient Decentralized
Nonconvex Finite-Sum Optimization [43.31016937305845]
Internet-of-things, networked sensing, autonomous systems and federated learning call for decentralized algorithms for finite-sum optimizations.
We develop DEcentralized STochastic REcurSive methodDESTRESS for non finite-sum optimization.
Detailed theoretical and numerical comparisons show that DESTRESS improves upon prior decentralized algorithms.
arXiv Detail & Related papers (2021-10-04T03:17:41Z) - FedPD: A Federated Learning Framework with Optimal Rates and Adaptivity
to Non-IID Data [59.50904660420082]
Federated Learning (FL) has become a popular paradigm for learning from distributed data.
To effectively utilize data at different devices without moving them to the cloud, algorithms such as the Federated Averaging (FedAvg) have adopted a "computation then aggregation" (CTA) model.
arXiv Detail & Related papers (2020-05-22T23:07:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.