Mining Latent Relationships among Clients: Peer-to-peer Federated
Learning with Adaptive Neighbor Matching
- URL: http://arxiv.org/abs/2203.12285v1
- Date: Wed, 23 Mar 2022 09:10:14 GMT
- Title: Mining Latent Relationships among Clients: Peer-to-peer Federated
Learning with Adaptive Neighbor Matching
- Authors: Zexi Li, Jiaxun Lu, Shuang Luo, Didi Zhu, Yunfeng Shao, Yinchuan Li,
Zhimeng Zhang, Chao Wu
- Abstract summary: In federated learning (FL), clients may have diverse objectives, merging all clients' knowledge into one global model will cause negative transfers to local performance.
We take advantage of peer-to-peer (P2P) FL, where clients communicate with neighbors without a central server.
We propose an algorithm that enables clients to form an effective communication topology in a decentralized manner without assuming the number of clusters.
- Score: 6.959557494221414
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In federated learning (FL), clients may have diverse objectives, merging all
clients' knowledge into one global model will cause negative transfers to local
performance. Thus, clustered FL is proposed to group similar clients into
clusters and maintain several global models. Nevertheless, current clustered FL
algorithms require the assumption of the number of clusters, they are not
effective enough to explore the latent relationships among clients. However, we
take advantage of peer-to-peer (P2P) FL, where clients communicate with
neighbors without a central server and propose an algorithm that enables
clients to form an effective communication topology in a decentralized manner
without assuming the number of clusters. Additionally, the P2P setting will
release the concerns caused by the central server in centralized FL, such as
reliability and communication bandwidth problems. In our method, 1) we present
two novel metrics for measuring client similarity, applicable under P2P
protocols; 2) we devise a two-stage algorithm, in the first stage, an efficient
method to enable clients to match same-cluster neighbors with high confidence
is proposed; 3) then in the second stage, a heuristic method based on
Expectation Maximization under the Gaussian Mixture Model assumption of
similarities is used for clients to discover more neighbors with similar
objectives. We make a theoretical analysis of how our work is superior to the
P2P FL counterpart and extensive experiments show that our method outperforms
all P2P FL baselines and has comparable or even superior performance to
centralized cluster FL. Moreover, results show that our method is much
effective in mining latent cluster relationships under various heterogeneity
without assuming the number of clusters and it is effective even under low
communication budgets.
Related papers
- Smart Sampling: Helping from Friendly Neighbors for Decentralized Federated Learning [10.917048408073846]
We introduce AFIND+, a simple yet efficient algorithm for sampling and aggregating neighbors in Decentralized FL (DFL)
AFIND+ identifies helpful neighbors, adaptively adjusts the number of selected neighbors, and strategically aggregates the sampled neighbors' models.
Numerical results on real-world datasets demonstrate that AFIND+ outperforms other sampling algorithms in DFL.
arXiv Detail & Related papers (2024-07-05T12:10:54Z) - Efficient Model Compression for Hierarchical Federated Learning [10.37403547348343]
Federated learning (FL) has garnered significant attention due to its capacity to preserve privacy within distributed learning systems.
This paper introduces a novel hierarchical FL framework that integrates the benefits of clustered FL and model compression.
arXiv Detail & Related papers (2024-05-27T12:17:47Z) - Federated cINN Clustering for Accurate Clustered Federated Learning [33.72494731516968]
Federated Learning (FL) presents an innovative approach to privacy-preserving distributed machine learning.
We propose the Federated cINN Clustering Algorithm (FCCA) to robustly cluster clients into different groups.
arXiv Detail & Related papers (2023-09-04T10:47:52Z) - Towards Instance-adaptive Inference for Federated Learning [80.38701896056828]
Federated learning (FL) is a distributed learning paradigm that enables multiple clients to learn a powerful global model by aggregating local training.
In this paper, we present a novel FL algorithm, i.e., FedIns, to handle intra-client data heterogeneity by enabling instance-adaptive inference in the FL framework.
Our experiments show that our FedIns outperforms state-of-the-art FL algorithms, e.g., a 6.64% improvement against the top-performing method with less than 15% communication cost on Tiny-ImageNet.
arXiv Detail & Related papers (2023-08-11T09:58:47Z) - Efficient Distribution Similarity Identification in Clustered Federated
Learning via Principal Angles Between Client Data Subspaces [59.33965805898736]
Clustered learning has been shown to produce promising results by grouping clients into clusters.
Existing FL algorithms are essentially trying to group clients together with similar distributions.
Prior FL algorithms attempt similarities indirectly during training.
arXiv Detail & Related papers (2022-09-21T17:37:54Z) - On the Convergence of Clustered Federated Learning [57.934295064030636]
In a federated learning system, the clients, e.g. mobile devices and organization participants, usually have different personal preferences or behavior patterns.
This paper proposes a novel weighted client-based clustered FL algorithm to leverage the client's group and each client in a unified optimization framework.
arXiv Detail & Related papers (2022-02-13T02:39:19Z) - FedChain: Chained Algorithms for Near-Optimal Communication Cost in
Federated Learning [24.812767482563878]
Federated learning (FL) aims to minimize the communication complexity of training a model over heterogeneous data distributed across many clients.
We propose FedChain, an algorithmic framework that combines the strengths of local methods and global methods to achieve fast convergence in terms of R.
arXiv Detail & Related papers (2021-08-16T02:57:06Z) - Low-Latency Federated Learning over Wireless Channels with Differential
Privacy [142.5983499872664]
In federated learning (FL), model training is distributed over clients and local models are aggregated by a central server.
In this paper, we aim to minimize FL training delay over wireless channels, constrained by overall training performance as well as each client's differential privacy (DP) requirement.
arXiv Detail & Related papers (2021-06-20T13:51:18Z) - Clustered Sampling: Low-Variance and Improved Representativity for
Clients Selection in Federated Learning [4.530678016396477]
This work addresses the problem of optimizing communications between server and clients in federated learning (FL)
Current sampling approaches in FL are either biased, or non optimal in terms of server-clients communications and training stability.
We prove that clustered sampling leads to better clients representatitivity and to reduced variance of the clients aggregation weights in FL.
arXiv Detail & Related papers (2021-05-12T18:19:20Z) - Blockchain Assisted Decentralized Federated Learning (BLADE-FL):
Performance Analysis and Resource Allocation [119.19061102064497]
We propose a decentralized FL framework by integrating blockchain into FL, namely, blockchain assisted decentralized federated learning (BLADE-FL)
In a round of the proposed BLADE-FL, each client broadcasts its trained model to other clients, competes to generate a block based on the received models, and then aggregates the models from the generated block before its local training of the next round.
We explore the impact of lazy clients on the learning performance of BLADE-FL, and characterize the relationship among the optimal K, the learning parameters, and the proportion of lazy clients.
arXiv Detail & Related papers (2021-01-18T07:19:08Z) - Faster Non-Convex Federated Learning via Global and Local Momentum [57.52663209739171]
textttFedGLOMO is the first (first-order) FLtexttFedGLOMO algorithm.
Our algorithm is provably optimal even with communication between the clients and the server.
arXiv Detail & Related papers (2020-12-07T21:05:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.