FedCompetitors: Harmonious Collaboration in Federated Learning with
Competing Participants
- URL: http://arxiv.org/abs/2312.11391v1
- Date: Mon, 18 Dec 2023 17:53:01 GMT
- Title: FedCompetitors: Harmonious Collaboration in Federated Learning with
Competing Participants
- Authors: Shanli Tan, Hao Cheng, Xiaohu Wu, Han Yu, Tiantian He, Yew-Soon Ong,
Chongjun Wang, and Xiaofeng Tao
- Abstract summary: Federated learning (FL) provides a privacy-preserving approach for collaborative training of machine learning models.
It is crucial to select appropriate collaborators for each FL participant based on data complementarity.
It is imperative to consider the inter-individual relationships among FL-PTs where some FL-PTs engage in competition.
- Score: 41.070716405671206
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) provides a privacy-preserving approach for
collaborative training of machine learning models. Given the potential data
heterogeneity, it is crucial to select appropriate collaborators for each FL
participant (FL-PT) based on data complementarity. Recent studies have
addressed this challenge. Similarly, it is imperative to consider the
inter-individual relationships among FL-PTs where some FL-PTs engage in
competition. Although FL literature has acknowledged the significance of this
scenario, practical methods for establishing FL ecosystems remain largely
unexplored. In this paper, we extend a principle from the balance theory,
namely ``the friend of my enemy is my enemy'', to ensure the absence of
conflicting interests within an FL ecosystem. The extended principle and the
resulting problem are formulated via graph theory and integer linear
programming. A polynomial-time algorithm is proposed to determine the
collaborators of each FL-PT. The solution guarantees high scalability, allowing
even competing FL-PTs to smoothly join the ecosystem without conflict of
interest. The proposed framework jointly considers competition and data
heterogeneity. Extensive experiments on real-world and synthetic data
demonstrate its efficacy compared to five alternative approaches, and its
ability to establish efficient collaboration networks among FL-PTs.
Related papers
- Free-Rider and Conflict Aware Collaboration Formation for Cross-Silo Federated Learning [32.35705737668307]
Federated learning (FL) is a machine learning paradigm that allows multiple FL participants to collaborate on training models without sharing private data.
We propose an optimal FL collaboration formation strategy -- FedEgoists -- which ensures that a FL-PT can benefit from FL if and only if it benefits the FL ecosystem.
We theoretically prove that the FL-PT coalitions formed are optimal since no coalitions can collaborate together to improve the utility of any of their members.
arXiv Detail & Related papers (2024-10-25T06:13:26Z) - Redefining Contributions: Shapley-Driven Federated Learning [3.9539878659683363]
Federated learning (FL) has emerged as a pivotal approach in machine learning.
It is challenging to ensure global model convergence when participants do not contribute equally and/or honestly.
This paper proposes a novel contribution assessment method called ShapFed for fine-grained evaluation of participant contributions in FL.
arXiv Detail & Related papers (2024-06-01T22:40:31Z) - OCD-FL: A Novel Communication-Efficient Peer Selection-based
Decentralized Federated Learning [2.603477777158694]
We propose an opportunistic communication-efficient decentralized federated learning (OCD-FL) scheme.
OCD-FL consists of a systematic FL peer selection for collaboration, aiming to achieve maximum FL knowledge gain while reducing energy consumption.
Experimental results demonstrate the capability of OCD-FL to achieve similar or better performances than the fully collaborative FL, while significantly reducing consumed energy by at least 30% and up to 80%.
arXiv Detail & Related papers (2024-03-06T20:34:08Z) - Privacy-preserving Federated Primal-dual Learning for Non-convex and Non-smooth Problems with Model Sparsification [51.04894019092156]
Federated learning (FL) has been recognized as a rapidly growing area, where the model is trained over clients under the FL orchestration (PS)
In this paper, we propose a novel primal sparification algorithm for and guarantee non-smooth FL problems.
Its unique insightful properties and its analyses are also presented.
arXiv Detail & Related papers (2023-10-30T14:15:47Z) - Deep Equilibrium Models Meet Federated Learning [71.57324258813675]
This study explores the problem of Federated Learning (FL) by utilizing the Deep Equilibrium (DEQ) models instead of conventional deep learning networks.
We claim that incorporating DEQ models into the federated learning framework naturally addresses several open problems in FL.
To the best of our knowledge, this study is the first to establish a connection between DEQ models and federated learning.
arXiv Detail & Related papers (2023-05-29T22:51:40Z) - Efficient Split-Mix Federated Learning for On-Demand and In-Situ
Customization [107.72786199113183]
Federated learning (FL) provides a distributed learning framework for multiple participants to collaborate learning without sharing raw data.
In this paper, we propose a novel Split-Mix FL strategy for heterogeneous participants that, once training is done, provides in-situ customization of model sizes and robustness.
arXiv Detail & Related papers (2022-03-18T04:58:34Z) - CoFED: Cross-silo Heterogeneous Federated Multi-task Learning via
Co-training [11.198612582299813]
Federated Learning (FL) is a machine learning technique that enables participants to train high-quality models collaboratively without exchanging their private data.
We propose a communication-efficient FL scheme, CoFED, based on pseudo-labeling unlabeled data like co-training.
Experimental results show that CoFED achieves better performance with a lower communication cost.
arXiv Detail & Related papers (2022-02-17T11:34:20Z) - Towards Verifiable Federated Learning [15.758657927386263]
Federated learning (FL) is an emerging paradigm of collaborative machine learning that preserves user privacy while building powerful models.
Due to the nature of open participation by self-interested entities, FL needs to guard against potential misbehaviours by legitimate FL participants.
Verifiable federated learning has become an emerging topic of research that has attracted significant interest from the academia and the industry alike.
arXiv Detail & Related papers (2022-02-15T09:52:25Z) - Heterogeneous Federated Learning via Grouped Sequential-to-Parallel
Training [60.892342868936865]
Federated learning (FL) is a rapidly growing privacy-preserving collaborative machine learning paradigm.
We propose a data heterogeneous-robust FL approach, FedGSP, to address this challenge.
We show that FedGSP improves the accuracy by 3.7% on average compared with seven state-of-the-art approaches.
arXiv Detail & Related papers (2022-01-31T03:15:28Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.