A Coopetitive-Compatible Data Generation Framework for Cross-silo Federated Learning
- URL: http://arxiv.org/abs/2509.18120v1
- Date: Wed, 10 Sep 2025 13:29:05 GMT
- Title: A Coopetitive-Compatible Data Generation Framework for Cross-silo Federated Learning
- Authors: Thanh Linh Nguyen, Quoc-Viet Pham,
- Abstract summary: Cross-silo federated learning (CFL) enables organizations to collaboratively train artificial intelligence (AI) models while preserving data privacy by keeping data local.<n>We propose CoCoGen, a coopetitive-compatible data generation framework, to model, analyze, and optimize collaborative learning under heterogeneous and competitive settings.
- Score: 3.7046358119786498
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Cross-silo federated learning (CFL) enables organizations (e.g., hospitals or banks) to collaboratively train artificial intelligence (AI) models while preserving data privacy by keeping data local. While prior work has primarily addressed statistical heterogeneity across organizations, a critical challenge arises from economic competition, where organizations may act as market rivals, making them hesitant to participate in joint training due to potential utility loss (i.e., reduced net benefit). Furthermore, the combined effects of statistical heterogeneity and inter-organizational competition on organizational behavior and system-wide social welfare remain underexplored. In this paper, we propose CoCoGen, a coopetitive-compatible data generation framework, leveraging generative AI (GenAI) and potential game theory to model, analyze, and optimize collaborative learning under heterogeneous and competitive settings. Specifically, CoCoGen characterizes competition and statistical heterogeneity through learning performance and utility-based formulations and models each training round as a weighted potential game. We then derive GenAI-based data generation strategies that maximize social welfare. Experimental results on the Fashion-MNIST dataset reveal how varying heterogeneity and competition levels affect organizational behavior and demonstrate that CoCoGen consistently outperforms baseline methods.
Related papers
- Robust Asymmetric Heterogeneous Federated Learning with Corrupted Clients [60.22876915395139]
This paper studies a challenging robust federated learning task with model heterogeneous and data corrupted clients.<n>Data corruption is unavoidable due to factors such as random noise, compression artifacts, or environmental conditions in real-world deployment.<n>We propose a novel Robust Asymmetric Heterogeneous Federated Learning framework to address these issues.
arXiv Detail & Related papers (2025-03-12T09:52:04Z) - Convergence-aware Clustered Federated Graph Learning Framework for Collaborative Inter-company Labor Market Forecasting [38.13767335441753]
Labor market forecasting on talent demand and supply is essential for business management and economic development.
Previous studies ignore the interconnection between demand-supply sequences among different companies for predicting variations.
We propose a Meta-personalized Convergence-aware Clustered Federated Learning framework to provide accurate and timely collaborative talent demand and supply prediction.
arXiv Detail & Related papers (2024-09-29T04:11:23Z) - DAMe: Personalized Federated Social Event Detection with Dual Aggregation Mechanism [55.45581907514175]
This paper proposes a personalized federated learning framework with a dual aggregation mechanism for social event detection, namely DAMe.
We introduce a global aggregation strategy to provide clients with maximum external knowledge of their preferences.
In addition, we incorporate a global-local event-centric constraint to prevent local overfitting and client-drift''
arXiv Detail & Related papers (2024-09-01T04:56:41Z) - Federated Knowledge Recycling: Privacy-Preserving Synthetic Data Sharing [5.0243930429558885]
Federated Knowledge Recycling (FedKR) is a cross-silo federated learning approach that uses locally generated synthetic data to facilitate collaboration between institutions.
FedKR combines advanced data generation techniques with a dynamic aggregation process to provide greater security against privacy attacks than existing methods.
arXiv Detail & Related papers (2024-07-30T13:56:26Z) - Generalizable Heterogeneous Federated Cross-Correlation and Instance
Similarity Learning [60.058083574671834]
This paper presents a novel FCCL+, federated correlation and similarity learning with non-target distillation.
For heterogeneous issue, we leverage irrelevant unlabeled public data for communication.
For catastrophic forgetting in local updating stage, FCCL+ introduces Federated Non Target Distillation.
arXiv Detail & Related papers (2023-09-28T09:32:27Z) - Federated Learning-Empowered AI-Generated Content in Wireless Networks [58.48381827268331]
Federated learning (FL) can be leveraged to improve learning efficiency and achieve privacy protection for AIGC.
We present FL-based techniques for empowering AIGC, and aim to enable users to generate diverse, personalized, and high-quality content.
arXiv Detail & Related papers (2023-07-14T04:13:11Z) - Strategic Data Sharing between Competitors [4.8951183832371]
We introduce a general framework for analyzing this data-sharing trade-off.
We study an instantiation of the framework, based on a conventional market model from economic theory.
Our findings indicate a profound impact of market conditions on the data-sharing incentives.
arXiv Detail & Related papers (2023-05-25T13:34:12Z) - FedPNN: One-shot Federated Classification via Evolving Clustering Method
and Probabilistic Neural Network hybrid [4.241208172557663]
We propose a two-stage federated learning approach toward the objective of privacy protection.
In the first stage, the synthetic dataset is generated by employing two different distributions as noise.
In the second stage, the Federated Probabilistic Neural Network (FedPNN) is developed and employed for building globally shared classification model.
arXiv Detail & Related papers (2023-04-09T03:23:37Z) - Combating Exacerbated Heterogeneity for Robust Models in Federated
Learning [91.88122934924435]
Combination of adversarial training and federated learning can lead to the undesired robustness deterioration.
We propose a novel framework called Slack Federated Adversarial Training (SFAT)
We verify the rationality and effectiveness of SFAT on various benchmarked and real-world datasets.
arXiv Detail & Related papers (2023-03-01T06:16:15Z) - Personalizing Federated Learning with Over-the-Air Computations [84.8089761800994]
Federated edge learning is a promising technology to deploy intelligence at the edge of wireless networks in a privacy-preserving manner.
Under such a setting, multiple clients collaboratively train a global generic model under the coordination of an edge server.
This paper presents a distributed training paradigm that employs analog over-the-air computation to address the communication bottleneck.
arXiv Detail & Related papers (2023-02-24T08:41:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.