FedGES: A Federated Learning Approach for BN Structure Learning
- URL: http://arxiv.org/abs/2502.01538v1
- Date: Mon, 03 Feb 2025 17:16:02 GMT
- Title: FedGES: A Federated Learning Approach for BN Structure Learning
- Authors: Pablo Torrijos, José A. Gámez, José M. Puerta,
- Abstract summary: This research introduces Federated GES (FedGES), a novel Federated Learning approach tailored for BN structure learning in decentralized settings.
FedGES uniquely addresses privacy and security challenges by exchanging only evolving network structures, not parameters or data.
- Score: 0.7182449176083623
- License:
- Abstract: Bayesian Network (BN) structure learning traditionally centralizes data, raising privacy concerns when data is distributed across multiple entities. This research introduces Federated GES (FedGES), a novel Federated Learning approach tailored for BN structure learning in decentralized settings using the Greedy Equivalence Search (GES) algorithm. FedGES uniquely addresses privacy and security challenges by exchanging only evolving network structures, not parameters or data. It realizes collaborative model development, using structural fusion to combine the limited models generated by each client in successive iterations. A controlled structural fusion is also proposed to enhance client consensus when adding any edge. Experimental results on various BNs from {\sf bnlearn}'s BN Repository validate the effectiveness of FedGES, particularly in high-dimensional (a large number of variables) and sparse data scenarios, offering a practical and privacy-preserving solution for real-world BN structure learning.
Related papers
- Optimizing Federated Graph Learning with Inherent Structural Knowledge and Dual-Densely Connected GNNs [6.185201353691423]
Federated Graph Learning (FGL) enables clients to collaboratively train powerful Graph Neural Networks (GNNs) in a distributed manner without exposing their private data.
Existing methods either overlook the inherent structural knowledge in graph data or capture it at the cost of significantly increased resource demands.
We propose FedDense, a novel FGL framework that optimize the utilization efficiency of inherent structural knowledge.
arXiv Detail & Related papers (2024-08-21T14:37:50Z) - Decoupled Subgraph Federated Learning [57.588938805581044]
We address the challenge of federated learning on graph-structured data distributed across multiple clients.
We present a novel framework for this scenario, named FedStruct, that harnesses deep structural dependencies.
We validate the effectiveness of FedStruct through experimental results conducted on six datasets for semi-supervised node classification.
arXiv Detail & Related papers (2024-02-29T13:47:23Z) - Federated Learning Empowered by Generative Content [55.576885852501775]
Federated learning (FL) enables leveraging distributed private data for model training in a privacy-preserving way.
We propose a novel FL framework termed FedGC, designed to mitigate data heterogeneity issues by diversifying private data with generative content.
We conduct a systematic empirical study on FedGC, covering diverse baselines, datasets, scenarios, and modalities.
arXiv Detail & Related papers (2023-12-10T07:38:56Z) - Factor-Assisted Federated Learning for Personalized Optimization with
Heterogeneous Data [6.024145412139383]
Federated learning is an emerging distributed machine learning framework aiming at protecting data privacy.
Data in different clients contain both common knowledge and personalized knowledge.
We develop a novel personalized federated learning framework for heterogeneous data, which we refer to as FedSplit.
arXiv Detail & Related papers (2023-12-07T13:05:47Z) - Personalizing Federated Learning with Over-the-Air Computations [84.8089761800994]
Federated edge learning is a promising technology to deploy intelligence at the edge of wireless networks in a privacy-preserving manner.
Under such a setting, multiple clients collaboratively train a global generic model under the coordination of an edge server.
This paper presents a distributed training paradigm that employs analog over-the-air computation to address the communication bottleneck.
arXiv Detail & Related papers (2023-02-24T08:41:19Z) - Federated Learning on Non-IID Graphs via Structural Knowledge Sharing [47.140441784462794]
federated graph learning (FGL) enables clients to train strong GNN models in a distributed manner without sharing their private data.
We propose FedStar, an FGL framework that extracts and shares the common underlying structure information for inter-graph learning tasks.
We perform extensive experiments over both cross-dataset and cross-domain non-IID FGL settings, demonstrating FedStar's superiority.
arXiv Detail & Related papers (2022-11-23T15:12:16Z) - Towards Privacy-Aware Causal Structure Learning in Federated Setting [27.5652887311069]
We study a privacy-aware causal structure learning problem in the federated setting.
We propose a novel Federated PC (FedPC) algorithm with two new strategies for preserving data privacy without centralizing data.
arXiv Detail & Related papers (2022-11-13T14:54:42Z) - Personalization Improves Privacy-Accuracy Tradeoffs in Federated
Optimization [57.98426940386627]
We show that coordinating local learning with private centralized learning yields a generically useful and improved tradeoff between accuracy and privacy.
We illustrate our theoretical results with experiments on synthetic and real-world datasets.
arXiv Detail & Related papers (2022-02-10T20:44:44Z) - Towards Federated Bayesian Network Structure Learning with Continuous
Optimization [14.779035801521717]
We present a cross-silo federated learning approach to estimate the structure of Bayesian network.
We develop a distributed structure learning method based on continuous optimization.
arXiv Detail & Related papers (2021-10-18T14:36:05Z) - Federated Knowledge Graphs Embedding [50.35484170815679]
We propose a novel decentralized scalable learning framework, Federated Knowledge Graphs Embedding (FKGE)
FKGE exploits adversarial generation between pairs of knowledge graphs to translate identical entities and relations of different domains into near embedding spaces.
In order to protect the privacy of the training data, FKGE further implements a privacy-preserving neural network structure to guarantee no raw data leakage.
arXiv Detail & Related papers (2021-05-17T05:30:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.