Towards Practical Federated Causal Structure Learning
- URL: http://arxiv.org/abs/2306.09433v2
- Date: Mon, 19 Jun 2023 16:19:49 GMT
- Title: Towards Practical Federated Causal Structure Learning
- Authors: Zhaoyu Wang, Pingchuan Ma, Shuai Wang
- Abstract summary: FedC2SL is a constraint-based causal structure learning scheme that learns causal graphs using a conditional independence test.
The study evaluates FedC2SL using both synthetic datasets and real-world data against existing solutions.
- Score: 9.74796970978203
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Understanding causal relations is vital in scientific discovery. The process
of causal structure learning involves identifying causal graphs from
observational data to understand such relations. Usually, a central server
performs this task, but sharing data with the server poses privacy risks.
Federated learning can solve this problem, but existing solutions for federated
causal structure learning make unrealistic assumptions about data and lack
convergence guarantees. FedC2SL is a federated constraint-based causal
structure learning scheme that learns causal graphs using a federated
conditional independence test, which examines conditional independence between
two variables under a condition set without collecting raw data from clients.
FedC2SL requires weaker and more realistic assumptions about data and offers
stronger resistance to data variability among clients. FedPC and FedFCI are the
two variants of FedC2SL for causal structure learning in causal sufficiency and
causal insufficiency, respectively. The study evaluates FedC2SL using both
synthetic datasets and real-world data against existing solutions and finds it
demonstrates encouraging performance and strong resilience to data
heterogeneity among clients.
Related papers
- Federated Granger Causality Learning for Interdependent Clients with State Space Representation [0.6499759302108926]
We develop a federated approach to learning Granger causality.
We propose augmenting the client models with the Granger causality information learned by the server.
We also study the convergence of the framework to a centralized oracle model.
arXiv Detail & Related papers (2025-01-23T18:04:21Z) - One-shot Federated Learning via Synthetic Distiller-Distillate Communication [63.89557765137003]
One-shot Federated learning (FL) is a powerful technology facilitating collaborative training of machine learning models in a single round of communication.
We propose FedSD2C, a novel and practical one-shot FL framework designed to address these challenges.
arXiv Detail & Related papers (2024-12-06T17:05:34Z) - FedDW: Distilling Weights through Consistency Optimization in Heterogeneous Federated Learning [14.477559543490242]
Federated Learning (FL) is an innovative distributed machine learning paradigm that enables neural network training across devices without centralizing data.
Previous research shows that in IID environments, the parameter structure of the model is expected to adhere to certain specific consistency principles.
This paper identifies the consistency between the two and leverages it to regulate training, underpinning our proposed FedDW framework.
Experimental results show FedDW outperforms 10 state-of-the-art FL methods, improving accuracy by an average of 3% in highly heterogeneous settings.
arXiv Detail & Related papers (2024-12-05T12:32:40Z) - Interventional Causal Structure Discovery over Graphical Models with Convergence and Optimality Guarantees [0.0]
We develop a bilevel optimization (Bloom) framework for causal structure learning.
Bloom not only provides theoretical support for causal structure discovery from both interventional and observational data, but also aspires to an efficient causal discovery algorithm.
It is seen through experiments on both synthetic and real-world datasets that Bloom markedly surpasses other leading learning algorithms.
arXiv Detail & Related papers (2024-08-09T02:22:50Z) - On the Federated Learning Framework for Cooperative Perception [28.720571541022245]
Federated learning offers a promising solution by enabling data privacy-preserving collaborative enhancements in perception, decision-making, and planning among connected and autonomous vehicles.
This study introduces a specialized federated learning framework for CP, termed the federated dynamic weighted aggregation (FedDWA) algorithm.
This framework employs dynamic client weighting to direct model convergence and integrates a novel loss function that utilizes Kullback-Leibler divergence (KLD) to counteract detrimental effects of non-independently and identically distributed (Non-IID) and unbalanced data.
arXiv Detail & Related papers (2024-04-26T04:34:45Z) - DAGnosis: Localized Identification of Data Inconsistencies using
Structures [73.39285449012255]
Identification and appropriate handling of inconsistencies in data at deployment time is crucial to reliably use machine learning models.
We use directed acyclic graphs (DAGs) to encode the training set's features probability distribution and independencies as a structure.
Our method, called DAGnosis, leverages these structural interactions to bring valuable and insightful data-centric conclusions.
arXiv Detail & Related papers (2024-02-26T11:29:16Z) - Federated Causal Discovery from Heterogeneous Data [70.31070224690399]
We propose a novel FCD method attempting to accommodate arbitrary causal models and heterogeneous data.
These approaches involve constructing summary statistics as a proxy of the raw data to protect data privacy.
We conduct extensive experiments on synthetic and real datasets to show the efficacy of our method.
arXiv Detail & Related papers (2024-02-20T18:53:53Z) - Personalized Federated Learning with Attention-based Client Selection [57.71009302168411]
We propose FedACS, a new PFL algorithm with an Attention-based Client Selection mechanism.
FedACS integrates an attention mechanism to enhance collaboration among clients with similar data distributions.
Experiments on CIFAR10 and FMNIST validate FedACS's superiority.
arXiv Detail & Related papers (2023-12-23T03:31:46Z) - Fake It Till Make It: Federated Learning with Consensus-Oriented
Generation [52.82176415223988]
We propose federated learning with consensus-oriented generation (FedCOG)
FedCOG consists of two key components at the client side: complementary data generation and knowledge-distillation-based model training.
Experiments on classical and real-world FL datasets show that FedCOG consistently outperforms state-of-the-art methods.
arXiv Detail & Related papers (2023-12-10T18:49:59Z) - Federated Causal Discovery [74.37739054932733]
This paper develops a gradient-based learning framework named DAG-Shared Federated Causal Discovery (DS-FCD)
It can learn the causal graph without directly touching local data and naturally handle the data heterogeneity.
Extensive experiments on both synthetic and real-world datasets verify the efficacy of the proposed method.
arXiv Detail & Related papers (2021-12-07T08:04:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.