Federated Causal Discovery
- URL: http://arxiv.org/abs/2112.03555v1
- Date: Tue, 7 Dec 2021 08:04:12 GMT
- Title: Federated Causal Discovery
- Authors: Erdun Gao and Junjia Chen and Li Shen and Tongliang Liu and Mingming
Gong and Howard Bondell
- Abstract summary: This paper develops a gradient-based learning framework named DAG-Shared Federated Causal Discovery (DS-FCD)
It can learn the causal graph without directly touching local data and naturally handle the data heterogeneity.
Extensive experiments on both synthetic and real-world datasets verify the efficacy of the proposed method.
- Score: 74.37739054932733
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Causal discovery aims to learn a causal graph from observational data. To
date, most causal discovery methods require data to be stored in a central
server. However, data owners gradually refuse to share their personalized data
to avoid privacy leakage, making this task more troublesome by cutting off the
first step. A puzzle arises: $\textit{how do we infer causal relations from
decentralized data?}$ In this paper, with the additive noise model assumption
of data, we take the first step in developing a gradient-based learning
framework named DAG-Shared Federated Causal Discovery (DS-FCD), which can learn
the causal graph without directly touching local data and naturally handle the
data heterogeneity. DS-FCD benefits from a two-level structure of each local
model. The first level learns the causal graph and communicates with the server
to get model information from other clients, while the second level
approximates causal mechanisms and personally updates from its own data to
accommodate the data heterogeneity. Moreover, DS-FCD formulates the overall
learning task as a continuous optimization problem by taking advantage of an
equality acyclicity constraint, which can be naturally solved by gradient
descent methods. Extensive experiments on both synthetic and real-world
datasets verify the efficacy of the proposed method.
Related papers
- TCGU: Data-centric Graph Unlearning based on Transferable Condensation [36.670771080732486]
Transferable Condensation Graph Unlearning (TCGU) is a data-centric solution to zero-glance graph unlearning.
We show that TCGU can achieve superior performance in terms of model utility, unlearning efficiency, and unlearning efficacy than existing GU methods.
arXiv Detail & Related papers (2024-10-09T02:14:40Z) - Federated Causal Discovery from Heterogeneous Data [70.31070224690399]
We propose a novel FCD method attempting to accommodate arbitrary causal models and heterogeneous data.
These approaches involve constructing summary statistics as a proxy of the raw data to protect data privacy.
We conduct extensive experiments on synthetic and real datasets to show the efficacy of our method.
arXiv Detail & Related papers (2024-02-20T18:53:53Z) - Fake It Till Make It: Federated Learning with Consensus-Oriented
Generation [52.82176415223988]
We propose federated learning with consensus-oriented generation (FedCOG)
FedCOG consists of two key components at the client side: complementary data generation and knowledge-distillation-based model training.
Experiments on classical and real-world FL datasets show that FedCOG consistently outperforms state-of-the-art methods.
arXiv Detail & Related papers (2023-12-10T18:49:59Z) - Federated Causality Learning with Explainable Adaptive Optimization [25.910766140488395]
We propose a federated causal discovery strategy (FedCausal) to learn the unified global causal graph from decentralized heterogeneous data.
We show that FedCausal can effectively deal with non-independently and identically distributed (non-iid) data.
arXiv Detail & Related papers (2023-12-09T11:18:20Z) - Towards Practical Federated Causal Structure Learning [9.74796970978203]
FedC2SL is a constraint-based causal structure learning scheme that learns causal graphs using a conditional independence test.
The study evaluates FedC2SL using both synthetic datasets and real-world data against existing solutions.
arXiv Detail & Related papers (2023-06-15T18:23:58Z) - Discovering Dynamic Causal Space for DAG Structure Learning [64.763763417533]
We propose a dynamic causal space for DAG structure learning, coined CASPER.
It integrates the graph structure into the score function as a new measure in the causal space to faithfully reflect the causal distance between estimated and ground truth DAG.
arXiv Detail & Related papers (2023-06-05T12:20:40Z) - Towards Understanding and Mitigating Dimensional Collapse in Heterogeneous Federated Learning [112.69497636932955]
Federated learning aims to train models across different clients without the sharing of data for privacy considerations.
We study how data heterogeneity affects the representations of the globally aggregated models.
We propose sc FedDecorr, a novel method that can effectively mitigate dimensional collapse in federated learning.
arXiv Detail & Related papers (2022-10-01T09:04:17Z) - Rethinking Data Heterogeneity in Federated Learning: Introducing a New
Notion and Standard Benchmarks [65.34113135080105]
We show that not only the issue of data heterogeneity in current setups is not necessarily a problem but also in fact it can be beneficial for the FL participants.
Our observations are intuitive.
Our code is available at https://github.com/MMorafah/FL-SC-NIID.
arXiv Detail & Related papers (2022-09-30T17:15:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.