FedGraph: an Aggregation Method from Graph Perspective
- URL: http://arxiv.org/abs/2210.02733v1
- Date: Thu, 6 Oct 2022 07:48:50 GMT
- Title: FedGraph: an Aggregation Method from Graph Perspective
- Authors: Zhifang Deng, Xiaohong Huang, Dandan Li, Xueguang Yuan
- Abstract summary: Federated Learning (FL) has become an effective solution to collaboratively train the model while preserving each client's privacy.
FedAvg is a standard aggregation algorithm that makes the proportion of dataset size of each client as aggregation weight.
We propose FedGraph, which can adjust the aggregation weights adaptively according to the training condition of local models.
- Score: 3.1236343261481165
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With the increasingly strengthened data privacy act and the difficult data
centralization, Federated Learning (FL) has become an effective solution to
collaboratively train the model while preserving each client's privacy. FedAvg
is a standard aggregation algorithm that makes the proportion of dataset size
of each client as aggregation weight. However, it can't deal with
non-independent and identically distributed (non-i.i.d) data well because of
its fixed aggregation weights and the neglect of data distribution. In this
paper, we propose an aggregation strategy that can effectively deal with
non-i.i.d dataset, namely FedGraph, which can adjust the aggregation weights
adaptively according to the training condition of local models in whole
training process. The FedGraph takes three factors into account from coarse to
fine: the proportion of each local dataset size, the topology factor of model
graphs, and the model weights. We calculate the gravitational force between
local models by transforming the local models into topology graphs. The
FedGraph can explore the internal correlation between local models better
through the weighted combination of the proportion each local dataset, topology
structure, and model weights. The proposed FedGraph has been applied to the
MICCAI Federated Tumor Segmentation Challenge 2021 (FeTS) datasets, and the
validation results show that our method surpasses the previous state-of-the-art
by 2.76 mean Dice Similarity Score. The source code will be available at
Github.
Related papers
- FedGT: Federated Node Classification with Scalable Graph Transformer [27.50698154862779]
We propose a scalable textbfFederated textbfGraph textbfTransformer (textbfFedGT) in the paper.
FedGT computes clients' similarity based on the aligned global nodes with optimal transport.
arXiv Detail & Related papers (2024-01-26T21:02:36Z) - Exploiting Label Skews in Federated Learning with Model Concatenation [39.38427550571378]
Federated Learning (FL) has emerged as a promising solution to perform deep learning on different data owners without exchanging raw data.
Among different non-IID types, label skews have been challenging and common in image classification and other tasks.
We propose FedConcat, a simple and effective approach that degrades these local models as the base of the global model.
arXiv Detail & Related papers (2023-12-11T10:44:52Z) - FedDisco: Federated Learning with Discrepancy-Aware Collaboration [41.828780724903744]
We propose a novel aggregation method, Federated Learning with Discrepancy-aware Collaboration (FedDisco)
Our FedDisco outperforms several state-of-the-art methods and can be easily incorporated with many existing methods to further enhance the performance.
arXiv Detail & Related papers (2023-05-30T17:20:51Z) - Distributed Learning over Networks with Graph-Attention-Based
Personalization [49.90052709285814]
We propose a graph-based personalized algorithm (GATTA) for distributed deep learning.
In particular, the personalized model in each agent is composed of a global part and a node-specific part.
By treating each agent as one node in a graph the node-specific parameters as its features, the benefits of the graph attention mechanism can be inherited.
arXiv Detail & Related papers (2023-05-22T13:48:30Z) - Graph Learning Across Data Silos [12.343382413705394]
We consider the problem of inferring graph topology from smooth graph signals in a novel but practical scenario.
Data are located in distributed clients and prohibited from leaving local clients due to factors such as privacy concerns.
We propose an auto-weighted multiple graph learning model to jointly learn a personalized graph for each local client and a single consensus graph for all clients.
arXiv Detail & Related papers (2023-01-17T02:14:57Z) - Towards Understanding and Mitigating Dimensional Collapse in Heterogeneous Federated Learning [112.69497636932955]
Federated learning aims to train models across different clients without the sharing of data for privacy considerations.
We study how data heterogeneity affects the representations of the globally aggregated models.
We propose sc FedDecorr, a novel method that can effectively mitigate dimensional collapse in federated learning.
arXiv Detail & Related papers (2022-10-01T09:04:17Z) - Model Inversion Attacks against Graph Neural Networks [65.35955643325038]
We study model inversion attacks against Graph Neural Networks (GNNs)
In this paper, we present GraphMI to infer the private training graph data.
Our experimental results show that such defenses are not sufficiently effective and call for more advanced defenses against privacy attacks.
arXiv Detail & Related papers (2022-09-16T09:13:43Z) - FedEgo: Privacy-preserving Personalized Federated Graph Learning with
Ego-graphs [22.649780281947837]
In some practical scenarios, graph data are stored separately in multiple distributed parties, which may not be directly shared due to conflicts of interest.
We propose FedEgo, a federated graph learning framework based on ego-graphs to tackle the challenges above.
arXiv Detail & Related papers (2022-08-29T15:47:36Z) - Personalized Subgraph Federated Learning [56.52903162729729]
We introduce a new subgraph FL problem, personalized subgraph FL, which focuses on the joint improvement of the interrelated local GNNs.
We propose a novel framework, FEDerated Personalized sUBgraph learning (FED-PUB), to tackle it.
We validate our FED-PUB for its subgraph FL performance on six datasets, considering both non-overlapping and overlapping subgraphs.
arXiv Detail & Related papers (2022-06-21T09:02:53Z) - Tackling the Local Bias in Federated Graph Learning [48.887310972708036]
In Federated graph learning (FGL), a global graph is distributed across different clients, where each client holds a subgraph.
Existing FGL methods fail to effectively utilize cross-client edges, losing structural information during the training.
We propose a novel FGL framework to make the local models similar to the model trained in a centralized setting.
arXiv Detail & Related papers (2021-10-22T08:22:36Z) - Training Robust Graph Neural Networks with Topology Adaptive Edge
Dropping [116.26579152942162]
Graph neural networks (GNNs) are processing architectures that exploit graph structural information to model representations from network data.
Despite their success, GNNs suffer from sub-optimal generalization performance given limited training data.
This paper proposes Topology Adaptive Edge Dropping to improve generalization performance and learn robust GNN models.
arXiv Detail & Related papers (2021-06-05T13:20:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.