FedEgo: Privacy-preserving Personalized Federated Graph Learning with
Ego-graphs
- URL: http://arxiv.org/abs/2208.13685v1
- Date: Mon, 29 Aug 2022 15:47:36 GMT
- Title: FedEgo: Privacy-preserving Personalized Federated Graph Learning with
Ego-graphs
- Authors: Taolin Zhang, Chuan Chen, Yaomin Chang, Lin Shu, and Zibin Zheng
- Abstract summary: In some practical scenarios, graph data are stored separately in multiple distributed parties, which may not be directly shared due to conflicts of interest.
We propose FedEgo, a federated graph learning framework based on ego-graphs to tackle the challenges above.
- Score: 22.649780281947837
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As special information carriers containing both structure and feature
information, graphs are widely used in graph mining, e.g., Graph Neural
Networks (GNNs). However, in some practical scenarios, graph data are stored
separately in multiple distributed parties, which may not be directly shared
due to conflicts of interest. Hence, federated graph neural networks are
proposed to address such data silo problems while preserving the privacy of
each party (or client). Nevertheless, different graph data distributions among
various parties, which is known as the statistical heterogeneity, may degrade
the performance of naive federated learning algorithms like FedAvg. In this
paper, we propose FedEgo, a federated graph learning framework based on
ego-graphs to tackle the challenges above, where each client will train their
local models while also contributing to the training of a global model. FedEgo
applies GraphSAGE over ego-graphs to make full use of the structure information
and utilizes Mixup for privacy concerns. To deal with the statistical
heterogeneity, we integrate personalization into learning and propose an
adaptive mixing coefficient strategy that enables clients to achieve their
optimal personalization. Extensive experimental results and in-depth analysis
demonstrate the effectiveness of FedEgo.
Related papers
- Toward Personalized Federated Node Classification in One-shot Communication [27.325478113745206]
We propose a one-shot personalized Federated Graph Learning method for node classification.
Our method estimates and aggregates class-wise feature distribution statistics to construct a global pseudo-graph on the server.
Our method significantly outperforms state-of-the-art baselines across various settings.
arXiv Detail & Related papers (2024-11-18T05:59:29Z) - Federated Graph Learning with Graphless Clients [52.5629887481768]
Federated Graph Learning (FGL) is tasked with training machine learning models, such as Graph Neural Networks (GNNs)
We propose a novel framework FedGLS to tackle the problem in FGL with graphless clients.
arXiv Detail & Related papers (2024-11-13T06:54:05Z) - Federated Hypergraph Learning: Hyperedge Completion with Local Differential Privacy [6.295242666794106]
FedHGL is designed to collaboratively train a comprehensive hypergraph neural network across multiple clients.
Cross-client feature aggregation is performed and distributed at the central server to ensure that this information can be utilized by the clients.
arXiv Detail & Related papers (2024-08-09T16:31:41Z) - Privacy-preserving design of graph neural networks with applications to
vertical federated learning [56.74455367682945]
We present an end-to-end graph representation learning framework called VESPER.
VESPER is capable of training high-performance GNN models over both sparse and dense graphs under reasonable privacy budgets.
arXiv Detail & Related papers (2023-10-31T15:34:59Z) - Distributed Learning over Networks with Graph-Attention-Based
Personalization [49.90052709285814]
We propose a graph-based personalized algorithm (GATTA) for distributed deep learning.
In particular, the personalized model in each agent is composed of a global part and a node-specific part.
By treating each agent as one node in a graph the node-specific parameters as its features, the benefits of the graph attention mechanism can be inherited.
arXiv Detail & Related papers (2023-05-22T13:48:30Z) - Graph Mixture of Experts: Learning on Large-Scale Graphs with Explicit
Diversity Modeling [60.0185734837814]
Graph neural networks (GNNs) have found extensive applications in learning from graph data.
To bolster the generalization capacity of GNNs, it has become customary to augment training graph structures with techniques like graph augmentations.
This study introduces the concept of Mixture-of-Experts (MoE) to GNNs, with the aim of augmenting their capacity to adapt to a diverse range of training graph structures.
arXiv Detail & Related papers (2023-04-06T01:09:36Z) - Federated Learning over Coupled Graphs [39.86903030911785]
Federated Learning (FL) has been proposed to solve the data isolation issue, mainly for Euclidean data.
We propose a novel FL framework for graph data, FedCog, to efficiently handle coupled graphs that are a kind of distributed graph data.
arXiv Detail & Related papers (2023-01-26T13:43:26Z) - Graph Learning Across Data Silos [12.343382413705394]
We consider the problem of inferring graph topology from smooth graph signals in a novel but practical scenario.
Data are located in distributed clients and prohibited from leaving local clients due to factors such as privacy concerns.
We propose an auto-weighted multiple graph learning model to jointly learn a personalized graph for each local client and a single consensus graph for all clients.
arXiv Detail & Related papers (2023-01-17T02:14:57Z) - Privatized Graph Federated Learning [57.14673504239551]
We introduce graph federated learning, which consists of multiple units connected by a graph.
We show how graph homomorphic perturbations can be used to ensure the algorithm is differentially private.
arXiv Detail & Related papers (2022-03-14T13:48:23Z) - GraphMI: Extracting Private Graph Data from Graph Neural Networks [59.05178231559796]
We present textbfGraph textbfModel textbfInversion attack (GraphMI), which aims to extract private graph data of the training graph by inverting GNN.
Specifically, we propose a projected gradient module to tackle the discreteness of graph edges while preserving the sparsity and smoothness of graph features.
We design a graph auto-encoder module to efficiently exploit graph topology, node attributes, and target model parameters for edge inference.
arXiv Detail & Related papers (2021-06-05T07:07:52Z) - FedGL: Federated Graph Learning Framework with Global Self-Supervision [22.124339267195822]
FedGL is capable of obtaining a high-quality global graph model while protecting data privacy.
The global self-supervision enables the information of each client to flow and share in a privacy-preserving manner.
arXiv Detail & Related papers (2021-05-07T11:27:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.