FedEgo: Privacy-preserving Personalized Federated Graph Learning with
Ego-graphs
- URL: http://arxiv.org/abs/2208.13685v1
- Date: Mon, 29 Aug 2022 15:47:36 GMT
- Title: FedEgo: Privacy-preserving Personalized Federated Graph Learning with
Ego-graphs
- Authors: Taolin Zhang, Chuan Chen, Yaomin Chang, Lin Shu, and Zibin Zheng
- Abstract summary: In some practical scenarios, graph data are stored separately in multiple distributed parties, which may not be directly shared due to conflicts of interest.
We propose FedEgo, a federated graph learning framework based on ego-graphs to tackle the challenges above.
- Score: 22.649780281947837
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As special information carriers containing both structure and feature
information, graphs are widely used in graph mining, e.g., Graph Neural
Networks (GNNs). However, in some practical scenarios, graph data are stored
separately in multiple distributed parties, which may not be directly shared
due to conflicts of interest. Hence, federated graph neural networks are
proposed to address such data silo problems while preserving the privacy of
each party (or client). Nevertheless, different graph data distributions among
various parties, which is known as the statistical heterogeneity, may degrade
the performance of naive federated learning algorithms like FedAvg. In this
paper, we propose FedEgo, a federated graph learning framework based on
ego-graphs to tackle the challenges above, where each client will train their
local models while also contributing to the training of a global model. FedEgo
applies GraphSAGE over ego-graphs to make full use of the structure information
and utilizes Mixup for privacy concerns. To deal with the statistical
heterogeneity, we integrate personalization into learning and propose an
adaptive mixing coefficient strategy that enables clients to achieve their
optimal personalization. Extensive experimental results and in-depth analysis
demonstrate the effectiveness of FedEgo.
Related papers
- Privacy-preserving design of graph neural networks with applications to
vertical federated learning [56.74455367682945]
We present an end-to-end graph representation learning framework called VESPER.
VESPER is capable of training high-performance GNN models over both sparse and dense graphs under reasonable privacy budgets.
arXiv Detail & Related papers (2023-10-31T15:34:59Z) - Distributed Learning over Networks with Graph-Attention-Based
Personalization [49.90052709285814]
We propose a graph-based personalized algorithm (GATTA) for distributed deep learning.
In particular, the personalized model in each agent is composed of a global part and a node-specific part.
By treating each agent as one node in a graph the node-specific parameters as its features, the benefits of the graph attention mechanism can be inherited.
arXiv Detail & Related papers (2023-05-22T13:48:30Z) - Graph Mixture of Experts: Learning on Large-Scale Graphs with Explicit
Diversity Modeling [60.0185734837814]
Graph neural networks (GNNs) have found extensive applications in learning from graph data.
To bolster the generalization capacity of GNNs, it has become customary to augment training graph structures with techniques like graph augmentations.
This study introduces the concept of Mixture-of-Experts (MoE) to GNNs, with the aim of augmenting their capacity to adapt to a diverse range of training graph structures.
arXiv Detail & Related papers (2023-04-06T01:09:36Z) - Federated Learning over Coupled Graphs [39.86903030911785]
Federated Learning (FL) has been proposed to solve the data isolation issue, mainly for Euclidean data.
We propose a novel FL framework for graph data, FedCog, to efficiently handle coupled graphs that are a kind of distributed graph data.
arXiv Detail & Related papers (2023-01-26T13:43:26Z) - Graph Learning Across Data Silos [12.343382413705394]
We consider the problem of inferring graph topology from smooth graph signals in a novel but practical scenario.
Data are located in distributed clients and prohibited from leaving local clients due to factors such as privacy concerns.
We propose an auto-weighted multiple graph learning model to jointly learn a personalized graph for each local client and a single consensus graph for all clients.
arXiv Detail & Related papers (2023-01-17T02:14:57Z) - Federated Graph Representation Learning using Self-Supervision [18.015793175772835]
Federated graph representation learning (FedGRL) brings the benefits of distributed training to graph structured data while simultaneously addressing some privacy and compliance concerns related to data curation.
We consider a realistic and novel problem setting, wherein cross-silo clients have access to vast amounts of unlabeled data with limited or no labeled data and additionally have diverse downstream class label domains.
We propose a novel FedGRL formulation based on model where we aim to learn a shared global model that is optimized collaboratively using a self-supervised objective and gets downstream task supervision through local client models.
arXiv Detail & Related papers (2022-10-27T02:13:42Z) - Privatized Graph Federated Learning [57.14673504239551]
We introduce graph federated learning, which consists of multiple units connected by a graph.
We show how graph homomorphic perturbations can be used to ensure the algorithm is differentially private.
arXiv Detail & Related papers (2022-03-14T13:48:23Z) - GraphMI: Extracting Private Graph Data from Graph Neural Networks [59.05178231559796]
We present textbfGraph textbfModel textbfInversion attack (GraphMI), which aims to extract private graph data of the training graph by inverting GNN.
Specifically, we propose a projected gradient module to tackle the discreteness of graph edges while preserving the sparsity and smoothness of graph features.
We design a graph auto-encoder module to efficiently exploit graph topology, node attributes, and target model parameters for edge inference.
arXiv Detail & Related papers (2021-06-05T07:07:52Z) - FedGL: Federated Graph Learning Framework with Global Self-Supervision [22.124339267195822]
FedGL is capable of obtaining a high-quality global graph model while protecting data privacy.
The global self-supervision enables the information of each client to flow and share in a privacy-preserving manner.
arXiv Detail & Related papers (2021-05-07T11:27:23Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.