Federated Learning on Non-IID Graphs via Structural Knowledge Sharing
- URL: http://arxiv.org/abs/2211.13009v1
- Date: Wed, 23 Nov 2022 15:12:16 GMT
- Title: Federated Learning on Non-IID Graphs via Structural Knowledge Sharing
- Authors: Yue Tan, Yixin Liu, Guodong Long, Jing Jiang, Qinghua Lu, Chengqi
Zhang
- Abstract summary: federated graph learning (FGL) enables clients to train strong GNN models in a distributed manner without sharing their private data.
We propose FedStar, an FGL framework that extracts and shares the common underlying structure information for inter-graph learning tasks.
We perform extensive experiments over both cross-dataset and cross-domain non-IID FGL settings, demonstrating FedStar's superiority.
- Score: 47.140441784462794
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have shown their superiority in modeling graph
data. Owing to the advantages of federated learning, federated graph learning
(FGL) enables clients to train strong GNN models in a distributed manner
without sharing their private data. A core challenge in federated systems is
the non-IID problem, which also widely exists in real-world graph data. For
example, local data of clients may come from diverse datasets or even domains,
e.g., social networks and molecules, increasing the difficulty for FGL methods
to capture commonly shared knowledge and learn a generalized encoder. From
real-world graph datasets, we observe that some structural properties are
shared by various domains, presenting great potential for sharing structural
knowledge in FGL. Inspired by this, we propose FedStar, an FGL framework that
extracts and shares the common underlying structure information for inter-graph
federated learning tasks. To explicitly extract the structure information
rather than encoding them along with the node features, we define structure
embeddings and encode them with an independent structure encoder. Then, the
structure encoder is shared across clients while the feature-based knowledge is
learned in a personalized way, making FedStar capable of capturing more
structure-based domain-invariant information and avoiding feature misalignment
issues. We perform extensive experiments over both cross-dataset and
cross-domain non-IID FGL settings, demonstrating the superiority of FedStar.
Related papers
- Decoupled Subgraph Federated Learning [57.588938805581044]
We address the challenge of federated learning on graph-structured data distributed across multiple clients.
We present a novel framework for this scenario, named FedStruct, that harnesses deep structural dependencies.
We validate the effectiveness of FedStruct through experimental results conducted on six datasets for semi-supervised node classification.
arXiv Detail & Related papers (2024-02-29T13:47:23Z) - GraphEdit: Large Language Models for Graph Structure Learning [62.618818029177355]
Graph Structure Learning (GSL) focuses on capturing intrinsic dependencies and interactions among nodes in graph-structured data.
Existing GSL methods heavily depend on explicit graph structural information as supervision signals.
We propose GraphEdit, an approach that leverages large language models (LLMs) to learn complex node relationships in graph-structured data.
arXiv Detail & Related papers (2024-02-23T08:29:42Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Privacy-preserving design of graph neural networks with applications to
vertical federated learning [56.74455367682945]
We present an end-to-end graph representation learning framework called VESPER.
VESPER is capable of training high-performance GNN models over both sparse and dense graphs under reasonable privacy budgets.
arXiv Detail & Related papers (2023-10-31T15:34:59Z) - Lumos: Heterogeneity-aware Federated Graph Learning over Decentralized
Devices [19.27111697495379]
Graph neural networks (GNNs) have been widely deployed in real-world networked applications and systems.
We propose the first federated GNN framework called Lumos that supports supervised and unsupervised learning.
Based on the constructed tree for each client, a decentralized tree-based GNN trainer is proposed to support versatile training.
arXiv Detail & Related papers (2023-03-01T13:27:06Z) - Federated Graph-based Networks with Shared Embedding [1.323497585762675]
We propose Federated Graph-based Networks with Shared Embedding (Feras), which uses shared embedding data to train the network and avoids the direct sharing of original data.
Feras enables the training of current graph-based models in the federated learning framework for privacy concern.
arXiv Detail & Related papers (2022-10-03T12:51:15Z) - FedEgo: Privacy-preserving Personalized Federated Graph Learning with
Ego-graphs [22.649780281947837]
In some practical scenarios, graph data are stored separately in multiple distributed parties, which may not be directly shared due to conflicts of interest.
We propose FedEgo, a federated graph learning framework based on ego-graphs to tackle the challenges above.
arXiv Detail & Related papers (2022-08-29T15:47:36Z) - Privatized Graph Federated Learning [57.14673504239551]
We introduce graph federated learning, which consists of multiple units connected by a graph.
We show how graph homomorphic perturbations can be used to ensure the algorithm is differentially private.
arXiv Detail & Related papers (2022-03-14T13:48:23Z) - FedGL: Federated Graph Learning Framework with Global Self-Supervision [22.124339267195822]
FedGL is capable of obtaining a high-quality global graph model while protecting data privacy.
The global self-supervision enables the information of each client to flow and share in a privacy-preserving manner.
arXiv Detail & Related papers (2021-05-07T11:27:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.