Federated Learning on Non-IID Graphs via Structural Knowledge Sharing
- URL: http://arxiv.org/abs/2211.13009v1
- Date: Wed, 23 Nov 2022 15:12:16 GMT
- Title: Federated Learning on Non-IID Graphs via Structural Knowledge Sharing
- Authors: Yue Tan, Yixin Liu, Guodong Long, Jing Jiang, Qinghua Lu, Chengqi
Zhang
- Abstract summary: federated graph learning (FGL) enables clients to train strong GNN models in a distributed manner without sharing their private data.
We propose FedStar, an FGL framework that extracts and shares the common underlying structure information for inter-graph learning tasks.
We perform extensive experiments over both cross-dataset and cross-domain non-IID FGL settings, demonstrating FedStar's superiority.
- Score: 47.140441784462794
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have shown their superiority in modeling graph
data. Owing to the advantages of federated learning, federated graph learning
(FGL) enables clients to train strong GNN models in a distributed manner
without sharing their private data. A core challenge in federated systems is
the non-IID problem, which also widely exists in real-world graph data. For
example, local data of clients may come from diverse datasets or even domains,
e.g., social networks and molecules, increasing the difficulty for FGL methods
to capture commonly shared knowledge and learn a generalized encoder. From
real-world graph datasets, we observe that some structural properties are
shared by various domains, presenting great potential for sharing structural
knowledge in FGL. Inspired by this, we propose FedStar, an FGL framework that
extracts and shares the common underlying structure information for inter-graph
federated learning tasks. To explicitly extract the structure information
rather than encoding them along with the node features, we define structure
embeddings and encode them with an independent structure encoder. Then, the
structure encoder is shared across clients while the feature-based knowledge is
learned in a personalized way, making FedStar capable of capturing more
structure-based domain-invariant information and avoiding feature misalignment
issues. We perform extensive experiments over both cross-dataset and
cross-domain non-IID FGL settings, demonstrating the superiority of FedStar.
Related papers
- Federated Graph Learning with Graphless Clients [52.5629887481768]
Federated Graph Learning (FGL) is tasked with training machine learning models, such as Graph Neural Networks (GNNs)
We propose a novel framework FedGLS to tackle the problem in FGL with graphless clients.
arXiv Detail & Related papers (2024-11-13T06:54:05Z) - FedSSP: Federated Graph Learning with Spectral Knowledge and Personalized Preference [31.796411806840087]
Federated Graph Learning (pFGL) facilitates the decentralized training of Graph Neural Networks (GNNs) without compromising privacy.
Previous pFGL methods incorrectly share non-generic knowledge globally and fail to tailor personalized solutions locally.
We propose our pFGL framework FedSSP which Shares generic Spectral knowledge while satisfying graph Preferences.
arXiv Detail & Related papers (2024-10-26T07:09:27Z) - Learning to Model Graph Structural Information on MLPs via Graph Structure Self-Contrasting [50.181824673039436]
We propose a Graph Structure Self-Contrasting (GSSC) framework that learns graph structural information without message passing.
The proposed framework is based purely on Multi-Layer Perceptrons (MLPs), where the structural information is only implicitly incorporated as prior knowledge.
It first applies structural sparsification to remove potentially uninformative or noisy edges in the neighborhood, and then performs structural self-contrasting in the sparsified neighborhood to learn robust node representations.
arXiv Detail & Related papers (2024-09-09T12:56:02Z) - Optimizing Federated Graph Learning with Inherent Structural Knowledge and Dual-Densely Connected GNNs [6.185201353691423]
Federated Graph Learning (FGL) enables clients to collaboratively train powerful Graph Neural Networks (GNNs) in a distributed manner without exposing their private data.
Existing methods either overlook the inherent structural knowledge in graph data or capture it at the cost of significantly increased resource demands.
We propose FedDense, a novel FGL framework that optimize the utilization efficiency of inherent structural knowledge.
arXiv Detail & Related papers (2024-08-21T14:37:50Z) - Federated Graph Learning with Structure Proxy Alignment [43.13100155569234]
Federated Graph Learning (FGL) aims to learn graph learning models over graph data distributed in multiple data owners.
We propose FedSpray, a novel FGL framework that learns local class-wise structure proxies in the latent space.
Our goal is to obtain the aligned structure proxies that can serve as reliable, unbiased neighboring information for node classification.
arXiv Detail & Related papers (2024-08-18T07:32:54Z) - Decoupled Subgraph Federated Learning [57.588938805581044]
We address the challenge of federated learning on graph-structured data distributed across multiple clients.
We present a novel framework for this scenario, named FedStruct, that harnesses deep structural dependencies.
We validate the effectiveness of FedStruct through experimental results conducted on six datasets for semi-supervised node classification.
arXiv Detail & Related papers (2024-02-29T13:47:23Z) - GraphEdit: Large Language Models for Graph Structure Learning [62.618818029177355]
Graph Structure Learning (GSL) focuses on capturing intrinsic dependencies and interactions among nodes in graph-structured data.
Existing GSL methods heavily depend on explicit graph structural information as supervision signals.
We propose GraphEdit, an approach that leverages large language models (LLMs) to learn complex node relationships in graph-structured data.
arXiv Detail & Related papers (2024-02-23T08:29:42Z) - Lumos: Heterogeneity-aware Federated Graph Learning over Decentralized
Devices [19.27111697495379]
Graph neural networks (GNNs) have been widely deployed in real-world networked applications and systems.
We propose the first federated GNN framework called Lumos that supports supervised and unsupervised learning.
Based on the constructed tree for each client, a decentralized tree-based GNN trainer is proposed to support versatile training.
arXiv Detail & Related papers (2023-03-01T13:27:06Z) - Privatized Graph Federated Learning [57.14673504239551]
We introduce graph federated learning, which consists of multiple units connected by a graph.
We show how graph homomorphic perturbations can be used to ensure the algorithm is differentially private.
arXiv Detail & Related papers (2022-03-14T13:48:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.