FedHGN: A Federated Framework for Heterogeneous Graph Neural Networks
- URL: http://arxiv.org/abs/2305.09729v1
- Date: Tue, 16 May 2023 18:01:49 GMT
- Title: FedHGN: A Federated Framework for Heterogeneous Graph Neural Networks
- Authors: Xinyu Fu, Irwin King
- Abstract summary: Heterogeneous graph neural networks (HGNNs) can learn from typed and relational graph data more effectively than conventional GNNs.
With larger parameter spaces, HGNNs may require more training data, which is often scarce in real-world applications due to privacy regulations.
We propose FedHGN, a novel and general FGL framework for HGNNs.
- Score: 45.94642721490744
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Heterogeneous graph neural networks (HGNNs) can learn from typed and
relational graph data more effectively than conventional GNNs. With larger
parameter spaces, HGNNs may require more training data, which is often scarce
in real-world applications due to privacy regulations (e.g., GDPR). Federated
graph learning (FGL) enables multiple clients to train a GNN collaboratively
without sharing their local data. However, existing FGL methods mainly focus on
homogeneous GNNs or knowledge graph embeddings; few have considered
heterogeneous graphs and HGNNs. In federated heterogeneous graph learning,
clients may have private graph schemas. Conventional FL/FGL methods attempting
to define a global HGNN model would violate schema privacy. To address these
challenges, we propose FedHGN, a novel and general FGL framework for HGNNs.
FedHGN adopts schema-weight decoupling to enable schema-agnostic knowledge
sharing and employs coefficients alignment to stabilize the training process
and improve HGNN performance. With better privacy preservation, FedHGN
consistently outperforms local training and conventional FL methods on three
widely adopted heterogeneous graph datasets with varying client numbers. The
code is available at https://github.com/cynricfu/FedHGN .
Related papers
- Federated Graph Learning with Structure Proxy Alignment [43.13100155569234]
Federated Graph Learning (FGL) aims to learn graph learning models over graph data distributed in multiple data owners.
We propose FedSpray, a novel FGL framework that learns local class-wise structure proxies in the latent space.
Our goal is to obtain the aligned structure proxies that can serve as reliable, unbiased neighboring information for node classification.
arXiv Detail & Related papers (2024-08-18T07:32:54Z) - FedGT: Federated Node Classification with Scalable Graph Transformer [27.50698154862779]
We propose a scalable textbfFederated textbfGraph textbfTransformer (textbfFedGT) in the paper.
FedGT computes clients' similarity based on the aligned global nodes with optimal transport.
arXiv Detail & Related papers (2024-01-26T21:02:36Z) - Graph Ladling: Shockingly Simple Parallel GNN Training without
Intermediate Communication [100.51884192970499]
GNNs are a powerful family of neural networks for learning over graphs.
scaling GNNs either by deepening or widening suffers from prevalent issues of unhealthy gradients, over-smoothening, information squashing.
We propose not to deepen or widen current GNNs, but instead present a data-centric perspective of model soups tailored for GNNs.
arXiv Detail & Related papers (2023-06-18T03:33:46Z) - SplitGNN: Splitting GNN for Node Classification with Heterogeneous
Attention [29.307331758493323]
We propose a split learning-based graph neural network (SplitGNN) for graph computation.
Our SplitGNN allows the isolated heterogeneous neighborhood to be collaboratively utilized.
We demonstrate the effectiveness of our SplitGNN on node classification tasks for two standard public datasets and the real-world dataset.
arXiv Detail & Related papers (2023-01-27T12:08:44Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - FedGraphNN: A Federated Learning System and Benchmark for Graph Neural
Networks [68.64678614325193]
Graph Neural Network (GNN) research is rapidly growing thanks to the capacity of GNNs to learn representations from graph-structured data.
Centralizing a massive amount of real-world graph data for GNN training is prohibitive due to user-side privacy concerns.
We introduce FedGraphNN, an open research federated learning system and a benchmark to facilitate GNN-based FL research.
arXiv Detail & Related papers (2021-04-14T22:11:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.