Federated Graph Learning -- A Position Paper
- URL: http://arxiv.org/abs/2105.11099v1
- Date: Mon, 24 May 2021 05:39:24 GMT
- Title: Federated Graph Learning -- A Position Paper
- Authors: Huanding Zhang, Tao Shen, Fei Wu, Mingyang Yin, Hongxia Yang, Chao Wu
- Abstract summary: Federated learning (FL) is an emerging technique that can collaboratively train a shared model while keeping the data decentralized.
We term it as federated graph learning (FGL)
Considering how graph data are distributed among clients, we propose four types of FGL: inter-graph FL, intra-graph FL and graph-structured FL.
- Score: 36.424411232612606
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNN) have been successful in many fields, and derived
various researches and applications in real industries. However, in some
privacy sensitive scenarios (like finance, healthcare), training a GNN model
centrally faces challenges due to the distributed data silos. Federated
learning (FL) is a an emerging technique that can collaboratively train a
shared model while keeping the data decentralized, which is a rational solution
for distributed GNN training. We term it as federated graph learning (FGL).
Although FGL has received increasing attention recently, the definition and
challenges of FGL is still up in the air. In this position paper, we present a
categorization to clarify it. Considering how graph data are distributed among
clients, we propose four types of FGL: inter-graph FL, intra-graph FL and
graph-structured FL, where intra-graph is further divided into horizontal and
vertical FGL. For each type of FGL, we make a detailed discussion about the
formulation and applications, and propose some potential challenges.
Related papers
- Federated Graph Learning with Graphless Clients [52.5629887481768]
Federated Graph Learning (FGL) is tasked with training machine learning models, such as Graph Neural Networks (GNNs)
We propose a novel framework FedGLS to tackle the problem in FGL with graphless clients.
arXiv Detail & Related papers (2024-11-13T06:54:05Z) - OpenFGL: A Comprehensive Benchmarks for Federated Graph Learning [36.04858706246336]
Federated graph learning (FGL) has emerged as a promising distributed training paradigm for graph neural networks across multiple local systems without direct data sharing.
Despite the proliferation of FGL, the diverse motivations from practical applications, spanning various research backgrounds and experimental settings, pose a significant challenge to fair evaluation.
We propose OpenFGL, a unified benchmark designed for the primary FGL scenarios: Graph-FL and Subgraph-FL.
arXiv Detail & Related papers (2024-08-29T06:40:01Z) - Federated Graph Learning with Structure Proxy Alignment [43.13100155569234]
Federated Graph Learning (FGL) aims to learn graph learning models over graph data distributed in multiple data owners.
We propose FedSpray, a novel FGL framework that learns local class-wise structure proxies in the latent space.
Our goal is to obtain the aligned structure proxies that can serve as reliable, unbiased neighboring information for node classification.
arXiv Detail & Related papers (2024-08-18T07:32:54Z) - Chasing Fairness in Graphs: A GNN Architecture Perspective [73.43111851492593]
We propose textsfFair textsfMessage textsfPassing (FMP) designed within a unified optimization framework for graph neural networks (GNNs)
In FMP, the aggregation is first adopted to utilize neighbors' information and then the bias mitigation step explicitly pushes demographic group node presentation centers together.
Experiments on node classification tasks demonstrate that the proposed FMP outperforms several baselines in terms of fairness and accuracy on three real-world datasets.
arXiv Detail & Related papers (2023-12-19T18:00:15Z) - Learning Strong Graph Neural Networks with Weak Information [64.64996100343602]
We develop a principled approach to the problem of graph learning with weak information (GLWI)
We propose D$2$PT, a dual-channel GNN framework that performs long-range information propagation on the input graph with incomplete structure, but also on a global graph that encodes global semantic similarities.
arXiv Detail & Related papers (2023-05-29T04:51:09Z) - FedHGN: A Federated Framework for Heterogeneous Graph Neural Networks [45.94642721490744]
Heterogeneous graph neural networks (HGNNs) can learn from typed and relational graph data more effectively than conventional GNNs.
With larger parameter spaces, HGNNs may require more training data, which is often scarce in real-world applications due to privacy regulations.
We propose FedHGN, a novel and general FGL framework for HGNNs.
arXiv Detail & Related papers (2023-05-16T18:01:49Z) - GLASU: A Communication-Efficient Algorithm for Federated Learning with
Vertically Distributed Graph Data [44.02629656473639]
We propose a model splitting method that splits a backbone GNN across the clients and the server and a communication-efficient algorithm, GLASU, to train such a model.
We offer a theoretical analysis and conduct extensive numerical experiments on real-world datasets, showing that the proposed algorithm effectively trains a GNN model, whose performance matches that of the backbone GNN when trained in a centralized manner.
arXiv Detail & Related papers (2023-03-16T17:47:55Z) - Personalized Subgraph Federated Learning [56.52903162729729]
We introduce a new subgraph FL problem, personalized subgraph FL, which focuses on the joint improvement of the interrelated local GNNs.
We propose a novel framework, FEDerated Personalized sUBgraph learning (FED-PUB), to tackle it.
We validate our FED-PUB for its subgraph FL performance on six datasets, considering both non-overlapping and overlapping subgraphs.
arXiv Detail & Related papers (2022-06-21T09:02:53Z) - Graph-level Neural Networks: Current Progress and Future Directions [61.08696673768116]
Graph-level Neural Networks (GLNNs, deep learning-based graph-level learning methods) have been attractive due to their superiority in modeling high-dimensional data.
We propose a systematic taxonomy covering GLNNs upon deep neural networks, graph neural networks, and graph pooling.
arXiv Detail & Related papers (2022-05-31T06:16:55Z) - FedGL: Federated Graph Learning Framework with Global Self-Supervision [22.124339267195822]
FedGL is capable of obtaining a high-quality global graph model while protecting data privacy.
The global self-supervision enables the information of each client to flow and share in a privacy-preserving manner.
arXiv Detail & Related papers (2021-05-07T11:27:23Z) - GraphFL: A Federated Learning Framework for Semi-Supervised Node
Classification on Graphs [48.13100386338979]
We propose the first FL framework, namely GraphFL, for semi-supervised node classification on graphs.
We propose two GraphFL methods to respectively address the non-IID issue in graph data and handle the tasks with new label domains.
We adopt representative graph neural networks as GraphSSC methods and evaluate GraphFL on multiple graph datasets.
arXiv Detail & Related papers (2020-12-08T03:13:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.