FedGL: Federated Graph Learning Framework with Global Self-Supervision
- URL: http://arxiv.org/abs/2105.03170v1
- Date: Fri, 7 May 2021 11:27:23 GMT
- Title: FedGL: Federated Graph Learning Framework with Global Self-Supervision
- Authors: Chuan Chen, Weibo Hu, Ziyue Xu, Zibin Zheng
- Abstract summary: FedGL is capable of obtaining a high-quality global graph model while protecting data privacy.
The global self-supervision enables the information of each client to flow and share in a privacy-preserving manner.
- Score: 22.124339267195822
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph data are ubiquitous in the real world. Graph learning (GL) tries to
mine and analyze graph data so that valuable information can be discovered.
Existing GL methods are designed for centralized scenarios. However, in
practical scenarios, graph data are usually distributed in different
organizations, i.e., the curse of isolated data islands. To address this
problem, we incorporate federated learning into GL and propose a general
Federated Graph Learning framework FedGL, which is capable of obtaining a
high-quality global graph model while protecting data privacy by discovering
the global self-supervision information during the federated training.
Concretely, we propose to upload the prediction results and node embeddings to
the server for discovering the global pseudo label and global pseudo graph,
which are distributed to each client to enrich the training labels and
complement the graph structure respectively, thereby improving the quality of
each local model. Moreover, the global self-supervision enables the information
of each client to flow and share in a privacy-preserving manner, thus
alleviating the heterogeneity and utilizing the complementarity of graph data
among different clients. Finally, experimental results show that FedGL
significantly outperforms baselines on four widely used graph datasets.
Related papers
- OpenGraph: Towards Open Graph Foundation Models [20.401374302429627]
We develop a general graph foundation model to understand the complex topological patterns present in diverse graph data.
We propose a unified graph tokenizer to adapt our graph model to generalize well on unseen graph data.
We also develop a scalable graph transformer, which effectively captures node-wise dependencies within the global topological context.
arXiv Detail & Related papers (2024-03-02T08:05:03Z) - GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks [72.01829954658889]
This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
arXiv Detail & Related papers (2023-06-20T03:33:22Z) - Learning Strong Graph Neural Networks with Weak Information [64.64996100343602]
We develop a principled approach to the problem of graph learning with weak information (GLWI)
We propose D$2$PT, a dual-channel GNN framework that performs long-range information propagation on the input graph with incomplete structure, but also on a global graph that encodes global semantic similarities.
arXiv Detail & Related papers (2023-05-29T04:51:09Z) - Federated Learning over Coupled Graphs [39.86903030911785]
Federated Learning (FL) has been proposed to solve the data isolation issue, mainly for Euclidean data.
We propose a novel FL framework for graph data, FedCog, to efficiently handle coupled graphs that are a kind of distributed graph data.
arXiv Detail & Related papers (2023-01-26T13:43:26Z) - Graph Learning Across Data Silos [12.343382413705394]
We consider the problem of inferring graph topology from smooth graph signals in a novel but practical scenario.
Data are located in distributed clients and prohibited from leaving local clients due to factors such as privacy concerns.
We propose an auto-weighted multiple graph learning model to jointly learn a personalized graph for each local client and a single consensus graph for all clients.
arXiv Detail & Related papers (2023-01-17T02:14:57Z) - DYNAFED: Tackling Client Data Heterogeneity with Global Dynamics [60.60173139258481]
Local training on non-iid distributed data results in deflected local optimum.
A natural solution is to gather all client data onto the server, such that the server has a global view of the entire data distribution.
In this paper, we put forth an idea to collect and leverage global knowledge on the server without hindering data privacy.
arXiv Detail & Related papers (2022-11-20T06:13:06Z) - FedEgo: Privacy-preserving Personalized Federated Graph Learning with
Ego-graphs [22.649780281947837]
In some practical scenarios, graph data are stored separately in multiple distributed parties, which may not be directly shared due to conflicts of interest.
We propose FedEgo, a federated graph learning framework based on ego-graphs to tackle the challenges above.
arXiv Detail & Related papers (2022-08-29T15:47:36Z) - Graph Representation Learning via Contrasting Cluster Assignments [57.87743170674533]
We propose a novel unsupervised graph representation model by contrasting cluster assignments, called as GRCCA.
It is motivated to make good use of local and global information synthetically through combining clustering algorithms and contrastive learning.
GRCCA has strong competitiveness in most tasks.
arXiv Detail & Related papers (2021-12-15T07:28:58Z) - Self-supervised Graph-level Representation Learning with Local and
Global Structure [71.45196938842608]
We propose a unified framework called Local-instance and Global-semantic Learning (GraphLoG) for self-supervised whole-graph representation learning.
Besides preserving the local similarities, GraphLoG introduces the hierarchical prototypes to capture the global semantic clusters.
An efficient online expectation-maximization (EM) algorithm is further developed for learning the model.
arXiv Detail & Related papers (2021-06-08T05:25:38Z) - GraphFL: A Federated Learning Framework for Semi-Supervised Node
Classification on Graphs [48.13100386338979]
We propose the first FL framework, namely GraphFL, for semi-supervised node classification on graphs.
We propose two GraphFL methods to respectively address the non-IID issue in graph data and handle the tasks with new label domains.
We adopt representative graph neural networks as GraphSSC methods and evaluate GraphFL on multiple graph datasets.
arXiv Detail & Related papers (2020-12-08T03:13:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.